What Google’s Page Experience update might mean for your website or blog

This post is intended as a little heads-up for my fellow Neocities webpage creators, and bloggers. But first I begin with my usual background insight.

When I was new to the internet and started creating websites, there were a few constraints I worked around:

  • I ensured my website looked and functioned ok on both Internet Explorer and Netscape Navigator,
  • my pages fit well on a screen with a resolution of 800×600, but would ideally expand to fit 1024×768 or higher,
  • and my pages didn’t take too long to load on a dial-up connection.

I pretty much set those criteria for myself based on what I was using (a Windows 98 desktop computer), and what seemed to be common at the time.

Gradually things changed; Netscape went and along came Firefox et al. Screens got bigger, and wider (and then smaller again), and connection speeds got quicker as broadband rolled out.

These days we have laptops, tablet computers, and smart phones, all with a variety of screen resolutions, a plethora of browsers, and varying input methods. Other “browsing technologies”, techniques, and tendencies have been implemented into those devices too, some of which are for security, others are for a “better browsing experience” based on whatever device you are using at the time.

Personally, I’ve had little interest in keeping up. I still only browse on my desktop computer, and my internet connection is still pretty slow, relatively speaking. My screen resolution is higher, but for the web pages I create, I still consider what might be the lowest common screen resolution someone might be viewing a page at, or even if someone might want to print the text (making a page too wide can hinder this).

As “mobile devices” became increasingly popular I still couldn’t be bothered to change my web pages to “adaptive” ones, and I still can’t, although when I changed blog themes I considered some of the more “adaptive” themes. However, I recently caught wind of Google (the search engine) changing how they rank websites based on these things and more. This website does a good job of explaining it [link].

I still enjoy creating web pages pretty much as I always have, and largely do this on Neocities where there are many like-minded people who dabble in aesthetics of the 2000’s internet. Mostly I create content there just because that’s what I like doing, but sometimes I might create a page that I consider might be of use to someone else and who might land on a page based on a Google search they may have done.

Sadly though, it appears, with Google’s changes, there will be less chance of this happening. Ultimately Google is cutting out “the old web” and favouring those websites that many in the Neocities community actually despise (many also seek to escape or avoid the clutches of “social media”). With this, the internet is being divided between those of the old world, and those of the new. An internet that is increasingly commercial, amalgamated into bland grey of conformity, and lacking the diverse and truly personal spaces that each website could be (this is likely taking things a bit far, but it seems more people these days are used to having a bland profile rather that a whole site they can customise to their heart’s content).

Here are some details of how Google is/will be operating in the near future (May):

…Page Experience … includes three new Core Web Vitals. These are ranking signals that Google considers important to measure a page’s overall user experience (UX)…

    • First Input Delay (FID) [responsiveness]
    • Largest Contentful Paint (LCP) [how long a page takes to load]
    • Cumulative Layout Shift (CLS) [how stable the content is [as it loads]

Google will use these metrics to rank websites together with its existing search signals, which include:

    • HTTPS, the secure version of HTTP
    • Mobile friendliness
    • Lack of interstitial pop-ups that are considered intrusive
    • Safe browsing, meaning no presence of malware on your pages

I find it somewhat ironic that it is the more ‘fancy and commercial’ websites that typically take longer for me to load (have a high FID), still might have a particularly (LCP – although many people need to learn to reduce the size of the images they plonk on a page), need the security of HTTPS, and carry the threat of “unsafe browsing, laced with malware [or the exploitation of tracking]”. Not that more static pages can’t do that, of course. Pop-ups have be ruled out for a while with inbuilt pop-up blockers in the browser, instead modern website bombard the visitor with pages littered with advertising and other distracting material beyond the content that the page is “supposed to be” about.

Some on Neocities will care little about making their website more friendly to Google’s system; even implementing “Mobile friendliness” can be a challenge. Those blogging on WordPress might consider if the theme their blog uses is Mobile friendly, or seek to change to one that is.

Incidentally, the Youtuber ActionRetro recently created a DuckDuckGo-powered search engine called FrogFind.com that strips out all of the bloat and even keeps that bloat at bay when you follow a link in the search results; ultimately presenting pages as they would have appeared on the early internet, and also making them compatible with the computers from that era. He also created a news feed of a similar fashion called 68k.news

I think these two examples illustrate the craving, at least among a small group, that desire the internet to be how it was, and was seemingly intended to be; we were happy with how things were, and while perhaps we were excited back then by the ever-changingness of technology, we grew tired of the constant change. I think this change is more market-driven than consumer-driven, or even if it is the latter, it’s because consumers are directed/brainwashed by the market! It seems that every week I am encountering more and more people who are “having to” change tech yet again, even while their “old tech” is still like new. It used to be that we “had to” upgrade our computers when Microsoft brought out a new version of Windows. Now even our smart phones and tablets become obsolete with a new iteration of iOS or Android. A new printer, for example, might insist on the use of an “App” and that app requires a certain version of Android, and no earlier, it might even prevent the use of a typical feature (like scanning) that insists a user create an account to enable that. All for what?

5 comments

  1. I quite agree Brian. I’m one of those who keep stubbornly using the old forms., but it gets harder and harder to do. I think there will be a day of reckoning though as green movements take hold along with initiatives such as the right to repair

  2. Thanks for the notice. They don’t seem to value the irritation factor for anyone who wants to use the net in any other way than what fits in with “you don’t have to worry your little head with that because we know exactly what’s best for you”. It was the kind of attitude that around 2000 removed programming from school computing courses for 15 years because someone persuaded someone else that it was no longer needed, so kids were taught at school only how to use Word, Powerpoint and Excel at a rather trivial level that anyone could pick up themselves. I’ll stop there otherwise this will become a rant.

    • Indeed, I went from high school around that time with good Word and Excel experience, to college where the IT tutor (who in hind-sight) was under the assumption we already had some programming knowledge.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s