An argument against lazy loading
Opinion: Most of the time, I really dislike using websites that lazy load images.
(Yeah, this post is full of my opinions. Yours may differ. That's okay 🙂)
So I've watched with dismay over the past few years as lazy loading has gone from targeted performance optimisation technique to blanket best practice.
Google's Lighthouse audit (which drives PageSpeed Insights) now gives you a big red fail if below-the-fold images aren't lazy loaded.
Since users can't see offscreen images when they load a page, there's no reason to download the offscreen images as part of the initial page load.
Lighthouse Audit Reference: Offscreen Images
As a result, every website owner, developer and framework author that cares about SEO performance in 2019 is lazy loading pretty much everything below The Fold™ (and plenty of things above it).
In defence of the browser's native busy indicator
Your browser's busy indicator is an amazingly solid, consistent & nuanced feedback mechanism. It's so good, smart people have been trying to give developers better access to it for a long time.
When spinning, it's saying...
Yo! I'm still loading some things. You can go ahead and scroll or read or interact with what you can see, but there's more stuff coming.
And when it stops spinning...
I'm done! I'm now fully capable of doing what you came to this page to do. Scroll! Read! Interact!
Well, it used to mean that. Now it means...
I've loaded something. I may or may not be capable of doing what you came to this page to do yet. Maybe I'm loading more stuff right now. Or I maybe I'll wait until you start scrolling.
Either way, I sure hope you have a constant, fast connection & are physically on the same side of the Pacific as my server...
That contract - the idea that the browser is telling the truth when it says a page has finished loading - is enormously valuable, especially in the web performance poster child scenario: on a low powered device with a patchy connection and limited or expensive bandwidth.
When I'm on a train/street/airport/public wifi, or when I'm somewhere that requires diligent airplane mode-ing to conserve precious đź’¸ megabytes, I'd gladly trade an extra second (or five) at load time for a guarantee that I can then reliably do the thing I came to that page to do without constantly going back to the network.
The browser's busy indicator can give that guarantee, but it's totally undermined by lazy loading implementations that vary wildly from site to site.
The experience is terrible
Read, scroll, wait.
Read, scroll, wait.
Anyone who has read a Medium post from Australia's latency-ridden soil will be familiar with this experience.
When Facebook popularised the "tiny blurry version of the image" technique a few years ago, they were using it for cover photos (i.e. background images). For images that are essentially just decoration, the technique makes sense!
For images that are content - like, say, an image surrounded by relevant words in a blog post - that blurry version is frustratingly useless. The frustration is compounded by the fact that we get no feedback from the browser’s native busy indicator.
Before jumping onto the tiny blurry image lazy loading bandwagon, pause and consider whether your blog or online store has the exact same requirements as a very specific part of a Facebook's UI did in 2015.
Making assumptions about how people consume your content
We don't load an entire video up front before users can watch it. Instead we stream it. When they click the play button, we incrementally load chunks of video just ahead of time.
We can do this because we can make a pretty reasonable assumption that the video will be consumed linearly at roughly 1 second per second.
We can't make that same assumption for an image-filled web page.
- You don't know how fast your user will scroll, or when they'll start scrolling, or where they're looking as they scroll.
- You don't know that they'll read the page from top to bottom at a constant rate.
- You don't even know if they'll start at the top of the page (a #fragment-url might send them somewhere further down).
- You don't know that the text content or above-the-fold images are more important to them than images below the fold.
It’s not just images
The popularity of code splitting has led to an explosion in page transitions that eschew the familiar browser busy indicator in favour of experiences like...
- Click. Wait. (No loading indicator, is anything happening? Should I click again?) New page suddenly appears!
- Click. The URL changed but nothing else appears to be happening. Wait. (Is that thin line at the top of the screen a loading bar? It's been stuck for a while...) Wait more. Give up & hard reload.
Code splitting has its advantages, especially in the context of a big JavaScript web app. Nobody wants to download & parse a bunch of code they'll never use.
But the approach comes with UX tradeoffs that we must consider. Is a faster start time worth an experience littered with awkward, buggy pauses? Is your custom loading UX as familiar, bug-free and edge-case-resilient as the browser's busy indicator?
What can we do?
So I didn't title this post "Never lazy load anything!"
Image lazy loading and code splitting are legitimately useful techniques in plenty of situations (especially when coupled with prefetching!). But they're frequently an engineering-only concern, aimed at improving some initial page load metric. They make our graphs look good.
Metrics like Time to Interactive and First CPU Idle are important! Improving the speed at which our users can navigate through common flows in our apps is important!
But I haven't really seen any discussion of the downsides to current approaches, now being cemented as a blanket baseline for every website by Google's Lighthouse audits.
Have you picked all the low hanging fruit?
- Are your images crunched?
- Are you using responsive images to ensure your users only download the image their device needs?
- Are your images (and other static resources) behind a global CDN, with filenames that allow them to be cached forever?
Does your project need a React-powered, image-lazy-loading, client-side-routing, code-splitting, hot-reloading, all-singing, all-dancing framework?
On a project where I can justify those tools, they're great! I love the modern JS ecosystem! But I didn't need any of them to make the site you're reading or this image-heavy photoblog nice & zippy. You may not either.
Have you tried Progressive JPEG?
So you really want that Medium-style progressive image display experience... OK, fine. The ancients had a mystical technology called Progressive JPEG:
When you see a progressive JPEG loading, you’ll see a blurry version of the full image, which gradually gets sharper as the bytes arrive.
That sounds like what we want, with a couple of added bonuses:
- Doesn't break the browser's busy indicator!
- Recognisable, useful (if not perfect, final, high-res) information in the image is visible sooner than with lazy loading techniques that make us wait for the full image to load.
After a long time spent traveling, relying on a mobile device on poor and expensive internet connections, I really, really like my browser's busy indicator.
More often than not, the lazy-loaded web feels broken to me. In those patchy connection scenarios, it often slows me down or entirely prevents me from doing what I visited a web page to do.
And yeah, this is just one person's anecdotal evidence. But if we're going to collectively break a well-understood feedback mechanism that worked pretty well for a couple of decades, we should at least consider the tradeoffs.