Browsing the web with a slow connection makes you realize something: we've built our web from a place of entitlement, an assumption of speed
— Craig Mod (@craigmod) June 2, 2015
A common data point I see brought in web performance discussions is a reference to a test Google ran to test the impact of slower search results on the number of searches a user performs. I found the original blog post at Speed Matters:
Our experiments demonstrate that slowing down the search results page by 100 to 400 milliseconds has a measurable impact on the number of searches per user of -0.2% to -0.6% (averaged over four or six weeks depending on the experiment).
The impact extends beyond the initial test period:
Users exposed to the 400 ms delay for six weeks did 0.21% fewer searches on average during the five week period after we stopped injecting the delay.
If you are going to reference this test and the corresponding data, please link back to original Google blog post. Hopefully that will save others the time of having to hunt down the original information.
Combined this with the number of Internet users in the United States ( ~279 million ) you get: ~1% of United States Internet users are on AOL dial up.
And google.com is fast compared to most sites, nearly every modern web site is going to be horribly painful on a dialup connection.
You may remember one of the original webperf rules was to put scripts at the bottom.
– The preload scanner scans the whole document as it comes in and issues fetch requests.
– The main parser sends a signal when it reaches the start of the body and sends fetch requests for any resources it discoveres that are not already queued.
– The layout engine increases the priority of in-viewport visibile images when it does layout.
– The resource loader treats all script and css as critical, regardless of where it is discovered (in the head or at the end of the body). see crbug.com/317785
– Resource loader loads in 2 phases. The first (critical phase) delays non-critical resources except for 1 image at a time. Once the main parser reaches the body tag it removes the constraint and fetches everything.
TL;DR: don’t use defer for external scripts that can depend on eachother if you need IE <= 9 support
Even after 20 years of development there are days where browser environments still feel like the wild west.
Patrick Meenan mentioned first paint time improvements in Chrome 41. I noticed a ~25% improvement in the first view SpeedIndex times for one of our tests. It was easy to spot when the auto update from Chrome 40 to 41 happened:
I compared the individual tests before and after the update and this really is all about first paint times. The total time for the page to be visually complete was roughly the same.
It has been super exciting to see WordPress.com DNS performance rank #3 worldwide:
We are behind second place EdgeCast by just 0.66ms.
Serious kudos to our systems and network operations teams on including DNS as part of our Anycast network, which made this level of performance possible.
From a presentation on the internals of the H2O web server ( something I had previously mentioned ) by Kazuho Oku, slide 21 ( emphasis is mine ):
Characteristics of a fast program:
1. Executes less instructions
– speed is a result of simplicity, not complexity
Getting the same result with less work and less complexity is a good thing.