I was using Google Movie Showtimes the other day to look up local times and noticed something really annoying: it doesn’t remember what location I want to search.
Most of the time web requests from my laptop go through a proxy connection. The result, even though I’m signed into my Google account, the showtimes site does a geo-ip lookup and assumes I’m in New York instead of Utah. Not a big deal, I enter in my zip code into the “Change Location” form, hit the button, then get the times I was looking for.
Then things get really strange, it doesn’t remember what location I entered. The next time I go to www.google.com/movies it uses geo-ip again and moves me from Utah to New York.
The page is hard coded to always use geo-ip, even when I tell them I’m some where else. Perhaps they could add a checkbox on the ‘Change Location’ form that says ‘Make This My Default Location’.
Using geo-ip to come up with smart defaults can be very helpful. When it prevents me from setting the right location then it becomes irritating.
Two months ago Google announced Brotli, a new compression format:
Brotli is a whole new data format. This new format allows us to get 20–26% higher compression ratios over Zopfli. In our study ‘Comparison of Brotli, Deflate, Zopfli, LZMA, LZHAM and Bzip2 Compression Algorithms’ we show that Brotli is roughly as fast as zlib’s Deflate implementation. At the same time, it compresses slightly more densely than LZMA and bzip2 on the Canterbury corpus. The higher data density is achieved by a 2nd order context modeling, re-use of entropy codes, larger memory window of past data and joint distribution codes. Just like Zopfli, the new algorithm is named after Swiss bakery products. Brötli means ‘small bread’ in Swiss German.
Compression is a big deal for web performance, being able to send the same file with fewer bytes is a big win.
I ran into the Google Docs downtime yesterday trying to edit a spreadsheet. Opening the document resulted in this error message:
For a few seconds this caused some serious tension. Did Google really remove my entire speadsheet? Despite the error message claiming that the file I had been editing for weeks didn’t exist, I quickly switched gears, suspecting that something was wrong with the service and not my file. The Google status page confirmed my suspicions.
This got me thinking about how services convey failure messages. In the broad sense there are at least two types of failure messages a service needs to provide. The first is some sort of user error, which is the one Google provided me yesterday. The second is a general ‘hey, we are down right now’ message.
My initial confusion and concern during the Google Docs downtime came from Google mixing up these two types of messages. The provided me with the first type ( user error ), when really it was the second type ( system is down ).
“Style” covers a lot of ground, from “use camelCase for variable names” to “never use global variables” to “never use exceptions.” This project holds the style guidelines we use for Google code. If you are modifying a project that originated at Google, you may be pointed to this page to see the style guides that apply to that project.
I’ve noticed more developer focused videos from Google lately ( I’m probably just late to the party ). One that I recently started watching is Compressor Head, “video series explaining the theory and practice of compression algorithms”.
A common data point I see brought in web performance discussions is a reference to a test Google ran to test the impact of slower search results on the number of searches a user performs. I found the original blog post at Speed Matters:
Our experiments demonstrate that slowing down the search results page by 100 to 400 milliseconds has a measurable impact on the number of searches per user of -0.2% to -0.6% (averaged over four or six weeks depending on the experiment).
The impact extends beyond the initial test period:
Users exposed to the 400 ms delay for six weeks did 0.21% fewer searches on average during the five week period after we stopped injecting the delay.
If you are going to reference this test and the corresponding data, please link back to original Google blog post. Hopefully that will save others the time of having to hunt down the original information.
I’ve been using Google PageSpeed Insights quite a bit recently. There isn’t much information on how exactly the tests are run, which can make it hard to reproduce the results. Then I noticed the user agent strings coming from PageSpeed Insights ( emphasis mine ):
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko; Google Page Speed Insights) Chrome/27.0.1453 Safari/537.36
Mozilla/5.0 (iPhone; CPU iPhone OS 6_0_1 like Mac OS X) AppleWebKit/537.36 (KHTML, like Gecko; Google Page Speed Insights) Version/6.0 Mobile/10A525 Safari/8536.25
The only difference between these and normal user agent strings is the ; Google Page Speed Insights.