Tag: google (page 2 of 6)

Google Reader, Gone

Sad to hear that Google Reader is being shutdown. As of 1 July 2013 Google Reader will be gone. This is a major bummer, I use Google Reader every day.

When Google Reader came along in 2005 it wiped out most of the feed reader services that were around at the time. Prior to Google Reader I used Bloglines, back when Blogline was still a feed reader. I had resisted moving from Bloglines to Google Reader, because for awhile I liked the experience in Bloglines better. Ultimately though Google Reader soaked up most of the oxygen in the feed reader service space and I moved entirely from Bloglines to Google Reader.

I suggest exporting your Google Reader data via Google Takeout soon. It only took a few minutes to walk through the steps to export my data. Other readers will likely support importing OPML, so best to export now instead of waiting until the last minute.

Despite how ugly this is going to be in the short term I think this painful process may eventually yield better things ahead. Marco Arment put it this way:

Now, we’ll be forced to fill the hole that Reader will leave behind, and there’s no immediately obvious alternative. We’re finally likely to see substantial innovation and competition in RSS desktop apps and sync platforms for the first time in almost a decade.

Technology, like nature, views a vacuum as an opportunity to fill a space.

So what now? There are a few alternatives available today, but none of them seem to be a clear winner. I imagine over the next twelve months there will be a number of new services that show up, trying to be _the_ replacement for Google Reader. It will be ugly for awhile, but I’m hopeful in the end we’ll see some really strong replacements come out of this.

In the mean time leave your Google Reader replacement suggestions in the comments below.

Zopfli Compression

Zopfli was named after a Swiss bread recipe

Zopfli was named after a Swiss bread recipe

Google recently announced the new Zopfli Compression Algorithm:

Zopfli Compression Algorithm is a new zlib (gzip, deflate) compatible compressor. This compressor takes more time (~100x slower), but compresses around 5% better than zlib and better than any other zlib-compatible compressor we have found.

Being gzip compatible means that existing clients can decompress files that have been compressed with Zopfli. Specifically, web browsers will be able to understand this.

A number of people have asked what the big deal is over such a relatively small size reduction at the cost of a much slower compression process. Lets take the lastest jQuery file from the Google CDN as an example – http://ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js.

First I cloned the Zopfli source repo and compiled it on my Macbook Air. Took about a minute or so to clone and build. The build process only required running make.

Here are the results of gzip -9 and zopfli --i1000 on jquery.min.js:

Compression Size Compression Time
None 92,629 bytes -
gzip -9 32,660 bytes 0.009s
zopfli –i1000 31,686 bytes 16.376s

In this test Zopfli saved an additional 974 bytes and took over 16 seconds longer. In our Google CDN example the time it takes to do the compression doesn’t make any difference, that is something you’ll only be doing once per file. To see what the potential savings is from those extra 974 bytes we’d need to know how often the jQuery file is downloaded. I don’t know what the actual numbers are, so lets make some up. Lets say it is 1 million times per month.

So 974 bytes * 1,000,000 gives us 974,000,000 bytes per month in savings. No doubt this is a drop in the bucket when compared to total bandwidth usage at Google, but it is an improvement. The improvement wouldn’t just be for Google either, everyone who views web sites that had the Zopfli compressed jQuery would have a slightly better experience as well. Smaller file means it gets downloaded faster over my existing connection. This would be extra good for mobile users.

I think there are plenty of cases where using Zopfli to compress your files will be a nice little improvement.

Google Glass

Google is setting some high expectations at the new Google Glass site – http://www.google.com/glass/start/ – if you haven’t seen it yet go watch the How it feels video. Setting the expectations bar so high is either going to create a ton of demand for Glass, or is going to kill it.

Beyond that though, the opportunities and challenges for Glass are fun to look at. For instance, will this style of wearable technology fall into the same “lame look” bucket as bluetooth ear pieces? Beyond style, how will people find ways to use it inconspicuously. I mean, can you imagine having to put this on during a meeting to look up contact information or an email thread? Would Glass support whispering, or hand gestures so that you could do things without having to talk out loud? Can you imagine how annoying it would get if there were 20 people in a room all trying to use their own Glass headset at the same time?

This leads me to wondering, would Glass be positioned as a phone replacement or a phone accessory? The ability to function as an amazing phone accessory will be very compelling. But like some people with tables vs. laptops, there will no doubt be a segment of the market that will find Glass as a potential phone replacement. My guess is that will still be the minority for at least the first few years.

There are plenty of situations where having something voice activated that is hands free is very desirable. With a few extra features this could do amazing things for firefighters, policemen, paramedics, etc. Tons of recreational and sports activities aren’t conducive to handling a phone, but something like Glass would go well. Imagine a composite video simulation of the Olympic opening ceremonies based on video feeds from every single person in the stands, the performers, and stadium crew, all wearing a Glass headset.

Which would be an astronomical amount of data to process. It wouldn’t even take thousands of people in one stadium to generate a large amount of data. Just one person constantly taking pictures, videos, sending messages, can generate a demand for lots of storage and a reasonable way to manage it all. That is both a huge opportunity and a huge challenge that Google will have to deal with, and could be the key to making the whole thing a viable source of revenue. What would you be willing to pay for a Google Glass account that allowed you to store, access, and share everything you ever did using your Glass headset?

The Locations of Google Data Centers

The news that Google is building another data center in Iowa got me wondering if there was a list of all the Google data centers.

Turns out Google has a page with all of the data center locations, which is just one part of their ‘Data Center’ details area. There are 6 Google data centers in the United States, 2 in Europe, and 3 in Asia. Each data center has a page with more details.

Google’s plusone.js Doesn’t Support HTTP Compression

I was surprised to see that Google’s plusone.js doesn’t support HTTP compression. Here is a quick test with
curl -v --compressed https://apis.google.com/js/plusone.js > /dev/null

Request Headers:

> GET /js/plusone.js HTTP/1.1
> User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3
> Host: apis.google.com
> Accept: */*
> Accept-Encoding: deflate, gzip

Response Headers:

HTTP/1.1 200 OK
< Content-Type: text/javascript; charset=utf-8
< Expires: Fri, 18 Nov 2011 02:35:20 GMT
< Date: Fri, 18 Nov 2011 02:35:20 GMT
< Cache-Control: private, max-age=3600
< X-Content-Type-Options: nosniff
< X-Frame-Options: SAMEORIGIN
< X-XSS-Protection: 1; mode=block
< Server: GSE
< Transfer-Encoding: chunked

You'll notice there is no Content-Encoding: gzip header in the response.

We'll have to get Steve Souders to pester them about that.

Firefox Desperate To Mimic Chrome, Even Their Mistakes

Recently Firefox has been pushing a more aggressive upgrade schedule. There is little doubt that they are feeling the pressure from Google Chrome, which is becoming increasingly popular and has an aggressive upgrade cycle as well.

In the last year Chrome has become nearly as popular as Firefox. Many of the recent changes with Firefox, like the shorter release cycles, make it look like it is trying to play catch up with Chrome. Perhaps desperately so. Unfortunately with release of Firefox 7 it appears they are also desperate to copy the same mistakes Chrome has made.

It is no secret that I really don’t like the way Chrome broke copy and paste in the URL field. That was a horrible decision that irritates me on an almost daily basis. When I select something to be copied I expect to have an exact copy of what was selected, altering that under the hood completely breaks the concept of copy and paste.

So guess what new “feature” was added to Firefox 7? You got it:

The ‘http://’ URL prefix is now hidden by default

And it behaves in exactly the same broken way that Chrome does.

To the Mozilla team: look, I understand that you’re concerned about losing market share to Chrome, but please, please, please don’t mimic their mistakes. Now in order to copy and paste the URL properly I have to copy everything but the first character of the hostname, then manually type in that first character then paste in the remainder. Absolutely horrible. This is one feature of Chrome that no one should ever copy, and I’d be thrilled to see it removed from Chrome as well.

If you want to no longer show ‘http://’ in the URL field, fine, but please stop breaking copy and paste.

UPDATE: Turns out Firefox has an option for disabling this “feature” ( kudos to @ozh ):

  • Enter about:config in the URL field
  • Filter on browser.urlbar.trimURLs
  • Set the value for browser.urlbar.trimURLs to false

Not great that this is on by default, but at least there is an easy way to turn it off. Now, if only it were that easy to turn off this “feature” in Chrome.

What Google+ Looks Like For Me

Being part of the second class citizen group that is Google Apps for Domains is becoming a real let down.

Google APIs and SSL

Google has announced the move for several of their APIs to require SSL when making requests. This is a good thing.

If you aren’t planning on it already, now is a good time to expect new APIs to require SSL from the start. This is likely going to make TLS Server Name Indication an even bigger deal, as demand for SSL services increases and IP addresses become more expensive.

How To Make A Lazy Argument, Play the Fanboy Card

The term fanboy is often used to describe in a negative way someone who sticks to one product or vendor just because. The term is intended to be negative because it implies that the person gave no real thought as to how good the product is or how well it fits their needs.

In a discussion about this vendor vs. that vendor playing the fanboy card is a lazy way to try and remove any legitimacy to the point of view of the other side. After all, how can they possibly bring anything useful to a discussion if they are just a fanboy?

Sadly playing the fanboy card is more often than not a lazy way of trying to ignore a different view point instead of addressing it. This is particularly true of Google Android phones vs. the iPhone. Aaron Toponce recently employed this method:

People jumped onto the iPhone bandwagon when it was announced on AT&T for two reasons: Apple fanboys and superior hardware. People getting an iPhone on the Verizon network will be: Apple fanboys.

Aaron Toponce – The Verizon iPhone

The only possible reason for “People getting an iPhone on the Verizon network will be: Apple fanboys.” By playing the fanboy card you simply get to ignore any possible counter point. I used Aaron’s post because it illustrated the point so well, folks on both sides play this game, it isn’t limited to one side or the other.

In the original post there was actually one detail provided, that the HTC Evo 4G is “head and shoulders over the iPhone 4. It’s no contest, and it’s already outdated hardware”. I’ve never used an HTC Evo 4G before and only played with an iPhone 4 for a few minutes. This made me curious to find out if there is any possible reason why someone would go with an iPhone 4 over an HTC Evo 4G. Fortunately others have already done the numbers comparison for me – Engadget lists numbers for the iPhone 4 vs. the HTC Evo 4G – with a chart for easy comparison.

Since the claim was “no contest” I only looked for things listed on the chart that could reasonably be justified by someone to indeed provide some contest.

  • 802.11b/g/n for the iPhone 4, HTC Evo 4G only supports 802.11b/g
  • 960 x 640 resolution on the iPhone 4, HTC Evo 4G 800 x 480
  • 720p at 30fps video recording on the iPhone 4, HTC Evo 4G 720p at 24fps
  • Gyroscope on the iPhone 4, none of the HTC Evo 4G
  • Listed talk time, iPhone 4 – 7 hours on 3G, 14 hours on 2G and the HTC Evo 4G – 6 hours

There are several other factors that could come into play, but I choose to limit the list to just specific numbers. Now it is entirely possible that none of the iPhone 4 advantages listed above make any difference given individual circumstances. The flip side is also true, some of these factors may be very important for some individuals. This constitutes a contest between the two.

The next time you get the urge to simply wave off counter points by calling the other side a fanboy, stop and think about what you are doing. Other wise you may end up being the actual fanboy in the discussion :-)

mod_pagespeed for Apache

mod_pagespeed is an open-source Apache module that automatically optimizes web pages and resources on them. It does this by rewriting the resources using filters that implement web performance best practices. Webmasters and web developers can use mod_pagespeed to improve the performance of their web pages when serving content with the Apache HTTP Server.

via mod_pagespeed Overview.

Older posts Newer posts

© 2014 Joseph Scott

Theme by Anders NorenUp ↑