Tuesday 30 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

HTML5 Games, Jammed

(Updated 10 Dec 2010 -- corrected link to 3rd Place Hilversum developer Kornel Lesinski's Twitter page.)

Last month, more than 50 developers assembled in Hilversum, Netherlands, and San Francisco, California for an HTML5 game jam.

The idea of HTML5 gaming may seem unusual, but if the results from this event are anything to go by, there will be plenty more HTML5 games in the future. In just over 24 hours of coding, attendees were able to produce the seeds of great games, powered by standard web technologies. The games we saw were novel, visually appealing, and in many cases, already very playable.

HTML5 is making it easy to develop games for standard web browsers, and it also provides a way for developers to reach mobiles and tablets with a single code base. Watch for other initiatives, like Mozilla's current HTML5 gaming competition, to take HTML5 gaming to the next level.

Here’s a look at the winners from both venues. You can see a detailed list of all the entries here.

First Place, San Francisco: Ninja Leap

A novel 8-bit style game where you “leap” over the bad guys. A good demo of the Canvas element and a complete game with levels and scoring. Congratulations David Ganzhorn and Mike Rotondo on winning the HTML5 Game Jam in the USA.


First Place, Hilversum: Monkey Fortress

A puzzle game where you build a fortress to protect the monkey, demonstrating a physics engine in Canvas. Congratulations Tom Hastjarjanto on winning the HTML5 Game Jam in Europe.


Second Place, San Francisco: Shell Shock

A platform shooter involving turtle-like creatures on wheels, using Canvas. By Wolff Dobson, Charles Lee, Nicolas Coderre, Dan Fessler, Sara Asher. (No online demo at present.)


Second Place, Hilversum: Snakes

A refresh on the classic “Snake” game, demonstrating multiplayer powered by NodeJS and WebSocket, and 3D transforms of the canvas element. By David Durman & Ales Sturala. (No online demo at present, but code repository available.)


Third Place, San Francisco: Fruit Link

A casual puzzle game by Bruno Garcia, where you link up adjacent matching fruit.


Third Place, Hilversum: Enterprise

A stunning 3D game inspired by the classic Syndicate series showcasing just how far we’ve come with Canvas-based graphics. Observe the collision detection and be sure to hit the “Flying Carpet” button as well as the space bar to fire! This game was also shown in the “Web or Native for Mobile Development?” session at the recent Google Developer Days conferences in Europe. Created by Kornel Lesinski, Peter van der Zee, and Edwin Martin.


A few other readily playable games you might enjoy are:

We were also honoured to have keynotes by two pioneers of web-based gaming. In Hilversum, the speaker was Tino Zijdel, creator of DHTML Lemmings back in 2004. Tino, coincidentally a Hilversum local, explained the tricks he used to make the game playable on the browsers of the day. He has subsequently written his account of the Game Jam. It’s in Dutch, so here’s an English translation. There were additional presentations from from Yu Jianrong, who covered ten tips for HTML5 Game Development and Paul Irish on HTML5.


The San Francisco keynote was given by Marcin Wichary, who gave a keynote on games and HTML5. Marcin is the creator of the Pac-Man doodle and also the first version of the popular HTML5Rocks slides. Marcin talked about his experiences in recreating Pac-Man and the timeless aspects of videogaming in modern age, shared some behind-the-scenes trivia, and shared the technology used to write the doodle and debug it.

We thank SPIL Games for hosting and co-organising the Netherlands event, and we also thank Samsung for contributing a Galaxy Tab for the Game Jam at that venue. Developers working on touch apps were able to use the Tab for testing, and we later gave the device away as a prize. Congratulations all who took part!

You can find more details about the event, including links to code repositories and further demos, at HTML5GameJam.com.

Monday 29 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Google’s sample OpenID relying party site

More and more websites are enhancing their login systems to include buttons for identity providers such as Google, Yahoo, Facebook, Twitter, Microsoft, etc. Users generally prefer this approach because it makes it easier for them to sign up for a new site that they visit. However if a user already has an account at a website, and they are used to logging in with their email and password, then it is hard to get them to switch to using an identity provider.

Google has recently released a sample site that shows how a website can migrate users away from password based logins, and instead have them leverage an identity provider. This sample site incorporates many of the ideas of the Internet Identity community, as well as feedback from numerous websites who have been on the cutting edge of applying these techniques. The following video provides highlights of some elements of the user experience.

The sample site is at openidsamplestore.com, but we suggest first reading this FAQ which describes the site and has links to additional videos of some of the features. We hope website developers will use these techniques to reduce the need for passwords on their site.

Tuesday 23 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Random Hacks of Kindness #2 - Come hack for humanity!

On the weekend of December 4 and 5, hackers will gather in cities around the globe to create software solutions that make a difference.

Google, Microsoft, The World Bank and Yahoo! are inviting software developers, independent hackers and students to participate in Random Hacks of Kindness (RHoK #2) next weekend.

RHoK brings together volunteer programmers and experts in disaster response for a two-day hackathon to create software solutions that focus on problems related to disaster risk and response. It is an opportunity to meet and work with top software developers and disaster experts, to create and improve open source applications that enable communities to recover from disasters, and to possibly win prizes.

Examples of previous hacks include the “I’m OK” app from RHoK #0 in November 2009, which was used during the response to recent earthquakes in Haiti and Chile, and the landslide prediction tool “Chasm,” winner of RHoK #1 in June 2010.

RHOK will be held simultaneously in many locations around the world. The five main stages will be in Chicago, Sao Paolo, Aarhus, Nairobi and Bangalore; and there will be over a dozen satellite events in other global cities. To find a location near you, see the latest list on the RHoK website.

Join us on December 4th and 5th, and visit www.rhok.org for more information.

Monday 22 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Edmunds partners with Google to make the web faster

Note: This is a guest post from Ismail Elshareef, who is the Principal Architect at Edmunds.com. Thanks for the post and for making the web faster Ismail!

In the Fall of 2008, we embarked on a complete redesign of our car enthusiast site, insideline.com. One of the main redesign objectives was to deliver the fastest page load possible to our consumers. Leading up to that point, we have been closely following and implementing the performance best practices championed by Google's Make the Web Faster team and others. We understood the impact performance has on user experience and the bottom line.

Some of the many performance-enhancing features that have been implemented on insideline.com (and now on our beta.edmunds.com) are:
  1. Reducing the number of HTTP requests: We combined CSS and JavaScript files as necessary as well as using sprites and data URIs when appropriate. We have also reduced the number of blocking requests as much as possible to make the pages "feel" faster
  2. Serving static content from different domains: This helped maximize the browser parallel download capacity and made the request payload faster since no cookies were sent over the wire to those domains
  3. Using Expires headers: Caching static files in the client's browser to eliminate unnecessary, redundant requests to our servers
  4. Lazy-loading Page Modules: Render the bare minimum page components first so that the user sees something on the page, and then go through the modules and load them in order of priority. We developed a JavaScript Loader component to help us accomplish that which you can read more on the Edmunds technology blog.
  5. Managing 3rd-party components: iFrame components could be lazy-loaded without a problem. JavaScript components, on the other hand, need to be loaded onto the page before the onLoad event fires. That had the potential of slowing down our pages. The solution we devised was to delay the calling of those components until we initiate the lazy-loading of modules and right before the onLoad event fires
  6. Using non-blocking calls: With the browser being a single thread process, we optimized ways of including resources on the page without affecting page rendering so that the page is perceived to be fast by the user.

The results on insideline.com have been incredbile. Page load time went from 9 seconds on average on the old site to 1.5 seconds on average on the new one, and that's with loading in much richer content onto the page (measured with WebPageTest). We have also seen a 3% increase in ad revenue. On the beta.edmunds.com, which will replace our legacy site fully in December 2010, we have seen a 17% increase in page views and a 2% reduction in the bounce rate for our landing pages in a controlled experiment.

Although we have a long way to go in making our pages and services faster, we are very pleased of the progress we’ve made so far. Working with Google to make the web faster has been an exciting adventure that will continue with more improvements and innovations for both our sites and the web as a whole. Get more details on the Edmunds technology blog and try these enhancements on your site today.

Thursday 18 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

HTML5, browsers, and books, twenty years later

Update: Thanks for all the interest and feedback on 20 Things I Learned about Browsers and the Web! We hope to open-source the code in the coming months and will post an update when we do. Stay tuned.

Twenty years ago this month, Tim Berners-Lee published his proposal for the World Wide Web. Since then, web browsers and web programming languages have come a long way. A few of us on the Chrome team decided to write an online guide for everyday users who are curious about the basics of how browsers and the web work, and how their evolution has changed the way we work and play online. Called "20 Things I Learned about Browsers and the Web," this online guidebook is illustrated by Christoph Niemann, and built in HTML5, JavaScript and CSS3, with our friends at Fi.

In building an online book app, HTML5, JavaScript and CSS3 gave us the ability to bring to life features that hearken back to what we love about books with the best aspects of the open web: the app works everywhere, and on any device with a modern browser. Here are a few features of the book experience that we’re particularly excited about:

  • After the app has been visited once, you can also take the experience with you offline, thanks to the Application Cache API.
  • You can resume reading where you had left off as the book remembers your progress using the Local Storage API. We also mark the chapters that have previously been read by folding the top right corner of the page in the navigation.
  • The app utilizes the History API to provide a clutter-free URL structure that can be indexed by search engines.
  • The HTML5 canvas element is used to enhance the experience with transitions between the hard cover and soft pages of the book. The page flips, including all shadows and highlights, are generated procedurally through JavaScript and drawn on canvas.
  • The canvas element is also used to animate some of the illustrations in the book.
  • CSS3 features such as web fonts, animations, gradients and shadows are used to enhance the visual appeal of the app.

This illustrated guidebook is best experienced in Chrome or any up-to-date, HTML5-compliant modern browser. We hope you enjoy the read as much as we did creating it, at www.20thingsilearned.com or goo.gl/20things.







Monday 15 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Be part of improving Google Person Finder

Google Person Finder has become a useful tool in responding to natural disasters by reconnecting people with their family and friends. We’ve been looking at the next phase of Google Person Finder and decided to begin hosting the open source project at Google Code. We’re inviting the developer community to help improve Google Person Finder and the PFIF data format.

Google Person Finder provides a common place to search for, comment on, and connect records from many missing person registries. After the January 12th earthquake in Haiti, a team of Googlers worked with the U.S. Department of State to quickly create a site that helped people who were affected by the disaster. The site was used heavily after the Chile earthquake in February and put in action again in April after the Qinghai earthquake in China and in August for the Pakistan floods.

The software powering Google Person Finder is open source so we’re listing the open issues and feature requests we’ve received over the past few months in hopes the community can help us improve the code. We’ve created a Developer Guide to help developers get started. As always, we invite those interested to post questions on our public Person Finder discussion group. Those who are interested in improving the PFIF data format can also join the PFIF discussion group.

In addition to opening our product for developers, we’ve decided it’s now time to turn off our Google Person Finder instances for Haiti, Chile, China, and Pakistan. It doesn’t seem useful to be serving these missing person records on the Internet indefinitely, so we intend for each instance of Google Person Finder to be running for a limited time. Once an instance has served its purpose, we will archive the PFIF records in a secure location for historical preservation for one year while we work to identify a permanent owner for these records. Assuming a long-term owner cannot be found, we will delete the records after one calendar year. For more information, please feel free to review the Google Person Finder FAQ.

Instant Previews: Under the hood

If you’ve used Google Search recently, you may have noticed a new feature that we’re calling Instant Previews. By clicking on the (sprited) magnifying glass icon next to a search result you see a preview of that page, often with the relevant content highlighted. Once activated, you can mouse over the rest of the results and quickly (instantly!) see previews of those search results, too.

Adding this feature to Google Search involved a lot of client-side Javascript. Being Google, we had to make sure we could deliver this feature without slowing down the page. We know our users want their results fast. So we thought we’d share some techniques involved in making this new feature fast.

JavaScript compilation

This is nothing new for Google Search: all our Javascript is compiled to make it as small as possible. We use the open-sourced Closure Compiler. In addition to minimizing the Javascript code, it also re-writes expressions, reuses variables, and prunes out code that is not being used. The Javascript on the search results page is deferred, and also cached very aggressively on the client side so that it’s not downloaded more than once per version.

On-demand JSONP

When you activate Instant Previews, the result previews are requested by your web browser.There are several ways to fetch the data we need using Javascript. The most popular techniques are XmlHttpRequest (XHR) and JSONP. XHR generally gives you better control and error-handling, but it has two drawbacks: browsers caching tends to be less reliable, and only same-origin requests are permitted (this is starting to change with modern browsers and cross-origin resource sharing, though). With JSONP, on the other hand, the requested script returns the desired data as a JSON object wrapped in a Javascript callback function, which in our case looks something like

google.vs.r({"dim":[302,585],"url":"http://example.com",ssegs:[...]}).

Although error handling with JSONP is a bit harder to do compared to XHR (not all browsers support onerror events), JSONP can be cached aggressively by the browser, and is not subject to same-origin restrictions. This last point is important for Instant Previews because web browsers restrict the number of concurrent requests that they send to any one host. Using a different host for the preview requests means that we don’t block other requests in the page.

There are a couple of tricks when using JSONP that are worth noting:

  • If you insert the script tag directly, e.g. using document.createElement, some browsers will show the page as still “loading” until all script requests are finished. To avoid that, make your DOM call to insert the script tag inside a window.setTimeout call.
  • After your requests come back and your callbacks are done, it’s a good idea to set your script src to null, and remove the tag. On some browsers, allowing too many script tags to accumulate over time may slow everything down.

Data URIs

At this point you are probably curious as to what we’re returning in our JSONP calls, and in particular, why we are using JSON and not just plain images. Perhaps you even used Firebug or your browser’s Developer Tools to examine the Instant Previews requests. If so, you will have noticed that we send back the image data as sets of data URIs. Data URIs are base64 encodings of image data, that modern browsers (IE8+, Chrome, Safari, Firefox, Opera, etc) can use to display images, instead of loading them from a server as usual.

To show previews, we need the image, and the relevant content of the page for the particular query, with bounding boxes that we draw on top of the image to show where that content appears on the page. If we used static images, we’d need to make one request for the content and one request for the image; using JSONP with data URIs, we make just one request. Data URIs are limited to 32K on IE8, so we send “slices” that are all under that limit, and then use Javascript to generate the necessary image tags to display them. And even though base64 encoding adds about 33% to the size of the image, our tests showed that gzip-compressed data URIs are comparable in size to the original JPEGs.

We use caching throughout our implementation, but it’s important to not forget about client-side caching as well. By using JSONP and data URIs, we limit the number of requests made, and also make sure that the browser will cache the data, so that if you refresh a page or redo a query, you should get the previews, well... instantly!

Thursday 11 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Check out Google’s latest cloud technologies at Cloudstock!

There’s an exciting new event happening December 6th dubbed the “Woodstock for Cloud Developers.” We’ll be participating at Cloudstock, an industry event taking place in San Francisco’s Moscone West, that brings together a growing developer community and some of the leading cloud technology companies (such as Google, vmware, Salesforce.com and Amazon) to learn, hack and network.

Google is a strong believer in the open technologies powering the web, such as HTML5. Cloud computing is about powering innovations on the web with platforms and services that make developers like you more efficient and allow you to concentrate on solving business problems. No longer do you have to worry about the hassle of acquiring and managing servers, disks, RAM and CPU-- it’s all accessible in the cloud.

Google will be presenting the following sessions at Cloudstock:

  • Introduction to Google’s Cloud Platform Technologies (Christian Schalk)
    This talk will provide an in depth review of Google's Cloud Platform Technologies by first reviewing both Google App Engine and App Engine for Business followed by an introduction to Google's new cloud technologies: Google Storage, Google Prediction API and BigQuery. Throughout the presentation, in depth technical demonstrations will be given showing how to use these technologies together in an integrated manner.
  • Selling your Cloud App on the Google Apps Marketplace (Ryan Boyd)
    This demo-focused session will review how to integrate your app with Google Apps and sell it on the Google Apps Marketplace to reach 30 million users at 3 million businesses. It will dive into the SaaSy Voice demo application, showing how technologies like OpenID-enabled Single Sign-On, OAuth and AtomPub make it easy to create great user experiences for your customers.

We have another session which will be announced shortly-- stay tuned to this blog and the GoogleCode twitter account!

Register for the free Cloudstock event at:
http://www.cloudstockevent.com/

Moscone West
San Francisco, CA
Monday, December 6th, 2010

Looking forward to meeting you there!

Monday 8 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Introducing Project Hosting comments by email

Google Project Hosting is all about helping software developers work together on source code, code reviews, issues, and wiki pages for technical documentation. In fact, the projects we host have collectively accumulated several million issue comments. Working together means that, from time to time, the ball is in your court and you need to respond to other users.

We send out notification emails to let the appropriate users know when an issue has been entered or a comment has been added to an issue, wiki page, or code review. These emails contain a link that allows you to enter your response in your project on code.google.com/p. But starting now, we are making it much easier and faster to respond to these comments by processing email replies that you send us.

So, check your inbox for new notification emails sent directly to you. When you see an email footer line that says that you can reply, just press the reply button in your email client, bang out a thoughtful response, and hit “Send”. Project committers and owners can even update an issue’s status and other values via email. For example, to let your teammates know that you are working on an urgent defect report that just came in, reply with:

Subject: Re: Issue 123 inyour-project: data loss when src == dst
Status: Started
Owner: your-email-address
Label: Priority-High

Thanks for that detailed defect report. I never realized that we needto handle that case specially. I’m going to add a check for itright now.

Please try it out the next time you receive a notification email. If you have questions, see our documentation on inbound email and user groups.

Thursday 4 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Introducing the Advocate Bios and Developer Events pages

We know that developers are always interested in learning about new APIs and best practices for existing ones. And, one of the best ways to learn is face to face interaction with an expert in the subject.

Your friendly neighborhood Google Developer Relations team members work everyday with the APIs you care about. We host, as well as attend, a number of events around the world to help as many developers as possible throughout the year. However, it hasn’t been easy for interested developers to find relevant events close to them.

We also realized that while many developers have met at least a couple of our Developer Advocates, it’s hard to tie an Advocate to their API expertise.

Enter the Advocate Bios and Developer Events pages.

The Advocates Bios page provides names, pictures and short descriptions of Developer Relations team members. You can filter them by what they work on and/or where they’re based out of.

The Developer Events page is a mashup of the Calendar and Maps APIs, running on an App Engine backend. Want to know about upcoming Android events in Prague? Or whether Patrick Chanezon is speaking at the GDD in Munich on Nov 9th? (He is!) You can do all of that and more with the Developer Events page.

Both the bios and the events pages are conveniently linked under the Developer Resources section on the Google Code home page.

We hope to see you at the events!

Wednesday 3 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Make your websites run faster, automatically -- try mod_pagespeed for Apache

Last year, as part of Google’s initiative to make the web faster, we introduced Page Speed, a tool that gives developers suggestions to speed up web pages. It’s usually pretty straightforward for developers and webmasters to implement these suggestions by updating their web server configuration, HTML, JavaScript, CSS and images. But we thought we could make it even easier -- ideally these optimizations should happen with minimal developer and webmaster effort.

So today, we’re introducing a module for the Apache HTTP Server called mod_pagespeed to perform many speed optimizations automatically. We’re starting with more than 15 on-the-fly optimizations that address various aspects of web performance, including optimizing caching, minimizing client-server round trips and minimizing payload size. We’ve seen mod_pagespeed reduce page load times by up to 50% (an average across a rough sample of sites we tried) -- in other words, essentially speeding up websites by about 2x, and sometimes even faster.

(Video comparison of the AdSense blog site with and without mod_pagespeed)

Here are a few simple optimizations that are a pain to do manually, but that mod_pagespeed excels at:

  • Making changes to the pages built by the Content Management Systems (CMS) with no need to make changes to the CMS itself,
  • Recompressing an image when its HTML context changes to serve only the bytes required (typically tedious to optimize manually), and
  • Extending the cache lifetime of the logo and images of your website to a year, while still allowing you to update these at any time.

We’re working with Go Daddy to get mod_pagespeed running for many of its 8.5 million customers. Warren Adelman, President and COO of Go Daddy, says:

"Go Daddy is continually looking for ways to provide our customers the best user experience possible. That's the reason we partnered with Google on the 'Make the Web Faster' initiative. Go Daddy engineers are seeing a dramatic decrease in load times of customers' websites using mod_pagespeed and other technologies provided. We hope to provide the technology to our customers soon - not only for their benefit, but for their website visitors as well.”

We’re also working with Cotendo to integrate the core engine of mod_pagespeed as part of their Content Delivery Network (CDN) service.

mod_pagespeed integrates as a module for the Apache HTTP Server, and we’ve released it as open-source for Apache for many Linux distributions. Download mod_pagespeed for your platform and let us know what you think on the project’s mailing list. We hope to work with the hosting, developer and webmaster community to improve mod_pagespeed and make the web faster.

Monday 1 November 2010
Facebook StumbleUpon Twitter Google+ Pin It

Introducing the Google APIs Console and our latest API updates

After a busy year of creating, curating, and re-organizing our APIs, we’re pleased to share that:
  • We’re announcing the Google APIs console, a new tool to help you use our APIs in your applications and on your websites.
  • We’re introducing a new and improved Custom Search API and the new Translate API, which replace the old Web Search API and the old Translate API respectively, which are being retired along with the old Local Search API.
  • We’ve reorganized and rewritten the documentation for some of your favorite APIs (read more on the AJAX APIs Blog).

New Google APIs Console Improves API Experience

The new APIs console helps you manage your API usage across all of your sites and apps. Key features include:
  • Log in with your Google account to see the API projects you’re working on.
  • Create and manage project teams for projects that are shared with your co-workers or friends.
  • Get developer credentials to track exactly how you are using each API.
  • View information about how your site or app is using the APIs, including which of your pages are making the most requests.



Initially, the console supports over a half dozen APIs – that number is expected to grow rapidly over time. Please take a look at the APIs console and get started using Google’s new APIs today.

New Custom Search API Delivers Better Integrated Search Experience


Google Custom Search helps you create a curated search experience, tailoring a custom search engine precisely to your specifications. This is the perfect tool for helping your visitors find exactly what they’re looking for on your site, and is especially useful for businesses that want to create a customized search experience across their public content without the expense or hassle of developing and hosting their own search infrastructure.
Today we are enhancing our Custom Search offering with the introduction of new output formats and a new API. Now, in addition to using the Custom Search element or the XML API, the new API offers search results using your choice of Atom or JSON syndication formats. To get started, click here to log into the API console and add this API to your project.

Retirement of Older APIs

As part of our ongoing housekeeping of our first-generation APIs, the legacy Web Search API and the Local Search API are being retired, to be phased out over the next three years as per our deprecation policies. We’ll also be tightening up our enforcement of the rate limits for these and the Translate API v1 over the next few months with an eye toward mitigating unauthorized usage, so we encourage everyone to migrate to the new APIs as available on the APIs console, or over to the Custom Search Element, the Translate Element, or the Maps API GoogleBar as your needs dictate.

Looking Forward

We’re excited about the opportunities that the new APIs console and this first batch of APIs built on our new API architecture will offer to developers. Even though we’ve been building APIs for several years now and are quickly approaching 100 tools, products, and APIs for developers, we still feel like we’re just getting warmed up. We’d love to hear your feedback on the new Google API console and our newest APIs — please let us know what you think.