Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo Mobile Web performance challenges and strategies 2013

Seo Master present to you: Author Photo
By Ramki Krishnan, Technical Program Manager

Consumers are increasingly relying on their mobile devices to access the Web, thrusting mobile web performance into the limelight. Mobile users expect web pages to display on their mobile devices as fast as or faster than on their desktops.

As part of Google’s effort to Make The Web Faster, we invited Guy Podjarny, CTO of Blaze.io, to talk about some of the major performance concerns in the mobile web and ways to alleviate these issues. Guy’s talk focused on Front-End Optimization and highlighted 3 areas: mobile network, software, and hardware. Each of these impacts performance in myriad ways. The full video is available here, and runs just under an hour. If you don’t have time to watch this enlightening talk, this post discusses some key takeaways.

Mobile networks have high latency, and reducing the number of requests and the size of downloads are well-known optimization strategies. Guy also mentions using on-demand image displays such as loading above-the-fold images by default and other images only as they scroll into view. To handle network reliability, he recommends non-blocking requests eliminating single points of failure, with a selective aggregation of files needed for content display. Periodic pinging of the cell tower by the client can also reduce latency associated with dropped connections, but judicious timeouts and battery drain on the mobile device need to be factored in.

Modern mobile browsers are built mobile-friendly, and they can be helped further by exploiting localStorage to store CSS and JavaScript files. Pipelining multiple requests on a connection is an option, but developers need to work around head-of-line blocking by using techniques such as splitting dynamic and static resource requests on different domains.

Mobile hardware CPUs are weaker than their desktop counterparts. Guy points out the need to minimize JavaScript when designing mobile-friendly web pages and avoid reflows or defer JavaScript until after page loads. Clever image rendering techniques such as automatically resizing images to devices and loading full resolution only on zoom can also help.

Guy’s presentation makes clear that mobile web optimizations need to mitigate latencies introduced by mobile networks, software, and hardware. Rapidly changing OSes and browsers add to the challenges facing publishers. New and evolved tools and technologies will help ensure an optimal web browsing experience for mobile users.


Ramki Krishnan works at Google on the "Make The Web Faster" team. When not at work, he dreams of being a tennis pro, a humorist, and a rock drummer all rolled into one.

Posted by Scott Knaster, Editor

2013, By: Seo Master

seo Google Web Toolkit 2.0 - now with Speed Tracer 2013

Seo Master present to you: Tonight at a Google Campfire One we released Google Web Toolkit 2.0, aiming to do two main things for developers:
  • Make it easier to build faster apps
  • Speed up the overall development cycle
This is a very exciting release because it's the cumulation of a year and a half working with teams like Google Wave, AdWords, and Orkut (among many others inside and outside of Google) to evolve GWT to meet the needs of today's web applications. There are many features and improvements, but let me call out three which we're especially excited about.

Faster Apps

Introducing: Performance profiling with Speed Tracer
The first thing you'll notice in 2.0 is that we've added a new tool called Speed Tracer. Speed Tracer is a performance profiler for Google Chrome that allows developers to see what's going on in a way which hasn't been possible before. We've worked closely with the Webkit community to add instrumentation in the browser to enable developers to gain deep insights into how code behaves, uncovering problems which have been hidden up till now.

Introducing: Incremental app download with code splitting
Another feature we've added into Google Web Toolkit is developer-guided code splitting. Code splitting allows a developer to split up their application for much, much faster startup times. Imagine if you have a settings page that users go to once a week. Why download that JavaScript when the application starts up? With code splitting, your users download just the JavaScript they need to get started.

Faster Development

Introducing: Declarative UI with UiBinder
UiBinder is a new declarative UI framework in Google Web Toolkit which enables rapid design iteration and a clean separation between presentation layer and application logic.

Dive into the details and more features in GWT 2.0.



2013, By: Seo Master

seo Introducing Google Public DNS: A new DNS resolver from Google 2013

Seo Master present to you: Today, as part of our efforts to make the web faster, we are announcing Google Public DNS, a new experimental public DNS resolver.

The DNS protocol is an important part of the web's infrastructure, serving as the Internet's "phone book". Every time you visit a website, your computer performs a DNS lookup. Complex pages often require multiple DNS lookups before they complete loading. As a result, the average Internet user performs hundreds of DNS lookups each day, that collectively can slow down his or her browsing experience.

We believe that a faster DNS infrastructure could significantly improve the browsing experience for all web users. To enhance DNS speed but to also improve security and validity of results, Google Public DNS is trying a few different approaches that we are sharing with the broader web community through our documentation:
  • Speed: Resolver-side cache misses are one of the primary contributors to sluggish DNS responses. Clever caching techniques can help increase the speed of these responses. Google Public DNS implements prefetching: before the TTL on a record expires, we refresh the record continuously, asychronously and independently of user requests for a large number of popular domains. This allows Google Public DNS to serve many DNS requests in the round trip time it takes a packet to travel to our servers and back.

  • Security: DNS is vulnerable to spoofing attacks that can poison the cache of a nameserver and can route all its users to a malicious website. Until new protocols like DNSSEC get widely adopted, resolvers need to take additional measures to keep their caches secure. Google Public DNS makes it more difficult for attackers to spoof valid responses by randomizing the case of query names and including additional data in its DNS messages.

  • Validity: Google Public DNS complies with the DNS standards and gives the user the exact response his or her computer expects without performing any blocking, filtering, or redirection that may hamper a user's browsing experience.
We hope that you will help us test these improvements by using the Google Public DNS service today, from wherever you are in the world. We plan to share what we learn from this experimental rollout of Google Public DNS with the broader web community and other DNS providers, to improve the browsing experience for Internet users globally.

To get more information on Google Public DNS you can visit our site, read our documentation, and our logging policies. We also look forward to receiving your feedback in our discussion group.

2013, By: Seo Master

seo Edmunds partners with Google to make the web faster 2013

Seo Master present to you: Note: This is a guest post from Ismail Elshareef, who is the Principal Architect at Edmunds.com. Thanks for the post and for making the web faster Ismail!

In the Fall of 2008, we embarked on a complete redesign of our car enthusiast site, insideline.com. One of the main redesign objectives was to deliver the fastest page load possible to our consumers. Leading up to that point, we have been closely following and implementing the performance best practices championed by Google's Make the Web Faster team and others. We understood the impact performance has on user experience and the bottom line.

Some of the many performance-enhancing features that have been implemented on insideline.com (and now on our beta.edmunds.com) are:
  1. Reducing the number of HTTP requests: We combined CSS and JavaScript files as necessary as well as using sprites and data URIs when appropriate. We have also reduced the number of blocking requests as much as possible to make the pages "feel" faster
  2. Serving static content from different domains: This helped maximize the browser parallel download capacity and made the request payload faster since no cookies were sent over the wire to those domains
  3. Using Expires headers: Caching static files in the client's browser to eliminate unnecessary, redundant requests to our servers
  4. Lazy-loading Page Modules: Render the bare minimum page components first so that the user sees something on the page, and then go through the modules and load them in order of priority. We developed a JavaScript Loader component to help us accomplish that which you can read more on the Edmunds technology blog.
  5. Managing 3rd-party components: iFrame components could be lazy-loaded without a problem. JavaScript components, on the other hand, need to be loaded onto the page before the onLoad event fires. That had the potential of slowing down our pages. The solution we devised was to delay the calling of those components until we initiate the lazy-loading of modules and right before the onLoad event fires
  6. Using non-blocking calls: With the browser being a single thread process, we optimized ways of including resources on the page without affecting page rendering so that the page is perceived to be fast by the user.

The results on insideline.com have been incredbile. Page load time went from 9 seconds on average on the old site to 1.5 seconds on average on the new one, and that's with loading in much richer content onto the page (measured with WebPageTest). We have also seen a 3% increase in ad revenue. On the beta.edmunds.com, which will replace our legacy site fully in December 2010, we have seen a 17% increase in page views and a 2% reduction in the bounce rate for our landing pages in a controlled experiment.

Although we have a long way to go in making our pages and services faster, we are very pleased of the progress we’ve made so far. Working with Google to make the web faster has been an exciting adventure that will continue with more improvements and innovations for both our sites and the web as a whole. Get more details on the Edmunds technology blog and try these enhancements on your site today.

2013, By: Seo Master
Powered by Blogger.