Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo Turbocharging web sites with new PageSpeed Service optimizations 2013

Seo Master present to you:
Kishore
Rahul
By Rahul Bansal and Kishore Simbili, PageSpeed Team

We spend a lot of time working to make the web faster. Last year, we introduced PageSpeed Service, an online service that automatically speeds up loading of web pages.

We are constantly working on new optimizations (rewriters) that can make pages load even faster. Along these lines, we are introducing a new rewriter called "Cache and Prioritize Visible Content". This rewriter enables users to start interacting with the web page and consuming the content much sooner. It accomplishes this by optimizing the page as a whole using the following web page-aware techniques and with minimal configuration needed:
  • Make HTML cacheable. Typically, most web pages are not cached because they contain small amounts of personalized information or other non-cacheable data. This rewriter separates the non-cacheable portions from the HTML and enables caching for the rest of the content on PageSpeed servers. When the page is loaded, PageSpeed servers send the cacheable parts immediately while non-cacheable parts are fetched from the origin server & patched into the browser later.
  • Prioritize visible content rendering. Rendering of a modern web page requires several network resources, but not all of them are needed right away. This rewriter automatically determines and prioritizes the content that is above the fold of the browser, so that it doesn’t have to compete with the rest of the page.
  • Defer Javascript. JavaScript execution is deferred until page load so that it doesn’t block rendering of visible content.

Early deployment of these techniques has shown significant improvements in user-perceived page load times. Below is a filmstrip view that compares the loading of pages on Power Line, a US-based political commentary website.


Joe Malchow, Publisher of Power Line says "With this rewriter the most important bytes, our content, load first and fast. To our readers, Power Line appears to be completely instantaneous, prompting deeper and lengthier reading sessions and more profound engagement with the site."

This rewriter works best when the page content is mostly generated on the server rather than via Javascript and only small portions of it are personalized. To see how this rewriter would benefit your site, you can check it out here. If you are satisfied with the results, you can sign up for PageSpeed Service here. If you already use PageSpeed Service, you can find more details about enabling this rewriter here. This rewriter will also be available to App Engine users of PageSpeed Service in the near future.


Rahul Bansal and Kishore Simbili are Software Engineers on Google’s PageSpeed Team in Bangalore, India, which is dedicated to making the web faster.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Measure and optimize with mod_pagespeed experiments 2013

Seo Master present to you: Author Photo
By Jeff Kaufman, Software Engineer, PageSpeed Team

Making your site fast shouldn’t require lots of manual optimization. With mod_pagespeed, an open-source Apache module, you can automatically apply web performance optimization best practices like cache extension, image optimization, and css inlining to speed up your site without a lot of hassle. As of version 0.10.22.4, mod_pagespeed now supports A/B tests integrated with Google Analytics, allowing you to measure how much it speeds up your site on live traffic and experimentally determine the best settings.

When running an experiment, mod_pagespeed randomly assigns visitors to experimental configurations based on percentages you choose. You can run an experiment on 1% of your traffic, 100%, or anywhere in between without affecting other visitors. It also injects JavaScript to report experiment assignments back to your Google Analytics account in a custom variable. Within Analytics you can track the impact of experimental configurations on page load times, bounce rates, conversions, or any other Analytics metric.

We ran an example experiment, comparing mod_pagespeed running with default settings to mod_pagespeed in pass-through mode, on a small blog. This required adding the following lines to our pagespeed.conf:
ModPagespeedRunExperiment on
ModPagespeedAnalyticsID "UA-XXXXXXXX-Y"

# half the users get the pagespeed optimizations
ModPagespeedExperimentSpec id=3;percent=50;default

# half get an unoptimized site
ModPagespeedExperimentSpec id=4;percent=50
While this site was static and contained mostly text, it did use some JavaScript and images and had not been manually optimized. We ran the experiment for a month, over which Analytics observed 11K page views, and we saw a 20% improvement in average page load time:


experiment results

Average page load time is sensitive to outliers, however, so to better understand the effects it’s helpful to check a histogram:


detailed experiment results

The clearest change is that mod_pagespeed moved about 7% of page loads from taking 1-3 seconds down to 0-1 second, but there is also an improvement in the long tail.

We encourage you to follow the experiment framework guide and start measuring the effect mod_pagespeed has on your site.


Jeff Kaufman works on mod_pagespeed, an open-source Apache module that helps make the web faster, and is interested in experiment measurement. He also plays for contra dances, organizes other dances, and blogs about dancing, giving, and tech.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Page Speed Service: Web performance, delivered. 2013

Seo Master present to you:
By Ram Ramani, Engineering Manager

Update 7/29/11: We were notified of a bug in the measurement tool that sometimes causes incorrect measurements. If your results indicated a slowdown on your pages, please run the tests again, and make sure you specify a fully qualified domain such as www.example.com. We apologize for any inconvenience and confusion this may have caused.

Details:
Measurement tests run for bare domains (such as example.com, without the prefix www) previously indicated that pages were loading more slowly, rather than speeding up, when using Page Speed Service. The test results page now prominently notifies you of this when you visit this page, if this error applies to you. Please check your old measurement results page if this bug applies to you. Running the tests again with the fully qualified domain such as www.example.com usually fixes the issue and gives you the correct measurement.


Two years ago we released the Page Speed browser extension and earlier this year the Page Speed Online API to provide developers with specific suggestions to make their web pages faster. Last year we released mod_pagespeed, an Apache module, to automatically rewrite web pages. To further simplify the life of webmasters and to avoid the hassles of installation, today we are releasing the latest addition to the Page Speed family: Page Speed Service.

Page Speed Service is an online service that automatically speeds up loading of your web pages. To use the service, you need to sign up and point your site’s DNS entry to Google. Page Speed Service fetches content from your servers, rewrites your pages by applying web performance best practices, and serves them to end users via Google's servers across the globe. Your users will continue to access your site just as they did before, only with faster load times. Now you don’t have to worry about concatenating CSS, compressing images, caching, gzipping resources or other web performance best practices.

In our testing we have seen speed improvements of 25% to 60% on several sites. But we know you care most about the numbers for your site, so check out how much Page Speed Service can speed up your site. If you’re encouraged by the results, please sign up. If not, be sure to check back later. We are diligently working on adding more improvements to the service.

At this time, Page Speed Service is being offered to a limited set of webmasters free of charge. Pricing will be competitive and details will be made available later. You can request access to the service by filling out this web form.

Ram Ramani is an Engineering Manager on the Make the Web Faster Team in Bangalore, India. He is a believer in "Faster is better".

Posted by Scott Knaster, Editor

2013, By: Seo Master

seo Lightning fast! Performance tips for using Google APIs 2013

Seo Master present to you:
Sven
Anton
By Anton Lopyrev and Sven Mawson, Google Developer Team

Over a year ago, we launched support for partial response and partial update for a number of APIs based on the Google Data Protocol. That launch was a part of our continuous effort to make the web faster. It was well received by our developer community as it significantly reduced network, memory, and CPU resources needed to work with certain Google APIs.

Today, we are adding support for partial response and an improved version of partial update, called patch, to a number of newer APIs such as Buzz, URL Shortener, Tasks and many others. In fact, all APIs available in the Google APIs Discovery Service and the APIs Explorer now support this feature.

To learn how to use partial response and partial update with a Google API, you can see the “Performance Tips” page in the documentation of the Tasks and Buzz APIs. We’ll roll out this page for all of the supported APIs over the next few months, but you can already use the algorithms with all of them today.

The partial response algorithm is identical to what was provided by the Google Data Protocol. By supplying a fields query parameter to any API call that returns data, you can request specific fields. Here is an example request that returns only titles and timestamps of a user’s public Buzz activities:
https://www.googleapis.com/buzz/v1/activities/antonlopyrev@gmail.com/@public?alt=json&pp=1&fields=items(title,updated)
Given that the full response is around 53KB and the partial response is only 3KB, the data sent to the client is reduced by almost 95%!

While the partial response algorithm is unchanged, the partial update algorithm has changed significantly compared to what was provided by Google Data Protocol. We’ve received feedback that the old algorithm was too complicated and hard to use, which prompted us to design something much simpler. The basics remain the same: you can use the HTTP PATCH verb in supported API methods to send partial updates to Google servers. However, the mechanics are different. Adding and modifying data uses the same 'merge' semantics as before. But deleting is simplified; just set a field to 'null'. Of course, the devil is in the details, so please check out the documentation for the nitty gritty.

You can try out both partial response and patch algorithms in the APIs Explorer. For partial responses, the fields parameter is available for most methods. In addition, the partial update methods are denoted by .patch in the method name. You can try both the fields parameter and the patch method on the “tasklist” resource in the APIs explorer.

If you are using Java or Python client libraries to access Google APIs, you can already ask for partial responses and send patch requests in the code. We are adding partial support to the rest of the Google APIs client libraries over time.

As our APIs get more and more use from devices with limited resources, taking advantage of performance optimizations such as partial response and patch is crucial for making your applications faster and more efficient. By using these features in your applications, you are joining us in our effort to make the web faster. For this, we thank you! Let us know of any issues and feature requests by posting to the developer forums of your favorite APIs or by leaving a comment on this post. Happy hacking!

Anton Lopyrev is an Associate Product Manager for Google APIs Infrastructure. He is a computer graphics enthusiast who is also passionate about product design.

Sven Mawson is a Software Engineer working on Google’s API Infrastructure. He believes well-designed, beautiful APIs need not sacrifice performance.


Posted by Scott Knaster, Editor

2013, By: Seo Master

seo EdgeCast Networks makes the web faster with Google’s mod_pagespeed 2013

Seo Master present to you:
Hayes
Josh
by Joshua Marantz, Google PageSpeed Team, and Hayes Kim, EdgeCast Networks

At Google we want the whole web to be faster. We've built a fast browser, improved image encodings, developed better network protocols, and provided PageSpeed tools and optimization libraries. In November 2010, we launched mod_pagespeed, an open-source Apache module that speeds up web sites by rewriting HTML, JavaScript, CSS and images to reduce size, eliminate HTTP requests, and improve browser performance.

mod_pagespeed adoption is growing rapidly. Now EdgeCast Networks, one of the world’s largest CDN operators, has integrated mod_pagespeed into the core of its content delivery network and is making it available as an option in its Application Delivery Network (ADN) service offering.

Hayes Kim, EdgeCast Senior Product Manager, had this to say: "Edgecast has integrated mod_pagespeed alongside our HTTP engine and deployed this to our ADN edge locations worldwide. Our solution enables optimizations in real-time and local to the end user, leveraging the full compute capacity of our edge nodes. We leverage the local edge caches for the unoptimized resources and then cache the subsequent optimized resources processed by mod_pagespeed. EdgeCast's integration can speed up millions of websites either served directly by EdgeCast or indirectly through hosting providers using our technology."

Hayes says that early results show up to a 77% pageview performance improvement when leveraging the ADN service with mod_pagespeed, and a 33% performance improvement from mod_pagespeed alone.

Gogotech, an e-commerce solution provider, has been evaluating EdgeCast's ADN and edge optimizer services, with promising results so far. "This solution looks to be a strong contender for further improving our offerings to Gogotech clients, and we are looking forward to seeing it develop," said Alex Bolduc, IT Director at Gogotech.

The following images and this video show how mod_pagespeed and EdgeCast's ADN are speeding up a Gogotech site.

comparative graph showing improvement with pagespeed

You can find more details about EdgeCast's mod_pagespeed integrated offerings here. And you can find information on Google’s PageSpeed technologies and tools here.


Joshua Marantz is a Software Engineer on Google’s Pagespeed Automatic team in Cambridge, MA, which is dedicated to making the web faster for everyone. Josh has been working on making software run fast for several decades, at Google and before that on accelerated chip simulation.

Hayes Kim has over eleven years of product development and leadership experience in online advertising, e-commerce, web acceleration, and social media. At EdgeCast, Hayes manages the development of the core HTTP technology that powers the CDN and Application Delivery Network.


Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Let's make the web faster 2013

Seo Master present to you: From building data centers in different parts of the world to designing highly efficient user interfaces, we at Google always strive to make our services faster. We focus on speed as a key requirement in product and infrastructure development, because our research indicates that people prefer faster, more responsive apps. Over the years, through continuous experimentation, we've identified some performance best practices that we'd like to share with the web community on code.google.com/speed, a new site for web developers, with tutorials, tips and performance tools.

We are excited to discuss what we've learned about web performance with the Internet community. However, to optimize the speed of web applications and make browsing the web as fast as turning the pages of a magazine, we need to work together as a community, to tackle some larger challenges that keep the web slow and prevent it from delivering its full potential:
  • Many protocols that power the Internet and the web were developed when broadband and rich interactive web apps were in their infancy. Networks have become much faster in the past 20 years, and by collaborating to update protocols such as HTML and TCP/IP we can create a better web experience for everyone. A great example of the community working together is HTML5. With HTML5 features such as AppCache, developers are now able to write JavaScript-heavy web apps that run instantly and work and feel like desktop applications.

  • In the last decade, we have seen close to a 100x improvement in JavaScript speed. Browser developers and the communities around them need to maintain this recent focus on performance improvement in order for the browser to become the platform of choice for more feature-rich and computationally-complex applications.

  • Many websites can become faster with little effort, and collective attention to performance can speed up the entire web. Tools such as Yahoo!'s YSlow and our own recently launched Page Speed help web developers create faster, more responsive web apps. As a community, we need to invest further in developing a new generation of tools for performance measurement, diagnostics, and optimization that work at the click of a button.

  • While there are now more than 400 million broadband subscribers worldwide, broadband penetration is still relatively low in many areas of the world. Steps have been taken to bring the benefits of broadband to more people, such as the FCC's decision to open up the white spaces spectrum, for which the Internet community, including Google, was a strong champion. Bringing the benefits of cheap reliable broadband access around the world should be one of the primary goals of our industry.
To find out what Googlers think about making the web faster, see the video below. If you have ideas on how to speed up the web, please share them with the rest of the community. Let's all work together to make the web faster!





(Cross-posted on the Official Google Blog, and the Google Webmaster Central Blog)2013, By: Seo Master

seo Tracking performance with HTTP Archive 2013

Seo Master present to you:
By Arvind Jain, Make the Web Faster Team

At Google, we put a lot of effort into making the web faster. To understand the impact of our work, we need to track the speed of the web over time. HTTP Archive allows us to do that.

HTTP Archive generates regular reports illustrating trends such as page size and Page Speed score of the top pages on the web. Interested users can download the raw dataset for free, modify the source code to perform their own analyses, and unearth valuable trends.

HTTP Archive crawls the world’s top 18,000 URLs, with a plan to increase that number to a million or more in the coming months.

Google engineers built HTTP Archive as an open source service. We are now transitioning the ownership and maintenance of it to the Internet Archive. Google is proud to support the continued development of HTTP Archive and to help create a rich repository of data that developers can use to conduct performance research.

Arvind Jain founded and leads the Make the Web Faster initiative at Google. As part of that initiative, Arvind also started the Instant Pages effort, just announced yesterday.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Gmail for Mobile HTML5 Series: Suggestions for Better Performance 2013

Seo Master present to you: On April 7th, Google launched a new version of Gmail for mobile for iPhone and Android-powered devices. We shared the behind-the-scenes story through this blog and decided to share more of our learnings in a brief series of follow-up blog posts. This week, I'll talk about a few small things you can do to improve performance of your HTML5-based applications. Our focus here will be on performance bottlenecks related to the database and AppCache.

Optimizing Database Performance

There are hundreds of books written about optimizing SQL and database performance, so I won't bother to get into these details, but instead focus on things which are of particular interest for mobile HTML5 apps.

Problem: Creating and deleting tables is slow! It can take upwards of 200 ms to create or delete a table. This means a simple database schema with 10 tables can easily take 2-4 seconds (or more!) just to delete and recreate the tables. Since this often needs to be done at startup time, this really hurts your launch time.

Solution: Smart versioning and backwards compatible schema changes (whenever possible). A simple way of doing this is to have a VERSION table with a single row that includes the version number (e.g., 1.0). For backwards-compatible version changes, just update the number after the decimal (e.g., 1.1) and apply any updates to the schema. For changes that aren't backwards compatible, update the number before the decimal (e.g., 2.0) at which point you can drop all the tables and recreate them all. With a reasonable schema design to begin with, it should be very rare that a schema change is not backwards compatible and even if this happens every month or so, users should get to use your application 20, 30 even 100 times before they hit this startup delay again. If your schema changes very infrequently, a simple 1, 2, 3 versioning scheme will probably work fine; just make sure to only recreate the database when the version changes!

Problem: Queries are slow! Queries are faster than creates and updates, but they can still take 100ms-150ms to execute. It's not uncommon for traditional applications to execute dozens or even hundreds of queries at startup – on mobile this is not an option.

Solution: Defer and/or combine queries. Any queries that can be deferred from startup (or at any other significant point in the application) should be deferred until the data is absolutely needed. Adding 2-3 more queries on a user-driven operation can turn an action from appearing instantaneous to feeling unresponsive. Any queries that are performed at startup should be optimized to require as few hits to the database as possible. For example, if you're storing data about books and magazines, you could use the following two queries to get all the authors along with the number of books and magazine articles they've writen:

SELECT Author, COUNT(*) as NumArticles
FROM Magazines
GROUP BY Author
ORDER BY NumArticles;

SELECT Author, COUNT(*) as NumBooks
FROM Books
GROUP BY Author
ORDER BY NumBooks;


This will work fine, but the additional query will generally cost you about 100-200 ms over a different (albeit less pretty) query like:

SELECT Author, NumPublications, PubType
FROM (
SELECT Author, COUNT(*) as NumPublications, 'Magazine' as PubType, 0 as SortIndex
FROM Magazines
GROUP BY Author
UNION
SELECT Author, COUNT(*) as NumPublications, 'Book' as PubType, 1 as SortIndex
FROM Books
GROUP BY Author
)
ORDER BY SortIndex, NumPublications;

This will return all the entries we want, with the magazine entries first in increasing order of number of articles, followed by the book entries, in increasing order of the number of books. This is a toy example and there are clearly other ways of improving this, such as merging the Magazines and Books tables, but this type of scenario shows up all the time. There's always a trade-off between simplicity and speed when dealing with databases, but in the case of HTML5 on mobile, this trade-off is even more important.

Problem: Multiple updates is slow!

Solution: Use Triggers whenever possible. When the result of a database update requires updating other rows in the database, try to do it via SQL triggers. For example, let's say you have a table called Books listing all the books you own and another called Authors storing the names of all the authors of books you own. If you give a book away, you'll want to remove it from the Books table. However, if this was the only book you owned by that author, you would also want to remove the author from the Authors table. This can be done with two UPDATE statements, but a "better" way is to write a trigger that automatically deletes the author from the Authors table when the last book by this author is removed. This will execute faster and because triggers happen asynchronously in the background, it will have less of an impact on the UI than executing two statements. Here's an example of a simple trigger for this case:

CREATE TRIGGER IF NOT EXISTS RemoveAuthor
AFTER DELETE ON Books
BEGIN
DELETE FROM Authors
WHERE Author NOT IN
(SELECT Author
FROM Books);
END;
We'll get into more detail on triggers and how to use them in another performance post to come.

Optimizing AppCache Performance

Problem: Logging in is slow!

Solution: Avoid redirects to the login page. App-Cache is great because it can launch the application without needing to hit the network, which makes it much faster and allows you to launch offline. One problem you might encounter though, is that the application will launch and then you'll need to hit the network to get some data for the current user. At this point you'll have to check that the user is authenticated and it might turn out that they're not (e.g., their cookies might have expired or have been deleted). One option is to redirect the user to a login page somewhere, allow him to authenticate and then redirect him back to the application. Regardless of whether or not the login page is listed in the manifest, when it redirects back to your application, the entire application will reload. A nicer approach is for the application itself to display an authentication interface which sends the credentials and does the authentication seamlessly in the background. This will avoid any additional reloads of the application and makes everything feel faster and better integrated.

Problem: AppCache reloading causes my app to be slow!

Solution: List as few URLs in the manifest as possible. In a series of posts on code.google.com, we talked about the HTML5 AppCache manifest file. An important aspect of the manifest file is that when the version gets updated, all the URLs listed in the file are fetched again. This happens in the background while the user is using the application, but opening all these network connections and transferring all that data can cause the application to slow down considerably during this process. Try to setup your application so that all the resources can be fetched from as few URLs as possible to speed up the manifest download and minimize this effect. Of course you could also just never update your manifest version, but what's the point of having rapid development if you never make any changes?


That's a brief intro to some performance considerations when developing HTML5 applications. These are all issues that we ran into ourselves and have either fixed or are in the process of fixing in our application. I hope this helps you to avoid some of the issues we ran into and makes your application blazing fast!

We plan to write several more performance related posts in the future, but for now stay tuned for next post where we'll discuss the cache pattern for building offline capable web applications.



Previous posts from Gmail for Mobile HTML5 Series
HTML5 and Webkit pave the way for mobile web applications
Using AppCache to launch offline - Part 1
Using AppCache to launch offline - Part 2
Using AppCache to launch offline - Part 3
A Common API for Web Storage
2013, By: Seo Master

seo Page Speed for ads and trackers 2013

Seo Master present to you: At Google, we're passionate about making the web faster. To help web page owners optimize their pages for speed, we open-sourced the Page Speed web performance tool a year ago. Today, we're excited to launch a new Page Speed feature: Page Speed for ads, such as display and rich media ads, and trackers, also known as analytics.

Page Speed now enables developers to run a performance analysis of the ads, the trackers, or the remaining content of the page. Web developers can use Page Speed to determine how ads and trackers impact the performance of their web pages, and ad and tracker providers can use this feature to tune their services for speed.

For instance, when analyzing an example web page, Page Speed displays several suggestions that we can apply to make the page faster:


But which of these suggestions applies to the content on the page that we authored? Which apply to the ads and trackers? Using the "Analyze"menu, we can determine that, in this example, the ads are contributing to slowing down the page:


When we switch to analyze the content of the page, the score for the page improves to 93. We can in this case enable compression for the resource that is served uncompressed currently.


We hope that you try these and other new features and rules of Page Speed and find them useful to further optimize the speed of your web pages.

Please share your experience using this new feature in our discussion forum.

2013, By: Seo Master

seo Nicholas C. Zakas: Speed Up Your JavaScript 2013

Seo Master present to you: Nicholas C. Zakas delivers the seventh Web Exponents tech talk at Google. Nicholas is a JavaScript guru and author working at Yahoo!. Most recently we worked together on my next book, Even Faster Web Sites. Nicholas contributed the chapter on Writing Efficient JavaScript, containing much of the sage advice found in this talk. Check out his slides and watch the video.



Nicholas starts by asserting that users have a greater expectation that sites will be fast. Web developers need to do most of the heavy lifting to meet these expectations. Much of the slowness in today's web sites comes from JavaScript. In this talk, Nicholas gives advice in four main areas: scope management, data access, loops, and DOM.

Scope Management: When a symbol is accessed, the JavaScript engine has to walk the scope chain to find that symbol. The scope chain starts with local variables, and ends with global variables. Using more local variables and fewer global variables results in better performance. One way to move in this direction is to store a global as a local variable when it's referenced multiple times within a function. Avoiding with also helps, because that adds more layers to the scope chain. And make sure to use var when declaring local variables, otherwise they'll end up in the global space which means longer access times.

Data Access: In JavaScript, data is accessed four ways: as literals, variables, object properties, and array items. Literals and variables are the fastest to access, although the relative performance can vary across browsers. Similar to global variables, performance can be improved by creating local variables to hold object properties and array items that are referenced multiple times. Also, keep in mind that deeper object property and array item lookup (e.g., obj.name1.name2.name3) is slower.

Loops: Nicholas points out that for-in and for each loops should generally be avoided. Although they provide convenience, they perform poorly. The choices when it comes to loops are for, do-while, and while. All three perform about the same. The key to loops is optimizing what is performed at each iteration in the loop, and the number of iterations, especially paying attention to the previous two performance recommendations. The classic example here is storing an array's length as a local variable, as opposed to querying the array's length property on each iteration through a loop.

DOM: One of the primary areas for optimizing your web application's interaction with the DOM is how you handle HTMLCollection objects: document.images, document.forms, etc., as well as the results of calling getElementsByTagName() and getElementsByClassName(). As noted in the HTML spec, HTMLCollections "are assumed to be live meaning that they are automatically updated when the underlying document is changed." Any idea how long this code takes to execute?

var divs = document.getElementsByTagName("div");
for (var i=0; i < divs.length; i++) {
var div = document.createElement("div");
document.body.appendChild(div);
}

This code results in an infinite loop! Each time a div is appended to the document, the divs array is updated, incrementing the length so that the termination condition is never reached. It's best to think of HTMLCollections as live queries instead of arrays. Minimizing the number of times you access HTMLCollection properties (hint: copy length to a local variable) is a win. It can also be faster to copy the HTMLCollection into a regular array when the contents are accessed frequently (see the slides for a code sample).

Another area for improving DOM performance is reflow - when the browser computes the page's layout. This happens more frequently than you might think, especially for web applications with heavy use of DHTML. If you have code that makes significant layout changes, consider making the changes within a DocumentFragment or setting the className property to alter styles.

There is hope for a faster web as browsers come equipped with JIT compilers and native code generation. But the legacy of previous, slower browsers will be with us for quite a while longer. So hang in there. With evangelists like Nicholas in the lead, it's still possible to find your way to a fast, efficient web page.


Check out other blog posts and videos in the Web Exponents speaker series:
2013, By: Seo Master

seo Introducing Page Speed 2013

Seo Master present to you: At Google, we focus constantly on speed; we believe that making our websites load and display faster improves the user's experience and helps them become more productive. Today, we want to share with the web community some of the best practices we've used and developed over the years, by open-sourcing Page Speed.

Page Speed is a tool we've been using internally to improve the performance of our web pages -- it's a Firefox Add-on integrated with Firebug. When you run Page Speed, you get immediate suggestions on how you can change your web pages to improve their speed. For example, Page Speed automatically optimizes images for you, giving you a compressed image that you can use immediately on your web site. It also identifies issues such as JavaScript and CSS loaded by your page that wasn't actually used to display the page, which can help reduce time your users spend waiting for the page to download and display.

Page Speed's suggestions are based on a set of commonly accepted best practices that we and other websites implement. To help you understand the suggestions and rules, we have created detailed documentation to describe the rationale behind each of the rules. We look forward to your feedback on the Webmaster Help Forum.

We hope you give Page Speed a try.

2013, By: Seo Master

seo Page Speed Online has a shiny new API 2013

Seo Master present to you:
Andrew
Richard
By Andrew Oates and Richard Rabbat, Page Speed Team

A few weeks ago, we introduced Page Speed Online, a web-based performance analysis tool that gives developers optimization suggestions. Almost immediately, developers asked us to make an API available to integrate into other tools and their regression testing suites. We were happy to oblige.

Today, as part of Google I/O, we are excited to introduce the Page Speed Online API as part of the Google APIs. With this API, developers now have the ability to integrate performance analysis very simply in their command-line tools and web performance dashboards.

We have provided a getting started guide that helps you to get up and running quickly, understand the API, and start monitoring the performance improvements that you make to your web pages. Not only that, in the request, you’ll be able to specify whether you’d like to see mobile or desktop analysis, and also get Page Speed suggestions in one of the 40 languages that we support, giving API access to the vast majority of developers in their native or preferred language.

We’re also pleased to share that the WordPress plugin W3 Total Cache now uses the Page Speed Online API to provide Page Speed suggestions to WordPress users, right in the WordPress dashboard. “The Page Speed tool itself provides extremely pointed and valuable insight into performance pitfalls. Providing that tool via an API has allowed me to directly correlate that feedback with actionable solutions that W3 Total Cache provides.” said Frederick Townes, CTO Mashable and W3 Total Cache author.

Take the Page Speed Online API for a spin and send us feedback on our mailing list. We’d love to hear your experience integrating the new Page Speed Online API.


Andrew Oates is a Software Engineer on the Page Speed Team in Google's Cambridge, Massachusetts office. You can find him in the credits for the Pixar film Up.

Richard Rabbat is the Product Management Lead on the "Make the Web Faster" initiative. He has launched Page Speed, mod_pagespeed and WebP. At Google since 2006, Richard works with engineering teams across the world.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo The Future of Web Performance at Google I/O: JavaScript 2013

Seo Master present to you:

This post is one in a series that previews Google I/O, our biggest developer event in San Francisco, May 28-29. Over the next month, we'll be highlighting sessions and speakers to give Google Code Blog readers a better sense of what's in store for you at the event. - Ed.

In April I announced that I'm starting another book. The working title is High Performance Web Sites, Part 2. This book contains the next set of web performance best practices that goes beyond my first book and YSlow. Here are the rules I have so far:
  1. Split the initial payload
  2. Load scripts without blocking
  3. Don't scatter scripts
  4. Split dominant content domains
  5. Make static content cookie-free
  6. Reduce cookie weight
  7. Minify CSS
  8. Optimize images
  9. Use iframes sparingly
  10. To www or not to www
I'm most excited about the best practices for improving JavaScript performance (rules 1-3). Web sites that are serious about performance are making progress on the first set of rules, but there's still a lot of room for improving JavaScript performance. Across the ten top U.S. sites approximately 40% of the time to load the page is spent downloading and executing JavaScript, and only 26% of the JavaScript functionality downloaded is used before the onload event.

In my session at Google I/O I'll present the research behind rules 1-3, talk about how the ten top U.S. web sites perform, demonstrate Cuzillion, and give several takeaways that you can use to make your web site faster.2013, By: Seo Master

seo Rob Campbell: Debugging and Testing the Web with Firebug 2013

Seo Master present to you: The sixth Web Exponents tech talk features Rob Campbell's presentation on Firebug. Rob works at Mozilla. He's one of the developers that Mozilla dedicated to the Firebug effort last July. Rob is one of the main drivers of the Firebug project, starting and heading up the weekly concalls, and closely tracking bugs and releases. As one of the founders of the Firebug Working Group, I'm excited to see Mozilla taking a more active role in Firebug. The benefits are clear as we see more features and greater stability with each Firebug release. Here's the video of Rob's presentation as well as a link to his slides.



Rob starts by highlighting what's new in Firebug 1.4 alpha. It's a joy for me to see that activation (enabling and disabling) has been simplified. Rob points out that the firebug icon serves also as a menu. One of the menu items is "Open With Editor", which developers will find useful for saving changes to their pages. A much needed UI change is flipping the tabs and buttons. The tabs used to be below the buttons. Putting them at the top is closer to what users expect from working with other tabbed UIs.

The new "pause" button will be useful for anyone debugging JavaScript. This implements "break on next" functionality, making it easier to stop when event handlers are called. Firebug's Net Panel has had significant improvements. The UI is better (colors!), but there's even more. The underlying timing information has been improved to give more accurate results. There are also markers for DOMContentLoaded and OnLoad, to show where those fire in relation to network requests.

Firebug Extensions provide a way for developers to add functionality that can be shared with others. Rob mentions several extensions including:Writing an extension is a great way to explore future directions for Firebug.

Rob talks about future roadmap. Firebug 1.5 will focus on extensions - making them easier to build and use. Firebug 1.6 will change the underlying JavaScript debugging mechanism in Firefox to support new features. Add Rob's blog to your RSS reader to find out about these future releases and other improvements to Firebug.

2013, By: Seo Master

seo Measure page load time with Google Analytics 2013

Seo Master present to you:
Zhiheng
Phil
By Zhiheng Wang, Make the Web Faster Team, and Phil Mui, Google Analytics Team

At Google, we’re passionate about speed and making the web faster, and we’re glad to see that many website owners share the same idea. A faster web is better for both users and businesses. A slow-loading landing page not only impacts your conversion rate, but can also impact AdWords Landing Page Quality and ranking in Google search.

To improve the performance of your pages, you first need to measure and diagnose the speed of a page, which can be a difficult task. Furthermore, even with page speed measurements, it’s critical to look at page speed in the context of other web analytics data.

Therefore, we are thrilled to announce the availability of the Site Speed report in Google Analytics. With the Site Speed report you can measure the page load time across your site right within your Google Analytics account.


Uses for the Site Speed report

With the Site Speed report, not only will you be able to monitor the speed of your pages, you can also analyze it along with other analytics data, such as:
  • Content: Which landing pages are slowest?
  • Traffic sources: Which campaigns correspond to faster page loads overall?
  • Visitor: How does page load time vary across geographies?
  • Technology: Does your site load faster or slower for different browsers?

Setting up the Site Speed report

For now, page speed measurement is turned off by default, so you’ll only see 0s in the Site Speed report until you’ve enabled it. To start measuring site speed, you need to make a small change to your Analytics tracking code. We have detailed instructions in the Site Speed article in the Analytics Help Center. Once you’ve updated your tracking code, a small sample of pageviews will be used to calculate the page load time.

Bringing the Site Speed report into Google Analytics is an important step of the Make the Web Faster effort, and we look forward to your feedback on Site Speed.


Zhiheng Wang spends most of his time at work building stuff so others can serve the web better. He spends the rest of his time at home fixing stuff so his family can surf the web better.

Phil Mui is the Group Product Manager of Google Analytics and has been leading its development since its early days. He has a Ph.D. from MIT and a M.Phil. from Oxford where he was a Marshall Scholar.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Web Exponents 2013

Seo Master present to you: Over the last few months, I've started inviting web gurus I know to give tech talks at Google. It's been great to kick off this speaker series with such luminaries as John Resig, Doug Crockford, and PPK. (I snuck in there, too.) The biggest benefit from these talks is the release of the videos and slides. The videos are popular, with thousands of viewers. The videos make it possible to share these insights with a wider audience who might not be at the next tech conference or workshop.

For various reasons (t-shirts, YouTube playlist), it's beneficial to give this speaker series a name. I've decided to call it Web Exponents. I use "exponents" in the sense of "a person who actively supports or favors a cause". The cause, in this case, is evangelizing innovation and best practices in web development. And now, thanks to fellow Googler Mark Chow, we have the Web Exponents playlist on YouTube. You can find all the past and future videos there.

Web Exponents speakers coming up next include Rob Campbell (Firebug) and Nicholas Zakas (Yahoo! JavaScript expert). I'll write a blog post for these and other future talks, and you can subscribe to the playlist to make sure you catch all the videos.

And now for the mandatory cheesy tagline: Web Exponents - raising web technology to a higher power.

2013, By: Seo Master

seo SPDY performance on mobile networks 2013

Seo Master present to you:
Michael
Ben
Matt
By Matt Welsh, Ben Greenstein, and Michael Piatek,
Mobile Web Performance Team


SPDY is a replacement for HTTP, designed to speed up transfers of web pages by eliminating much of the overhead associated with HTTP. SPDY supports several optimizations that give it an edge over HTTP when it comes to speed. SPDY is gaining a great deal of traction -- it has been implemented in Chrome, Firefox, and Amazon Silk, has been deployed widely by Google, and there is now SPDY support for Apache through the mod_spdy module.

We wondered what the performance of SPDY would be compared to HTTP for popular websites, using a Samsung Galaxy Nexus (running Android), a modern, SPDY-enabled browser (Chrome for Android), and a variety of pages from real websites (77 pages across 31 popular domains).

The net result is that using SPDY produced a mean page load time improvement of 23% across these sites, compared to HTTP. This is equivalent to a speedup of 1.3x for SPDY over HTTP. Much more work can be done to improve SPDY performance on 3G and 4G cellular networks, but this is a promising start.

The following graph shows the page load time for HTTP and SPDY, in milliseconds, across the 77 pages that were measured. As the graph shows, in all but one case, SPDY reduces load times, sometimes by as much as 50%.


Check out the full article for more details on the measurement methodology and results.


Matt Welsh, Ben Greenstein, and Michael Piatek are software engineers on Google’s Mobile Web Performance Team based in Seattle. They are working to speed up mobile web performance globally, and as part of their jobs, they run up impressive mobile bandwidth bills every month.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Cuzillion: Check your zillion web pages 2013

Seo Master present to you:

Steve Souders, member of the performance group at Google, has released a new open source tool called Cuzillion. Steve was constantly creating sample test web pages that he used to test out theories on Web site performance. He realized that he was repeating a lot of the same steps, so why not create a tool that would enable him to build the samples quickly. Thus, Cuzillion was born.



If you take a look at the UI above, you will see that it is mimicking a Web page, with a <HEAD> and <BODY>. On the left hand side you select types of elements; such as images, scripts, CSS, and other resources. You add these elements to the mini page on the right, and then you can select that element to set more properties on it. For example, you can quickly set the domain that it is running on, which allows you to test splitting our content on domains.

We sat down with Steve and produced the video below in two parts. It starts off with him discussing the project, and then delves into a screencast of the product itself. He gives us an introduction, and then shows how he used it to solve an issue with Orkut.

2013, By: Seo Master

seo Speed up your sites with PageSpeed for Nginx 2013

Seo Master present to you: Author Photo
By Jeff Kaufman, Software Engineer, Make the Web Faster Team

When we released mod_pagespeed in 2010, we gave webmasters a way to speed up their sites without needing to become web performance optimization experts. As an Apache module, however, it was unavailable to sites running Nginx, the popular high performing open source web server that powers many large web sites. Today that changes: we're releasing PageSpeed Beta for Nginx, aka ngx_pagespeed.

Running as a module inside Nginx, ngx_pagespeed rewrites your webpages to make them faster for your users. This includes compressing images, minifying CSS and JavaScript, extending cache lifetimes, and many other web performance best practices. All of mod_pagespeed's optimization filters are now available to Nginx users.

ngx_pagespeed logo
After three months of alpha testing on hundreds of sites, ngx_pagespeed has proven its ability to serve production traffic. It's ready for beta, and it's ready for you to start using it on your site.

MaxCDN, a content delivery network provider, recently published a blog post on their experience testing ngx_pagespeed: “With PageSpeed enabled, we shaved 1.57 seconds from our average page load, dropped our bounce rate by 1%, and our exit percentage by 2.5%. In sum, we squeezed out extra performance with nothing but a few extra lines in our nginx config files... We are continuing to test the module with the PageSpeed team, and our goal is to make it available across our CDN and to all of our customers – stay tuned!”

ZippyKid, a popular WordPress hosting provider, is also one of the early beta testers of ngx_pagespeed: “PageSpeed for ZippyKid is the world’s first WordPress optimization service powered by ngx_pagespeed, designed to automatically apply web performance best practices to deliver fast WordPress sites. Our benchmarks indicate that PageSpeed for ZippyKid will deliver up to a 75% reduction in page sizes and a 50% improvement in page rendering speeds.”

Development of ngx_pagespeed is open source, with contributions by developers from Google, Taobao, We-Amp, and many other individual volunteers. Thanks everyone for helping us reach the Beta milestone!

To start using ngx_pagespeed, follow the installation instructions on GitHub.


Jeff Kaufman works on PageSpeed, an open-source server module that helps make the web faster, and is interested in experiment measurement. He also plays for contra dances, organizes other dances, and blogs about dancing, giving, and tech.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo PageSpeed Service makes mobile sites faster 2013

Seo Master present to you: Author Photo
By Ram Ramani, Engineering Manager

PageSpeed Service (PSS) is an online service to speed up the rendering of your web pages by rewriting and serving them through Google. While PSS’s optimization techniques benefit most platforms and browsers, today I’d like to focus on some of the PSS rewriters that are especially effective on mobile web pages. PageSpeed Service optimizes the web pages in such a way that users can start viewing and interacting with your pages as soon as possible.



Prioritize Critical CSS: To avoid page reflows, modern browsers do not render pages until the CSS is downloaded and parsed. These CSS files are often tens of KBs because they include all the styles needed for the entire site. These blocking requests are especially bad on mobile devices, where network round trip times are high. The Prioritize Critical CSS rewriter speeds up rendering by identifying the minimal CSS required to render that page and including it in the HTML file. This not only saves an extra round trip to download additional files but also reduces the CPU consumed by the browser. Finally, a reference to the original CSS file is included at the end of the page to lazy-load the non-critical CSS.

Defer JavaScript: The HTML specification requires the browser to stop, download, and execute each synchronous JavaScript file before proceeding to build and render the page - this requirement can significantly slow down rendering. PSS circumvents this behavior by rewriting the HTML to defer execution of all JavaScript until after the page is first rendered. This benefits pages that are mostly rendered via HTML markup rather than JavaScript.

Optimize Images: Mobile screens are almost always smaller than their desktop counterparts. Large, high quality images translate to excessive bytes on the wire, slowing down page loads. PSS can resize images on the server to fit required dimensions and re-compress them to the optimal format, without perceptible visual loss. For very large images above the fold, PSS can also inline a low quality preview image for initial rendering. Once the rest of the page content loads, it is replaced by the original image, creating a seamless experience. Furthermore, images below the fold can be lazy-loaded, which prevents them from competing with the rest of the page load.

PageSpeed Service includes several rewriters that speed up the rendering of web pages. Using PageSpeed Service, the mobile pages of TopNewsToday and Net1News are now 61% faster and 68% faster respectively. Alex Tsvetanov of TopNews Today says, “With Google PageSpeed Service, we increased our unique visitors and total pageviews by 100%, while reducing our bounce rate by 30%”. Massimo Romanello, CEO of Net1News says, "Thanks to Google PageSpeed Service, we have been able to reach 200,000 unique daily visitors with the same existing infrastructure and have made our site one of the quickest in the news sector".

PageSpeed takes just a few minutes to set up and requires no code changes on your site. Check out how much PageSpeed can speed up your site. I encourage you to try out these features by signing up for PageSpeed Service and letting us know what you think at page-speed-service-discuss@googlegroups.com.


Ram Ramani is an Engineering Manager on the Make the Web Faster Team in Mountain View. He is a believer in "Faster is better".

Posted by Scott Knaster, Editor
2013, By: Seo Master
Powered by Blogger.