Création des Logiciels de gestion d'Entreprise, Création et référencement des sites web, Réseaux et Maintenance, Conception
Création des Logiciels de gestion d'Entreprise, Création et référencement des sites web, Réseaux et Maintenance, Conception
Bharath |
Jan-Willem |
Joshua |
<link rel="dns-prefetch">
tags to allow the browser to pre-resolve DNS for resources on the page. The waterfall below shows the improvement after inserting the hints.<link rel="dns-prefetch">
is supported on Chrome, Firefox and Internet Explorer.If you’ve used Google Search recently, you may have noticed a new feature that we’re calling Instant Previews. By clicking on the (sprited) magnifying glass icon next to a search result you see a preview of that page, often with the relevant content highlighted. Once activated, you can mouse over the rest of the results and quickly (instantly!) see previews of those search results, too.
Adding this feature to Google Search involved a lot of client-side Javascript. Being Google, we had to make sure we could deliver this feature without slowing down the page. We know our users want their results fast. So we thought we’d share some techniques involved in making this new feature fast.
This is nothing new for Google Search: all our Javascript is compiled to make it as small as possible. We use the open-sourced Closure Compiler. In addition to minimizing the Javascript code, it also re-writes expressions, reuses variables, and prunes out code that is not being used. The Javascript on the search results page is deferred, and also cached very aggressively on the client side so that it’s not downloaded more than once per version.
When you activate Instant Previews, the result previews are requested by your web browser.There are several ways to fetch the data we need using Javascript. The most popular techniques are XmlHttpRequest (XHR) and JSONP. XHR generally gives you better control and error-handling, but it has two drawbacks: browsers caching tends to be less reliable, and only same-origin requests are permitted (this is starting to change with modern browsers and cross-origin resource sharing, though). With JSONP, on the other hand, the requested script returns the desired data as a JSON object wrapped in a Javascript callback function, which in our case looks something like
google.vs.r({"dim":[302,585],"url":"http://example.com",ssegs:[...]}).
Although error handling with JSONP is a bit harder to do compared to XHR (not all browsers support onerror
events), JSONP can be cached aggressively by the browser, and is not subject to same-origin restrictions. This last point is important for Instant Previews because web browsers restrict the number of concurrent requests that they send to any one host. Using a different host for the preview requests means that we don’t block other requests in the page.
There are a couple of tricks when using JSONP that are worth noting:
At this point you are probably curious as to what we’re returning in our JSONP calls, and in particular, why we are using JSON and not just plain images. Perhaps you even used Firebug or your browser’s Developer Tools to examine the Instant Previews requests. If so, you will have noticed that we send back the image data as sets of data URIs. Data URIs are base64 encodings of image data, that modern browsers (IE8+, Chrome, Safari, Firefox, Opera, etc) can use to display images, instead of loading them from a server as usual.
To show previews, we need the image, and the relevant content of the page for the particular query, with bounding boxes that we draw on top of the image to show where that content appears on the page. If we used static images, we’d need to make one request for the content and one request for the image; using JSONP with data URIs, we make just one request. Data URIs are limited to 32K on IE8, so we send “slices” that are all under that limit, and then use Javascript to generate the necessary image tags to display them. And even though base64 encoding adds about 33% to the size of the image, our tests showed that gzip-compressed data URIs are comparable in size to the original JPEGs.
We use caching throughout our implementation, but it’s important to not forget about client-side caching as well. By using JSONP and data URIs, we limit the number of requests made, and also make sure that the browser will cache the data, so that if you refresh a page or redo a query, you should get the previews, well... instantly!
By Matías Pelenur, Instant Previews team2013, By: Seo MasterLast year, as part of Google’s initiative to make the web faster, we introduced Page Speed, a tool that gives developers suggestions to speed up web pages. It’s usually pretty straightforward for developers and webmasters to implement these suggestions by updating their web server configuration, HTML, JavaScript, CSS and images. But we thought we could make it even easier -- ideally these optimizations should happen with minimal developer and webmaster effort.
So today, we’re introducing a module for the Apache HTTP Server called mod_pagespeed to perform many speed optimizations automatically. We’re starting with more than 15 on-the-fly optimizations that address various aspects of web performance, including optimizing caching, minimizing client-server round trips and minimizing payload size. We’ve seen mod_pagespeed reduce page load times by up to 50% (an average across a rough sample of sites we tried) -- in other words, essentially speeding up websites by about 2x, and sometimes even faster.
(Video comparison of the AdSense blog site with and without mod_pagespeed)
Here are a few simple optimizations that are a pain to do manually, but that mod_pagespeed excels at:
We’re working with Go Daddy to get mod_pagespeed running for many of its 8.5 million customers. Warren Adelman, President and COO of Go Daddy, says:
"Go Daddy is continually looking for ways to provide our customers the best user experience possible. That's the reason we partnered with Google on the 'Make the Web Faster' initiative. Go Daddy engineers are seeing a dramatic decrease in load times of customers' websites using mod_pagespeed and other technologies provided. We hope to provide the technology to our customers soon - not only for their benefit, but for their website visitors as well.”
We’re also working with Cotendo to integrate the core engine of mod_pagespeed as part of their Content Delivery Network (CDN) service.
mod_pagespeed integrates as a module for the Apache HTTP Server, and we’ve released it as open-source for Apache for many Linux distributions. Download mod_pagespeed for your platform and let us know what you think on the project’s mailing list. We hope to work with the hosting, developer and webmaster community to improve mod_pagespeed and make the web faster.
By Richard Rabbat, ‘Make the Web Faster’ initiative2013, By: Seo MasterIlya |
Joshua |
Cross-posted from the Chromium Blog
As part of Google’s initiative to make the web faster, over the past few months we have released a number of tools to help site owners speed up their websites. We launched the Page Speed Firefox extension to evaluate the performance of web pages and to get suggestions on how to improve them, we introduced the Speed Tracer Chrome extension to help identify and fix performance problems in web applications, and we released a set of closure tools to help build rich web applications with fully optimized JavaScript code. While these tools have been incredibly successful in helping developers optimize their sites, as we’ve evaluated our progress, we continue to notice a single component of web pages is consistently responsible for the majority of the latency on pages across the web: images.
Most of the common image formats on the web today were established over a decade ago and are based on technology from around that time. Some engineers at Google decided to figure out if there was a way to further compress lossy images like JPEG to make them load faster, while still preserving quality and resolution. As part of this effort, we are releasing a developer preview of a new image format, WebP, that promises to significantly reduce the byte size of photos on the web, allowing web sites to load faster than before.
Images and photos make up about 65% of the bytes transmitted per web page today. They can significantly slow down a user’s web experience, especially on bandwidth-constrained networks such as a mobile network. Images on the web consist primarily of lossy formats such as JPEG, and to a lesser extent lossless formats such as PNG and GIF. Our team focused on improving compression of the lossy images, which constitute the larger percentage of images on the web today.
To improve on the compression that JPEG provides, we used an image compressor based on the VP8 codec that Google open-sourced in May 2010. We applied the techniques from VP8 video intra frame coding to push the envelope in still image coding. We also adapted a very lightweight container based on RIFF. While this container format contributes a minimal overhead of only 20 bytes per image, it is extensible to allow authors to save meta-data they would like to store.
While the benefits of a VP8 based image format were clear in theory, we needed to test them in the real world. In order to gauge the effectiveness of our efforts, we randomly picked about 1,000,000 images from the web (mostly JPEGs and some PNGs and GIFs) and re-encoded them to WebP without perceptibly compromising visual quality. This resulted in an average 39% reduction in file size. We expect that developers will achieve in practice even better file size reduction with WebP when starting from an uncompressed image.
To help you assess WebP’s performance with other formats, we have shared a selection of open-source and classic images along with file sizes so you can visually compare them on this site. We are also releasing a conversion tool that you can use to convert images to the WebP format. We’re looking forward to working with the browser and web developer community on the WebP spec and on adding native support for WebP. While WebP images can’t be viewed until browsers support the format, we are developing a patch for WebKit to provide native support for WebP in an upcoming release of Google Chrome. We plan to add support for a transparency layer, also known as alpha channel in a future update.
We’re excited to hear feedback from the developer community on our discussion group, so download the conversion tool, try it out on your favorite set of images, and let us know what you think.
By Richard Rabbat, WebP Team2013, By: Seo Master<script type="text/JavaScript">
function loadFile(url) {
var script = document.createElement('SCRIPT');
script.src = url;
document.getElementsByTagName('HEAD')[0].appendChild(script);
}
</script>
Option 2: XmlHttpRequest (XHR)<script type="text/JavaScript">
function loadFile(url) {
function callback() {
if (req.readyState == 4) { // 4 = Loaded
if (req.status == 200) {
eval(req.responseText);
} else {
// Error
}
}
};
var req = new XMLHttpRequest();
req.onreadystatechange = callback;
req.open("GET", url, true);
req.send("");
}
</script>
The next question is, when to lazy load the modules? One strategy is to lazy load the modules in the background once the home page has been loaded. This approach has some drawbacks. First, JavaScript execution in the browser is single threaded. So while you are loading the modules in the background, the rest of your app becomes non-responsive to user actions while the modules load. Second, it's very difficult to decide when, and in what order, to load the modules. What if a user tries to access a feature/page you have yet to lazy load in the background? A better strategy is to associate the loading of a module with a user's action. Typically, user actions are associated with an invocation of an asynchronous function (for example, an onclick handler). This is the perfect time for you to lazy load the module since the code will have to be fetched over the network. If mobile networks are slow, you can adopt a strategy where you prefetch the code of the modules in advance and keep them stored in the javascript heap. Only then parse and load the corresponding module on user action. One word of caution is that you should make sure your prefetching strategy doesn't impact the user's experience - for example, don't prefetch all the modules while you are fetching user data. Remember, dividing up the latency has far better for users than bunching it all together during startup. <html>
...
<script id="lazy">
// Make sure you strip out (or replace) comment blocks in your JavaScript first.
/*
JavaScript of lazy module
*/
</script>
<script>
function lazyLoad() {
var lazyElement = document.getElementById('lazy');
var lazyElementBody = lazyElement.innerHTML;
var jsCode = stripOutCommentBlock(lazyElementBody);
eval(jsCode);
}
</script>
<div onclick=lazyLoad()> Lazy Load </div>
</html>
In the future, we hope that the HTML5 standard will allow more control over when the application cache should download resources in the manifest, since using comments to pass along code is not elegant but worked nicely for us. In addition, the snippets of code are not meant to be a reference implementation and one should consider many additional optimizations such as stripping white space and compiling the JavaScript to make its parsing and execution faster. To learn more about web performance, get tips and tricks to improve the speed of your web applications and to download tools, please visit http://code.google.com/speed.This is a guest post by Owen Barton, partner and director of engineering at CivicActions. Owen has been working with Google's “Make the Web Faster” project team and the Drupal community to make improvements in Drupal 7 front-end performance. This is a condensed version of a more in-depth post over at the CivicActions blog.
Drupal is a popular free and open source publishing platform, powering high profile sites such as The White House, The New York Observer and Amnesty International. The Drupal community has long understood the importance of good front-end performance to successful web sites, being ahead of the game in many ways. This post highlights some of the improvements developed for the upcoming Drupal 7 release, several of which can save an additional second or more of page load times.
Drupal 7 has made its caching system more easily pluggable - to allow for easier memcache integration, for example. It has also enabled caching HTTP headers to be set so that logged out users can cache entire pages locally as well as improve compatibility with reverse proxies and content distribution networks (CDNs). There is also a patch waiting which reduces both the response size and the time taken to generate 404 responses for inlined page assets. Depending on the type of 404 (CSS have a larger effect than images, for example) the slower 404s were adding 0.5 to 1 second to the calling page load times.
Drupal currently has the ability to aggregate multiple CSS and JavaScript files by concatenating them into a smaller number of files to reduce the number of HTTP requests. There is a patch in the queue for Drupal 7 that could allow aggregation to be enabled by default, which is great because the large number of individual files can add anything from 0-1.5 seconds to page loads.
One issue that has become apparent with the Drupal 6 aggregation system is that users can end up downloading aggregate files that include a large amount of duplicate code. On one page the aggregate may contain files a, b and c, whilst on a second page the aggregate may contain files a, b and d - the “c” and “d” files being added conditionally on specific pages. This breaks the benefits of browser caching and slows down subsequent page loads. Benchmarking on core alone shows that avoiding duplicate aggregates can save over a second across 5 page loads. A patch has already been committed that means files need to be explicitly added to the aggregate, and fix Drupal core to add appropriate files to the aggregate unconditionally.
Drupal has supported gzip compression of HTML output for a long time, however for CSS and JavaScript, the files are delivered directly by the webserver, so Drupal has less control. There are webserver based compressors such as Apache’s mod_deflate, but these are not always available. A patch is in the queue that stores compressed versions of aggregated files on write and uses rewrite and header directives in .htaccess that allow these files to be served correctly. Benchmarks show that this patch can make initial page views 20-60% faster, saving anything from 0.3 to 3 seconds total.
The Drupal 7 release promises some real improvements from a front-end performance point of view. Other performance optimizations will no doubt continue to appear and be refined in contributed modules and themes, as well as in site building best practices and documentation. In Drupal 8 we will hopefully see further improvements in the CSS/JS file aggregation system, increased high-level caching effectiveness and hopefully more tools to help site builders reduce file sizes. If you have yet to try Drupal, download it now and give it a try and tell us in the comments if your site performance improves!
At Google, we are constantly looking at ways to make web pages load faster. One way to do this is by making web images smaller. This is especially important for mobile devices where smaller images save both bandwidth and battery life. Earlier this month, we released version 0.2 of the WebP library that adds support for lossless and transparency modes to compress images. This version provides CPU and memory performance comparable to or better than PNG, yet results in 26% smaller files.
WebP’s improved compression comes from advanced techniques such as dedicated entropy codes for different color channels, exploiting 2D locality of backward reference distances and a color cache of recently used colors. This complements basic techniques such as dictionary coding, Huffman coding and color indexing transform. We think that we've only scratched the surface in improving compression. Our newly added support for alpha transparency with lossy images promises additional gains in this space, helping make WebP an efficient replacement for PNG.
The new WebP modes are supported natively in the latest Beta version of Chrome. The bit stream specification for these new WebP modes has been finalized and the container specification has been updated. We thank the community for their valuable feedback and for helping us evolve WebP as a new image compression format for the web. We encourage you to try these new compression methods on your favorite set of images, check out the code, and continue to provide feedback.
Dr. Jyrki Alakuijala is a Software Engineer with a special interest in data compression. He is a father of five daughters, and sings in the Finnish Choir in Zürich. Before joining Google, Jyrki worked in neurosurgical and radiotherapy development.
Posted by Ashleigh Rentz, Editor Emerita
2013, By: Seo Master