Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

Seo Master present to you:

Let's say you figured out a wicked-cool way to speed up how quickly your website loads. You know with great certainty that the most popular web page will load much, much faster. Then, you remember that the page always loads much, much faster in your browser with the web server running on your development box. You need numbers that represent what your users will actually experience.

Depending on your development process, you may have several measuring opportunities such as after a live release, on a staging server, on a continuous build, or on your own development box.

Only a live release gives numbers that users experience--through different types of connections, different computers, different browsers. But, it comes with some challenges:
  • You must instrument your site (one example tool is jiffy-web).
  • You must isolate changes.
    • The most straight-forward way to do that is to release only one change at a time. That avoids having multiple changes altering performance numbers--for better or for worse--at the same time.
  • Consider releasing changes to a subset of users.
    • That helps safe-guard against real-world events like holidays, big news days, exploding hard drives, and, perhaps the worst possible fate of all: a slashdotting.
Measuring real traffic is important, but performance should also be measured earlier in the development process. Millions of users will not hit your development web server--they won't, right?! You need a way to measure your pages without that.

Today, Steve Souders has released Hammerhead, a Firefox plug-in that is just the ticket for measuring web page load times. It has a sweet feature that repeatedly loads pages both with and without the browser cache so you can understand different use cases. One thing Hammerhead will not do for you is slow down your web connection. The page load times that you measure on your development machine will likely be faster than your users' wildest dreams.

Measuring page load times with a real DSL or dial up connection would be ideal, but if you cannot do that, all hope is not lost. You can try the following tools that simulate slower connection speeds on a single box:Please share your own favorite tools in the comments. Each of these tools is super easy to install and setup options to simulate slower connections. However, they each have some caveats that you need to keep in mind.

Firefox Throttle hooks into the WinSock API to limit bandwidth and avoids using proxy settings. (If you use it, be sure to disable "burst-mode".) Right now, Firefox Throttle only limits bandwidth. That means it controls how much data arrives in a given time period after the first bits arrive. It does not limit latency. Latency controls how long it takes packets to travel to and from the server. See Wikipedia's Relationship between latency and throughput for more details. For certain webpages, latency can make up a large part of the overall load time. The next Firefox Throttle release is expected to include latency delays and other webmaster friendly features to simulate slower, less-reliable connections. With these enhancements, Firefox Throttle will be an easy recommendation.

Fiddler and Charles act as proxies, and, as a result they make browsers act rather differently. For instance, IE and Firefox drastically limit the maximum number of connections (IE8 from 60+ to 6 and FF3 from 30 to 8). If you happen to know that all your users go though a proxy anyway, then this will not matter to you. Otherwise, it can mean that web pages load substantially differently.

If you have more time and hardware with which to tinker, you may want to check out tools like dummynet (FreeBSD or Mac OS X), or netem (Linux). They have even more knobs and controls and can be put between the web browser hardware and the serving hardware.

Measurements at each stage of web development can guide performance improvements. Hammerhead combined with a connection simulator like Firefox Throttle can be a great addition to your web development tool chest.2013, By: Seo Master
Seo Master present to you:

Cross-posted from the Chromium Blog

As part of Google’s initiative to make the web faster, over the past few months we have released a number of tools to help site owners speed up their websites. We launched the Page Speed Firefox extension to evaluate the performance of web pages and to get suggestions on how to improve them, we introduced the Speed Tracer Chrome extension to help identify and fix performance problems in web applications, and we released a set of closure tools to help build rich web applications with fully optimized JavaScript code. While these tools have been incredibly successful in helping developers optimize their sites, as we’ve evaluated our progress, we continue to notice a single component of web pages is consistently responsible for the majority of the latency on pages across the web: images.

Most of the common image formats on the web today were established over a decade ago and are based on technology from around that time. Some engineers at Google decided to figure out if there was a way to further compress lossy images like JPEG to make them load faster, while still preserving quality and resolution. As part of this effort, we are releasing a developer preview of a new image format, WebP, that promises to significantly reduce the byte size of photos on the web, allowing web sites to load faster than before.

Images and photos make up about 65% of the bytes transmitted per web page today. They can significantly slow down a user’s web experience, especially on bandwidth-constrained networks such as a mobile network. Images on the web consist primarily of lossy formats such as JPEG, and to a lesser extent lossless formats such as PNG and GIF. Our team focused on improving compression of the lossy images, which constitute the larger percentage of images on the web today.

To improve on the compression that JPEG provides, we used an image compressor based on the VP8 codec that Google open-sourced in May 2010. We applied the techniques from VP8 video intra frame coding to push the envelope in still image coding. We also adapted a very lightweight container based on RIFF. While this container format contributes a minimal overhead of only 20 bytes per image, it is extensible to allow authors to save meta-data they would like to store.

While the benefits of a VP8 based image format were clear in theory, we needed to test them in the real world. In order to gauge the effectiveness of our efforts, we randomly picked about 1,000,000 images from the web (mostly JPEGs and some PNGs and GIFs) and re-encoded them to WebP without perceptibly compromising visual quality. This resulted in an average 39% reduction in file size. We expect that developers will achieve in practice even better file size reduction with WebP when starting from an uncompressed image.

To help you assess WebP’s performance with other formats, we have shared a selection of open-source and classic images along with file sizes so you can visually compare them on this site. We are also releasing a conversion tool that you can use to convert images to the WebP format. We’re looking forward to working with the browser and web developer community on the WebP spec and on adding native support for WebP. While WebP images can’t be viewed until browsers support the format, we are developing a patch for WebKit to provide native support for WebP in an upcoming release of Google Chrome. We plan to add support for a transparency layer, also known as alpha channel in a future update.

We’re excited to hear feedback from the developer community on our discussion group, so download the conversion tool, try it out on your favorite set of images, and let us know what you think.

2013, By: Seo Master
Seo Master present to you: Author Photo
By Scott Knaster, Google Code Blog Editor

The Dead Sea Scrolls were lost in the Judean desert for more than 2000 years before being rediscovered in 1947. Now The Digital Dead Sea Scrolls project makes five of the ancient documents available online to everyone.



The online scrolls contain incredibly high-resolution photography (up to 1200 megapixels) and an English translation along with the original Hebrew text. Looking through the scrolls online is a remarkable mashup of ancient artifacts and modern technology.

Not everything can be done online: sometimes you need to be there. When a magnitude 5.8 earthquake struck near Washington, D.C. last August, the Washington Monument suffered visible damage. This week the U. S. National Park Service sent its "difficult access team" to rappel up and down the monument to check for damage. Civil Engineer Emma Cardini seemed to enjoy the task and was quoted as saying "It’s really cool to see the planes flying under you". See, that’s why it’s great to be an engineer.

Birds fly, too – but dinosaurs with feathers? Check out this news from Canada about the discovery of amber-bound feathers that belonged to dinosaurs and birds from the late Cretaceous period.


Fridaygram is a weekly post containing a cool Google-related announcement and a couple of fun science-based tidbits. But no cake.
2013, By: Seo Master
Powered by Blogger.