Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo App Engine's System Status Dashboard 2013

Seo Master present to you:

We recently announced a System Status Dashboard for Google App Engine. As developers depend on App Engine for important applications, we wanted to provide more visibility into App Engine's availability and performance.

Application development today is pretty different than it was just a few years ago. Most web apps now make use of hosted third-party services for features like search, maps, video, or translation (e.g., our AJAX APIs). These services mean developers don't have to invest in massive computing resources to build these features themselves, and can instead focus on what is exciting and new about their apps.

Building in dependencies to third-party services or moving to a new hosting infrastructure is not something developers take lightly. This new App Engine dashboard provides some of the same monitoring data that we use internally, so you can make informed decisions about your hosting infrastructure.

Learn more (about this and other recent announcements) in the App Engine blog and please let us know what you think.2013, By: Seo Master

seo Measuring Speed The Slow Way 2013

Seo Master present to you:

Let's say you figured out a wicked-cool way to speed up how quickly your website loads. You know with great certainty that the most popular web page will load much, much faster. Then, you remember that the page always loads much, much faster in your browser with the web server running on your development box. You need numbers that represent what your users will actually experience.

Depending on your development process, you may have several measuring opportunities such as after a live release, on a staging server, on a continuous build, or on your own development box.

Only a live release gives numbers that users experience--through different types of connections, different computers, different browsers. But, it comes with some challenges:
  • You must instrument your site (one example tool is jiffy-web).
  • You must isolate changes.
    • The most straight-forward way to do that is to release only one change at a time. That avoids having multiple changes altering performance numbers--for better or for worse--at the same time.
  • Consider releasing changes to a subset of users.
    • That helps safe-guard against real-world events like holidays, big news days, exploding hard drives, and, perhaps the worst possible fate of all: a slashdotting.
Measuring real traffic is important, but performance should also be measured earlier in the development process. Millions of users will not hit your development web server--they won't, right?! You need a way to measure your pages without that.

Today, Steve Souders has released Hammerhead, a Firefox plug-in that is just the ticket for measuring web page load times. It has a sweet feature that repeatedly loads pages both with and without the browser cache so you can understand different use cases. One thing Hammerhead will not do for you is slow down your web connection. The page load times that you measure on your development machine will likely be faster than your users' wildest dreams.

Measuring page load times with a real DSL or dial up connection would be ideal, but if you cannot do that, all hope is not lost. You can try the following tools that simulate slower connection speeds on a single box:Please share your own favorite tools in the comments. Each of these tools is super easy to install and setup options to simulate slower connections. However, they each have some caveats that you need to keep in mind.

Firefox Throttle hooks into the WinSock API to limit bandwidth and avoids using proxy settings. (If you use it, be sure to disable "burst-mode".) Right now, Firefox Throttle only limits bandwidth. That means it controls how much data arrives in a given time period after the first bits arrive. It does not limit latency. Latency controls how long it takes packets to travel to and from the server. See Wikipedia's Relationship between latency and throughput for more details. For certain webpages, latency can make up a large part of the overall load time. The next Firefox Throttle release is expected to include latency delays and other webmaster friendly features to simulate slower, less-reliable connections. With these enhancements, Firefox Throttle will be an easy recommendation.

Fiddler and Charles act as proxies, and, as a result they make browsers act rather differently. For instance, IE and Firefox drastically limit the maximum number of connections (IE8 from 60+ to 6 and FF3 from 30 to 8). If you happen to know that all your users go though a proxy anyway, then this will not matter to you. Otherwise, it can mean that web pages load substantially differently.

If you have more time and hardware with which to tinker, you may want to check out tools like dummynet (FreeBSD or Mac OS X), or netem (Linux). They have even more knobs and controls and can be put between the web browser hardware and the serving hardware.

Measurements at each stage of web development can guide performance improvements. Hammerhead combined with a connection simulator like Firefox Throttle can be a great addition to your web development tool chest.2013, By: Seo Master

from web contents: Introducing Page Speed Online, with mobile support 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: intermediate

At Google, we’re striving to make the whole web fast. As part of that effort, we’re launching a new web-based tool in Google Labs, Page Speed Online, which analyzes the performance of web pages and gives specific suggestions for making them faster. Page Speed Online is available from any browser, at any time. This allows website owners to get immediate access to Page Speed performance suggestions so they can make their pages faster.

In addition, we’ve added a new feature: the ability to get Page Speed suggestions customized for the mobile version of a page, specifically smartphones. Due to the relatively limited CPU capabilities of mobile devices, the high round-trip times of mobile networks, and rapid growth of mobile usage, understanding and optimizing for mobile performance is even more critical than for the desktop, so Page Speed Online now allows you to easily analyze and optimize your site for mobile performance. The mobile recommendations are tuned for the unique characteristics of mobile devices, and contain several best practices that go beyond the recommendations for desktop browsers, in order to create a faster mobile experience. New mobile-targeted best practices include eliminating uncacheable landing page redirects and reducing the amount of JavaScript parsed during the page load, two common issues that slow down mobile pages today.


Page Speed Online is powered by the same Page Speed SDK that powers the Chrome and Firefox extensions and webpagetest.org.

Please give Page Speed Online a try. We’re eager to hear your feedback on our mailing list and how you’re using it to optimize your site.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Page Speed Service - Web Performance, Delivered. 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Advanced

Two years ago we released the Page Speed browser extension and earlier this year the Page Speed Online API to provide developers with specific suggestions to make their web pages faster. Last year we released mod_pagespeed, an Apache module, to automatically rewrite web pages. To further simplify the life of webmasters and to avoid the hassles of installation, today we are releasing the latest addition to the Page Speed family: Page Speed Service.

Page Speed Service is an online service that automatically speeds up loading of your web pages. To use the service, you need to sign up and point your site’s DNS entry to Google. Page Speed Service fetches content from your servers, rewrites your pages by applying web performance best practices, and serves them to end users via Google's servers across the globe. Your users will continue to access your site just as they did before, only with faster load times. Now you don’t have to worry about concatenating CSS, compressing images, caching, gzipping resources or other web performance best practices.

In our testing we have seen speed improvements of 25% to 60% on several sites. But we know you care most about the numbers for your site, so check out how much Page Speed Service can speed up your site. If you’re encouraged by the results, please sign up. If not, be sure to check back later. We are diligently working on adding more improvements to the service.

At this time, Page Speed Service is being offered to a limited set of webmasters free of charge. Pricing will be competitive and details will be made available later. You can request access to the service by filling out this web form.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Making more pages load instantly 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All


At Google we're obsessed with speed. We've long known that even seemingly minor speed increases can have surprisingly large impacts on user engagement and happiness. About a year ago we rolled out Instant Pages in pursuit of that goal. Instant Pages makes use of prerendering technology in Chrome to make your site appear to load instantly in some cases, with no need for any extra work on your part. Here's a video of it in action:



We've been closely watching performance and listening to webmaster feedback. Since Instant Pages rolled out we've saved more than a thousand years of ours users' time. We're very happy with the results so far, and we'll be gradually increasing how often we trigger the feature.

In the vast majority of cases, webmasters don't have to do anything for their sites to work correctly with prerendering. As we mentioned in our initial announcement of Instant Pages, search traffic will be measured in Webmaster Tools just like before this feature: only results the user visits will be counted. If your site keeps track of pageviews on its own, you might be interested in the Page Visibility API, which allows you to detect when prerendering is occurring and factor those out of your statistics. If you use an ads or analytics package, check with them to see if their solution is already prerender-aware; if it is, in many cases you won't need to make any changes at all. If you're interested in triggering Chrome's prerendering within your own site, see the Prerendering in Chrome article.

Instant Pages means that users arrive at your site happier and more engaged, which is great for everyone.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Page Speed Online has a shiny new API 2013

salam every one, this is a topic from google web master centrale blog:
Andrew
Richard
Webmaster level: intermediate

A few weeks ago, we introduced Page Speed Online, a web-based performance analysis tool that gives developers optimization suggestions. Almost immediately, developers asked us to make an API available to integrate into other tools and their regression testing suites. We were happy to oblige.

Today, as part of Google I/O, we are excited to introduce the Page Speed Online API as part of the Google APIs. With this API, developers now have the ability to integrate performance analysis very simply in their command-line tools and web performance dashboards.

We have provided a getting started guide that helps you to get up and running quickly, understand the API, and start monitoring the performance improvements that you make to your web pages. Not only that, in the request, you’ll be able to specify whether you’d like to see mobile or desktop analysis, and also get Page Speed suggestions in one of the 40 languages that we support, giving API access to the vast majority of developers in their native or preferred language.

We’re also pleased to share that the WordPress plugin W3 Total Cache now uses the Page Speed Online API to provide Page Speed suggestions to WordPress users, right in the WordPress dashboard. “The Page Speed tool itself provides extremely pointed and valuable insight into performance pitfalls. Providing that tool via an API has allowed me to directly correlate that feedback with actionable solutions that W3 Total Cache provides.” said Frederick Townes, CTO Mashable and W3 Total Cache author.

Take the Page Speed Online API for a spin and send us feedback on our mailing list. We’d love to hear your experience integrating the new Page Speed Online API.

Andrew Oates is a Software Engineer on the Page Speed Team in Google's Cambridge, Massachusetts office. You can find him in the credits for the Pixar film Up.

Richard Rabbat is the Product Management Lead on the "Make the Web Faster" initiative. He has launched Page Speed, mod_pagespeed and WebP. At Google since 2006, Richard works with engineering teams across the world.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Make the web faster with mod_pagespeed, now out of Beta 2013

salam every one, this is a topic from google web master centrale blog:

If your page is on the web, speed matters. For developers and webmasters, making your page faster shouldn’t be a hassle, which is why we introduced mod_pagespeed in 2010. Since then the development team has been working to improve the functionality, quality and performance of this open-source Apache module that automatically optimizes web pages and their resources. Now, after almost two years and eighteen releases, we are announcing that we are taking off the Beta label.

We’re committed to working with the open-source community to continue evolving mod_pagespeed, including more, better and smarter optimizations and support for other web servers. Over 120,000 sites are already using mod_pagespeed to improve the performance of their web pages using the latest techniques and trends in optimization. The product is used worldwide by individual sites, and is also offered by hosting providers, such as DreamHost, Go Daddy and content delivery networks like EdgeCast. With the move out of beta we hope that even more sites will soon benefit from the web performance improvements offered through mod_pagespeed.

mod_pagespeed is a key part of our goal to help make the web faster for everyone. Users prefer faster sites and we have seen that faster pages lead to higher user engagement, conversions, and retention. In fact, page speed is one of the signals in search ranking and ad quality scores. Besides evangelizing for speed, we offer tools and technologies to help measure, quantify, and improve performance, such as Site Speed Reports in Google Analytics, PageSpeed Insights, and PageSpeed Optimization products. In fact, both mod_pagespeed and PageSpeed Service are based on our open-source PageSpeed Optimization Libraries project, and are important ways in which we help websites take advantage of the latest performance best practices.



To learn more about mod_pagespeed and how to incorporate it in your site, watch our recent Google Developers Live session or visit the mod_pagespeed product page.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Using site speed in web search ranking 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

You may have heard that here at Google we're obsessed with speed, in our products and on the web. As part of that effort, today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.

Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we've seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don't just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.

If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate the speed of your site:
  • Page Speed, an open source Firefox/Firebug add-on that evaluates the performance of web pages and gives suggestions for improvement.
  • YSlow, a free tool from Yahoo! that suggests ways to improve website speed.
  • WebPagetest shows a waterfall view of your pages' load performance plus an optimization checklist.
  • In Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below. We've also blogged about site performance.
While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.

We encourage you to start looking at your site's speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone's experience on the Internet.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Preparing your site for a traffic spike 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

It’s a moment any site owner both looks forward to, and dreads: a huge surge in traffic to your site (yay!) can often cause your site to crash (boo!). Maybe you’ll create a piece of viral content, or get Slashdotted, or maybe Larry Page will get a tattoo and your site on tech tattoos will be suddenly in vogue.

Many people go online immediately after a noteworthy event—a political debate, the death of a celebrity, or a natural disaster—to get news and information about that event. This can cause a rapid increase in traffic to websites that provide relevant information, and may even cause sites to crash at the moment they’re becoming most popular. While it’s not always possible to anticipate such events, you can prepare your site in a variety of ways so that you’ll be ready to handle a sudden surge in traffic if one should occur:
  • Prepare a lightweight version of your site.
    Consider maintaining a lightweight version of your website; you can then switch all of your traffic over to this lightweight version if you start to experience a spike in traffic. One good way to do this is to have a mobile version of your site, and to make the mobile site available to desktop/PC users during periods of high traffic. Another low-effort option is to just maintain a lightweight version of your homepage, since the homepage is often the most-requested page of a site as visitors start there and then navigate out to the specific area of the site that they’re interested in. If a particular article or picture on your site has gone viral, you could similarly create a lightweight version of just that page.
    A couple tips for creating lightweight pages:
    • Exclude decorative elements like images or Flash wherever possible; use text instead of images in the site navigation and chrome, and put most of the content in HTML.
    • Use static HTML pages rather than dynamic ones; the latter place more load on your servers. You can also cache the static output of dynamic pages to reduce server load.
  • Take advantage of stable third-party services.
    Another alternative is to host a copy of your site on a third-party service that you know will be able to withstand a heavy stream of traffic. For example, you could create a copy of your site—or a pared-down version with a focus on information relevant to the spike—on a platform like Google Sites or Blogger; use services like Google Docs to host documents or forms; or use a content delivery network (CDN).
  • Use lightweight file formats.
    If you offer downloadable information, try to make the downloaded files as small as possible by using lightweight file formats. For example, offering the same data as a plain text file rather than a PDF can allow users to download the exact same content at a fraction of the filesize (thereby lightening the load on your servers). Also keep in mind that, if it’s not possible to use plain text files, PDFs generated from textual content are more lightweight than PDFs with images in them. Text-based PDFs are also easier for Google to understand and index fully.
  • Make tabular data available in CSV and XML formats.
    If you offer numerical or tabular data (data displayed in tables), we recommend also providing it in CSV and/or XML format. These filetypes are relatively lightweight and make it easy for external developers to use your data in external applications or services in cases where you want the data to reach as many people as possible, such as in the wake of a natural disaster.
We’d love to hear your tips and tricks for weathering traffic spikes—come join us in our Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: You and site performance, sitting in a tree... 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Beginner to Intermediate

...k, i, s, s, i, n, g! Perhaps you heard our announcement that speed is a signal in rankings, but didn’t know where to start. We’d like to help foster a lasting relationship between you and a responsive experience for your users. Last week I filmed my updated presentation from "The Need For Speed: Google Says It Matters" which includes three first steps to understanding site performance. So grab headphones and some popcorn, then verify ownership of your website and download a plugin, and we’ll all be comfy with site performance in no time.



Just curious about the Q&A? No problem! Here you go:

Is it possible to check my server response time from different areas around the world?
Yes. WebPagetest.org can test performance from the United States (both East and West Coast—go West Coast! :), United Kingdom, China, and New Zealand.
What's a good response time to aim for?
First, if your competition is fast, they may provide a better user experience than your site for your same audience. In that case, you may want to make your site better, stronger, faster...

Otherwise, studies by Akamai claim 2 seconds as the threshold for ecommerce site "acceptability." Just as an FYI, at Google we aim for under a half-second.
Does progressive rendering help users?
Definitely! Progressive rendering is when a browser can display content as it’s available incrementally rather than waiting for all the content to display at once. This provides users faster visual feedback and helps them feel more in control. Bing experimented with progressive rendering by sending users their visual header (like the logo and searchbox) quickly, then the results/ads once they were available. Bing found a 0.7% increase in satisfaction with progressive rendering. They commented that this improvement compared with full feature rollout.

How can you implement progressive rendering techniques on your site? Put stylesheets at the top of the page. This allows a browser to start displaying content ASAP.

Page speed plugin, videos, articles, and help forum are all found at code.google.com/speed/.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo The Future of Web Performance at Google I/O: JavaScript 2013

Seo Master present to you:

This post is one in a series that previews Google I/O, our biggest developer event in San Francisco, May 28-29. Over the next month, we'll be highlighting sessions and speakers to give Google Code Blog readers a better sense of what's in store for you at the event. - Ed.

In April I announced that I'm starting another book. The working title is High Performance Web Sites, Part 2. This book contains the next set of web performance best practices that goes beyond my first book and YSlow. Here are the rules I have so far:
  1. Split the initial payload
  2. Load scripts without blocking
  3. Don't scatter scripts
  4. Split dominant content domains
  5. Make static content cookie-free
  6. Reduce cookie weight
  7. Minify CSS
  8. Optimize images
  9. Use iframes sparingly
  10. To www or not to www
I'm most excited about the best practices for improving JavaScript performance (rules 1-3). Web sites that are serious about performance are making progress on the first set of rules, but there's still a lot of room for improving JavaScript performance. Across the ten top U.S. sites approximately 40% of the time to load the page is spent downloading and executing JavaScript, and only 26% of the JavaScript functionality downloaded is used before the onload event.

In my session at Google I/O I'll present the research behind rules 1-3, talk about how the ten top U.S. web sites perform, demonstrate Cuzillion, and give several takeaways that you can use to make your web site faster.2013, By: Seo Master

seo How we improved performance on Google Code 2013

Seo Master present to you:

If you're a frequent visitor to code.google.com for product updates and reference materials for Google APIs you're working with, you might have noticed that the page loading time (or page rendering time depending on how you see it) has reduced in varying degrees in the past several weeks.

As you'll see below, we've made several changes to help reduce user-perceived latency. This is not an exhaustive list of all improvements we've made recently, but these are the major ones we've made.

As Steve Souders emphasizes as the "Performance Golden Rule" in his book High Performance Web Sites, "only 10-20% of the end user response time is spent downloading the HTML document. The other 80-90% is spent downloading all the components in the page (p.5)".

We agree. That's why we focused our effort on reducing the number and size of downloads (HTTP requests) for the "components" throughout Google Code.
  • Combined and minimized JavaScript and CSS files used throughout the site
Downloading JavaScript and CSS files blocks rendering of the rest of the page. Thus, to reduce the number of HTTP requests made on the initial page load, we combined frequently-used JavaScript and CSS files into one file each. This technique has brought down 20 HTTP requests down to just 2. We also minimized the files by stripping out unnecessary whitespace and shortening function/variable names whenever possible.
  • Implemented CSS sprites for frequently-used images
There are 7 images prominently used throughout Google Code, including the Google Code logo, the googley balls at the bottom of every page, the plus and minus signs as well as the subscribe icon inside each blog gadget.

Although browsers usually download several images in parallel, we concatenated these images into one image so only one HTTP request would be made. Of course, concatenating several images into one required us to make several changes in HTML/CSS. For example, instead of having:

<img src="/images/plus.gif" />


We had to change it to:

<div style="background-image:url(/images/sprites.gif); background-position:-28px -246px; width:9px; height:9px">&amp;</div></span>


where sprites.gif is the concatenated image and background-position and width/height carefully calculated.
  • Implemented lazy loading of Google AJAX APIs loader module (google.load)
We like to eat our own dogfood. Among other APIs, we use our very own AJAX Feed API on product homepages inside the blog gadgets and the AJAX Search API on the search page. These Google AJAX APIs require the Google loader module (google.load) to be loaded first before any of the specific AJAX APIs (i.e. AJAX Feed API, AJAX Search API, Maps API) can be initialized and used. Traditionally, the Google AJAX APIs loader module would be loaded by including the following <script> tag in the <head> section:
    <script type="text/javascript" src="http://www.google.com/jsapi"></script>
This works well in most cases, but when optimizing for the display of static content, this blocks the browser from rendering the rest of the page until it's finished loading that script, thus impacting the user-perceived latency. So instead of loading the Google AJAX APIs loader module upfront, we are now loading it lazily only on the pages where it's required. This is made possible as follows (please note that this is a stripped-down version of what we have on Google Code):

First, in the <head> section, we load the Google AJAX APIs loader module via DOM scripting only on the pages where it's required:

if (needToLoadGoogleAjaxApisLoaderModule) {
// Load Google AJAX APIs loader module (google.load)
var script = document.createElement('script');
script.src = 'http://www.google.com/jsapi?callback=googleLoadCallback';
script.type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild(script);
}
It's important to add the 'callback' parameter in the src attribute, 'callback=googleLoadCallback'. This callback handler will then be called whenever the Google loader module is finished loading.

Then, in the Google loader callback handler (googleLoadCallback()), we initialize the AJAX Feed API and provide the function name that utilizes the AJAX Feed API (startUsingAjaxFeedAPI):
function googleLoadCallback() {
// Initialize AJAX Feed API
google.load('feeds', '1', {callback: startUsingAjaxFeedAPI});
}

function startUsingAjaxFeedAPI() {
// Start using AJAX Feed API
var feed = new google.feeds.Feed(someFeedUrl);
...
}

In effect, we're loading the AJAX Feed API on-demand through the use of two consecutive callback handlers, first to load the Google AJAX APIs loader module (google.load) and then to initialize the AJAX Feed API before it's used. Similar technique can be used for the Maps API and the AJAX Search API.

By now you're probably wondering just how much of an impact did these changes have on Google Code anyways? According to our latency measurement stats, the user-perceived latency on Google Code dropped quite a bit, anywhere between 30% and 70% depending on the page. This is a huge return for relatively small investments we've made along the way, and we hope you'll find these techniques useful for your own web development as well.2013, By: Seo Master

seo Steve Souders: Life's Too Short, Write Fast Code (part 2) 2013

Seo Master present to you: By Steve Souders, Member of Technical Staff

I've been working on a follow-up book to High Performance Web Sites called Even Faster Web Sites. As I finish chapters, I talk about the findings at conferences and tech talks. The first three chapters are Split the Initial Payload, Load Scripts Without Blocking, and Don't Scatter Inline Scripts. You can hear about those best practices in my video from Google I/O.

This talk presents the next three chapters: Couple Asynchronous Scripts, Use Iframes Sparingly, and Flush the Document Early.

The adoption of JavaScript is growing, but the blocking behavior of external scripts is well known. That's why it's important to use one of the techniques to load scripts without blocking (see the Google I/O talk). But loading scripts asynchronously means that inlined code that uses symbols from the script must be coupled in some way. Without this coupling, undefined symbol errors occur when the inlined code is executed before the external script arrives.

There are five techniques for coupling asynchronous scripts: hardcoded callback, window onload, timer, script onload, and degrading script tags. All of the techniques work. Degrading scripts tags is the most elegant, but isn't well known. Script onload is the most versatile technique and is the one I recommend people use. In the talk, I then go into detail, including many code examples, on how to load scripts asynchronously and use these coupling techniques to speed up your web page.

Iframes have a negative impact on web pages. They are the most expensive DOM element to create. They block the parent's onload event (although there's a workaround to this problem in Safari and Chrome). Also, the main page can block resources in the iframe. It's important to understand these interactions if you use iframes in your page.

Flushing the document early allows the browser to start rendering the page and downloading resources in the page, even before the entire HTML document has arrived. But getting flushing to work can feel like trying to get the stars to align. You need to understand PHP's output_buffering, HTTP/1.1's chunked encoding, Apache's DeflateBufferSize, the impact of proxies, minimum HTML size requirements in Safari and Chrome, and the need for domain sharding to avoid having the HTML document block other downloads.

If your company wants a better user experience, increased revenues, and reduced operating costs, the key is to create even faster web sites. For more information on these best practices, watch the video below and read the slides.



Check out other talks in this tech speaker series:
2013, By: Seo Master

seo John Resig: Drop-in JavaScript Performance 2013

Seo Master present to you:

Although Mozilla is right across the street, their JavaScript evangelist, John Resig, hails from Boston. When he comes to town, it's a great opportunity to learn about his current explorations in the world of JavaScript. I was fortunate to be able to host him for a Google tech talk last week. The video and slides are now available. In addition to his evangelism role at Mozilla, John is the creator of jQuery and Dromaeo, author of Pro JavaScript Techniques, and member of the Firebug Working Group. He's currently working on Secrets of the JavaScript Ninja, due out sometime this year.

In this talk, John starts off highlighting why performance will improve in the next generation of browsers, thanks to advances in JavaScript engines and new features such as process per tab and parallel script loading. He digs deeper into JavaScript performance, touching on shaping, tracing, just-in-time compilation, and the various benchmarks (SunSpider, Dromaeo, and V8 benchmark). John plugs my UA Profiler, with its tests for simultaneous connections, parallel script loading, and link prefetching. He wraps up with a collection of many other advanced features in the areas of communiction, DOM, styling, data, and measurements.



Wow, a lot of material to cover in one hour. An added benefit of having this talk given at Google is the questions from the audience. At one point, a member of the Google Chrome team goes into detail about how parsing works in V8. Many thanks to John for sharing his work and insights with all of us.

2013, By: Seo Master
Powered by Blogger.