Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

Seo Master present to you: Author Photo
By Derek Slater, Policy Manager

More than 2 billion people around the world use the web to discover, work, share, and communicate. This week, Google Developers Live Presents will host a series on Internet regulation and the future of our web. Airing Tuesday through Thursday at 3:30pm PST (23:30 UTC), technical, entrepreneurial, and policy experts weigh in on the economic and social impact of the Internet, as well as its future if we don’t take action.


Visit the Google+ events to RSVP, add the episodes to your Google calendar, tune in live on GDL, and ask questions of our on-air guests. And, most importantly, raise your voice for a free and open web.

Tuesday: The State of Our Web | 3:30 pm PST | 23:30 UTC | Featuring M-Lab and the Transparency Report | Watch live | Add to calendar

How can you tell if an application is being throttled? What are the trends in governments seeking access to users' data? Minds behind M-Lab and the Transparency report – two projects trying to empower Internet users with data about the state of the Internet – join us in-studio.

Wednesday: Entrepreneurs on the #freeandopen web | 3:30 pm PST | 23:30 UTC | Featuring Google for Entrepreneurs and Engine Advocacy | Watch live | Add to calendar

Google for Entrepreneurs is helping startups around the world and Engine Advocacy is the startup voice in government. Learn more about what they’ve picked up along the way about the culture of successful communities of entrepreneurs, and policies on the table that may impact them.

Thursday: Internet Freedom and the ITU | 3:30 pm PST | 23:30 UTC | Featuring Access Now, Association for Progressive Communications, Centro de Technologia e Sociedade (Brazil), Fundacion Karisma (Colombia), Derechos Digitales (Chile) | Watch live | Add to calendar

This week, the world's governments are gathering in Dubai to discuss the future of the Internet. Some governments want to use this meeting to increase censorship and regulate the Internet. Hear from five leading advocacy groups from around the world about what’s at stake.

Connect with us at developers.google.com/live. Tune in to live programming, check out the latest in Google tools and technologies, and learn how to make great apps.


Derek Slater defends the open Internet on Google's public policy team. He supports the company's global advocacy efforts on innovation policy, and recently helped launch google.com/takeaction.

Posted by Scott Knaster, Editor
2013, By: Seo Master
Seo Master present to you: Today, as part of our efforts to make the web faster, we are announcing Google Public DNS, a new experimental public DNS resolver.

The DNS protocol is an important part of the web's infrastructure, serving as the Internet's "phone book". Every time you visit a website, your computer performs a DNS lookup. Complex pages often require multiple DNS lookups before they complete loading. As a result, the average Internet user performs hundreds of DNS lookups each day, that collectively can slow down his or her browsing experience.

We believe that a faster DNS infrastructure could significantly improve the browsing experience for all web users. To enhance DNS speed but to also improve security and validity of results, Google Public DNS is trying a few different approaches that we are sharing with the broader web community through our documentation:
  • Speed: Resolver-side cache misses are one of the primary contributors to sluggish DNS responses. Clever caching techniques can help increase the speed of these responses. Google Public DNS implements prefetching: before the TTL on a record expires, we refresh the record continuously, asychronously and independently of user requests for a large number of popular domains. This allows Google Public DNS to serve many DNS requests in the round trip time it takes a packet to travel to our servers and back.

  • Security: DNS is vulnerable to spoofing attacks that can poison the cache of a nameserver and can route all its users to a malicious website. Until new protocols like DNSSEC get widely adopted, resolvers need to take additional measures to keep their caches secure. Google Public DNS makes it more difficult for attackers to spoof valid responses by randomizing the case of query names and including additional data in its DNS messages.

  • Validity: Google Public DNS complies with the DNS standards and gives the user the exact response his or her computer expects without performing any blocking, filtering, or redirection that may hamper a user's browsing experience.
We hope that you will help us test these improvements by using the Google Public DNS service today, from wherever you are in the world. We plan to share what we learn from this experimental rollout of Google Public DNS with the broader web community and other DNS providers, to improve the browsing experience for Internet users globally.

To get more information on Google Public DNS you can visit our site, read our documentation, and our logging policies. We also look forward to receiving your feedback in our discussion group.

2013, By: Seo Master
Seo Master present to you:
Custom robots.txt for BloggerCustom robots.txt is a way for you to instruct the search engine that you don’t want it to crawl certain pages of your blog (“crawl” means that crawlers, like Googlebot, go through your content, and index it so that other people can find it when they search for it). For example, let’s say there are parts of your blog that have information you would rather not promote, either for personal reasons or because it doesn’t represent the general theme of your blog -- this is where you can clarify these restrictions.

Custom robot.txt for Blogger

However, keep in mind that other sites may have linked to the pages that you’ve decided to restrict. Further, Google may index your page if we discover it by following a link from someone else's site. To display it in search results, Google will need to display a title of some kind and because we won't have access to any of your page content, we will rely on off-page content such as anchor text from other sites. (To truly block a URL from being indexed, you can use meta tags.)
To exclude certain content from being searched, go to Settings | Search Preferences and click Edit next to "Custom robots.txt." Enter the content which you would like web robots to ignore. For example:

User-agent: *
Disallow: /about

You can also read about robot.txt on this post on the Google Webmaster’s blog.

Warning! Use with caution. Incorrect use of this feature can result in your blog being ignored by search engines.
2013, By: Seo Master
Powered by Blogger.