Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog: Over a month ago we introduced an algorithmic improvement designed to help people
find more high-quality sites in search. Since then we’ve gotten a lot of positive responses about the change: searchers are finding better results, and many great publishers are getting more traffic.

Today we’ve rolled out this improvement globally to all English-language Google users, and we’ve also incorporated new user feedback signals to help people find better search results. In some high-confidence situations, we are beginning to incorporate data about the sites that users block into our algorithms. In addition, this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before. The impact of these new signals is smaller in scope than the original change: about 2% of U.S. queries are affected by a reasonable amount, compared with almost 12% of U.S. queries for the original change.

Based on our testing, we’ve found the algorithm is very accurate at detecting site quality. If you believe your site is high-quality and has been impacted by this change, we encourage you to evaluate the different aspects of your site extensively. Google's quality guidelines provide helpful information about how to improve your site. As sites change, our algorithmic rankings will update to reflect that. In addition, you’re welcome to post in our Webmaster Help Forums. While we aren’t making any manual exceptions, we will consider this feedback as we continue to refine our algorithms.

We will continue testing and refining the change before expanding to additional languages, and we’ll be sure to post an update when we have more to share.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Webmaster Level: All
(A nearly duplicate version :) cross-posted on the Official Google Blog)

Earlier this year, we launched Google Services for Websites, a program that helps partners, e.g., web hoster and access providers, offer useful and powerful tools to their customers. By making services, such as Webmaster Tools, Custom Search, Site Search and AdSense, easily accessible via the hoster control panel, hosters can easily enable these services for their webmasters. The tools help website owners understand search performance, improve user retention and monetize their content — in other words, run more effective websites.

Since we launched the program, several hosting platforms have enhanced their offerings by integrating with the appropriate APIs. Webmasters can configure accounts, submit Sitemaps with Webmaster Tools, create Custom Search Boxes for their sites and monetize their content with AdSense, all with a few clicks at their hoster control panel. More partners are in the process of implementing these enhancements.

We've just added new tools to the suite:
  • Web Elements allows your customers to enhance their websites with the ease of cut-and-paste. Webmasters can provide maps, real-time news, calendars, presentations, spreadsheets and YouTube videos on their sites. With the Conversation Element, websites can create more engagement with their communities. The Custom Search Element provides inline search over your own site (or others you specify) without having to write any code and various options to customize further.
  • Page Speed allows webmasters to measure the performance of their websites. Snappier websites help users find things faster; the recommendations from these latency tools allow hosters and webmasters to optimize website speed. These techniques can help hosters reduce resource use and optimize network bandwidth.
  • The Tips for Hosters page offers a set of tips for hosters for creating a richer website hosting platform. Hosters can improve the convenience and accessibility of tools, while at the same time saving platform costs and earning referral fees. Tips include the use of analytics tools such as Google Analytics to help webmasters understand their traffic and linguistic tools such as Google Translate to help websites reach a broader audience.
If you're a hoster and would like to participate in the Google Services for Websites program, please apply here. You'll have to integrate with the service APIs before these services can be made available to your customers, so the earlier you start that process, the better.

And if your hosting service doesn't have Google Services for Websites yet, send them to this page. Once they become a partner, you can quickly configure the services you want at your hoster's control panel (without having to come to Google).

As always, we'd love to get feedback on how the program is working for you, and what improvements you'd like to see.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:

Google is constantly trying new ideas to improve our coverage of the web. We already do some pretty smart things like scanning JavaScript and Flash to discover links to new web pages, and today, we would like to talk about another new technology we've started experimenting with recently.

In the past few months we have been exploring some HTML forms to try to discover new web pages and URLs that we otherwise couldn't find and index for users who search on Google. Specifically, when we encounter a <FORM> element on a high-quality site, we might choose to do a small number of queries using the form. For text boxes, our computers automatically choose words from the site that has the form; for select menus, check boxes, and radio buttons on the form, we choose from among the values of the HTML. Having chosen the values for each input, we generate and then try to crawl URLs that correspond to a possible query a user may have made. If we ascertain that the web page resulting from our query is valid, interesting, and includes content not in our index, we may include it in our index much as we would include any other web page.

Needless to say, this experiment follows good Internet citizenry practices. Only a small number of particularly useful sites receive this treatment, and our crawl agent, the ever-friendly Googlebot, always adheres to robots.txt, nofollow, and noindex directives. That means that if a search form is forbidden in robots.txt, we won't crawl any of the URLs that a form would generate.  Similarly, we only retrieve GET forms and avoid forms that require any kind of user information. For example, we omit any forms that have a password input or that use terms commonly associated with personal information such as logins, userids, contacts, etc. We are also mindful of the impact we can have on web sites and limit ourselves to a very small number of fetches for a given site.

The web pages we discover in our enhanced crawl do not come at the expense of regular web pages that are already part of the crawl, so this change doesn't reduce PageRank for your other pages. As such it should only increase the exposure of your site in Google. This change also does not affect the crawling, ranking, or selection of other web pages in any significant way.

This experiment is part of Google's broader effort to increase its coverage of the web. In fact, HTML forms have long been thought to be the gateway to large volumes of data beyond the normal scope of search engines. The terms Deep Web, Hidden Web, or Invisible Web have been used collectively to refer to such content that has so far been invisible to search engine users. By crawling using HTML forms (and abiding by robots.txt), we are able to lead search engine users to documents that would otherwise not be easily found in search engines, and provide webmasters and users alike with a better and more comprehensive search experience.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.