Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: About those fake penalty notification emails 2013

salam every one, this is a topic from google web master centrale blog: (This post was translated from our German webmaster blog, originally written by Stefanie.)

We are aware that a number of German webmasters have received fake penalty notification emails that allegedly came from Google Search Quality. These spam emails have created some confusion about their authenticity, since we send very similar email notifications, which you can read more about here. Therefore, we clearly want to state that these emails are not related to any of Google's efforts concerning webmaster notification.

Updated: Because these emails are easy to mistake for authentic ones from Google, we've temporarily discontinued sending them as we work on ways to provide more secure communication mechanisms. We hope this will reduce confusion.

In the original post, we had listed the ways to tell if the email you received was not from Google. However, as we've temporarily stopped sending emails about guidelines violations, you can safely assume that any email you receive isn't from us. Note that the emails we sent did not include attachments. In addition, some of the emails mentioned 301 redirects as being the violation in question. Rest assured that 301 redirects are not a violation of our Webmaster Guidelines. Note that we do provide information about some violations in webmaster tools. If your site previously violated the guidelines and you've made changes to fix it, you can let us know by filing a reinclusion request.

This post has been updated to indicate that we've temporarily stopped sending emails to webmasters about guidelines violations to reduce confusion.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Website testing & Google search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Advanced

We’ve gotten several questions recently about whether website testing—such as A/B or multivariate testing—affects a site’s performance in search results. We’re glad you’re asking, because we’re glad you’re testing! A/B and multivariate testing are great ways of making sure that what you’re offering really appeals to your users.

Before we dig into the implications for search, a brief primer:
Website testing is when you try out different versions of your website (or a part of your website), and collect data about how users react to each version. You use software to track which version causes users to do-what-you-want-them-to-do most often: which one results in the most purchases, or the most email signups, or whatever you’re testing for. After the test is finished you can update your website to use the “winner” of the test—the most effective content.

A/B testing is when you run a test by creating multiple versions of a page, each with its own URL. When users try to access the original URL, you redirect some of them to each of the variation URLs and then compare users’ behaviour to see which page is most effective.

Multivariate testing is when you use software to change differents parts of your website on the fly. You can test changes to multiple parts of a page—say, the heading, a photo, and the ‘Add to Cart’ button—and the software will show variations of each of these sections to users in different combinations and then statistically analyze which variations are the most effective. Only one URL is involved; the variations are inserted dynamically on the page.

So how does this affect what Googlebot sees on your site? Will serving different content variants change how your site ranks? Below are some guidelines for running an effective test with minimal impact on your site’s search performance.
  • No cloaking.
    Cloaking—showing one set of content to humans, and a different set to Googlebot—is against our Webmaster Guidelines, whether you’re running a test or not. Make sure that you’re not deciding whether to serve the test, or which content variant to serve, based on user-agent. An example of this would be always serving the original content when you see the user-agent “Googlebot.” Remember that infringing our Guidelines can get your site demoted or removed from Google search results—probably not the desired outcome of your test.
  • Use rel=“canonical”.
    If you’re running an A/B test with multiple URLs, you can use the rel=“canonical” link attribute on all of your alternate URLs to indicate that the original URL is the preferred version. We recommend using rel=“canonical” rather than a noindex meta tag because it more closely matches your intent in this situation. Let’s say you were testing variations of your homepage; you don’t want search engines to not index your homepage, you just want them to understand that all the test URLs are close duplicates or variations on the original URL and should be grouped as such, with the original URL as the canonical. Using noindex rather than rel=“canonical” in such a situation can sometimes have unexpected effects (e.g., if for some reason we choose one of the variant URLs as the canonical, the “original” URL might also get dropped from the index since it would get treated as a duplicate).
  • Use 302s, not 301s.
    If you’re running an A/B test that redirects users from the original URL to a variation URL, use a 302 (temporary) redirect, not a 301 (permanent) redirect. This tells search engines that this redirect is temporary—it will only be in place as long as you’re running the experiment—and that they should keep the original URL in their index rather than replacing it with the target of the redirect (the test page). JavaScript-based redirects are also fine.
  • Only run the experiment as long as necessary.
    The amount of time required for a reliable test will vary depending on factors like your conversion rates, and how much traffic your website gets; a good testing tool should tell you when you’ve gathered enough data to draw a reliable conclusion. Once you’ve concluded the test, you should update your site with the desired content variation(s) and remove all elements of the test as soon as possible, such as alternate URLs or testing scripts and markup. If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if you’re serving one content variant to a large percentage of your users.
The recommendations above should result in your tests having little or no impact on your site in search results. However, depending on what types of content you’re testing, it may not even matter much if Googlebot crawls or indexes some of your content variations while you’re testing. Small changes, such as the size, color, or placement of a button or image, or the text of your “call to action” (“Add to cart” vs. “Buy now!”), can have a surprising impact on users’ interactions with your webpage, but will often have little or no impact on that page’s search result snippet or ranking. In addition, if we crawl your site often enough to detect and index your experiment, we’ll probably index the eventual updates you make to your site fairly quickly after you’ve concluded the experiment.

To learn more about website testing, check out these articles on Content Experiments, our free testing tool in Google Analytics. You can also ask questions about website testing in the Analytics Help Forum, or about search impact in the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: A new, improved form for reporting webspam 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All


Everyone on the web knows how frustrating it is to perform a search and find websites gaming the search results. These websites can be considered webspam - sites that violate Google’s Webmaster Guidelines and try to trick Google into ranking them highly. Here at Google, we work hard to keep these sites out of your search results, but if you still see them, you can notify us by using our webspam report form. We’ve just rolled out a new, improved webspam report form, so it’s now easier than ever to help us maintain the quality of our search results. Let’s take a look at some of our new form’s features:


Option to report various search issues
There are many search results, such as sites with malware and phishing, that are not necessarily webspam but still degrade the search experience. We’ve noticed that our users sometimes report these other issues using our webspam report form, causing a delay between when a user reports the issue and when the appropriate team at Google handles it. The new form’s interstitial page allows you to report these other search issues directly to the correct teams so that they can address your concerns in a timely manner.


Simplified form with informative links
To improve the readability of the form, we’ve made the text more concise, and we’ve integrated helpful links into the form’s instructions. Now, the ability to look up our Webmaster Guidelines, get advice on writing actionable form comments, and block sites from your personalized search results is just one click away.


Thank you page with personalization options
Some of our most valuable information comes from our users, and we appreciate the webspam reports you submit to us. The thank you page explains what happens once we’ve received your webspam report. If you want to report more webspam, there’s a link back to the form page and instructions on how to report webspam more efficiently with the Chrome Webspam Report Extension. We also provide information on how you can immediately block the site you’ve reported from your personalized search results, for example, by managing blocked sites in your Google Account.


At Google, we strive to provide the highest quality, most relevant search results, so we take your webspam reports very seriously. We hope our new form makes the experience of reporting webspam as painless as possible (and if it doesn’t, feel free to let us know in the comments).


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: More details about our webmaster guidelines 2013

salam every one, this is a topic from google web master centrale blog: At SMX Advanced on Monday, Matt Cutts talked about our webmaster guidelines. Later, during Q&A, someone asked about adding more detail to the guidelines: more explanation about violations and more actionable help on how to improve sites. You ask -- we deliver! On Tuesday, Matt told the SMX crowd that we'd updated the guidelines overnight to include exactly those things! We work fast around here. (OK, maybe we had been working on some of it already.)

So, what's new? Well, the guidelines themselves haven't changed. But the specific quality guidelines now link to expanded information to help you better understand how to spot and fix any issues. That section is below so you can click through to explore these new details.

Quality guidelines - specific guidelines

As Riona MacNarmara recently posted in our discussion forum, we are working to expand our webmaster help content even further and want your input. If you have suggestions, please post them in either the thread or as a comment to this post. We would love to hear from you!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Linkowanie 2013

salam every one, this is a topic from google web master centrale blog:

Szczególnie popularną wśród polskich webmasterów metodą optymalizacji stron pod kątem wyszukiwarek jest wymiana lub zakup linków o wysokim PageRank. W przeszłości niewątpliwie była to jedna z możliwości, która faktycznie przynosiła efekty. Niestety przy wyborze linków użytkownicy i ich zainteresowania nie zawsze są uwzględniane. Prowadzi to do linkowania serwisów i stron internetowych niezwiązanych ze sobą tematycznie. Tego typu linki nie stanowią żadnej wartości informacyjnej dla osób odwiedzających i są postrzegane jako nieetyczna metoda SEO, podobnie jak ukryty tekst. Wytyczne Google dla webmasterów jednoznacznie odnoszą się do takich praktyk.

Dbając o polskich użytkowników, Google niedawno ulepszył algorytmy i metody weryfikacji istotnych linków. Celem tych starań jest udostępnienie jak najlepszych wyników SERP (strony z wynikami wyszukiwania).

Jak więc należy poprawnie linkować strony internetowe, aby nie wykraczać poza wytyczne Google?
Starając się podwyższyć PageRank i dzięki temu osiągnąć lepsze notowanie strony w SERP, należy kierować się potrzebami potencjalnych użytkowników odwiedzających dany serwis, zarówno przy wyborze treści, jak i linków. Linkowanie do i linki z tematycznie związanych stron są doceniane przez Google i bez wątpienia będą pozytywnie wpływać na pozycje w indeksie. Równocześnie Google
dąży do zlikwidowania wpływu masowej wymiany linków tematycznie rozbieżnych oraz linków zakupionych. Odnosi się to również do zautomatyzowanych systemów wymiany linków.

Jak więc postarać się o wartościowe linki?
Najlepszą metodą uzyskania dobrych linków
jest niepowtarzalna, interesująca treść, która w naturalny sposób zdobędzie popularność w społeczności internetowej, a szczególnie wśród grona osób zainteresowanych danym tematem, na przykład u autorów blogów. Naturalnie uzyskane linki istnieją dłużej niż kupione, ponieważ nadane bezinteresownie rzadziej są usuwane. Niezależnie od rodzaju strony internetowej, czy tematu, należy kierować się wyłącznie potrzebami potencjalnych użytkowników. Każda decyzja odnosząca się do linkowania powinna być poprzedzona pytaniem: Czy będzie to użyteczne dla odwiedzających moją stronę?


Linking

One popular way to optimize webpages for search engines, especially among Polish Web masters, is with link exchanges or buying high PageRank links. Unfortunately, in the choice of link partners, some webmasters' priority has not always been on what is best for the user. This causes some people to link to totally unrelated pages or engage in link exchanges with spammy sites. This kind of linking does not provide additional value to the page’s visitors and is a SEO method that, like hiding text, can be considered spammy. Google’s webmaster guidelines refer clearly to methods of this type under "quality guidelines".

Caring about our Polish users, Google recently improved algorithms and methods of link validation for our Polish search results. We do this because we want to provide our users with the best SERPs (Search Engine Result Pages) possible.

How to link in order not to violate Google’s webmaster guidelines?
If you want to increase your PageRank and to improve your position in the SERPs, you should always be thinking about your visitors’ needs. This refers to content as much as to linking.

Linking to and from related sites is still very much appreciated by Google and it will have a positive impact on the position in the index. Simultaneously, Google will work to stop the impact of excessive off-topic link exchanging or bought links, including automated link exchange programs.

How to create relevant links?
The best way to gain relevant links is to create unique, relevant content that can quickly gain popularity in the Internet community, especially among those who are interested in the topic, such as blog publishers. Also, look for editorially given links based on merit, since naturally grown links tend to exist longer; and such links will pass the test of time. Therefore, the best way to go is to focus on your visitors’ needs, no matter how this is related to content or linking. Before making any single decision, you should ask yourself the question: Is this going to be beneficial for my page’s visitors?

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: The Impact of User Feedback, Part 1 2013

salam every one, this is a topic from google web master centrale blog:

About a year ago, in response to user feedback, we created a paid links reporting form within Webmaster Tools. User feedback, through reporting paid links, webspam, or suggestions in our Webmaster Help Group, has been invaluable in ensuring that the quality of our index and our tools is as high as possible. Today, I'd like to highlight the impact that reporting paid links and webspam has had on our index. In a future post, I'll showcase how user feedback and concerns in the Webmaster Help Group have helped us improve our Help Center documentation and Webmaster Tools.

Reporting Paid Links

As mentioned in the post Information about buying and selling links that pass PageRank, Google reserves the right to take action on sites that buy or sell links that pass PageRank for the purpose of manipulating search engine rankings. Even though we work hard to discount these links through algorithmic detection, if you see a site that is buying or selling links that pass PageRank, please let us know. Over the last year, users have submitted thousands and thousands of paid link reports to Google, and each report can contain multiple websites that are suspected of selling links. These reports are actively reviewed, and the feedback is invaluable to improve our search algorithms. We also are willing to take manual action on a significant fraction of paid link reports as we continue to improve our algorithms. More importantly, the hard work of users who have already reported paid links has helped improve the quality of our index for millions. For more information on reporting paid links, check out this Help Center article.

Reporting Webspam

Google has also provided a form to report general webspam since November 2001. We appreciate users who alert us to potential abuses for the sake of the whole Internet community. Spam reports come in two flavors: an authenticated form that requires registration in Webmaster Tools, and an unauthenticated form. We receive hundreds of reports each day. Spam reports to the authenticated form are given more weight and are individually investigated more often. Spam reports to the unauthenticated form are assessed in terms of impact, and a large fraction of those are reviewed as well. As Udi Manber, VP of Engineering & Search Quality mentioned in his recent blog post on our Official Google Blog, in 2007 more than 450 new improvements were made to our search algorithms. A number of those improvements were related to webspam. It's not an understatement to say that users who have taken the time to report spam were essential to many of those algorithmic enhancements.

Going forward

As users' expectations of search increase daily, we know it's important to provide a high quality index with relevant results. We're always happy to hear stories in our Webmaster Help Group from users who have have reported spam with noticeable results in our Webmaster Help Group. Now that you know how Google uses feedback to improve our search quality, you may want to tell us about webspam you've seen in our results. Please use our authenticated form to report paid links or other types of webspam. Thanks again for taking the time to help us improve.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Google Webmaster Guidelines updated 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Today we’re happy to announce an updated version of our Webmaster Quality Guidelines. Both our basic quality guidelines and many of our more specific articles (like those on links schemes or hidden text) have been reorganized and expanded to provide you with more information about how to create quality websites for both users and Google.

The main message of our quality guidelines hasn’t changed: Focus on the user. However, we’ve added more guidance and examples of behavior that you should avoid in order to keep your site in good standing with Google’s search results. We’ve also added a set of quality and technical guidelines for rich snippets, as structured markup is becoming increasingly popular.

We hope these updated guidelines will give you a better understanding of how to create and maintain Google-friendly websites.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Requesting reconsideration using Google Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:

If your site does not appear in Google Search results, you might be understandably worried. Here, we've put together some information to help you determine when and how to submit a reconsideration request for your site.

You can follow along as Bergy (the webmaster of example.com in our video) tries to find out whether he needs to submit a reconsideration request for his Ancient Roman Politics blog. Of course, not all webmasters' problems can be traced back to Wysz (-:, but the simple steps outlined below can help you determine the right solution for your particular case.


Check for access issues

You may want to check if there are any access issues with your site - you can do this by logging in to your Webmaster Tools account. On the Overview page you'll be able to see when Googlebot last successfully crawled the home page of your site. Another way to do this is to check the cache date for your site's homepage. For more detailed information about how Googlebot crawls your site, you might want to check the crawl rate graphs (find them in Tools > Set crawl rate).

On the Overview page you can also check whether there are any crawling errors. For example, if your server was busy or unavailable when we tried to access your site, you would get a "URL unreachable" error message. Alternatively, there might be URLs in your site blocked by your robots.txt file. You can see this in "URLs restricted by robots.txt". If there are URLs listed there which you did not expect, you can go to Tools and select "Analyze robots.txt" - there you can make sure that your robots.txt file is properly formatted and only blocking the parts of your site which you don't want Google to crawl.

Other than the examples mentioned above, there are several more types of crawl errors - HTTP errors and URLs timed out errors, just to name a few. Even thought we haven't highlighted them here, you will still see alerts for all of them on the Overview page in your Webmaster Tools account.

Check for messages

If Google has no problems accessing your site, check to see if there is a message waiting for you in the Message Center of your Webmaster Tools account. This is the place Google uses to communicate important information to you regarding your Webmaster Tools account and the sites you manage. If we have noticed there is something wrong with your site, we may send you a message there, detailing some issues which you need to fix to bring your site into compliance with the Webmaster Guidelines.

Read the Webmaster Guidelines

If you don't see a message in the Message Center, check to see if your site is or has at some point been in violation of the Webmaster Guidelines. You can find them, and much more, in our Help Center.

Fix your site

If your site is in violation of the Webmaster Guidelines and you think that this might have affected how your site is viewed by Google, this would be a good time to submit a reconsideration request. But before you do that, make changes to your site so that it falls within our guidelines.

Submit a reconsideration request

Now you can go ahead and submit a request for reconsideration. Log in to your Webmaster Tools account. Under Tools, click on "Request reconsideration" and follow the steps. Make sure to explain what you think was wrong with your site and what steps you have taken to fix it.

Once you've submitted your request, you'll see a message from us in the Message Center confirming that we've received it. We'll then review your site for compliance with the Webmaster Guidelines.

We hope this post has helped give you an idea when and how to submit a reconsideration request. If you're not sure why Google isn't including your site, a great place to look for help is our Webmaster Help Group. There you will find many knowledgeable and friendly webmasters and Googlers, who would be happy to look at your site and give suggestions on how you could fix things. You can find links to both the Help Center and the Webmaster Group at google.com/webmasters.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: How Google defines IP delivery, geolocation, and cloaking 2013

salam every one, this is a topic from google web master centrale blog:

Many of you have asked for more information regarding webserving techniques (especially related to Googlebot), so we made a short glossary of some of the more unusual methods.
  • Geolocation: Serving targeted/different content to users based on their location. As a webmaster, you may be able to determine a user's location from preferences you've stored in their cookie, information pertaining to their login, or their IP address. For example, if your site is about baseball, you may use geolocation techniques to highlight the Yankees to your users in New York.

    The key is to treat Googlebot as you would a typical user from a similar location, IP range, etc. (i.e. don't treat Googlebot as if it came from its own separate country—that's cloaking).

  • IP delivery: Serving targeted/different content to users based on their IP address, often because the IP address provides geographic information. Because IP delivery can be viewed as a specific type of geolocation, similar rules apply. Googlebot should see the same content a typical user from the same IP address would see.

    (Author's warning: This 7.5-minute video may cause drowsiness. Even if you're really interested in IP delivery or multi-language sites, it's a bit uneventful.)

  • Cloaking: Serving different content to users than to Googlebot. This is a violation of our webmaster guidelines. If the file that Googlebot sees is not identical to the file that a typical user sees, then you're in a high-risk category. A program such as md5sum or diff can compute a hash to verify that two different files are identical.

  • First click free: Implementing Google News' First click free policy for your content allows you to include your premium or subscription-based content in Google's websearch index without violating our quality guidelines. You allow all users who find your page using Google search to see the full text of the document, even if they have not registered or subscribed. The user's first click to your content area is free. However, you can block the user with a login or payment request when he clicks away from that page to another section of your site.

    If you're using First click free, the page displayed to users who visit from Google must be identical to the content that is shown to the Googlebot.
Still have questions?  We'll see you at the related thread in our Webmaster Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Information about buying and selling links that pass PageRank 2013

salam every one, this is a topic from google web master centrale blog:

Our goal is to provide users the best search experience by presenting equitable and accurate results. We enjoy working with webmasters, and an added benefit of our working together is that when you make better and more accessible content, the internet, as well as our index, improves. This in turn allows us to deliver more relevant search results to users.

If, however, a webmaster chooses to buy or sell links for the purpose of manipulating search engine rankings, we reserve the right to protect the quality of our index. Buying or selling links that pass PageRank violates our webmaster guidelines. Such links can hurt relevance by causing:

- Inaccuracies: False popularity and links that are not fundamentally based on merit, relevance, or authority
- Inequities: Unfair advantage in our organic search results to websites with the biggest pocketbooks

In order to stay within Google's quality guidelines, paid links should be disclosed through a rel="nofollow" or other techniques such as doing a redirect through a page which is robots.txt'ed out. Here's more information explaining our stance on buying and selling links that pass PageRank:

February 2003: Google's official quality guidelines have advised "Don't participate in link schemes designed to increase your site's ranking or PageRank" for several years.

September 2005: I posted on my blog about text links and PageRank.

December 2005: Another post on my blog discussed this issue, and said

Many people who work on ranking at search engines think that selling links can lower the quality of links on the web. If you want to buy or sell a link purely for visitors or traffic and not for search engines, a simple method exists to do so (the nofollow attribute). Google’s stance on selling links is pretty clear and we’re pretty accurate at spotting them, both algorithmically and manually. Sites that sell links can lose their trust in search engines.

September 2006: In an interview with John Battelle, I noted that "Google does consider it a violation of our quality guidelines to sell links that affect search engines."

January 2007: I posted on my blog to remind people that "links in those paid-for posts should be made in a way that doesn’t affect search engines."

April 2007: We provided a mechanism for people to report paid links to Google.

June 2007: I addressed paid links in my keynote discussion during the Search Marketing Expo (SMX) conference in Seattle. Here's a video excerpt from the keynote discussion. It's less than a minute long, but highlights that Google is willing to use both algorithmic and manual detection of paid links that violate our quality guidelines, and that we are willing to take stronger action on such links in the future.

June 2007: A post on the official Google Webmaster Blog noted that "Buying or selling links to manipulate results and deceive search engines violates our guidelines." The post also introduced a new official form in Google's webmaster console so that people could report buying or selling of links.

June 2007: Google added more specific guidance to our official webmaster documentation about how to report buying or selling links and what sort of link schemes violate our quality guidelines.

August 2007: I described Google's official position on buying and selling links in a panel dedicated to paid links at the Search Engine Strategies (SES) conference in San Jose.

September 2007: In a post on my blog recapping the SES San Jose conference, I also made my presentation available to the general public (PowerPoint link).

October 2007: Google provided comments for a Forbes article titled "Google Purges the Payola".

October 2007: Google officially confirmed to Search Engine Land that we were taking stronger action on this issue, including decreasing the toolbar PageRank of sites selling links that pass PageRank.

October 2007: An email that I sent to Search Engine Journal also made it clear that Google was taking stronger action on buying/selling links that pass PageRank.

We appreciate the feedback that we've received on this issue. A few of the more prevalent questions:

Q: Is buying or selling links that pass PageRank a violation of Google's guidelines? Why?
A: Yes, it is, for the reasons we mentioned above. I also recently did a post on my personal blog that walks through an example of why search engines wouldn't want to count such links. On a serious medical subject (brain tumors), we highlighted people being paid to write about a brain tumor treatment when they hadn't been aware of the treatment before, and we saw several cases where people didn't do basic research (or even spellchecking!) before writing paid posts.

Q: Is this a Google-only issue?
A: No. All the major search engines have opposed buying and selling links that affect search engines. For the Forbes article Google Purges The Payola, Andy Greenberg asked other search engines about their policies, and the results were unanimous. From the story:

Search engines hate this kind of paid-for popularity. Google's Webmaster guidelines ban buying links just to pump search rankings. Other search engines including Ask, MSN, and Yahoo!, which mimic Google's link-based search rankings, also discourage buying and selling links.

Other engines have also commented about this individually, e.g. a search engine representative from Microsoft commented in a recent interview and said

The reality is that most paid links are a.) obviously not objective and b.) very often irrelevant. If you are asking about those then the answer is absolutely there is a risk. We will not tolerate bogus links that add little value to the user experience and are effectively trying to game the system.

Q: Is that why we've seen some sites that sell links receive lower PageRank in the Google toolbar?
A: Yes. If a site is selling links, that can affect our opinion about the value of that site or cause us to lose trust in that site.

Q: What recourse does a site owner have if their site was selling links that pass PageRank, and the site's PageRank in the Google toolbar was lowered?
A: The site owner can address the violations of the webmaster guidelines and submit a reconsideration request in Google's Webmaster Central console. Before doing a reconsideration request, please make sure that all sold links either do not pass PageRank or are removed.

Q: Is Google trying to tell webmasters how to run their own site?
A: No. We're giving advice to webmasters who want to do well in Google. As I said in this video from my keynote discussion in June 2007, webmasters are welcome to make their sites however they like, but Google in turn reserves the right to protect the quality and relevance of our index. To the best of our knowledge, all the major search engines have adopted similar positions.

Q: Is Google trying to crack down on other forms of advertisements used to drive traffic?
A: No, not at all. Our webmaster guidelines clearly state that you can use links as means to get targeted traffic. In fact, in the presentation I did in August 2007, I specifically called out several examples of non-Google advertising that are completely within our guidelines. We just want disclosure to search engines of paid links so that the paid links won't affect search engines.

Q: I'm aware of a site that appears to be buying/selling links. How can I get that information to Google?
A: Read our official blog post about how to report paid links from earlier in 2007. We've received thousands and thousands of reports in just a few months, but we welcome more reports. We appreciate the feedback, because it helps us take direct action as well as improve our existing algorithmic detection. We also use that data to train new algorithms for paid links that violate our quality guidelines.

Q: Can I get more information?
A: Sure. I wrote more answers about paid links earlier this year if you'd like to read them. And if you still have questions, you can join the discussion in our Webmaster Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.