Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Video Sitemaps 101: Making your videos searchable 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

We know that some of you, or your clients or colleagues, may be new to online video publishing. To make it easier for everyone to understand video indexing and Video Sitemaps, we’ve created a video -- narrated by Nelson Lee, Video Search Product Manager -- that explains everything in basic terms:



Also, last month we wrote about some best practices for getting video content indexed on Google. Today, to help beginners better understand the whys and hows of implementing a Video Sitemap, we added a starting page to the information on Video Sitemaps in the Webmaster Help Center. Please take a look and share your thoughts.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Video Sitemaps: Understanding location tags 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

If you want to add video information to a Sitemap or mRSS feed you must specify the location of the video. This means you must include one of two tags, either the video:player_loc or video:content_loc. In the case of an mRSS feed, these equivalent tags are media:player or media:content, respectively. We need this information to verify that there is actually a live video on your landing page and to extract metadata and signals from the video bytes for ranking. If one of these tags is not included we will not be able to verify the video and your Sitemap/mRSS feed will not be crawled. To reduce confusion, here is some more detail about these elements.

Video Locations Defined

Player Location/URL: the player (e.g., .swf) URL with corresponding arguments that load and play the actual video.

Content Location/URL: the actual raw video bytes (e.g., .flv, .avi) containing the video content.

The Requirements

One of either the player video:player_loc or content video:content_loc location is required. However, we strongly suggest you provide both, as they each serve distinct purposes: player location is primarily used to help verify that a video exists on the page, and content location helps us extract more signals and metadata to accurately rank your videos.

URL extensions at a glance:



















Sitemap:mRSS:Contents:
<loc><link>The playpage URL
<video:player_loc>

<media:player> (url attribute)The SWF URL
<video:content_loc><media:content> (url attribute)The FLV or other raw video URL

NOTE: All URLs should be unique (every URL in your entire Video Sitemap and mRSS feed should be unique)

If you would like to better ensure that only Googlebot accesses your content, you can perform a reverse DNS lookup.

For more information on Google Videos please visit our Help Center, and to post questions and search for answers check out our Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Joint support for the Sitemap Protocol 2013

salam every one, this is a topic from google web master centrale blog: We're thrilled to tell you that Yahoo! and Microsoft are joining us in supporting the Sitemap protocol.

As part of this development, we're moving the protocol to a new namespace, www.sitemaps.org, and raising the version number to 0.9. The sponsoring companies will continue to collaborate on the protocol and publish enhancements on the jointly-maintained site sitemaps.org.

If you've already submitted a Sitemap to Google using the previous namespace and version number, we'll continue to accept it. If you haven't submitted a Sitemap before, check out the documentation on www.sitemaps.org for information on creating one. You can submit your Sitemap file to Google using Google webmaster tools. See the documentation that Yahoo! and Microsoft provide for information about submitting to them.

If any website owners, tool writers, or webserver developers haven't gotten around to implementing Sitemaps yet, thinking this was just a crazy Google experiment, we hope this joint announcement shows that the industry is heading in this direction. The more Sitemaps eventually cover the entire web, the more we can revolutionize the way web crawlers interact with websites. In our view, the experiment is still underway.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Video Tutorial: Google for Webmasters 2013

salam every one, this is a topic from google web master centrale blog:
We're always looking for new ways to help educate our fellow webmasters. While you may already be familiar with Webmaster Tools, Webmaster Help Discussion Groups, this blog, and our Help Center, we've added another tutorial to help you understand how Google works. Hence we've made this video of a soon-to-come presentation titled "Google for Webmasters." This video will introduce how Google discovers, crawls, indexes your site's pages, and how Google displays them in search results. It also touches lightly upon challenges webmasters and search engines face, such as duplicate content, and the effective indexing of Flash and AJAX content. Lastly, it also talks about the benefits of offerings Webmaster Central and other useful Google products.


Take a look for yourself.

Discoverability:



Accessibility - Crawling and Indexing:


Ranking:


Webmaster Central Overview:


Other Resources:



Google Presentations Version:
http://docs.google.com/Presentation?id=dc5x7mrn_245gf8kjwfx

Important links from this presentation as they chronologically appear in the video:
Add your URL to Google
Help Center: Sitemaps
Sitemaps.org
Robots.txt
Meta tags
Best uses of Flash
Best uses of Ajax
Duplicate content
Google's Technology
Google's History
PigeonRank
Help Center: Link Schemes
Help Center: Cloaking
Webmaster Guidelines
Webmaster Central
Google Analytics
Google Website Optimizer
Google Trends
Google Reader
Google Alerts
More Google Products


Special thanks to Wysz, Chark, and Alissa for the voices.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Sitemaps FAQs 2013

salam every one, this is a topic from google web master centrale blog:

Last month, Trevor spoke on the Sitemaps: Oversold, Misused or On The Money? panel at Search Engine Strategies in Chicago. After receiving a lot of great questions at the conference in addition to all the feedback we receive in our Help Group, we've pulled together a FAQ:

Q: I submitted a Sitemap, but my URLs haven't been [crawled/indexed] yet. Isn't that what a Sitemap is for?
A: Submitting a Sitemap helps you make sure Google knows about the URLs on your site. It can be especially helpful if your content is not easily discoverable by our crawler (such as pages accessible only through a form). It is not, however, a guarantee that those URLs will be crawled or indexed. We use information from Sitemaps to augment our usual crawl and discovery processes. Learn more.

Q: If it doesn't get me automatically crawled and indexed, what does a Sitemap do?
A: Sitemaps give information to Google to help us better understand your site. This can include making sure we know about all your URLs, how often and when they're updated, and what their relative importance is. Also, if you submit your Sitemap via Webmaster Tools, we'll show you stats such as how many of your Sitemap's URLs are indexed. Learn more.

Q: Will a Sitemap help me rank better?
A: A Sitemap does not affect the actual ranking of your pages. However, if it helps get more of your site crawled (by notifying us of URLs we didn't previously didn't know about, and/or by helping us prioritize the URLs on your site), that can lead to increased presence and visibility of your site in our index. Learn more.

Q: If I set all of my pages to have priority 1.0, will that make them rank higher (or get crawled faster) than someone else's pages that have priority 0.8?
A: No. As stated in our Help Center, "priority only indicates the importance of a particular URL relative to other URLs on your site, and doesn't impact the ranking of your pages in search results." Indicating that all of your pages have the same priority is the same as not providing any priority information at all.

Q: Is there any point in submitting a Sitemap if all the metadata (<changefreq>, <priority>, etc.) is the same for each URL, or if I'm not sure it's accurate?
A: If the value of a particular tag is the same for 100% of the URLs in your Sitemap, you don't need to include that tag in your Sitemap. Including it won't hurt you, but it's essentially the same as not submitting any information, since it doesn't help distinguish between your URLs. If you're not sure whether your metadata is accurate (for example, you don't know when a particular URL was last modified), it's better to omit that tag for that particular URL than to just make up a value which may be inaccurate.

Q: I've heard about people who submitted a Sitemap and got penalized shortly afterward. Can a Sitemap hurt you?
A: Only if it falls on you from a great height. (Seriously, though: if it ever happened that someone was penalized after submitting a Sitemap, it would have been purely coincidental. Google does not penalize you for submitting a Sitemap.)

Q: Where can I put my Sitemap? Does it have to be at the root of my site?
A: We recently enabled Sitemap cross-submissions, which means that you can put your Sitemap just about anywhere as long as you have the following sites verified in your Webmaster Tools account:
  • the site on which the Sitemap is located
  • the site(s) whose URLs are referenced in the Sitemap
Note that cross-submissions may not work for search engines other than Google. Learn more about Sitemap cross-submissions.

Q: Can I just submit the site map that my webmaster made of my site? I don't get this whole XML thing.
A: There's a difference between a (usually HTML) site map built to help humans navigate around your site, and an XML Sitemap built for search engines. Both of them are useful, and it's great to have both. A site map on your domain can also help search engines find your content (since crawlers can follow the links on the page). However, if you submit an HTML site map in place of a Sitemap, Webmaster Tools will report an error because an HTML page isn't one of our recognized Sitemap formats. Also, if you create an XML Sitemap, you'll be able to give us more information than you can with an HTML site map (which is just a collection of links). Learn more about supported Sitemap formats.

Q: Which Sitemap format is the best?
A: We recommend the XML Sitemap protocol as defined by sitemaps.org. XML Sitemaps have the advantage of being upgradeable: you can start simple if you want (by just listing your URLs), but—unlike a text file Sitemap—you can easily upgrade an XML Sitemap later on to include more metadata. XML Sitemaps are also more comprehensive than an Atom or RSS feed submitted as a Sitemap, since feeds usually only list your most recent URLs (rather than all the URLs you want search engines to know about).

Q: If I have multiple URLs that point to the same content, can I use my Sitemap to indicate my preferred URL for that content?
A: Yes. While we can't guarantee that our algorithms will display that particular URL in search results, it's still helpful for you to indicate your preference by including that URL in your Sitemap. We take this into consideration, along with other signals, when deciding which URL to display in search results. Learn more about duplicate content.

Q: Does the placement of a URL within a Sitemap file matter? Will the URLs at the beginning of the file get better treatment than the URLs near the end?
A: No, and no.

Q: If my site has multiple sections (e.g. a blog, a forum, and a photo gallery), should I submit one Sitemap for the site, or multiple Sitemaps (one for each section)?
A: You may submit as few or as many Sitemaps as you like (up to these limits). Organize them in whatever way you find easiest to maintain. If you create multiple Sitemaps, you can use a Sitemap Index file to list them all. Learn more.

If your question isn't covered here, you can find even more questions and answers in our Sitemaps Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New warnings feedback 2013

salam every one, this is a topic from google web master centrale blog:

Given helpful suggestions from our
discussion group, we've improved feedback for sitemaps in Webmaster Tools. Now, minor problems in a sitemap will be reported as "warnings," and will appear instead of, or in addition to, more serious "errors." (Previously all problems were listed as errors.) Warnings allow us to provide feedback on portions of your sitemap that may be confusing or inaccurate, while saving the real "error" alarm for problems that make your sitemap completely unreadable. We hope the additional information makes it even easier to share your sitemaps with Google.

The new set of warnings includes many problems that we had previously classified as errors, including the "incorrect namespace" and "invalid date" examples shown in the screenshot above. We also crawl a sample of the URLs listed in your sitemap and report warnings if the Googlebot runs into any trouble with them. These warnings might suggest a widespread problem with your site that warrants further investigation, such as a stale sitemap or a misconfigured robots.txt file.
Please let us know how you like this new feedback. Tell us what you think via the comments below, or in the
discussion group. We also appreciate suggestions for additional warnings that you would find useful.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New: Content analysis and Sitemap details, plus more languages 2013

salam every one, this is a topic from google web master centrale blog:

We're always striving to help webmasters build outstanding websites, and in our latest release we have two new features: Content analysis and Sitemap details. We hope these features help you to build a site you could compare to a fine wine -- getting better and better over time.

Content analysis

To help you improve the quality of your site, our new content analysis feature should be a helpful addition to the crawl error diagnostics already provided in Webmaster Tools. Content analysis contains feedback about issues that may impact the user experience or that may make it difficult for Google to crawl and index pages on your site. By reviewing the areas we've highlighted, you can help eliminate potential issues that could affect your site's ability to be crawled and indexed. This results in better indexing of your site by Google and other search engines.

The Content analysis summary page within the Diagnostics section of Webmaster Tools features three main categories. Click on a particular issue type for more details:

  • Title tag issues
  • Meta description issues
  • Non-indexable content issues

content analysis usability section

Selecting "Duplicate title tags" displays a list of repeated page titles along with a count of how many pages contain that title. We currently present up to thirty duplicated page titles on the details page. If the duplicate title issues shown are corrected, we'll update the list to reflect any other pages that share duplicate titles the next time your website is crawled.

Also, in the Title tag issues category, we show "Long title tags" and "Short title tags." For these issue types we will identify title tags that are way too short (for example "IT" isn't generally a good title tag) or way too long (title tag was never intended to mean <insert epic novel here>). A similar algorithm identifies potentially problematic meta description tags. While these pointers won't directly help you rank better (i.e. pages with <title> length x aren't moved to the top of the search results), they may help your site display better titles and snippets in search results, and this can increase visitor traffic.

In the "Non-indexable content issues," we give you a heads-up of areas that aren't as friendly to our more text-based crawler. And be sure to check out our posts on Flash and images to learn how to make these items more search-engine friendly.


content analysis crawlability section


Sitemap details page

If you've submitted a Sitemap, you'll be happy when you see the additional information in Webmaster Tools revealing how your Sitemap was processed. You can find this information on the newly available Sitemap Details page which (along with information that was previously provided for each of your Sitemaps) shows you the number of the pages from your Sitemap that were indexed. Keep in mind the number of pages indexed from your Sitemap may not be 100% accurate because the indexed number is updated periodically, but it's more accurate than running a "site:example.com" query on Google.

The new Sitemap Details page also lists any errors or warnings that were encountered when specific pages from your Sitemap were crawled. So the time you might have previously spent on crafting custom Google queries to determine how many pages from your Sitemap were indexed, can now be spent on improving your site. If your site is already the crème de la crème, you might prefer to spend the extra free time mastering your ice-carving skills or blending the perfect eggnog.

Here's a view of the new Sitemap details page:


Sitemaps are an excellent way to tell Google about your site's most important pages, especially if you have new or updated content that we may not know about. If you haven't yet submitted a Sitemap or have questions about the process, visit our Webmaster Help Center to learn more.

Webmaster Tools now available in Czech & Hungarian

We love expanding our product to help more people and in their language of choice. We recently put in effort to expand the number of Webmaster Tools available languages to Czech and Hungarian, in addition to the 20 other languages we already support. We won't be stopping here. Our desire to support even more languages in the future means that if your language of choice isn't currently supported, stay tuned -- there'll be even more supported languages to come.

We always love to hear what you think. Please visit our Webmaster Help Group to share comments or ask questions.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: On-Demand Sitemaps for Custom Search 2013

salam every one, this is a topic from google web master centrale blog:
Since we launched enhanced indexing with the Custom Search platform earlier this year, webmasters who submit Sitemaps to Webmaster Tools get special treatment: Custom Search recognizes the submitted Sitemaps and indexes URLs from these Sitemaps into a separate index for higher quality Custom Search results. We analyze your Custom Search Engines (CSEs), pick up the appropriate Sitemaps, and figure out which URLs are relevant for your engines for enhanced indexing. You get the dual benefit of better discovery for Google.com and more comprehensive coverage in your own CSEs.

Today, we're taking another step towards improving your experience with Google webmaster services with the launch of On-Demand Indexing in Custom Search. With On-Demand Indexing, you can now tell us about the pages on your websites that are new, or that are important and have changed, and Custom Search will instantly schedule them for crawl, and index and serve them in your CSEs usually within 24 hours, often much faster.

How do you tell us about these URLs? You guessed it... provide a Sitemap to Webmaster Tools, like you always do, and tell Custom Search about it. Just go to the CSE control panel, click on the Indexing tab, select your On-Demand Sitemap, and hit the "Index Now" button. You can tell us which of these URLs are most important to you via the priority and lastmod attributes that you provide in your Sitemap. Each CSE has a number of pages allocated within the On-Demand Index, and with these attributes, you can us which are most important for indexing. If you need greater allocation in the On-Demand index, as well as more customization controls, Google Site Search provides a range of options.


Some important points to remember:
  1. You only need to submit your Sitemaps once in Webmaster Tools. Custom Search will automatically list the Sitemaps submitted via Webmaster Tools and you can decide which Sitemap to select for On-Demand Indexing.
  2. Your Sitemap needs to be for a website verified in Webmaster Tools, so that we can verify ownership of the right URLs.
  3. In order for us to index these additional pages, our crawlers must be able to crawl them. You can use "Webmaster Tools > Crawl Errors > URLs restricted by robots.txt" or check your robots.txt file to ensure that you're not blocking us from crawling these pages.
  4. Submitting pages for On-Demand Indexing will not make them appear any faster in the main Google index, or impact ranking on Google.com.
We hope you'll use this feature to inform us regularly of the most important changes on your sites, so we can respond quickly and get those pages indexed in your CSE. As always, we're always listening for your feedback on Custom Search.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Duplicate content summit at SMX Advanced 2013

salam every one, this is a topic from google web master centrale blog: Last week, I participated in the duplicate content summit at SMX Advanced. I couldn't resist the opportunity to show how Buffy is applicable to the everday Search marketing world, but mostly I was there to get input from you on the duplicate content issues you face and to brainstorm how search engines can help.

A few months ago, Adam wrote a great post on dealing with duplicate content. The most important things to know about duplicate content are:
  • Google wants to serve up unique results and does a great job of picking a version of your content to show if your sites includes duplication. If you don't want to worry about sorting through duplication on your site, you can let us worry about it instead.
  • Duplicate content doesn't cause your site to be penalized. If duplicate pages are detected, one version will be returned in the search results to ensure variety for searchers.
  • Duplicate content doesn't cause your site to be placed in the supplemental index. Duplication may indirectly influence this however, if links to your pages are split among the various versions, causing lower per-page PageRank.
At the summit at SMX Advanced, we asked what duplicate content issues were most worrisome. Those in the audience were concerned about scraper sites, syndication, and internal duplication. We discussed lots of potential solutions to these issues and we'll definitely consider these options along with others as we continue to evolve our toolset. Here's the list of some of the potential solutions we discussed so that those of you who couldn't attend can get in on the conversation.

Specifying the preferred version of a URL in the site's Sitemap file
One thing we discussed was the possibility of specifying the preferred version of a URL in a Sitemap file, with the suggestion that if we encountered multiple URLs that point to the same content, we could consolidate links to that page and could index the preferred version.

Providing a method for indicating parameters that should be stripped from a URL during indexing
We discussed providing this in either an interface such as webmaster tools on in the site's robots.txt file. For instance, if a URL contains sessions IDs, the webmaster could indicate the variable for the session ID, which would help search engines index the clean version of the URL and consolidate links to it. The audience leaned towards an addition in robots.txt for this.

Providing a way to authenticate ownership of content
This would provide search engines with extra information to help ensure we index the original version of an article, rather than a scraped or syndicated version. Note that we do a pretty good job of this now and not many people in the audience mentioned this to be a primary issue. However, the audience was interested in a way of authenticating content as an extra protection. Some suggested using the page with the earliest date, but creation dates aren't always reliable. Someone also suggested allowing site owners to register content, although that could raise issues as well, as non-savvy site owners wouldn't know to register content and someone else could take the content and register it instead. We currently rely on a number of factors such as the site's authority and the number of links to the page. If you syndicate content, we suggest that you ask the sites who are using your content to block their version with a robots.txt file as part of the syndication arrangement to help ensure your version is served in results.

Making a duplicate content report available for site owners
There was great support for the idea of a duplicate content report that would list pages within a site that search engines see as duplicate, as well as pages that are seen as duplicates of pages on other sites. In addition, we discussed the possibility of adding an alert system to this report so site owners could be notified via email or RSS of new duplication issues (particularly external duplication).

Working with blogging software and content management systems to address duplicate content issues
Some duplicate content issues within a site are due to how the software powering the site structures URLs. For instance, a blog may have the same content on the home page, a permalink page, a category page, and an archive page. We are definitely open to talking with software makers about the best way to provide easy solutions for content creators.

In addition to discussing potential solutions to duplicate content issues, the audience had a few questions.

Q: If I nofollow a substantial number of my internal links to reduce duplicate content issues, will this raise a red flag with the search engines?
The number of nofollow links on a site won't raise any red flags, but that is probably not the best method of blocking the search engines from crawling duplicate pages, as other sites may link to those pages. A better method may be to block pages you don't want crawled with a robots.txt file.

Q: Are the search engines continuing the Sitemaps alliance?
We launched sitemaps.org in November of last year and have continued to meet regularly since then. In April, we added the ability for you to let us know about your Sitemap in your robots.txt file. We plan to continue to work together on initiatives such as this to make the lives of webmasters easier.

Q: Many pages on my site primarily consist of graphs. Although the graphs are different on each page, how can I ensure that search engines don't see these pages as duplicate since they don't read images?
To ensure that search engines see these pages as unique, include unique text on each page (for instance, a different title, caption, and description for each graph) and include unique alt text for each image. (For instance, rather than use alt="graph", use something like alt="graph that shows Willow's evil trending over time".

Q: I've syndicated my content to many affiliates and now some of those sites are ranking for this content rather than my site. What can I do?
If you've freely distributed your content, you may need to enhance and expand the content on your site to make it unique.

Q: As a searcher, I want to see duplicates in search results. Can you add this as an option?
We've found that most searchers prefer not to have duplicate results. The audience member in particular commented that she may not want to get information from one site and would like other choices, but for that case, other sites will likely not have identical information and therefore will show up in the results. Bear in mind that you can add the "&filter=0" parameter to the end of a Google web search URL to see additional results which might be similar.

I've brought back all the issues and potential solutions that we discussed at the summit back to my team and others within Google and we'll continue to work on providing the best search results and expanding our partnership with you, the webmaster. If you have additional thoughts, we'd love to hear about them!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: A new Google Sitemap Generator for your website 2013

salam every one, this is a topic from google web master centrale blog:
It's been well over three years since we initially announced the Python Sitemap generator in June 2005. In this time, we've seen lots of people create great third-party Sitemap generators to help webmasters create better Sitemap files. While most Sitemap generators either crawl websites or list the files on a server, we have created a different kind of Sitemap generator that uses several ways to find URLs on your website and then allows you to automatically create and maintain different kinds of Sitemap files.

Google Sitemap Generator screenshot of the admin console

About Google Sitemap Generator


Our new open-source Google Sitemap Generator finds new and modified URLs based on your webserver's traffic, its log files, or the files found on the server. By combining these methods, Google Sitemap Generator can be very fast in finding these URLs and calculating relevant metadata, thereby making your Sitemap files as effective as possible. Once Google Sitemap Generator has collected the URLs, it can create the following Sitemap files for you:

In addition, Google Sitemap Generator can send a ping to Google Blog Search for all of your new or modified URLs. You can optionally include the URLs of the Sitemap files in your robots.txt file as well as "ping" the other search engines that support the sitemaps.org standard.

Sending the URLs to the right Sitemap files is simple thanks to the web-based administration console. This console gives you access to various features that make administration a piece of cake while maintaining a high level of security by default.

Getting started


Google Sitemap Generator is a server plug-in that can be installed on both Linux/Apache and Microsoft IIS Windows-based servers. As with other server-side plug-ins, you will need to have administrative access to the server to install it. You can find detailed information for the installation in the Google Sitemap Generator documentation.

We're excited to release Google Sitemap Generator with the source code and hope that this will encourage more web hosters to include this or similar tools in their hosting packages!

Do you have any questions? Feel free to drop by our Help Group for Google Sitemap Generator or ask general Sitemaps question in our Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Google, duplicate content caused by URL parameters, and you 2013

salam every one, this is a topic from google web master centrale blog:


How can URL parameters, like session IDs or tracking IDs, cause duplicate content?
When user and/or tracking information is stored through URL parameters, duplicate content can arise because the same page is accessible through numerous URLs. It's what Adam Lasnik referred to in "Deftly Dealing with Duplicate Content" as "store items shown (and -- worse yet -- linked) via multiple distinct URLs." In the example below, URL parameters create three URLs which access the same product page.


(click to enlarge)

Why should you care?
When search engines crawl identical content through varied URLs, there may be several negative effects:

1. Having multiple URLs can dilute link popularity. For example, in the diagram above, rather than 50 links to your intended display URL, the 50 links may be divided three ways among the three distinct URLs.

2. Search results may display user-unfriendly URLs (long URLs with tracking IDs, session IDs)
* Decreases chances of user selecting the listing
* Offsets branding efforts


How we help users and webmasters with duplicate content
We've designed algorithms to help prevent duplicate content from negatively affecting webmasters and the user experience.

1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.

2. We select what we think is the "best" URL to represent the cluster in search results.

3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL.

Consolidating properties from duplicates into one representative URL often provides users with more accurate search results.


If you find you have duplicate content as mentioned above, can you help search engines understand your site?
First, no worries, there are many sites on the web that utilize URL parameters and for valid reasons. But yes, you can help reduce potential problems for search engines by:

1. Removing unnecessary URL parameters -- keep the URL as clean as possible.

2. Submitting a Sitemap with the canonical (i.e. representative) version of each URL. While we can't guarantee that our algorithms will display the Sitemap's URL in search results, it's helpful to indicate the canonical preference.


How can you design your site to reduce duplicate content?
Because of the way Google handles duplicate content, webmasters need not be overly concerned with the loss of link popularity or loss of PageRank due to duplication. However, to reduce duplicate content more broadly, we suggest:

1. When tracking visitor information, use 301 redirects to redirect URLs with parameters such as affiliateID, trackingID, etc. to the canonical version.

2. Use a cookie to set the affiliateID and trackingID values.

If you follow this guideline, your webserver logs could appear as:

127.0.0.1 - - [19/Jun/2007:14:40:45 -0700] "GET /product.php?category=gummy-candy&item=swedish-fish&affiliateid=ABCD HTTP/1.1" 301 -

127.0.0.1 - - [19/Jun/2007:14:40:45 -0700] "GET /product.php?item=swedish-fish HTTP/1.1" 200 74

And the session file storing the raw cookie information may look like:

category|s:11:"gummy-candy";affiliateid|s:4:"ABCD";

Please be aware that if your site uses cookies, your content (such as product pages) should remain accessible with cookies disabled.


How can we better assist you in the future?
We recently published ideas from SMX Advanced on how search engines can help webmasters with duplicate content. If you have an opinion on the topic, please join our conversation in the Webmaster Help Group (we've already started the thread).

Update: for more information, please see our Help Center article on canonicalization.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Musings on Down Under 2013

salam every one, this is a topic from google web master centrale blog:

Earlier this year, a bunch of Googlers (Maile, Peeyush, Dan, Adam and I) bunged ourselves across the equator and headed to Sydney, so we could show our users and webmasters that just because you're "down under" doesn't mean you're under our radar. We had a great time getting to know folks at our Sydney office, and an even greater time meeting and chatting with all the people attending Search Summit and Search Engine Room. What makes those 12-hour flights worthwhile is getting the chance to inform and be informed about the issues important to the webmaster community.

One of the questions we heard quite frequently: Should we as webmasters/SEOs/SEMs/users be worried about personalized search?

Our answer: a resounding NO! Personalized search takes each user's search behavior, and subtly tunes the search results to better match their interests over time. For a user, this means that even if you're a lone entomologist in a sea of sports fans, you'll always get the results most relevant to you for the query "cricket". For the webmaster, it allows niche markets that collide on the same search terms to disambiguate themselves based on individual user preferences, and this really presents a tremendous opportunity for visibility. Also, to put things in perspective, search engines have been moving towards some degree of personalization for years; for example, providing country/language specific results is already a form of personalization, just at a coarser granularity. Making it more fine-grained is the logical next step, and helps level the playing field for smaller niche websites which now have a chance to rank well for users that want their content the most.

Another question that popped up a lot: I'm moving my site from domain X to Y. How do I make sure all my hard-earned reputation carries over?

Here are the important bits to think about:
  • For each page on domain X, have it 301-redirect to the corresponding page on Y. (How? Typically through .htaccess, but check with your hosting provider).
  • You might want to stagger the move, and redirect sub-sections of your site over time. This gives you the chance to keep an eye on the effects, and also gives search engines' crawl/indexing pipelines time to cover the space of redirected URLs.
  • http://www.google.com/webmasters is your friend. Keep an eye on it during the transition to make sure that the redirects are having the effect you want.
  • Give it time. How quickly the transition is reflected in the results depends on how quickly we recrawl your site and see those redirects, which depends on a lot of factors including the current reputation of your site's pages.
  • Don't forget to update your Sitemap. (You are using Sitemaps, aren't you?)
  • If possible, don't substantially change the content of your pages at the same time you make the move. Otherwise, it will be difficult to tell if ranking changes are due to the change of content or incorrectly implemented redirects.
Before we sign off, we wanted to shout-out to a couple of the folks at the Sydney office: Lars (one of the original Google Maps guys) gets accolades from all of us jetlagged migrants for donating his awesome Italian espresso machine to the office. And Deepak, thanks for all your tips on what to see and do around Sydney.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: What's new with Sitemaps.org? 2013

salam every one, this is a topic from google web master centrale blog: What has the Sitemaps team been up to since we announced sitemaps.org? We've been busy trying to get Sitemaps adopted by everyone and to make the submission process as easy and automated as possible. To that end, we have three new announcements to share with you.

First, we're making the sitemaps.org site available in 18 languages! We know that our users are located all around the world and we want to make it easy for you to learn about Sitemaps, no matter what language you speak. Here is a link to the Sitemap protocol in Japanese and the FAQ in German.

Second, it's now easier for you to tell us where your Sitemaps live. We wondered if we could make it so easy that you wouldn't even have to tell us and every other search engine that supports Sitemaps. But how? Well, every website can have a robots.txt file in a standard location, so we decided to let you tell us about your Sitemap in the robots.txt file. All you have to do is add a line like

Sitemap: http://www.mysite.com/sitemap.xml

to your robots.txt file. Just make sure you include the full URL, including the http://. That's it. Of course, we still think it's useful to submit your Sitemap through Webmaster tools so you can make sure that the Sitemap was processed without any issues and you can get additional statistics about your site

Last but not least, Ask.com is now also supporting the Sitemap protocol. And with the ability to discover your Sitemaps from your robots.txt file, Ask.com and any other search engine that supports this change to robots.txt will be able to find your Sitemap file.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: An Update on Sitemaps at Google 2013

salam every one, this is a topic from google web master centrale blog:
Did you know that the number of website hosts that have been submitting Sitemap files has almost tripled over the last year? It's no wonder: the secret is out - as a recent research study showed, Sitemaps helps search engines to find new and changed content faster. Using Sitemaps doesn't guarantee that your site will be crawled and indexed completely, but it certainly helps us understand your website better.

Together with the Webmaster Tools design update, we've been working on Sitemaps as well:
  • Google and the other search engines which are a part of Sitemaps.org now support up to 50,000 child Sitemaps for Sitemap index files (instead of the previous 1,000). This allows large sites to submit a theoretical maximum of 2.5 billion URLs with a single Sitemap Index URL (oh, and if you need more, you can always submit multiple Sitemap index files). 
  • The Webmaster Tools design update now shows you all Sitemap files that were submitted for your verified website. This is particularly useful if you have multiple owners verified in Webmaster Tools or if you are submitting some Sitemap files via HTTP ping or through your robots.txt file.
  • The indexed URL count in Webmaster Tools for your Sitemap files is now even more precise.
  • For the XML developers out there, we've updated the XSD schemas to allow Sitemap extensions. The new schema helps webmasters to create better Sitemaps by verifying more features. By validating Sitemap files with the new schema, you can be more confident that the Sitemap files are correct.
  • Do I need to mention that Sitemap file processing is much faster than ever before? We've drastically reduced the average time from submitting a Sitemap file to processing it and showing some initial data in Webmaster Tools. 


For more information about using Sitemaps, make sure to check out our blog post about frequently asked questions on Sitemaps and our Help Center. If you have any questions that aren't covered here, don't forget to search our Help Forum and start a thread in the Sitemaps section for more help.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New third-party Sitemaps tools 2013

salam every one, this is a topic from google web master centrale blog: Hello, webmasters, I'm Maile, and I recently joined the team here at Google webmaster central. And I already have good news to report: we've updated our third-party program and websites information. These third-party tools provide lots of options for easily generate a Sitemap -- from plugins for content management systems to online generators.

Many thanks to this community for continuing to innovate and improve the Sitemap tools. Since most of my work focuses on the Sitemaps protocol, I hope to meet you on our Sitemaps protocol discussion group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Multiple Sitemaps in the same directory 2013

salam every one, this is a topic from google web master centrale blog: We've gotten a few questions about whether you can put multiple Sitemaps in the same directory. Yes, you can!

You might want to have multiple Sitemap files in a single directory for a number of reasons. For instance, if you have an auction site, you might want to have a daily Sitemap with new auction offers and a weekly Sitemap with less time-sensitive URLs. Or you could generate a new Sitemap every day with new offers, so that the list of Sitemaps grows over time. Either of these solutions works just fine.

Or, here's another sample scenario: Suppose you're a provider that supports multiple web shops, and they share a similar URL structure differentiated by a parameter. For example:

http://example.com/stores/home?id=1
http://example.com/stores/home?id=2
http://example.com/stores/home?id=3

Since they're all in the same directory, it's fine by our rules to put the URLs for all of the stores into a single Sitemap, under http://example.com/ or http://example.com/stores/. However, some webmasters may prefer to have separate Sitemaps for each store, such as:

http://example.com/stores/store1_sitemap.xml
http://example.com/stores/store2_sitemap.xml
http://example.com/stores/store3_sitemap.xml

As long as all URLs listed in the Sitemap are at the same location as the Sitemap or in a sub directory (in the above example http://example.com/stores/ or perhaps http://example.com/stores/catalog) it's fine for multiple Sitemaps to live in the same directory (as many as you want!). The important thing is that Sitemaps not contain URLs from parent directories or completely different directories -- if that happens, we can't be sure that the submitter controls the URL's directory, so we can't trust the metadata.

The above Sitemaps could also be collected into a single Sitemap index file and easily be submitted via Google webmaster tools. For example, you could create http://example.com/stores/sitemap_index.xml as follows:

<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.google.com/schemas/sitemap/0.84">
<sitemap>
<loc>http://example.com/stores/store1_sitemap.xml</loc>
<lastmod>2006-10-01T18:23:17+00:00</lastmod>
</sitemap>
<sitemap>
<loc>http://example.com/stores/store2_sitemap.xml</loc>
<lastmod>2006-10-01</lastmod>
</sitemap>
<sitemap>
<loc>http://example.com/stores/store3_sitemap.xml</loc>
<lastmod>2006-10-05</lastmod>
</sitemap>
</sitemapindex>

Then simply add the index file to your account, and you'll be able to see any errors for each of the child Sitemaps.

If each store includes more than 50,000 URLs (the maximum number for a single Sitemap), you would need to have multiple Sitemaps for each store. In that case, you may want to create a Sitemap index file for each store that lists the Sitemaps for that store. For instance:

http://example.com/stores/store1_sitemapindex.xml
http://example.com/stores/store2_sitemapindex.xml
http://example.com/stores/store3_sitemapindex.xml

Since Sitemap index files can't contain other index files, you would need to submit each Sitemap index file to your account separately.

Whether you list all URLs in a single Sitemap or in multiple Sitemaps (in the same directory of different directories) is simply based on what's easiest for you to maintain. We treat the URLs equally for each of these methods of organization.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Adding Images to your Sitemaps 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Sitemaps are an invaluable resource for search engines. They can highlight the important content on a site and allow crawlers to quickly discover it. Images are an important element of many sites and search engines could equally benefit from knowing which images you consider important. This is particularly true for images that are only accessible via JavaScript forms, or for pages that contain many images but only some of which are integral to the page content.

Now you can use a Sitemaps extension to provide Google with exactly this information. For each URL you list in your Sitemap, you can add additional information about important images that exist on that page. You don’t need to create a new Sitemap, you can just add information on images to the Sitemap you already use.

Adding images to your Sitemaps is easy. Simply follow the instructions in the Webmaster Tools Help Center or refer to the example below:

<?xml version="1.0" encoding="UTF-8"?>
  <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
   xmlns:image="http://www.google.com/schemas/sitemap-image/1.1">
  <url>
    <loc>http://example.com/sample.html</loc>
    <image:image>
        <image:loc>http://example.com/image.jpg</image:loc>
    </image:image>
  </url>
</urlset>


We index billions of images and see hundreds of millions of image-related queries each day. To take advantage of that traffic most effectively, take a moment to update your Sitemap file with information on the images from your site. Let us know in the Sitemaps forum if you have any questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Sitemaps offer better coverage for your Custom Search Engine 2013

salam every one, this is a topic from google web master centrale blog:

If you're a webmaster or site owner, you realize the importance of providing high quality search on your site so that users easily find the right information.

We just announced today that AdSense for Search is now powered by Custom Search. Custom Search (a Google-powered search box that you can install on your website in minutes) helps your users quickly find what they're looking for. As a webmaster, Custom Search gives you advanced customization options to improve the accuracy of your site's search results. You can also choose to monetize your traffic with ads tuned to the topic of your site. If you don't want ads, you can use Custom Search Business Edition.



Now, we're also looking to index more of your site's content for inclusion in your Custom Search Engine (CSE) used for search on your site. We figure out what sites and URLs are included in your CSE, and -- if you've provided Sitemaps for the relevant sites -- we use that information to create a more comprehensive experience for your site's visitors. You don't have to do anything specific, besides submitting a Sitemap (via Webmaster Tools) for your site if you haven't already done so. Note that this change will not result in more pages indexed on Google.com and your search rankings on Google.com won't change. However, you will be able to get much better results coverage in your CSE.

Custom Search is built on top of the Google index. This means that all pages that are available on Google.com are also available to your search engine. We're now maintaining a CSE-specific index in addition to the Google.com index for enhancing the performance of search on your site. If you submit a Sitemap, it's likely that we will crawl those pages and include them in the additional index we build.

In order for us to index these additional pages, our crawlers must be able to crawl them. Your Sitemap will also help us identify the URLs that are important. Please ensure you are not blocking us from crawling any pages you want indexed. Improved index coverage is not instantaneous, as it takes some time for the pages to be crawled and indexed.

So what are you waiting for? Submit your Sitemap!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Using stats from site: and Sitemap details 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: Beginner to Intermediate

Every now and then in the webmaster blogosphere and forums, this issue comes up: when a webmaster performs a [site:example.com] query on their website, the number of indexed results differs from what is displayed in their Sitemaps report in Webmaster Tools. Such a discrepancy may smell like a bug, but it's actually by design. Your Sitemap report only reflects the URLs you've submitted in your Sitemap file. The site operator, on the other hand, takes into account whatever Google has crawled, which may include URLs not included in your Sitemap, such as newly added URLs or other URLs discovered via links.

Think of the site operator as a quick diagnosis of the general health of your site in Google's index. Site operator results can show you:
  • a rough estimate of how many pages have been indexed
  • one indication of if your site has been hacked
  • if you have duplicate titles or snippets
Here is an example query using the site operator:



Your Sitemap report provides more granular statistics about the URLs you submitted, such as the number of indexed URLs vs. the number submitted for crawling, and Sitemap-specific warnings or errors that may have occurred when Google tried to access your URLs.

Sitemap report

Feel free to check out our Help Center for more on the site: operator and Sitemaps. If you have further questions or issues, please post to our Webmaster Help Forum, where experienced webmasters and Googlers are happy to help.

Posted by Charlene Perez
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Tips for News Search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

During my stint on the "How Google Works Tour: Seattle", I heard plenty of questions regarding News Search from esteemed members of the press, such as The Stranger, The Seattle Times and Seattle Weekly. After careful note-taking throughout our conversations, the News team and I compiled this presentation to provide background and FAQs for all publishers interested in Google News Search.



Along with the FAQs about News Sitemaps and PageRank in the video above, here's additional Q&A to get you started:

Would adding a city name to my paper—for example, changing our name from "The Times" to "The San Francisco Bay Area Times"—help me target my local audience in News Search?
No, this won't help News rankings. We extract geography and location information from the article itself (see video). Changing your name to include relevant keywords or adding a local address in your footer won't help you target a specific audience in our News rankings.
What happens if I accidentally include URLs in my News Sitemap that are older than 72 hours?
We want only the most recently added URLs in your News Sitemap, as it directs Googlebot to your breaking information. If you include older URLs, no worries (there's no penalty unless you're perceived as maliciously spamming -- this case would be rare, so again, no worries); we just won't include those URLs in our next News crawl.
To get the full scoop, check out the video!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.