Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Introducing a new Rich Snippets format: Events 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Last year we introduced Rich Snippets, a new feature that makes it possible to surface structured data from your pages on Google's search results. So far, user reaction to Rich Snippets has been enthusiastic -- after all, Rich Snippets help people make more informed clicks and find what they need even faster.

We originally introduced Rich Snippets with two formats: reviews and people. Later in the year we added support for marking up video information which is used to improve Video Search. Today, we're excited to kick off the new year by adding support for events.

Events markup is based off of the hCalendar microformat. Here's an example of what the new events Rich Snippets will look like:


The new format shows links to specific events on the page along with dates and locations. It provides a fast and convenient way for users to determine if a page has events they may be interested in.

If you have event listings on your site, we encourage you to review the events documentation we've prepared to help you get started. Please note, however, that marking up your content is not a guarantee that Rich Snippets will show for your site. Just as we did for previous formats, we will take a gradual approach to incorporating the new event snippets to ensure a great user experience along the way.

Stay tuned for more developments in Rich Snippets throughout the year!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Finding Places on the Web: Rich Snippets for Local Search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All
Cross-posted from the Lat Long Blog.

We’re sharing some news today that we hope webmasters will find exciting. As you know, we’re constantly working to organize the world’s information - be it textual, visual, geographic or any other type of useful data. From a local search perspective, part of this effort means looking for all the great web pages that reference a particular place. The Internet is teeming with useful information about local places and points of interest, and we do our best to deliver relevant search results that help shed light on locations all across the globe.

Today, we’re announcing that your use of Rich Snippets can help people find the web pages you’ve created that may reference a specific place or location. By using structured HTML formats like hCard to markup the business or organization described on your page, you make it easier for search engines like Google to properly classify your site, recognize and understand that its content is about a particular place, and make it discoverable to users on Place pages.

You can get started by reviewing these tips for using Rich Snippets for Local Search. Whether you’re creating a website for your own business, an article on a newly opened restaurant, or a guide to the best places in town, your precise markup helps associate your site with the search results for that particular place. Though this markup does not guarantee that your site will be shown in search results, we’re excited to expand support for making the web better organized around real world places.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Google SEO resources for beginners 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Beginner

Want to eat healthier and exercise more in 2010? That's tough! Want to learn about search engine optimization (SEO) so you can disregard the rumors and know what's important? That's easy! Here's how to gain SEO knowledge as you go about your new start to 2010:

Step 1: Absorb the basics
  • If you like to learn by reading, download our SEO Starter Guide for reading while you're on an exercise bike, training for Ironman.
  • Or, if you're more a video watcher, try listening to my "Search Friendly Development" session while you're cleaning your house. Keep in mind that some parts of the presentation are a little more technical.

  • For good measure, and because at some point you'll hear references to them, check out our webmaster guidelines for yourself.

Step 2: Explore details that pique your interest
Are you done with the basics but now you have some questions? Good for you! Try researching a particular topic in our Webmaster Help Center. For example, do you want more information about crawling and indexing or understanding what links are all about?


Step 3: Verify ownership of your site in Webmaster Tools
It takes a little bit of skill, but we have tons of help for verification. Once you verify ownership of your site (i.e., signal to Google that you're the owner), you can:


A sample message regarding the crawlability of your site


Step 4: Research before you do anything drastic
Usually the basics (e.g., good content/service and a crawlable site with indexable information) are the necessities for SEO. You may hear or read differently, but before you do anything drastic on your site such as robots.txt disallow'ing all of your directories or revamping your entire site architecture, please try:
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Easier management of website verifications 2013

salam every one, this is a topic from google web master centrale blog:

Webmaster level: All

To help webmasters manage the verified owners for their websites in Webmaster Tools, we’ve recently introduced three new features:

  • Verification details view: You can now see the methods used to verify an owner for your site. In the Manage owners page for your site, you can now find the new Verification details link. This screenshot shows the verification details of a user who is verified using both an HTML file uploaded to the site and a meta tag:

    Where appropriate, the Verification details will have links to the correct URL on your site where the verification can be found to help you find it faster.

  • Requiring the verification method be removed from the site before unverifying an owner: You now need to remove the verification method from your site before unverifying an owner from Webmaster Tools. Webmaster Tools now checks the method that the owner used to verify ownership of the site, and will show an error message if the verification is still found. For example, this is the error message shown when an unverification was attempted while the DNS CNAME verification method was still found on the DNS records of the domain:

  • Shorter CNAME verification string: We’ve slightly modified the CNAME verification string to make it shorter to support a larger number of DNS providers. Some systems limit the number of characters that can be used in DNS records, which meant that some users were not able to use the CNAME verification method. We’ve now made the CNAME verification method have a fewer number of characters. Existing CNAME verifications will continue to be valid.

We hope this changes make it easier for you to use Webmaster Tools. As always, please post in our Verification forum if you have any questions or feedback.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Holiday source code housekeeping: Website clinic for non-profits 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: Beginner

Cross-posted on the Google Grants Blog

As the holiday season comes around, we all have a bit of housekeeping to do. This is precisely why we wanted to focus the second post in our site clinic series on cleaning up your source code. Throughout our analysis of submitted non-profit websites, we noticed some confusion about what HTML markup, or tags, to use where, and what content to place within them, both of which could have significant impact on users and how your website looks on the search results page.

Before you deck the halls, deck out your <title> elements
Out of all the submitted non-profit websites, 27% were misusing their <title> elements, which are critical in letting both Google and users know what’s important to your website. Typically, a search engine will display ~60 characters from your title element; this is valuable real estate, so you should use it! Before getting into the actual code, let’s first take a look at how a great title element from one of our submitted sites, Sharp, will appear in the search results page:


Ideally, a great <title> element will include the name of the organization, along with a descriptive tag line. Let’s take a look at some submitted examples:

Organization

<title> source code

User Friendliness

Tag Behavior

Sharp

<title>Top San Diego Doctors and Hospitals - Sharp HealthCare</title>

Best

Includes organization’s name and a descriptive tag line

Interieur

<title>Interieur 2010 - 15-24 October Kortrijk, Belgium</title>

Good

Includes the organization’s name and a non-descriptive tag line

VAMS International

<title>Visual Arts and Music for Society | VAMS International</title>

Okay

Includes only the organization’s name


If you don’t specify a <title> tag, then Google will try to create a title for you. You can probably do better than our best guess, so go for it: take control of your <title> tag! It’s a simple fix that can make a huge difference. Using specific <title> tags for your deeper URLs is also important, and we’ll address that in our next site clinic post.

Keep an eye on your description meta tags
Description meta tags weren’t being utilized to their full potential in 54% of submitted sites. These tags are often used to populate the two-line snippet provided to users in the search results page. With a solid snippet, you can get your potential readers excited and ready to learn more about your organization. Let’s take another look at a good example from among the submitted sites, Tales of Aussie Rescue:


If description meta tags are absent or not relevant, a snippet will be chosen from the page’s content automatically. If you’re lucky and have a good snippet auto-selected, keep in mind that search engines vary in the way that they select snippets, so it’s better to keep things consistent and relevant by writing a solid description meta tag.

Keep your <h> elements in their place
Another quick fix in your housekeeping is assuring your website makes proper use of heading tags. In our non-profit study, nearly 19% of submitted sites had room for improvement with heading elements. The most common problem in heading tags was the tendency to initiate headers with an <h2> or <h3> tag while not including an <h1> tag, presumably for aesthetic reasons.

Headings give you the opportunity to tell both Google and users what’s important to you and your website. The lower the number on your heading tag, the more important the text, in the eyes of Google and your users. Take advantage of that <h1> tag! If you don’t like how an <h1> tag is rendered visually, you can always alter its appearance in your CSS.

Use alt text for images
Everyone is always proud to display their family photos come holiday season, but don’t forget to tell us what they’re all about. Over 37% of analyzed sites were not making appropriate use of the image alt attribute. If used properly, this attribute can:
  • Help Google understand what your image is
  • Allow users on text-only browsers, with accessibility problems, or on limited devices to understand your images
Keep in mind, rich and descriptive alt text is the key here. Let’s take another look at some of our submitted sites and their alt attribute usage:

Organization

Source Code

User Friendliness

Tag Behavior

Sponsor A Puppy

<img alt="Sponsor a Puppy logo" src=...

Best: the alt text specifies the image is the organization’s main logo

Uses rich, descriptive alt text to describe images, buttons, and logos

Philanthropedia

<img alt="Logo" height=...

Good: the alt text specifies the image is a logo, but does not further describe it by the organization or its behavior

Uses non-descriptive alt text for images, buttons, and logos, or uses alt text only sporadically

Coastal Community Foundation

<img src="...”>

Not ideal: alt text not present

No use of alt text, or use of text that does not add meaning (often seen in numbering the images)


A little window shopping for your New Year’s resolution
Google has some great resources to further address best practices in your source code. For starters, you can use our HTML Suggestion Tool in Webmaster Tools. Also, it’s always a good practice to make your site accessible to all viewers.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Quality links to your site 2013

salam every one, this is a topic from google web master centrale blog: A popular question on our Webmaster Help Forum is in regard to best practices for organic link building. There seems to be some confusion, especially among less experienced webmasters, on how to approach the topic. Different perspectives have been shared, and we would also like to explain our viewpoint on earning quality links.

If your site is rather new and still unknown, a good way marketing technique is to get involved in the community around your topic. Interact and contribute on forums and blogs. Just keep in mind to contribute in a positive way, rather than spamming or soliciting for your site. Just building a reputation can drive people to your site. And they will keep on visiting it and linking to it. If you offer long-lasting, unique and compelling content -- something that lets your expertise shine -- people will want to recommend it to others. Great content can serve this purpose as much as providing useful tools.

A promising way to create value for your target group and earn great links is to think of issues or problems your users might encounter. Visitors are likely to appreciate your site and link to it if you publish a short tutorial or a video providing a solution, or a practical tool. Survey or original research results can serve the same purpose, if they turn out to be useful for the target audience. Both methods grow your credibility in the community and increase visibility. This can help you gain lasting, merit-based links and loyal followers who generate direct traffic and "spread the word." Offering a number of solutions for different problems could evolve into a blog which can continuously affect the site's reputation in a positive way.

Humor can be another way to gain both great links and get people to talk about your site. With Google Buzz and other social media services constantly growing, entertaining content is being shared now more than ever. We've seen all kinds of amusing content, from ASCII art embedded in a site's source code to funny downtime messages used as a viral marketing technique to increase the visibility of a site. However, we do not recommend counting only on short-lived link-bait tactics. Their appeal wears off quickly and as powerful as marketing stunts can be, you shouldn't rely on them as a long-term strategy or as your only marketing effort.

It's important to clarify that any legitimate link building strategy is a long-term effort. There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site's reputation. Buying PageRank-passing links or randomly exchanging links are the worst ways of attempting to gather links and they're likely to have no positive impact on your site's performance over time. If your site's visibility in the Google index is important to you it's best to avoid them.

Directory entries are often mentioned as another way to promote young sites in the Google index. There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it's on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes.

It can be a good idea to take a look at similar sites in other markets and identify the elements of those sites that might work well for yours, too. However, it's important not to just copy success stories but to adapt them, so that they provide unique value for your visitors.


Social bookmarks on YouTube enable users to share content easily


Finally, consider making linking to your site easier for less tech savvy users. Similar to the way we do it on YouTube, offering bookmarking services for social sites like Twitter or Facebook can help spread the word about the great content on your site and draw users' attention.

As usual, we'd like to hear your opinion. You're welcome to comment here in the blog, or join our Webmaster Help Forum community.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Optimizing sites for TV 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Just as mobile phones make your site accessible to people on the go, Google TV makes your site easily viewable to people lounging on their couch. Google TV is a platform that combines your current TV programming with the web and, before long, more apps. It’s the web you love, with the TV you love, all available on the sofa made for you. Woohoo!

Because Google TV has a fully functioning web browser built in, users can easily visit your site from their TV. Current sites should already work, but you may want to provide your users with an enhanced TV experience -- what's called the “10-foot UI” (user interface). They'll be several feet away from the screen, not several inches away, and rather than a mouse on their desktop, they'll have a remote with a keyboard and a pointing device.

For example, here’s YouTube for desktop users versus what we’re calling “YouTube Leanback” -- our site optimized for large screens:


YouTube desktop version on the left, YouTube Leanback on the right

See our Spotlight Gallery for more examples of TV-optimized sites.

What does "optimized for TV" mean?

It means that, for the user sitting on their couch, your site on their TV is an even more enjoyable experience:
  • Text is large enough to be viewable from the sofa-to-TV distance.
  • Site navigation can be performed through button arrows on the remote (a D-pad), rather than mouse/touchpad usage
  • Selectable elements provide a visual queue when selected (when you’re 10 feet away, it needs to be really, really obvious what selections are highlighted)
  • and more...
How can webmasters gain a general idea of their site’s appearance on TV?

First, remember that appearance alone doesn't incorporate whether your site can be easily navigated by TV users (i.e. users with a remote rather than a mouse). With that said, here’s a quick workaround to give you a ballpark idea of how your site looks on TV. (For more in-depth info, please see the “Design considerations” in our optimization guide.)
  1. On a large monitor, make your window size 1920 x 1080.
  2. In a browser, visit your site at full screen.
  3. Zoom the browser to 1.5x the normal size. This is performed in different ways with different keyboards. For example, in Chrome if you press ctrl+ (press ctrl and + at the same time) twice, that’ll zoom the browser to nearly 1.5x the initial size.
  4. Move back 3 x (the distance between you and the monitor).
  5. Check out your site!
And don’t forget, if you want to see your site with the real thing, Google TV enabled devices are now available in stores.

How can you learn more?

Our team just published a developer site, with TV optimization techniques, at code.google.com/tv/web/.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: State of the Index 2009 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

At PubCon in Las Vegas in November 2009, I gave a "State of the Index" talk which covers what Google has done for users, web developers, and webmasters in the last year. I recently recreated it on video for those of you who didn't make it to the conference. You can watch it below:


And here are the slides if you'd like to follow along:


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: URL removal explained, Part III: Removing content that you don't own 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Welcome to the third episode of our URL removals series! In episodes one and two, we talked about expediting the removal of content that's under your control and requesting expedited cache removals. Today, we're covering how to use Google's public URL removal tool to request removal of content from Google’s search results when the content originates on a website not under your control.

Google offers two tools that provide a way to request expedited removal of content:

1. Verified URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site of which you’re a verified owner in Webmaster Tools (like your blog or your company’s site)

2. Public URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site which you can’t verify ownership (like your friend’s blog)

Sometimes a situation arises where the information you want to remove originates from a site that you don't own or can't control. Since each individual webmaster controls their site and their site’s content, the best way to update or remove results from Google is for the site owner (where the content is published) to either block crawling of the URL, modify the content source, or remove the page altogether. If the content isn't changed, it would just reappear in our search results the next time we crawled it. So the first step to remove content that's hosted on a site you don't own is to contact the owner of the website and request that they remove or block the content in question.
  • Removed or blocked content

    If the website owner removes a page, requests for the removed page should return a "404 Not Found" response or a "410 Gone" response. If they choose to block the page from search engines, then the page should either be disallowed in the site's robots.txt file or contain a noindex meta tag. Once one of these requirements is met, you can submit a removal request using the "Webmaster has already blocked the page" option.



    Sometimes a website owner will claim that they’ve blocked or removed a page but they haven’t technically done so. If they claim a page has been blocked you can double check by looking at the site’s robots.txt file to see if the page is listed there as disallowed.
    User-agent: *
    Disallow: /blocked-page/
    Another place to check if a page has been blocked is within the page’s HTML source code itself. You can visit the page and choose “View Page Source” from your browser. Is there a meta noindex tag in the HTML “head” section?
    <html>
    <head>
    <title>blocked page</title>
    <meta name="robots" content="noindex">
    </head>
    ...
    If they inform you that the page has been removed, you can confirm this by using an HTTP response testing tool like the Live HTTP Headers add-on for the Firefox browser. With this add-on enabled, you can request any URL in Firefox to test that the HTTP response is actually 404 Not Found or 410 Gone.

  • Content removed from the page

    Once you've confirmed that the content you're seeking to remove is no longer present on the page, you can request a cache removal using the 'Content has been removed from the page' option. This type of removal--usually called a "cache" removal--ensures that Google's search results will not include the cached copy or version of the old page, or any snippets of text from the old version of the page. Only the current updated page (without the content that's been removed) will be accessible from Google's search results. However, the current updated page can potentially still rank for terms related to the old content as a result of inbound links that still exist from external sites. For cache removal requests you’ll be asked to enter a "term that has been removed from the page." Be sure to enter a word that is not found on the current live page, so that our automated process can confirm the page has changed -- otherwise the request will be denied. Cache removals are covered in more detail in part two of the "URL removal explained" series.


  • Removing inappropriate webpages or images that appear in our SafeSearch filtered results

    Google introduced the SafeSearch filter with the goal of providing search results that exclude potentially offensive content. For situations where you find content that you feel should have been filtered out by SafeSearch, you can request that this content be excluded from SafeSearch filtered results in the future. Submit a removal request using the 'Inappropriate content appears in our SafeSearch filtered results' option.

If you encounter any issues with the public URL removal tool or have questions not addressed here, please post them to the Webmaster Help Forum or consult the more detailed removal instructions in our Help Center. If you do post to the forum, remember to use a URL shortening service to share any links to content you want removed.

Edit: Read the rest of this series:
Part I: Removing URLs & directories
Part II: Removing & updating cached content
Part IV: Tracking requests, what not to remove
Companion post: Managing what information is available about you online

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: 'New software version' notifications for your site 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

One of the great things about working at Google is that we get to take advantage of an enormous amount of computing power to do some really cool things. One idea we tried out was to let webmasters know about their potentially hackable websites. The initial effort was successful enough that we thought we would take it one step further by expanding our efforts to cover other types of web applications—for example, more content management systems (CMSs), forum/bulletin-board applications, stat-trackers, and so on.

This time, however, our goal is not just to isolate vulnerable or hackable software packages, but to also notify webmasters about newer versions of the software packages or plugins they're running on their website. For example, there might be a Drupal module or Joomla extension update available but some folks might not have upgraded. There are a few reasons a webmaster might not upgrade to the newer version and one of the reasons could be that they just don't know a new version exists. This is where we think we can help. We hope to let webmasters know about new versions of their software by sending them a message via Webmaster Tools. This way they can make an informed decision about whether or not they would like to upgrade.

One of the ways we identify sites to notify is by parsing source code of web pages that we crawl. For example, WordPress and other CMS applications include a generator meta tag that specifies the version number. This has proven to be tremendously helpful in our efforts to notify webmasters. So if you're a software developer, and would like us to help you notify your users about newer versions of your software, a great way to start would be to include a generator meta tag that tells the version number of your software. If you're a plugin or a widget developer, including a version number in the source you provide to your users is a great way to help too.

We've seen divided opinions over time about whether it's a good security practice to include a version number in source code, because it lets hackers or worm writers know that the website might be vulnerable to a particular type of exploit. But as Matt Mullenweg pointed out, "Where [a worm writer's] 1.0 might have checked for version numbers, 2.0 just tests [a website's] capabilities...". Meanwhile, the advantage of a version number is that it can help alert site owners when they need to update their site. In the end, we tend to think that including a version number can do more good than harm.

We plan to begin sending out the first of these messages soon and hope that webmasters find them useful! If you have any questions or feedback, feel free to comment here.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: A helping holiday hand: Website clinic for non-profits 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: Beginner

Cross-posted on the Google Grants Blog

A New Year’s resolution
In the spirit of the holidays, here at Google we wanted to take the time to help out those who spend their days making our world a better place: non-profit organizations. A few weeks back, we asked webmasters of non-profits to submit their organization’s site to our Search Quality team for analysis. After some number crunching and trend analysis, we’re back to report on general areas for improvement and to guide you towards some useful resources!

Making our list, checking it twice
First, we’d like thank all of the amazing organizations who participated by submitting their sites. We got some great results, and are excited about all the diverse non-profit causes out there.

Our analysis will take place in the following two posts. The first post will focus on cleaning up HTML tags in your source code, while the second will examine improving user experience via better content accessibility.

Visions of... URLs... dancing in our heads
The great news is, every single site submitted had at least one or two areas to tweak to make it even better! So this information should be helpful to everyone out there, big or small. Just to whet your appetites, here’s a quick list of items that will not be addressed in our following posts, but that had some room for improvement in a large percentage of submitted sites:
  • Keep an eye on proper canonicalization: 56% of analyzed non-profit sites could improve their canonicalization practices. You can read more about canonicalization in this blog post from a previous site clinic.
  • Make sure your volunteer/support sections are visible: 29% of our submissions could improve their sites by making their support, volunteer, or donation sections easier to find. A great way to accomplish this is to add a donations tab to your navigation bar so it’s just one click away at all times.
  • Protect your confidential information: Lots of non-profits, especially those in the medical industry, deal with some very important and confidential information. Read up on how to control your crawled and indexed content, and remember to protect confidential content through proper authentication measures.
  • Make your Flash sites search engine friendly: We saw some beautiful sites running on Flash. Search engines have a hard time understanding Flash files, and we’re working to improve Flash comprehension on our end, but here are some discussion points on how you can help us understand your Flash content.
Contributors: Aditya Goradia, Brandon Falls, Charlene Perez, Diara Dankert, Michael Wyszomierski, and Nelson Bradley
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Showing more results from a domain 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site:



Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.

We’re always reassessing our ranking and user interface, making hundreds of changes each year. We expect today’s improvement will help users find deeper results from a single site, while still providing diversity on the results page.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Structured Data Testing Tool 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

Today we’re excited to share the launch of a shiny new version of the rich snippet testing tool, now called the structured data testing tool. The major improvements are:
  • We’ve improved how we display rich snippets in the testing tool to better match how they appear in search results.
  • The brand new visual design makes it clearer what structured data we can extract from the page, and how that may be shown in our search results.
  • The tool is now available in languages other than English to help webmasters from around the world build structured-data-enabled websites.
Here’s what it looks like:
The new structured data testing tool works with all supported rich snippets and authorship markup, including applications, products, recipes, reviews, and others.

Try it yourself and, as always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.

Written by Yong Zhu on behalf of the rich snippets testing tool team



this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Make the most of Search Queries in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog: Level: Beginner to Intermediate

If you’re intrigued by the Search Queries feature in Webmaster Tools but aren’t sure how to make it actionable, we have a video that we hope will help!


Maile shares her approach to Search Queries in Webmaster Tools

This video explains the vocabulary of Search Queries, such as:
  • Impressions
  • Average position (only the top-ranking URL for the user’s query is factored in our calculation)
  • Click
  • CTR
The video also reviews an approach to investigating Top queries and Top pages:
  1. Prepare by understanding your website’s goals and your target audience (then using Search Queries “filters” to support your knowledge)
  2. Sort by clicks in Top queries to understand the top queries bringing searchers to your site (for the given time period)
  3. Sort by CTR to notice any missed opportunities
  4. Categorize queries into logical buckets that simplify tracking your progress and staying in touch with users’ needs
  5. Sort Top pages by clicks to find the URLs on your site most visited by searchers (for the given time period)
  6. Sort Top pages by impressions to find valuable pages that can be used to help feature your related, high-quality, but lower-ranking pages
After you’ve watched the video and applied the knowledge of your site with the findings from Search Queries, you’ll likely have several improvement ideas to help searchers find your site. If you’re up for it, let us know in the comments what Search Queries information you find useful (and why!), and of course, as always, feel free to share any tips or feedback.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Five common SEO mistakes (and six good ideas!) 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Beginner to Intermediate

To help you avoid common mistakes webmasters face with regard to search engine optimization (SEO), I filmed a video outlining five common mistakes I’ve noticed in the SEO industry. Almost four years ago, we also gathered information from all of you (our readers) about your SEO recommendations and updated our related Help Center article given your feedback. Much of the same advice from 2008 still holds true today -- here’s to more years ahead building a great site!




If you’re short on time, here’s the gist:

Avoid these common mistakes
1. Having no value proposition: Try not to assume that a site should rank #1 without knowing why it’s helpful to searchers (and better than the competition :)

2. Segmented approach: Be wary of setting SEO-related goals without making sure they’re aligned with your company’s overall objectives and the goals of other departments. For example, in tandem with your work optimizing product pages (and the full user experience once they come to your site), also contribute your expertise to your Marketing team’s upcoming campaign. So if Marketing is launching new videos or a more interactive site, be sure that searchers can find their content, too.

3. Time-consuming workarounds: Avoid implementing a hack rather than researching new features or best practices that could simplify development (e.g., changing the timestamp on an updated URL so it’s crawled more quickly instead of easily submitting the URL through Fetch as Googlebot).

4. Caught in SEO trends: Consider spending less time obsessing about the latest “trick” to boost your rankings and instead focus on the fundamental tasks/efforts that will bring lasting visitors.

5. Slow iteration: Aim to be agile rather than promote an environment where the infrastructure and/or processes make improving your site, or even testing possible improvements, difficult.
Six fundamental SEO tips
1. Do something cool: Make sure your site stands out from the competition -- in a good way!

2. Include relevant words in your copy: Try to put yourself in the shoes of searchers. What would they query to find you? Your name/business name, location, products, etc., are important. It's also helpful to use the same terms in your site that your users might type (e.g., you might be a trained “flower designer” but most searchers might type [florist]), and to answer the questions they might have (e.g., store hours, product specs, reviews). It helps to know your customers.

3. Be smart about your tags and site architecture: Create unique title tags and meta descriptions; include Rich Snippets markup from schema.org where appropriate. Have intuitive navigation and good internal links.

4. Sign up for email forwarding in Webmaster Tools: Help us communicate with you, especially when we notice something awry with your site.

5. Attract buzz: Natural links, +1s, likes, follows... In every business there's something compelling, interesting, entertaining, or surprising that you can offer or share with your users. Provide a helpful service, tell fun stories, paint a vivid picture and users will share and reshare your content.

6. Stay fresh and relevant: Keep content up-to-date and consider options such as building a social media presence (if that’s where a potential audience exists) or creating an ideal mobile experience if your users are often on-the-go.
Good luck to everyone!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help Google index your videos 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

The single best way to make Google aware of all your videos on your website is to create and maintain a Video Sitemap. Video Sitemaps provide Google with essential information about your videos, including the URLs for the pages where the videos can be found, the titles of the videos, keywords, thumbnail images, durations, and other information. The Sitemap also allows you to define the period of time for which each video will be available. This is particularly useful for content that has explicit viewing windows, so that we can remove the content from our index when it expires.

Once your Sitemap is created, you can can submit the URL of the Sitemap file in Google Webmaster Tools or through your robots.txt file.

Once we have indexed a video, it may appear in our web search results in what we call a Video Onebox (a cluster of videos related to the queried topic) and in our video search property, Google Videos. A video result is immediately recognizable by its thumbnail, duration, and a description.

As an example, this is what a video result from CNN.com looks like on Google:


We encourage those of you with videos to submit Video Sitemaps and to keep them updated with your new content. Please also visit our recently updated Video Sitemap Help Center, and utilize our Sitemap Help Forum. If you've submitted a Video Sitemap file via Webmaster Tools and want to share your experiences or problems, you can do so here.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Upcoming changes in Google’s HTTP Referrer 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

Protecting users’ privacy is a priority for us and it’s helped drive recent changes. Helping users save time is also very important; it’s explicitly mentioned as a part of our philosophy. Today, we’re happy to announce that Google Web Search will soon be using a new proposal to reduce latency when a user of Google’s SSL-search clicks on a search result with a modern browser such as Chrome.

Starting in April, for browsers with the appropriate support, we will be using the "referrer" meta tag to automatically simplify the referring URL that is sent by the browser when visiting a page linked from an organic search result. This results in a faster time to result and more streamlined experience for the user.

What does this mean for sites that receive clicks from Google search results? You may start to see "origin" referrers—Google’s homepages (see the meta referrer specification for further detail)—as a source of organic SSL search traffic. This change will only affect the subset of SSL search referrers which already didn’t include the query terms. Non-HTTPS referrals will continue to behave as they do today. Again, the primary motivation for this change is to remove an unneeded redirect so that signed-in users reach their destination faster.

Website analytics programs can detect these organic search requests by detecting bare Google host names using SSL (like "https://www.google.co.uk/"). Webmasters will continue see the same data in Webmasters Tools—just as before, you’ll receive an aggregated list of the top search queries that drove traffic to their site.

We will continue to look into further improvements to how search query data is surfaced through Webmaster Tools. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Your fast pass through security 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Security checks are nobody's cup of tea. We've never seen people go through airport baggage checks for fun. But while security measures are often necessary, that doesn't mean they have to be painful. In that spirit, we’ve implemented several major improvements to make the Google Site Verification process faster, more straightforward, and perhaps even a pleasure to use—so you can get on with the tasks that matter to you.

New verification method recommendations


You’ll quickly notice the changes we’ve made to the verification page, namely the new tabbed interface. These tabs allow us to give greater visibility to the verification method that we think will be most useful to you, which is listed in the Recommended Method tab.


Our recommendation is just an educated guess, but sometimes guesses can be wrong. It’s possible that the method we recommend might not work for you. If this is the case, simply click the "Alternate methods" tab to see the other verification methods that are available. Verifying with an alternate method is just as powerful as verifying with a recommended method.

Our recommendations are computed from statistical data taken from users with a similar configuration to yours. For example, we can guess which verification methods might be successful by looking at the public hosting information for your website. In the future we plan to add more signals so that we can provide additional customized instructions along with more relevant recommendations.

New Google Sites Are Automatically Verified
For some of you, we’ve made the process even more effortless—Google Sites administrators are now automatically verified for all new sites that they create. When you create a new Google Site, it’ll appear verified in the details page. The same goes for adding or removing owners: when you edit the owners list in your Google Site's settings, the changes will automatically appear in Webmaster Tools.

One thing to note is that we’re unable to automatically verify preexisting Google Sites at this time. If you’d like to verify your older Google Sites, please continue to use the meta tag method already available.

We hope these enhancements help get you through security even faster. Should you get pulled over and have any questions, feel free to check out our Webmaster Help Forums.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Test your webmaster know-how! 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

We thought it might be fun and educational to create a quiz for webmasters about issues we commonly see in the Webmaster Help Forum. Together with our awesome Bionic Posters, we've tried to come up with questions and answers that reflect recurring concerns in the forum and some information that may not be well known. Some things to keep in mind when taking this quiz:
  • The quiz will be available to take from today until Wednesday, January 27 at 5PM PST.
  • It doesn't cover all facets of webmaster problems that arise, and—as with any test—it is at best only a fun way to test your webmaster prowess ;). We leave discussion of specific cases to the forum.
  • We've set up the quiz using our very own Google Docs. This means you won't see results right away, but we plan to write a follow-up blog post explaining answers and listing top scorers. Be sure to save your answers or print out your completed quiz before submitting! This way you can check your answers against the correct ones when we publish them.
  • It's just for fun!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Updated malware feature in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

A little over six months ago we released a new malware diagnostic tool in Webmaster Tools with the help of Lucas Ballard from the anti-malware team. This feature has been a great success; many of you were interested to know if Google had detected malicious software in your site, and you used the tool's information to find and remove that malware and to fix the vulnerabilities in your servers.

Well, a few days ago we promoted the malware diagnostics tool from Labs to a full Webmaster Tools feature. You can now find it under the Diagnostics menu. Not only that, we also added support for malware notifications. As you may already know, if your site has malware we may show a warning message in our search results indicating that the site is potentially harmful. If this is the case, you should remove any dangerous content as soon as possible and patch the vulnerabilities in your server. After you've done that, you can request a malware review in order to have the warning for your site removed. What's new in our latest release is that the form to request a review is now right there with the rest of the malware data:

Screenshot of the new malware feature in Webmaster Tools

We've also made several other improvements under the covers. Now the data is updated almost four times faster than before. And we've improved our algorithms for identifying injected content and can pinpoint exploits that were difficult to catch when the feature first launched.

On the Webmaster Tools dashboard you'll still see a warning message when you have malware on one of your sites. This message has a link that will take you directly to the malware tool. Here at Google we take malware very seriously, and we're working on several improvements to this feature so that we can tell you ASAP if we detect that your site is potentially infected. Stay tuned!

For more details, check out the Malware & Hacked sites help forum.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.