Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Easier management of website verifications 2013

salam every one, this is a topic from google web master centrale blog:

Webmaster level: All

To help webmasters manage the verified owners for their websites in Webmaster Tools, we’ve recently introduced three new features:

  • Verification details view: You can now see the methods used to verify an owner for your site. In the Manage owners page for your site, you can now find the new Verification details link. This screenshot shows the verification details of a user who is verified using both an HTML file uploaded to the site and a meta tag:

    Where appropriate, the Verification details will have links to the correct URL on your site where the verification can be found to help you find it faster.

  • Requiring the verification method be removed from the site before unverifying an owner: You now need to remove the verification method from your site before unverifying an owner from Webmaster Tools. Webmaster Tools now checks the method that the owner used to verify ownership of the site, and will show an error message if the verification is still found. For example, this is the error message shown when an unverification was attempted while the DNS CNAME verification method was still found on the DNS records of the domain:

  • Shorter CNAME verification string: We’ve slightly modified the CNAME verification string to make it shorter to support a larger number of DNS providers. Some systems limit the number of characters that can be used in DNS records, which meant that some users were not able to use the CNAME verification method. We’ve now made the CNAME verification method have a fewer number of characters. Existing CNAME verifications will continue to be valid.

We hope this changes make it easier for you to use Webmaster Tools. As always, please post in our Verification forum if you have any questions or feedback.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Quality links to your site 2013

salam every one, this is a topic from google web master centrale blog: A popular question on our Webmaster Help Forum is in regard to best practices for organic link building. There seems to be some confusion, especially among less experienced webmasters, on how to approach the topic. Different perspectives have been shared, and we would also like to explain our viewpoint on earning quality links.

If your site is rather new and still unknown, a good way marketing technique is to get involved in the community around your topic. Interact and contribute on forums and blogs. Just keep in mind to contribute in a positive way, rather than spamming or soliciting for your site. Just building a reputation can drive people to your site. And they will keep on visiting it and linking to it. If you offer long-lasting, unique and compelling content -- something that lets your expertise shine -- people will want to recommend it to others. Great content can serve this purpose as much as providing useful tools.

A promising way to create value for your target group and earn great links is to think of issues or problems your users might encounter. Visitors are likely to appreciate your site and link to it if you publish a short tutorial or a video providing a solution, or a practical tool. Survey or original research results can serve the same purpose, if they turn out to be useful for the target audience. Both methods grow your credibility in the community and increase visibility. This can help you gain lasting, merit-based links and loyal followers who generate direct traffic and "spread the word." Offering a number of solutions for different problems could evolve into a blog which can continuously affect the site's reputation in a positive way.

Humor can be another way to gain both great links and get people to talk about your site. With Google Buzz and other social media services constantly growing, entertaining content is being shared now more than ever. We've seen all kinds of amusing content, from ASCII art embedded in a site's source code to funny downtime messages used as a viral marketing technique to increase the visibility of a site. However, we do not recommend counting only on short-lived link-bait tactics. Their appeal wears off quickly and as powerful as marketing stunts can be, you shouldn't rely on them as a long-term strategy or as your only marketing effort.

It's important to clarify that any legitimate link building strategy is a long-term effort. There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site's reputation. Buying PageRank-passing links or randomly exchanging links are the worst ways of attempting to gather links and they're likely to have no positive impact on your site's performance over time. If your site's visibility in the Google index is important to you it's best to avoid them.

Directory entries are often mentioned as another way to promote young sites in the Google index. There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it's on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes.

It can be a good idea to take a look at similar sites in other markets and identify the elements of those sites that might work well for yours, too. However, it's important not to just copy success stories but to adapt them, so that they provide unique value for your visitors.


Social bookmarks on YouTube enable users to share content easily


Finally, consider making linking to your site easier for less tech savvy users. Similar to the way we do it on YouTube, offering bookmarking services for social sites like Twitter or Facebook can help spread the word about the great content on your site and draw users' attention.

As usual, we'd like to hear your opinion. You're welcome to comment here in the blog, or join our Webmaster Help Forum community.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Optimizing sites for TV 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Just as mobile phones make your site accessible to people on the go, Google TV makes your site easily viewable to people lounging on their couch. Google TV is a platform that combines your current TV programming with the web and, before long, more apps. It’s the web you love, with the TV you love, all available on the sofa made for you. Woohoo!

Because Google TV has a fully functioning web browser built in, users can easily visit your site from their TV. Current sites should already work, but you may want to provide your users with an enhanced TV experience -- what's called the “10-foot UI” (user interface). They'll be several feet away from the screen, not several inches away, and rather than a mouse on their desktop, they'll have a remote with a keyboard and a pointing device.

For example, here’s YouTube for desktop users versus what we’re calling “YouTube Leanback” -- our site optimized for large screens:


YouTube desktop version on the left, YouTube Leanback on the right

See our Spotlight Gallery for more examples of TV-optimized sites.

What does "optimized for TV" mean?

It means that, for the user sitting on their couch, your site on their TV is an even more enjoyable experience:
  • Text is large enough to be viewable from the sofa-to-TV distance.
  • Site navigation can be performed through button arrows on the remote (a D-pad), rather than mouse/touchpad usage
  • Selectable elements provide a visual queue when selected (when you’re 10 feet away, it needs to be really, really obvious what selections are highlighted)
  • and more...
How can webmasters gain a general idea of their site’s appearance on TV?

First, remember that appearance alone doesn't incorporate whether your site can be easily navigated by TV users (i.e. users with a remote rather than a mouse). With that said, here’s a quick workaround to give you a ballpark idea of how your site looks on TV. (For more in-depth info, please see the “Design considerations” in our optimization guide.)
  1. On a large monitor, make your window size 1920 x 1080.
  2. In a browser, visit your site at full screen.
  3. Zoom the browser to 1.5x the normal size. This is performed in different ways with different keyboards. For example, in Chrome if you press ctrl+ (press ctrl and + at the same time) twice, that’ll zoom the browser to nearly 1.5x the initial size.
  4. Move back 3 x (the distance between you and the monitor).
  5. Check out your site!
And don’t forget, if you want to see your site with the real thing, Google TV enabled devices are now available in stores.

How can you learn more?

Our team just published a developer site, with TV optimization techniques, at code.google.com/tv/web/.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Website user research and testing on the cheap 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

As the team responsible for tens of thousands of Google’s informational web pages, the Webmaster Team is here to offer tips and advice based on their experiences as hands-on webmasters.

If you’ve never tested or analyzed usage of your website, ask yourself if you really know whether your site is useful for your target audience. If you’re unsure, why not find out? For example, did you know that on average users scroll down 5.9 times as often as they scroll up, meaning that often once page content is scrolled past, it is “lost?” (See Jakob Nielsen’s findings on scrolling, where he advises that users don’t mind scrolling, but within limits.)

Also, check your analytics—are you curious about high bounce rates from any of your pages, or very short time-on-page metrics?

First, think about your user


The start of a web project—whether it’s completely new or a revamp of an existing site—is a great time to ask questions like:

  • How might users access your site—home, office, on-the-go?
  • How tech-savvy are your visitors?
  • How familiar are users with the subject matter of your website?

The answers to some of these questions can be valuable when making initial design decisions.

For instance, if the user is likely to be on the road, they might be short on time to find the information they need from your site, or be in a distracting environment and have a slow data connection—so a simple layout with single purpose would work best. Additionally, if you’re providing content for a less technical audience, make sure it’s not too difficult to access content—animation might provide a “wow” factor, but only if your user appreciates it and it’s not too difficult to get to the content.

Even without testing, building a basic user profile (or “persona”) can help shape your designs for the benefit of the user—this doesn’t have to be an exhaustive biography, but just some basic considerations of your user’s behavior patterns.

Simple testing


Testing doesn’t have to be a costly operation – friends and family can be a great resource. Some pointers:

  • Sample size: Just five people can be a large enough number of users to find common problems in your layouts and navigation (see Jakob Nielsen’s article on why using a small sample size is sufficient).
  • Choosing your testers: A range of different technical ability can be useful, but be sure to only focus on trends—for example, if more than 50% of your testers have the same usability issue, it’s likely a real problem—rather than individual issues encountered.
  • Testing location: If possible, visit the user in their home and watch how they use the site—observe how he/she normally navigates the web when relaxed and in their natural environment. Remote testing is also a possibility if you can’t make it in person—we’ve heard that Google+ hangouts can be used effectively for this (find out more about using Google+ hangouts).
  • How to test: Based on your site’s goals, define 4 or 5 simple tasks to do on your website, and let the user try to complete the tasks. Ask your testers to speak aloud so you can better understand their experiences and thought processes.
  • What to test: Basic prototypes in clickable image or document format (for example, PDF) or HTML can be used to test the basic interactions, without having to build out a full site for testing. This way, you can test out different options for navigation and layouts to see how they perform before implementing them.
  • What not to test: Focus on functionality rather than graphic design elements; viewpoints are often subjective. You would only get useful feedback on design from quantitative testing with large (200+) numbers of users (unless, for example, the colors you use on your site make the content unreadable, which would be good feedback!). One format for getting some useful feedback on the design can be to offer 5-6 descriptive keywords and ask your user to choose the most representative ones.
Overall, basic testing is most useful for seeing how your website’s functionality is working—the ease of finding information and common site interactions.

Lessons learned


In case you’re still wondering whether it’s really worth research and testing, here are a few simple things we confirmed from actual users that we wouldn’t have known if we hadn’t sat with actual users and watched them use our pages, or analyzed our web traffic.

  • Take care when using layouts that hide/show content: We found when using scripts to expand and collapse long text passages, the user often didn’t realize the extra content was available—effectively “hiding” the JavaScript-rendered content when the user searches within the page (for example, using Control + F, which we’ve seen often).


    Wireframe of layout tested, showing “zipped”
    content on the bottom left



    Final page design showing anchor links in the top
    and content laid out in the main body of the page


  • Check your language: Headings, link and button text are what catches the user’s eye the most when scanning the page. Avoid using “Learn more…” in link text—users seem averse to clicking on a link which implies they will need to learn something. Instead, just try to use a literal description of what content the user will get behind the link—and make sure link text makes sense and is easy to understand out of context, because that is often how it will be scanned. Be mindful about language and try to make button text descriptive, inviting and interesting.
  • Test pages on a slower connection: Try out your pages using different networks (for example, try browsing your website using the wifi at your local coffee shop or a friend’s house), especially if your target users are likely to be viewing your pages from a home connection that’s not as fast as your office network. We found a considerable improvement in CTR and time-on-site metrics in some cases when we made scripted animations much simpler and faster (hint: use Google’s Page Speed Online to check performance if you don’t have access to a slower Internet connection).
So if you’re caught up in a seemingly never-ending redevelopment cycle, save yourself some time in the future by investing a little up front through user profiling and basic testing, so that you’re more likely to choose the right approach for your site layout and architecture.

We’d love to hear from you in the comments: have you tried out website usability testing? If so, how did you get on, and what are your favorite simple and low-cost tricks to get the most out of it? this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: State of the Index 2009 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

At PubCon in Las Vegas in November 2009, I gave a "State of the Index" talk which covers what Google has done for users, web developers, and webmasters in the last year. I recently recreated it on video for those of you who didn't make it to the conference. You can watch it below:


And here are the slides if you'd like to follow along:


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: URL removal explained, Part III: Removing content that you don't own 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Welcome to the third episode of our URL removals series! In episodes one and two, we talked about expediting the removal of content that's under your control and requesting expedited cache removals. Today, we're covering how to use Google's public URL removal tool to request removal of content from Google’s search results when the content originates on a website not under your control.

Google offers two tools that provide a way to request expedited removal of content:

1. Verified URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site of which you’re a verified owner in Webmaster Tools (like your blog or your company’s site)

2. Public URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site which you can’t verify ownership (like your friend’s blog)

Sometimes a situation arises where the information you want to remove originates from a site that you don't own or can't control. Since each individual webmaster controls their site and their site’s content, the best way to update or remove results from Google is for the site owner (where the content is published) to either block crawling of the URL, modify the content source, or remove the page altogether. If the content isn't changed, it would just reappear in our search results the next time we crawled it. So the first step to remove content that's hosted on a site you don't own is to contact the owner of the website and request that they remove or block the content in question.
  • Removed or blocked content

    If the website owner removes a page, requests for the removed page should return a "404 Not Found" response or a "410 Gone" response. If they choose to block the page from search engines, then the page should either be disallowed in the site's robots.txt file or contain a noindex meta tag. Once one of these requirements is met, you can submit a removal request using the "Webmaster has already blocked the page" option.



    Sometimes a website owner will claim that they’ve blocked or removed a page but they haven’t technically done so. If they claim a page has been blocked you can double check by looking at the site’s robots.txt file to see if the page is listed there as disallowed.
    User-agent: *
    Disallow: /blocked-page/
    Another place to check if a page has been blocked is within the page’s HTML source code itself. You can visit the page and choose “View Page Source” from your browser. Is there a meta noindex tag in the HTML “head” section?
    <html>
    <head>
    <title>blocked page</title>
    <meta name="robots" content="noindex">
    </head>
    ...
    If they inform you that the page has been removed, you can confirm this by using an HTTP response testing tool like the Live HTTP Headers add-on for the Firefox browser. With this add-on enabled, you can request any URL in Firefox to test that the HTTP response is actually 404 Not Found or 410 Gone.

  • Content removed from the page

    Once you've confirmed that the content you're seeking to remove is no longer present on the page, you can request a cache removal using the 'Content has been removed from the page' option. This type of removal--usually called a "cache" removal--ensures that Google's search results will not include the cached copy or version of the old page, or any snippets of text from the old version of the page. Only the current updated page (without the content that's been removed) will be accessible from Google's search results. However, the current updated page can potentially still rank for terms related to the old content as a result of inbound links that still exist from external sites. For cache removal requests you’ll be asked to enter a "term that has been removed from the page." Be sure to enter a word that is not found on the current live page, so that our automated process can confirm the page has changed -- otherwise the request will be denied. Cache removals are covered in more detail in part two of the "URL removal explained" series.


  • Removing inappropriate webpages or images that appear in our SafeSearch filtered results

    Google introduced the SafeSearch filter with the goal of providing search results that exclude potentially offensive content. For situations where you find content that you feel should have been filtered out by SafeSearch, you can request that this content be excluded from SafeSearch filtered results in the future. Submit a removal request using the 'Inappropriate content appears in our SafeSearch filtered results' option.

If you encounter any issues with the public URL removal tool or have questions not addressed here, please post them to the Webmaster Help Forum or consult the more detailed removal instructions in our Help Center. If you do post to the forum, remember to use a URL shortening service to share any links to content you want removed.

Edit: Read the rest of this series:
Part I: Removing URLs & directories
Part II: Removing & updating cached content
Part IV: Tracking requests, what not to remove
Companion post: Managing what information is available about you online

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: 'New software version' notifications for your site 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

One of the great things about working at Google is that we get to take advantage of an enormous amount of computing power to do some really cool things. One idea we tried out was to let webmasters know about their potentially hackable websites. The initial effort was successful enough that we thought we would take it one step further by expanding our efforts to cover other types of web applications—for example, more content management systems (CMSs), forum/bulletin-board applications, stat-trackers, and so on.

This time, however, our goal is not just to isolate vulnerable or hackable software packages, but to also notify webmasters about newer versions of the software packages or plugins they're running on their website. For example, there might be a Drupal module or Joomla extension update available but some folks might not have upgraded. There are a few reasons a webmaster might not upgrade to the newer version and one of the reasons could be that they just don't know a new version exists. This is where we think we can help. We hope to let webmasters know about new versions of their software by sending them a message via Webmaster Tools. This way they can make an informed decision about whether or not they would like to upgrade.

One of the ways we identify sites to notify is by parsing source code of web pages that we crawl. For example, WordPress and other CMS applications include a generator meta tag that specifies the version number. This has proven to be tremendously helpful in our efforts to notify webmasters. So if you're a software developer, and would like us to help you notify your users about newer versions of your software, a great way to start would be to include a generator meta tag that tells the version number of your software. If you're a plugin or a widget developer, including a version number in the source you provide to your users is a great way to help too.

We've seen divided opinions over time about whether it's a good security practice to include a version number in source code, because it lets hackers or worm writers know that the website might be vulnerable to a particular type of exploit. But as Matt Mullenweg pointed out, "Where [a worm writer's] 1.0 might have checked for version numbers, 2.0 just tests [a website's] capabilities...". Meanwhile, the advantage of a version number is that it can help alert site owners when they need to update their site. In the end, we tend to think that including a version number can do more good than harm.

We plan to begin sending out the first of these messages soon and hope that webmasters find them useful! If you have any questions or feedback, feel free to comment here.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Showing more results from a domain 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site:



Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.

We’re always reassessing our ranking and user interface, making hundreds of changes each year. We expect today’s improvement will help users find deeper results from a single site, while still providing diversity on the results page.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Structured Data Testing Tool 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

Today we’re excited to share the launch of a shiny new version of the rich snippet testing tool, now called the structured data testing tool. The major improvements are:
  • We’ve improved how we display rich snippets in the testing tool to better match how they appear in search results.
  • The brand new visual design makes it clearer what structured data we can extract from the page, and how that may be shown in our search results.
  • The tool is now available in languages other than English to help webmasters from around the world build structured-data-enabled websites.
Here’s what it looks like:
The new structured data testing tool works with all supported rich snippets and authorship markup, including applications, products, recipes, reviews, and others.

Try it yourself and, as always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.

Written by Yong Zhu on behalf of the rich snippets testing tool team



this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Making search-friendly mobile websites — now in 11 more languages 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

As more and more users worldwide with mobile devices access the Internet, it’s fantastic to see so many websites making their content accessible and useful for those devices. To help webmasters optimize their sites we launched our recommendations for smartphones, feature-phones, tablets, and Googlebot-friendly sites in June 2012.

We’re happy to announce that those recommendations are now also available in Arabic, Brazilian Portuguese, Dutch, French, German, Italian, Japanese, Polish, Russian, Simplified Chinese, and Spanish. US-based webmasters are welcome to read the UK-English version.

We welcome you to go through our recommendations, pick the configuration that you feel will work best with your website, and get ready to jump on the mobile bandwagon!

Thanks to the fantastic webmaster-outreach team in Dublin, Tokyo and Beijing for making this possible!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help Google index your videos 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

The single best way to make Google aware of all your videos on your website is to create and maintain a Video Sitemap. Video Sitemaps provide Google with essential information about your videos, including the URLs for the pages where the videos can be found, the titles of the videos, keywords, thumbnail images, durations, and other information. The Sitemap also allows you to define the period of time for which each video will be available. This is particularly useful for content that has explicit viewing windows, so that we can remove the content from our index when it expires.

Once your Sitemap is created, you can can submit the URL of the Sitemap file in Google Webmaster Tools or through your robots.txt file.

Once we have indexed a video, it may appear in our web search results in what we call a Video Onebox (a cluster of videos related to the queried topic) and in our video search property, Google Videos. A video result is immediately recognizable by its thumbnail, duration, and a description.

As an example, this is what a video result from CNN.com looks like on Google:


We encourage those of you with videos to submit Video Sitemaps and to keep them updated with your new content. Please also visit our recently updated Video Sitemap Help Center, and utilize our Sitemap Help Forum. If you've submitted a Video Sitemap file via Webmaster Tools and want to share your experiences or problems, you can do so here.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Upcoming changes in Google’s HTTP Referrer 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

Protecting users’ privacy is a priority for us and it’s helped drive recent changes. Helping users save time is also very important; it’s explicitly mentioned as a part of our philosophy. Today, we’re happy to announce that Google Web Search will soon be using a new proposal to reduce latency when a user of Google’s SSL-search clicks on a search result with a modern browser such as Chrome.

Starting in April, for browsers with the appropriate support, we will be using the "referrer" meta tag to automatically simplify the referring URL that is sent by the browser when visiting a page linked from an organic search result. This results in a faster time to result and more streamlined experience for the user.

What does this mean for sites that receive clicks from Google search results? You may start to see "origin" referrers—Google’s homepages (see the meta referrer specification for further detail)—as a source of organic SSL search traffic. This change will only affect the subset of SSL search referrers which already didn’t include the query terms. Non-HTTPS referrals will continue to behave as they do today. Again, the primary motivation for this change is to remove an unneeded redirect so that signed-in users reach their destination faster.

Website analytics programs can detect these organic search requests by detecting bare Google host names using SSL (like "https://www.google.co.uk/"). Webmasters will continue see the same data in Webmasters Tools—just as before, you’ll receive an aggregated list of the top search queries that drove traffic to their site.

We will continue to look into further improvements to how search query data is surfaced through Webmaster Tools. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Your fast pass through security 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Security checks are nobody's cup of tea. We've never seen people go through airport baggage checks for fun. But while security measures are often necessary, that doesn't mean they have to be painful. In that spirit, we’ve implemented several major improvements to make the Google Site Verification process faster, more straightforward, and perhaps even a pleasure to use—so you can get on with the tasks that matter to you.

New verification method recommendations


You’ll quickly notice the changes we’ve made to the verification page, namely the new tabbed interface. These tabs allow us to give greater visibility to the verification method that we think will be most useful to you, which is listed in the Recommended Method tab.


Our recommendation is just an educated guess, but sometimes guesses can be wrong. It’s possible that the method we recommend might not work for you. If this is the case, simply click the "Alternate methods" tab to see the other verification methods that are available. Verifying with an alternate method is just as powerful as verifying with a recommended method.

Our recommendations are computed from statistical data taken from users with a similar configuration to yours. For example, we can guess which verification methods might be successful by looking at the public hosting information for your website. In the future we plan to add more signals so that we can provide additional customized instructions along with more relevant recommendations.

New Google Sites Are Automatically Verified
For some of you, we’ve made the process even more effortless—Google Sites administrators are now automatically verified for all new sites that they create. When you create a new Google Site, it’ll appear verified in the details page. The same goes for adding or removing owners: when you edit the owners list in your Google Site's settings, the changes will automatically appear in Webmaster Tools.

One thing to note is that we’re unable to automatically verify preexisting Google Sites at this time. If you’d like to verify your older Google Sites, please continue to use the meta tag method already available.

We hope these enhancements help get you through security even faster. Should you get pulled over and have any questions, feel free to check out our Webmaster Help Forums.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Test your webmaster know-how! 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

We thought it might be fun and educational to create a quiz for webmasters about issues we commonly see in the Webmaster Help Forum. Together with our awesome Bionic Posters, we've tried to come up with questions and answers that reflect recurring concerns in the forum and some information that may not be well known. Some things to keep in mind when taking this quiz:
  • The quiz will be available to take from today until Wednesday, January 27 at 5PM PST.
  • It doesn't cover all facets of webmaster problems that arise, and—as with any test—it is at best only a fun way to test your webmaster prowess ;). We leave discussion of specific cases to the forum.
  • We've set up the quiz using our very own Google Docs. This means you won't see results right away, but we plan to write a follow-up blog post explaining answers and listing top scorers. Be sure to save your answers or print out your completed quiz before submitting! This way you can check your answers against the correct ones when we publish them.
  • It's just for fun!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Updated malware feature in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

A little over six months ago we released a new malware diagnostic tool in Webmaster Tools with the help of Lucas Ballard from the anti-malware team. This feature has been a great success; many of you were interested to know if Google had detected malicious software in your site, and you used the tool's information to find and remove that malware and to fix the vulnerabilities in your servers.

Well, a few days ago we promoted the malware diagnostics tool from Labs to a full Webmaster Tools feature. You can now find it under the Diagnostics menu. Not only that, we also added support for malware notifications. As you may already know, if your site has malware we may show a warning message in our search results indicating that the site is potentially harmful. If this is the case, you should remove any dangerous content as soon as possible and patch the vulnerabilities in your server. After you've done that, you can request a malware review in order to have the warning for your site removed. What's new in our latest release is that the form to request a review is now right there with the rest of the malware data:

Screenshot of the new malware feature in Webmaster Tools

We've also made several other improvements under the covers. Now the data is updated almost four times faster than before. And we've improved our algorithms for identifying injected content and can pinpoint exploits that were difficult to catch when the feature first launched.

On the Webmaster Tools dashboard you'll still see a warning message when you have malware on one of your sites. This message has a link that will take you directly to the malware tool. Here at Google we take malware very seriously, and we're working on several improvements to this feature so that we can tell you ASAP if we detect that your site is potentially infected. Stay tuned!

For more details, check out the Malware & Hacked sites help forum.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Running desktop and mobile versions of your site 2013

salam every one, this is a topic from google web master centrale blog: (This post was largely translated from our Japanese version of the Webmaster Central Blog )

Recently I introduced several methods to ensure your mobile site is properly indexed by Google. Today I'd like to share information useful for webmasters who manage both desktop and mobile phone versions of a site.

One of the most common problems for webmasters who run both mobile and desktop versions of a site is that the mobile version of the site appears for users on a desktop computer, or that the desktop version of the site appears when someone finds them from a mobile device. In dealing with this scenario, here are two viable options:

Redirect mobile users to the correct version
When a mobile user or crawler (like Googlebot-Mobile) accesses the desktop version of a URL, you can redirect them to the corresponding mobile version of the same page. Google notices the relationship between the two versions of the URL and displays the standard version for searches from desktops and the mobile version for mobile searches.

If you redirect users, please make sure that the content on the corresponding mobile/desktop URL matches as closely as possible. For example, if you run a shopping site and there's an access from a mobile phone to a desktop-version URL, make sure that the user is redirected to the mobile version of the page for the same product, and not to the homepage of the mobile version of the site. We occasionally find sites using this kind of redirect in an attempt to boost their search rankings, but this practice only results in a negative user experience, and so should be avoided at all costs.

On the other hand, when there's an access to a mobile-version URL from a desktop browser or by our web crawler, Googlebot, it's not necessary to redirect them to the desktop-version. For instance, Google doesn't automatically redirect desktop users from their mobile site to their desktop site, instead they include a link on the mobile-version page to the desktop version. These links are especially helpful when a mobile site doesn't provide the full functionality of the desktop version -- users can easily navigate to the desktop-version if they prefer.

Switch content based on User-agent
Some sites have the same URL for both desktop and mobile content, but change their format according to User-agent. In other words, both mobile users and desktop users access the same URL (i.e. no redirects), but the content/format changes slightly according to the User-agent. In this case, the same URL will appear for both mobile search and desktop search, and desktop users can see a desktop version of the content while mobile users can see a mobile version of the content.

However, note that if you fail to configure your site correctly, your site could be considered to be cloaking, which can lead to your site disappearing from our search results. Cloaking refers to an attempt to boost search result rankings by serving different content to Googlebot than to regular users. This causes problems such as less relevant results (pages appear in search results even though their content is actually unrelated to what users see/want), so we take cloaking very seriously.

So what does "the page that the user sees" mean if you provide both versions with a URL? As I mentioned in the previous post, Google uses "Googlebot" for web search and "Googlebot-Mobile" for mobile search. To remain within our guidelines, you should serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device. It's fine if the contents for Googlebot are different from the one for Googlebot-Mobile.

One example of how you could be unintentionally detected for cloaking is if your site returns a message like "Please access from mobile phones" to desktop browsers, but then returns a full mobile version to both crawlers (so Googlebot receives the mobile version). In this case, the page which web search users see (e.g. "Please access from mobile phones") is different from the page which Googlebot crawls (e.g. "Welcome to my site"). Again, we detect cloaking because we want to serve users the same relevant content that Googlebot or Googlebot-Mobile crawled.

Diagram of serving content from your mobile-enabled site


We're working on a daily basis to improve search results and solve problems, but because the relationship between PC and mobile versions of a web site can be nuanced, we appreciate the cooperation of webmasters. Your help will result in more mobile content being indexed by Google, improving the search results provided to users. Thank you for your cooperation in improving the mobile search user experience.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Troubleshooting Instant Previews in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

In November, we launched Instant Previews to help users better understand if a particular result was relevant for a their search query. Since launch, our Instant Previews team has been keeping an eye on common complaints and problems related to how pages are rendered for Instant Previews.

When we see issues with preview images, they are frequently due to:
  • Blocked resources due to a robots.txt entry
  • Cloaking: Erroneous content being served to the Googlebot user-agent
  • Poor alternative content when Flash is unavailable
To help webmasters diagnose these problems, we have a new Instant Preview tool in the Labs section of Webmaster Tools (in English only for now).



Here, you can input the URL of any page on your site. We will then fetch the page from your site and try to render it both as it would display in Chrome and through our Instant Preview renderer. Please keep in mind that both of these renders are done using a recent build of Webkit which does not include plugins such as Flash or Silverlight, so it's important to consider the value of providing alternative content for these situations. Alternative content can be helpful to search engines, and visitors to your site without the plugin would benefit as well.

Below the renders, you’ll also see automated feedback on problems our system can detect such as missing or roboted resources. And, in the future, we plan to add more informative and timely feedback to help improve your Instant Previews!

Please direct your questions and feedback to the Webmaster Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Verification time savers —  Analytics included! 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Nobody likes to duplicate effort. Unfortunately, sometimes it's a fact of life. If you want to use Google Analytics, you need to add a JavaScript tracking code to your pages. When you're ready to verify ownership of your site in other Google products (such as Webmaster Tools), you have to add a meta tag, HTML file or DNS record to your site. They're very similar tasks, but also completely independent. Until today.

You can now use a Google Analytics JavaScript snippet to verify ownership of your website. If you already have Google Analytics set up, verifying ownership is as simple as clicking a button.


This only works with the newer asynchronous Analytics JavaScript, so if you haven't migrated yet, now is a great time. If you haven't set up Google Analytics or verified yet, go ahead and set up Google Analytics first, then come verify ownership of your site. It'll save you a little time — who doesn't like that? Just as with all of our other verification methods, the Google Analytics JavaScript needs to stay in place on your site, or your verification will expire. You also need to remain an administrator on the Google Analytics account associated with the JavaScript snippet.

Don't forget that once you've verified ownership, you can add other verified owners quickly and easily through the Verification Details page. There's no need for each owner to manually verify ownership. More effort and time saved!


We've also introduced an improved interface for verification. The new verification page gives you more information about each verification method. In some cases, we can now provide detailed instructions about how to complete verification with your specific domain registrar or provider. If your provider is included, there's no need to dig through their documentation to figure out how to add a verification DNS record — we'll walk you through it.


The time you save using these new verification features might not be enough to let you take up a new hobby, but we hope it makes the verification process a little bit more pleasant. As always, please visit the Webmaster Help Forum if you have any questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Beyond Times and Arial - The New Web Safe Fonts 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

In the past, when you created a website or web app, you were largely limited to a few select “web safe” fonts such as Times and Arial. If you deviated from these fonts, you were required to use Adobe Flash or to embed text in images, which introduced a whole new set of trade offs. For example, images aren’t semantic, cannot be translated into other languages automatically, and can be much larger in file size than text. In addition, text in images cannot be copied to a user’s clipboard, read with screen-reading software, or easily indexed by search engines.

The good news is, with Google Web Fonts it is now possible to use hundreds of web safe fonts on your web pages. Launched last May, Google Web Fonts allows you to simply choose the font(s) you’d like to use on your webpage, blog, or web app, and embed the snippet of HTML and CSS. In about 30 seconds, you can have beautiful fonts on your pages that will render correctly in the large majority of popular modern web browsers. No longer will you need to use images or Flash to embed the font of your choice.

Unlike Times and Arial, which are references to fonts installed on a user’s local machine, web fonts are served via a browser request (much like an image would be served). That means you can push any web font to a user’s machine. Users will be delighted when they realize these fonts behave just as any other text in Arial would behave.


Some example web fonts, offered by the Google Web Fonts service


The adoption of the web font technology has been rapid. Google Web Fonts now serves roughly 50 million daily requests[1], across roughly 800,000 unique websites[2], and is growing at about 30% each month. Here at Google, we’re excited about the potential for web fonts to change the very fabric of the web. Beautiful typography makes the web more pleasant to browse, expressive, and interesting.

Here’s to a beautiful Web!



[1] A request is a single call to the Google Font API for one or more fonts.
[2] We count a unique website as unique domains, except that “www” subdomains are not counted. For example, www.myblog.com and myblog.com would count as one domain. However, sam.myblog.com and sally.myblog.com would count as two domains.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Accessing search query data for your sites 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

SSL encryption on the web has been growing by leaps and bounds. As part of our commitment to provide a more secure online experience, today we announced that SSL Search on https://www.google.com will become the default experience for signed in users on google.com. This change will be rolling out over the next few weeks.

What is the impact of this change for webmasters? Today, a web site accessed through organic search results on http://www.google.com (non-SSL) can see both that the user came from google.com and their search query. (Technically speaking, the user’s browser passes this information via the HTTP referrer field.) However, for organic search results on SSL search, a web site will only know that the user came from google.com.

Webmasters can still access a wealth of search query data for their sites via Webmaster Tools. For sites which have been added and verified in Webmaster Tools, webmasters can do the following:
  • View the top 1000 daily search queries and top 1000 daily landing pages for the past 30 days.
  • View the impressions, clicks, clickthrough rate (CTR), and average position in search results for each query, and compare this to the previous 30 day period.
  • Download this data in CSV format.
In addition, users of Google Analytics’ Search Engine Optimization reports have access to the same search query data available in Webmaster Tools and can take advantage of its rich reporting capabilities.

We will continue to look into further improvements to how search query data is surfaced through Webmaster Tools. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.