Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: A faster image search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

People looking for images on Google often want to browse through many images, looking both at the images and their metadata (detailed information about the images). Based on feedback from both users and webmasters, we redesigned Google Images to provide a better search experience. In the next few days, you’ll see image results displayed in an inline panel so it’s faster, more beautiful, and more reliable. You will be able to quickly flip through a set of images by using the keyboard. If you want to go back to browsing other search results, just scroll down and pick up right where you left off.

Screenshot of new Google Images results using the query nasa earth as an example


Here’s what it means for webmasters:
  • We now display detailed information about the image (the metadata) right underneath the image in the search results, instead of redirecting users to a separate landing page.
  • We’re featuring some key information much more prominently next to the image: the title of the page hosting the image, the domain name it comes from, and the image size.
  • The domain name is now clickable, and we also added a new button to visit the page the image is hosted on. This means that there are now four clickable targets to the source page instead of just two. In our tests, we’ve seen a net increase in the average click-through rate to the hosting website.
  • The source page will no longer load up in an iframe in the background of the image detail view. This speeds up the experience for users, reduces the load on the source website’s servers, and improves the accuracy of webmaster metrics such as pageviews. As usual, image search query data is available in Top Search Queries in Webmaster Tools.
As always, please ask on our Webmaster Help forum if you have questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: A reminder about selling links that pass PageRank 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

Google has said for years that selling links that pass PageRank violates our quality guidelines. We continue to reiterate that guidance periodically to help remind site owners and webmasters of that policy.

Please be wary if someone approaches you and wants to pay you for links or "advertorial" pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations. The consequences for a linkselling site start with losing trust in Google's search results, as well as reduction of the site's visible PageRank in the Google Toolbar. The consequences can also include lower rankings for that site in Google's search results.

If you receive a warning for selling links that pass PageRank in Google's Webmaster Tools, you'll see a notification message to look for "possibly artificial or unnatural links on your site pointing to other sites that could be intended to manipulate PageRank." That's an indication that your site has lost trust in Google's index.

To address the issue, make sure that any paid links on your site don't pass PageRank. You can remove any paid links or advertorial pages, or make sure that any paid hyperlinks have the rel="nofollow" attribute. After ensuring that no paid links on your site pass PageRank, you can submit a reconsideration request and if you had a manual webspam action on your site, someone at Google will review the request. After the request has been reviewed, you'll get a notification back about whether the reconsideration request was granted or not.

We do take this issue very seriously, so we recommend you avoid selling (and buying) links that pass PageRank in order to prevent loss of trust, lower PageRank in the Google Toolbar, lower rankings, or in an extreme case, removal from Google's search results.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Introducing a new Rich Snippets format: Events 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Last year we introduced Rich Snippets, a new feature that makes it possible to surface structured data from your pages on Google's search results. So far, user reaction to Rich Snippets has been enthusiastic -- after all, Rich Snippets help people make more informed clicks and find what they need even faster.

We originally introduced Rich Snippets with two formats: reviews and people. Later in the year we added support for marking up video information which is used to improve Video Search. Today, we're excited to kick off the new year by adding support for events.

Events markup is based off of the hCalendar microformat. Here's an example of what the new events Rich Snippets will look like:


The new format shows links to specific events on the page along with dates and locations. It provides a fast and convenient way for users to determine if a page has events they may be interested in.

If you have event listings on your site, we encourage you to review the events documentation we've prepared to help you get started. Please note, however, that marking up your content is not a guarantee that Rich Snippets will show for your site. Just as we did for previous formats, we will take a gradual approach to incorporating the new event snippets to ensure a great user experience along the way.

Stay tuned for more developments in Rich Snippets throughout the year!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Finding Places on the Web: Rich Snippets for Local Search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All
Cross-posted from the Lat Long Blog.

We’re sharing some news today that we hope webmasters will find exciting. As you know, we’re constantly working to organize the world’s information - be it textual, visual, geographic or any other type of useful data. From a local search perspective, part of this effort means looking for all the great web pages that reference a particular place. The Internet is teeming with useful information about local places and points of interest, and we do our best to deliver relevant search results that help shed light on locations all across the globe.

Today, we’re announcing that your use of Rich Snippets can help people find the web pages you’ve created that may reference a specific place or location. By using structured HTML formats like hCard to markup the business or organization described on your page, you make it easier for search engines like Google to properly classify your site, recognize and understand that its content is about a particular place, and make it discoverable to users on Place pages.

You can get started by reviewing these tips for using Rich Snippets for Local Search. Whether you’re creating a website for your own business, an article on a newly opened restaurant, or a guide to the best places in town, your precise markup helps associate your site with the search results for that particular place. Though this markup does not guarantee that your site will be shown in search results, we’re excited to expand support for making the web better organized around real world places.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Making Websites Mobile Friendly 2013

salam every one, this is a topic from google web master centrale blog:

Webmaster level: Intermediate

We’ve noticed a rise in the number of questions from webmasters about how best to structure a website for mobile phones and how websites can best interact with Googlebot-Mobile. In this post we’ll explain the current situation and give you specific recommendations you can implement now.

Some Background

Let’s start with a simple question: what do we mean by “mobile phone” when talking about mobile-friendly websites?

A good way to answer this question is to think about the capabilities of the mobile phone’s web browser, especially in relation to the capabilities of modern desktop browsers. To simplify matters, we can break mobile phones into a few classifications:

  1. Traditional mobile phones: Phones with browsers that cannot render normal desktop webpages. This includes browsers for cHTML (iMode), WML, WAP, and the like.
  2. Smartphones: Phones with browsers that render normal desktop pages, at least to some extent. This category includes a diversity of devices, such Windows Phone 7, Blackberry devices, iPhones, and Android phones, and also tablets and eBook readers.

    We can further break down this category by support for HTML5:

    • Devices with browsers that do not support HTML5
    • Devices with browsers that support HTML5

Once upon a time, mobile phones connected to the Internet using browsers with limited rendering capabilities; but this is clearly a changing situation with the fast rise of smartphones which have browsers that rival the full desktop experience. As such, it’s important to note that the distinction we are making here is based on the current situation as we see it and might change in the future.

Googlebot and Mobile Content

Google has two crawlers relevant to this topic: Googlebot and Googlebot-Mobile. Googlebot crawls desktop-browser type of webpages and content embedded in them and Googlebot-Mobile crawls mobile content. The questions we’re seeing more of can be summed up as follows:

Given the diversity of capabilities of mobile web browsers, what kind of content should I serve to Googlebot-Mobile?

The answer lies in the User-agent that Googlebot-Mobile supplies when crawling. There are several User-agent strings in use by Googlebot-Mobile, all of which use this format:

[Phone name(s)] (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)

To decide which content to serve, assess which content your website has that best serves the phone(s) in the User-agent string. A full list of Googlebot-Mobile User-agents can be found here.

Notice that we currently do not crawl with Googlebot-Mobile using a smartphone User-agent string. Thus at the current time, a correctly-configured content serving system will serve Googlebot-Mobile content only for the traditional phones described above, because that’s what the User-agent strings in use today dictate. This may change in the future, and if so, it may mean there would be a new Googlebot-Mobile User-agent string.

For now, we expect smartphones to handle desktop experience content so there is no real need for mobile-specific effort from webmasters. However, for many websites it may still make sense for the content to be formatted differently for smartphones, and the decision to do so should be based on how you can best serve your users.

URL Structure for Mobile Content

The next set of questions ask about the URLs mobile content should be served from. Let’s look in detail at some common use cases.

Websites with only Desktop Experience Content

Most websites currently have only one version of their content, namely in HTML that is designed for desktop web browsers. This means all browsers access the content from the same URL.

These websites may not be serving traditional mobile phone users. The quality experienced by their smartphone users depends on the mobile browser they are using and it could be as good as browsing from the desktop.

If you serve only desktop experience content for all User Agents, you should do so for Googlebot-Mobile too; that is, treat Googlebot-Mobile as you treat all other or unknown User Agents. In these cases, Google may modify your webpages for an improved mobile experience.

Websites with Dedicated Mobile Content

Many websites have content specifically optimized for mobile users. The content could be simply reformatted for the typically smaller mobile displays, or it could be in a different format (e.g., served using WAP, etc.).

A very common question we see is: Does it matter if the different types of content are served from the same URL or from different URLs? For example, some websites have www.example.com as the URL desktop browsers are meant to access and have m.example.com or wap.example.com for the different mobile devices. Other websites serve all types of content from just one URL structure like www.example.com.

For Googlebot and Googlebot-Mobile, it does not matter what the URL structure is as long as it returns exactly what a user sees too. For example, if you redirect mobile users from www.example.com to m.example.com, that will be recognized by Googlebot-Mobile and both websites will be crawled and added to the correct index. In this case, use a 301 redirect for both users and Googlebot-Mobile.

If you serve all types of content from www.example.com, i.e. serving desktop-optimized content or mobile-optimized content from the same URL depending on the User-agent, this will also lead to correct crawling by Googlebot and Googlebot-Mobile. This is not considered cloaking by Google.

It is worth repeating that regardless of URL structure, you must correctly detect the User-agent as given by your users and Googlebot-Mobile, and serve both the same content. Don’t forget to keep the default content, the desktop-optimized content, for when an unknown User-agent requests it.

Mobile Sitemaps in Webmaster Tools

Finally, we receive many questions about what URLs to put in Mobile Sitemaps. As explained in our Mobile Sitemaps Help Center articles, you should include only mobile content URLs in Mobile Sitemaps, even if these URLs also return non-mobile content when accessed by a non-mobile User-agent.

More Questions?

A good place to start is our Mobile Sites Help Center articles and the relevant sections in our Search Engine Optimization Starter Guide. We also created a thread in our forums for you to ask questions about this post.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Easier management of website verifications 2013

salam every one, this is a topic from google web master centrale blog:

Webmaster level: All

To help webmasters manage the verified owners for their websites in Webmaster Tools, we’ve recently introduced three new features:

  • Verification details view: You can now see the methods used to verify an owner for your site. In the Manage owners page for your site, you can now find the new Verification details link. This screenshot shows the verification details of a user who is verified using both an HTML file uploaded to the site and a meta tag:

    Where appropriate, the Verification details will have links to the correct URL on your site where the verification can be found to help you find it faster.

  • Requiring the verification method be removed from the site before unverifying an owner: You now need to remove the verification method from your site before unverifying an owner from Webmaster Tools. Webmaster Tools now checks the method that the owner used to verify ownership of the site, and will show an error message if the verification is still found. For example, this is the error message shown when an unverification was attempted while the DNS CNAME verification method was still found on the DNS records of the domain:

  • Shorter CNAME verification string: We’ve slightly modified the CNAME verification string to make it shorter to support a larger number of DNS providers. Some systems limit the number of characters that can be used in DNS records, which meant that some users were not able to use the CNAME verification method. We’ve now made the CNAME verification method have a fewer number of characters. Existing CNAME verifications will continue to be valid.

We hope this changes make it easier for you to use Webmaster Tools. As always, please post in our Verification forum if you have any questions or feedback.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Quality links to your site 2013

salam every one, this is a topic from google web master centrale blog: A popular question on our Webmaster Help Forum is in regard to best practices for organic link building. There seems to be some confusion, especially among less experienced webmasters, on how to approach the topic. Different perspectives have been shared, and we would also like to explain our viewpoint on earning quality links.

If your site is rather new and still unknown, a good way marketing technique is to get involved in the community around your topic. Interact and contribute on forums and blogs. Just keep in mind to contribute in a positive way, rather than spamming or soliciting for your site. Just building a reputation can drive people to your site. And they will keep on visiting it and linking to it. If you offer long-lasting, unique and compelling content -- something that lets your expertise shine -- people will want to recommend it to others. Great content can serve this purpose as much as providing useful tools.

A promising way to create value for your target group and earn great links is to think of issues or problems your users might encounter. Visitors are likely to appreciate your site and link to it if you publish a short tutorial or a video providing a solution, or a practical tool. Survey or original research results can serve the same purpose, if they turn out to be useful for the target audience. Both methods grow your credibility in the community and increase visibility. This can help you gain lasting, merit-based links and loyal followers who generate direct traffic and "spread the word." Offering a number of solutions for different problems could evolve into a blog which can continuously affect the site's reputation in a positive way.

Humor can be another way to gain both great links and get people to talk about your site. With Google Buzz and other social media services constantly growing, entertaining content is being shared now more than ever. We've seen all kinds of amusing content, from ASCII art embedded in a site's source code to funny downtime messages used as a viral marketing technique to increase the visibility of a site. However, we do not recommend counting only on short-lived link-bait tactics. Their appeal wears off quickly and as powerful as marketing stunts can be, you shouldn't rely on them as a long-term strategy or as your only marketing effort.

It's important to clarify that any legitimate link building strategy is a long-term effort. There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site's reputation. Buying PageRank-passing links or randomly exchanging links are the worst ways of attempting to gather links and they're likely to have no positive impact on your site's performance over time. If your site's visibility in the Google index is important to you it's best to avoid them.

Directory entries are often mentioned as another way to promote young sites in the Google index. There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it's on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes.

It can be a good idea to take a look at similar sites in other markets and identify the elements of those sites that might work well for yours, too. However, it's important not to just copy success stories but to adapt them, so that they provide unique value for your visitors.


Social bookmarks on YouTube enable users to share content easily


Finally, consider making linking to your site easier for less tech savvy users. Similar to the way we do it on YouTube, offering bookmarking services for social sites like Twitter or Facebook can help spread the word about the great content on your site and draw users' attention.

As usual, we'd like to hear your opinion. You're welcome to comment here in the blog, or join our Webmaster Help Forum community.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: To slash or not to slash 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Intermediate

That is the question we hear often. Onward to the answers! Historically, it’s common for URLs with a trailing slash to indicate a directory, and those without a trailing slash to denote a file:

http://example.com/foo/ (with trailing slash, conventionally a directory)
http://example.com/foo (without trailing slash, conventionally a file)

But they certainly don’t have to. Google treats each URL above separately (and equally) regardless of whether it’s a file or a directory, or it contains a trailing slash or it doesn’t contain a trailing slash.

Different content on / and no-/ URLs okay for Google, often less ideal for users

From a technical, search engine standpoint, it’s certainly permissible for these two URL versions to contain different content. Your users, however, may find this configuration horribly confusing -- just imagine if www.google.com/webmasters and www.google.com/webmasters/ produced two separate experiences.

For this reason, trailing slash and non-trailing slash URLs often serve the same content. The most common case is when a site is configured with a directory structure:
http://example.com/parent-directory/child-directory/

Your site’s configuration and your options

You can do a quick check on your site to see if the URLs:
  1. http://<your-domain-here>/<some-directory-here>/
    (with trailing slash)
  2. http://<your-domain-here>/<some-directory-here>
    (no trailing slash)
don’t both return a 200 response code, but that one version redirects to the other.
  • If only one version can be returned (i.e., the other redirects to it), that’s great! This behavior is beneficial because it reduces duplicate content. In the particular case of redirects to trailing slash URLs, our search results will likely show the version of the URL with the 200 response code (most often the trailing slash URL) -- regardless of whether the redirect was a 301 or 302.

  • If both slash and non-trailing-slash versions contain the same content and each returns 200, you can:
    • Consider changing this behavior (more info below) to reduce duplicate content and improve crawl efficiency.
    • Leave it as-is. Many sites have duplicate content. Our indexing process often handles this case for webmasters and users. While it’s not totally optimal behavior, it’s perfectly legitimate and a-okay. :)
    • Rest assured that for your root URL specifically, http://example.com is equivalent to http://example.com/ and can’t be redirected even if you’re Chuck Norris.
Steps for serving only one URL version

What if your site serves duplicate content on these two URLs:

http://<your-domain-here>/<some-directory-here>/
http://<your-domain-here>/<some-directory-here>

meaning that both URLs return 200 (neither has a redirect or contains rel=”canonical”), and you want to change the situation?
  1. Choose one URL as the preferred version. If your site has a directory structure, it’s more conventional to use a trailing slash with your directory URLs (e.g., example.com/directory/ rather than example.com/directory), but you’re free to choose whichever you like.

  2. Be consistent with the preferred version. Use it in your internal links. If you have a Sitemap, include the preferred version (and don’t include the duplicate URL).

  3. Use a 301 redirect from the duplicate to the preferred version. If that’s not possible, rel=”canonical” is a strong option. rel=”canonical” works similarly to a 301 for Google’s indexing purposes, and other major search engines as well.

  4. Test your 301 configuration through Fetch as Googlebot in Webmaster Tools. Make sure your URLs:
    http://example.com/foo/
    http://example.com/foo
    are behaving as expected. The preferred version should return 200. The duplicate URL should 301 to the preferred URL.

  5. Check for Crawl errors in Webmaster Tools, and, if possible, your webserver logs as a sanity check that the 301s are implemented.

  6. Profit! (just kidding) But you can bask in the sunshine of your efficient server configuration, warmed by the knowledge that your site is better optimized.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Optimizing sites for TV 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Just as mobile phones make your site accessible to people on the go, Google TV makes your site easily viewable to people lounging on their couch. Google TV is a platform that combines your current TV programming with the web and, before long, more apps. It’s the web you love, with the TV you love, all available on the sofa made for you. Woohoo!

Because Google TV has a fully functioning web browser built in, users can easily visit your site from their TV. Current sites should already work, but you may want to provide your users with an enhanced TV experience -- what's called the “10-foot UI” (user interface). They'll be several feet away from the screen, not several inches away, and rather than a mouse on their desktop, they'll have a remote with a keyboard and a pointing device.

For example, here’s YouTube for desktop users versus what we’re calling “YouTube Leanback” -- our site optimized for large screens:


YouTube desktop version on the left, YouTube Leanback on the right

See our Spotlight Gallery for more examples of TV-optimized sites.

What does "optimized for TV" mean?

It means that, for the user sitting on their couch, your site on their TV is an even more enjoyable experience:
  • Text is large enough to be viewable from the sofa-to-TV distance.
  • Site navigation can be performed through button arrows on the remote (a D-pad), rather than mouse/touchpad usage
  • Selectable elements provide a visual queue when selected (when you’re 10 feet away, it needs to be really, really obvious what selections are highlighted)
  • and more...
How can webmasters gain a general idea of their site’s appearance on TV?

First, remember that appearance alone doesn't incorporate whether your site can be easily navigated by TV users (i.e. users with a remote rather than a mouse). With that said, here’s a quick workaround to give you a ballpark idea of how your site looks on TV. (For more in-depth info, please see the “Design considerations” in our optimization guide.)
  1. On a large monitor, make your window size 1920 x 1080.
  2. In a browser, visit your site at full screen.
  3. Zoom the browser to 1.5x the normal size. This is performed in different ways with different keyboards. For example, in Chrome if you press ctrl+ (press ctrl and + at the same time) twice, that’ll zoom the browser to nearly 1.5x the initial size.
  4. Move back 3 x (the distance between you and the monitor).
  5. Check out your site!
And don’t forget, if you want to see your site with the real thing, Google TV enabled devices are now available in stores.

How can you learn more?

Our team just published a developer site, with TV optimization techniques, at code.google.com/tv/web/.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Website user research and testing on the cheap 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

As the team responsible for tens of thousands of Google’s informational web pages, the Webmaster Team is here to offer tips and advice based on their experiences as hands-on webmasters.

If you’ve never tested or analyzed usage of your website, ask yourself if you really know whether your site is useful for your target audience. If you’re unsure, why not find out? For example, did you know that on average users scroll down 5.9 times as often as they scroll up, meaning that often once page content is scrolled past, it is “lost?” (See Jakob Nielsen’s findings on scrolling, where he advises that users don’t mind scrolling, but within limits.)

Also, check your analytics—are you curious about high bounce rates from any of your pages, or very short time-on-page metrics?

First, think about your user


The start of a web project—whether it’s completely new or a revamp of an existing site—is a great time to ask questions like:

  • How might users access your site—home, office, on-the-go?
  • How tech-savvy are your visitors?
  • How familiar are users with the subject matter of your website?

The answers to some of these questions can be valuable when making initial design decisions.

For instance, if the user is likely to be on the road, they might be short on time to find the information they need from your site, or be in a distracting environment and have a slow data connection—so a simple layout with single purpose would work best. Additionally, if you’re providing content for a less technical audience, make sure it’s not too difficult to access content—animation might provide a “wow” factor, but only if your user appreciates it and it’s not too difficult to get to the content.

Even without testing, building a basic user profile (or “persona”) can help shape your designs for the benefit of the user—this doesn’t have to be an exhaustive biography, but just some basic considerations of your user’s behavior patterns.

Simple testing


Testing doesn’t have to be a costly operation – friends and family can be a great resource. Some pointers:

  • Sample size: Just five people can be a large enough number of users to find common problems in your layouts and navigation (see Jakob Nielsen’s article on why using a small sample size is sufficient).
  • Choosing your testers: A range of different technical ability can be useful, but be sure to only focus on trends—for example, if more than 50% of your testers have the same usability issue, it’s likely a real problem—rather than individual issues encountered.
  • Testing location: If possible, visit the user in their home and watch how they use the site—observe how he/she normally navigates the web when relaxed and in their natural environment. Remote testing is also a possibility if you can’t make it in person—we’ve heard that Google+ hangouts can be used effectively for this (find out more about using Google+ hangouts).
  • How to test: Based on your site’s goals, define 4 or 5 simple tasks to do on your website, and let the user try to complete the tasks. Ask your testers to speak aloud so you can better understand their experiences and thought processes.
  • What to test: Basic prototypes in clickable image or document format (for example, PDF) or HTML can be used to test the basic interactions, without having to build out a full site for testing. This way, you can test out different options for navigation and layouts to see how they perform before implementing them.
  • What not to test: Focus on functionality rather than graphic design elements; viewpoints are often subjective. You would only get useful feedback on design from quantitative testing with large (200+) numbers of users (unless, for example, the colors you use on your site make the content unreadable, which would be good feedback!). One format for getting some useful feedback on the design can be to offer 5-6 descriptive keywords and ask your user to choose the most representative ones.
Overall, basic testing is most useful for seeing how your website’s functionality is working—the ease of finding information and common site interactions.

Lessons learned


In case you’re still wondering whether it’s really worth research and testing, here are a few simple things we confirmed from actual users that we wouldn’t have known if we hadn’t sat with actual users and watched them use our pages, or analyzed our web traffic.

  • Take care when using layouts that hide/show content: We found when using scripts to expand and collapse long text passages, the user often didn’t realize the extra content was available—effectively “hiding” the JavaScript-rendered content when the user searches within the page (for example, using Control + F, which we’ve seen often).


    Wireframe of layout tested, showing “zipped”
    content on the bottom left



    Final page design showing anchor links in the top
    and content laid out in the main body of the page


  • Check your language: Headings, link and button text are what catches the user’s eye the most when scanning the page. Avoid using “Learn more…” in link text—users seem averse to clicking on a link which implies they will need to learn something. Instead, just try to use a literal description of what content the user will get behind the link—and make sure link text makes sense and is easy to understand out of context, because that is often how it will be scanned. Be mindful about language and try to make button text descriptive, inviting and interesting.
  • Test pages on a slower connection: Try out your pages using different networks (for example, try browsing your website using the wifi at your local coffee shop or a friend’s house), especially if your target users are likely to be viewing your pages from a home connection that’s not as fast as your office network. We found a considerable improvement in CTR and time-on-site metrics in some cases when we made scripted animations much simpler and faster (hint: use Google’s Page Speed Online to check performance if you don’t have access to a slower Internet connection).
So if you’re caught up in a seemingly never-ending redevelopment cycle, save yourself some time in the future by investing a little up front through user profiling and basic testing, so that you’re more likely to choose the right approach for your site layout and architecture.

We’d love to hear from you in the comments: have you tried out website usability testing? If so, how did you get on, and what are your favorite simple and low-cost tricks to get the most out of it? this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: State of the Index 2009 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

At PubCon in Las Vegas in November 2009, I gave a "State of the Index" talk which covers what Google has done for users, web developers, and webmasters in the last year. I recently recreated it on video for those of you who didn't make it to the conference. You can watch it below:


And here are the slides if you'd like to follow along:


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: URL removal explained, Part III: Removing content that you don't own 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Welcome to the third episode of our URL removals series! In episodes one and two, we talked about expediting the removal of content that's under your control and requesting expedited cache removals. Today, we're covering how to use Google's public URL removal tool to request removal of content from Google’s search results when the content originates on a website not under your control.

Google offers two tools that provide a way to request expedited removal of content:

1. Verified URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site of which you’re a verified owner in Webmaster Tools (like your blog or your company’s site)

2. Public URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site which you can’t verify ownership (like your friend’s blog)

Sometimes a situation arises where the information you want to remove originates from a site that you don't own or can't control. Since each individual webmaster controls their site and their site’s content, the best way to update or remove results from Google is for the site owner (where the content is published) to either block crawling of the URL, modify the content source, or remove the page altogether. If the content isn't changed, it would just reappear in our search results the next time we crawled it. So the first step to remove content that's hosted on a site you don't own is to contact the owner of the website and request that they remove or block the content in question.
  • Removed or blocked content

    If the website owner removes a page, requests for the removed page should return a "404 Not Found" response or a "410 Gone" response. If they choose to block the page from search engines, then the page should either be disallowed in the site's robots.txt file or contain a noindex meta tag. Once one of these requirements is met, you can submit a removal request using the "Webmaster has already blocked the page" option.



    Sometimes a website owner will claim that they’ve blocked or removed a page but they haven’t technically done so. If they claim a page has been blocked you can double check by looking at the site’s robots.txt file to see if the page is listed there as disallowed.
    User-agent: *
    Disallow: /blocked-page/
    Another place to check if a page has been blocked is within the page’s HTML source code itself. You can visit the page and choose “View Page Source” from your browser. Is there a meta noindex tag in the HTML “head” section?
    <html>
    <head>
    <title>blocked page</title>
    <meta name="robots" content="noindex">
    </head>
    ...
    If they inform you that the page has been removed, you can confirm this by using an HTTP response testing tool like the Live HTTP Headers add-on for the Firefox browser. With this add-on enabled, you can request any URL in Firefox to test that the HTTP response is actually 404 Not Found or 410 Gone.

  • Content removed from the page

    Once you've confirmed that the content you're seeking to remove is no longer present on the page, you can request a cache removal using the 'Content has been removed from the page' option. This type of removal--usually called a "cache" removal--ensures that Google's search results will not include the cached copy or version of the old page, or any snippets of text from the old version of the page. Only the current updated page (without the content that's been removed) will be accessible from Google's search results. However, the current updated page can potentially still rank for terms related to the old content as a result of inbound links that still exist from external sites. For cache removal requests you’ll be asked to enter a "term that has been removed from the page." Be sure to enter a word that is not found on the current live page, so that our automated process can confirm the page has changed -- otherwise the request will be denied. Cache removals are covered in more detail in part two of the "URL removal explained" series.


  • Removing inappropriate webpages or images that appear in our SafeSearch filtered results

    Google introduced the SafeSearch filter with the goal of providing search results that exclude potentially offensive content. For situations where you find content that you feel should have been filtered out by SafeSearch, you can request that this content be excluded from SafeSearch filtered results in the future. Submit a removal request using the 'Inappropriate content appears in our SafeSearch filtered results' option.

If you encounter any issues with the public URL removal tool or have questions not addressed here, please post them to the Webmaster Help Forum or consult the more detailed removal instructions in our Help Center. If you do post to the forum, remember to use a URL shortening service to share any links to content you want removed.

Edit: Read the rest of this series:
Part I: Removing URLs & directories
Part II: Removing & updating cached content
Part IV: Tracking requests, what not to remove
Companion post: Managing what information is available about you online

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: 'New software version' notifications for your site 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

One of the great things about working at Google is that we get to take advantage of an enormous amount of computing power to do some really cool things. One idea we tried out was to let webmasters know about their potentially hackable websites. The initial effort was successful enough that we thought we would take it one step further by expanding our efforts to cover other types of web applications—for example, more content management systems (CMSs), forum/bulletin-board applications, stat-trackers, and so on.

This time, however, our goal is not just to isolate vulnerable or hackable software packages, but to also notify webmasters about newer versions of the software packages or plugins they're running on their website. For example, there might be a Drupal module or Joomla extension update available but some folks might not have upgraded. There are a few reasons a webmaster might not upgrade to the newer version and one of the reasons could be that they just don't know a new version exists. This is where we think we can help. We hope to let webmasters know about new versions of their software by sending them a message via Webmaster Tools. This way they can make an informed decision about whether or not they would like to upgrade.

One of the ways we identify sites to notify is by parsing source code of web pages that we crawl. For example, WordPress and other CMS applications include a generator meta tag that specifies the version number. This has proven to be tremendously helpful in our efforts to notify webmasters. So if you're a software developer, and would like us to help you notify your users about newer versions of your software, a great way to start would be to include a generator meta tag that tells the version number of your software. If you're a plugin or a widget developer, including a version number in the source you provide to your users is a great way to help too.

We've seen divided opinions over time about whether it's a good security practice to include a version number in source code, because it lets hackers or worm writers know that the website might be vulnerable to a particular type of exploit. But as Matt Mullenweg pointed out, "Where [a worm writer's] 1.0 might have checked for version numbers, 2.0 just tests [a website's] capabilities...". Meanwhile, the advantage of a version number is that it can help alert site owners when they need to update their site. In the end, we tend to think that including a version number can do more good than harm.

We plan to begin sending out the first of these messages soon and hope that webmasters find them useful! If you have any questions or feedback, feel free to comment here.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Showing more results from a domain 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site:



Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.

We’re always reassessing our ranking and user interface, making hundreds of changes each year. We expect today’s improvement will help users find deeper results from a single site, while still providing diversity on the results page.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Structured Data Testing Tool 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

Today we’re excited to share the launch of a shiny new version of the rich snippet testing tool, now called the structured data testing tool. The major improvements are:
  • We’ve improved how we display rich snippets in the testing tool to better match how they appear in search results.
  • The brand new visual design makes it clearer what structured data we can extract from the page, and how that may be shown in our search results.
  • The tool is now available in languages other than English to help webmasters from around the world build structured-data-enabled websites.
Here’s what it looks like:
The new structured data testing tool works with all supported rich snippets and authorship markup, including applications, products, recipes, reviews, and others.

Try it yourself and, as always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.

Written by Yong Zhu on behalf of the rich snippets testing tool team



this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Making search-friendly mobile websites — now in 11 more languages 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

As more and more users worldwide with mobile devices access the Internet, it’s fantastic to see so many websites making their content accessible and useful for those devices. To help webmasters optimize their sites we launched our recommendations for smartphones, feature-phones, tablets, and Googlebot-friendly sites in June 2012.

We’re happy to announce that those recommendations are now also available in Arabic, Brazilian Portuguese, Dutch, French, German, Italian, Japanese, Polish, Russian, Simplified Chinese, and Spanish. US-based webmasters are welcome to read the UK-English version.

We welcome you to go through our recommendations, pick the configuration that you feel will work best with your website, and get ready to jump on the mobile bandwagon!

Thanks to the fantastic webmaster-outreach team in Dublin, Tokyo and Beijing for making this possible!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Make the most of Search Queries in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog: Level: Beginner to Intermediate

If you’re intrigued by the Search Queries feature in Webmaster Tools but aren’t sure how to make it actionable, we have a video that we hope will help!


Maile shares her approach to Search Queries in Webmaster Tools

This video explains the vocabulary of Search Queries, such as:
  • Impressions
  • Average position (only the top-ranking URL for the user’s query is factored in our calculation)
  • Click
  • CTR
The video also reviews an approach to investigating Top queries and Top pages:
  1. Prepare by understanding your website’s goals and your target audience (then using Search Queries “filters” to support your knowledge)
  2. Sort by clicks in Top queries to understand the top queries bringing searchers to your site (for the given time period)
  3. Sort by CTR to notice any missed opportunities
  4. Categorize queries into logical buckets that simplify tracking your progress and staying in touch with users’ needs
  5. Sort Top pages by clicks to find the URLs on your site most visited by searchers (for the given time period)
  6. Sort Top pages by impressions to find valuable pages that can be used to help feature your related, high-quality, but lower-ranking pages
After you’ve watched the video and applied the knowledge of your site with the findings from Search Queries, you’ll likely have several improvement ideas to help searchers find your site. If you’re up for it, let us know in the comments what Search Queries information you find useful (and why!), and of course, as always, feel free to share any tips or feedback.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Five common SEO mistakes (and six good ideas!) 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Beginner to Intermediate

To help you avoid common mistakes webmasters face with regard to search engine optimization (SEO), I filmed a video outlining five common mistakes I’ve noticed in the SEO industry. Almost four years ago, we also gathered information from all of you (our readers) about your SEO recommendations and updated our related Help Center article given your feedback. Much of the same advice from 2008 still holds true today -- here’s to more years ahead building a great site!




If you’re short on time, here’s the gist:

Avoid these common mistakes
1. Having no value proposition: Try not to assume that a site should rank #1 without knowing why it’s helpful to searchers (and better than the competition :)

2. Segmented approach: Be wary of setting SEO-related goals without making sure they’re aligned with your company’s overall objectives and the goals of other departments. For example, in tandem with your work optimizing product pages (and the full user experience once they come to your site), also contribute your expertise to your Marketing team’s upcoming campaign. So if Marketing is launching new videos or a more interactive site, be sure that searchers can find their content, too.

3. Time-consuming workarounds: Avoid implementing a hack rather than researching new features or best practices that could simplify development (e.g., changing the timestamp on an updated URL so it’s crawled more quickly instead of easily submitting the URL through Fetch as Googlebot).

4. Caught in SEO trends: Consider spending less time obsessing about the latest “trick” to boost your rankings and instead focus on the fundamental tasks/efforts that will bring lasting visitors.

5. Slow iteration: Aim to be agile rather than promote an environment where the infrastructure and/or processes make improving your site, or even testing possible improvements, difficult.
Six fundamental SEO tips
1. Do something cool: Make sure your site stands out from the competition -- in a good way!

2. Include relevant words in your copy: Try to put yourself in the shoes of searchers. What would they query to find you? Your name/business name, location, products, etc., are important. It's also helpful to use the same terms in your site that your users might type (e.g., you might be a trained “flower designer” but most searchers might type [florist]), and to answer the questions they might have (e.g., store hours, product specs, reviews). It helps to know your customers.

3. Be smart about your tags and site architecture: Create unique title tags and meta descriptions; include Rich Snippets markup from schema.org where appropriate. Have intuitive navigation and good internal links.

4. Sign up for email forwarding in Webmaster Tools: Help us communicate with you, especially when we notice something awry with your site.

5. Attract buzz: Natural links, +1s, likes, follows... In every business there's something compelling, interesting, entertaining, or surprising that you can offer or share with your users. Provide a helpful service, tell fun stories, paint a vivid picture and users will share and reshare your content.

6. Stay fresh and relevant: Keep content up-to-date and consider options such as building a social media presence (if that’s where a potential audience exists) or creating an ideal mobile experience if your users are often on-the-go.
Good luck to everyone!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help Google index your videos 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

The single best way to make Google aware of all your videos on your website is to create and maintain a Video Sitemap. Video Sitemaps provide Google with essential information about your videos, including the URLs for the pages where the videos can be found, the titles of the videos, keywords, thumbnail images, durations, and other information. The Sitemap also allows you to define the period of time for which each video will be available. This is particularly useful for content that has explicit viewing windows, so that we can remove the content from our index when it expires.

Once your Sitemap is created, you can can submit the URL of the Sitemap file in Google Webmaster Tools or through your robots.txt file.

Once we have indexed a video, it may appear in our web search results in what we call a Video Onebox (a cluster of videos related to the queried topic) and in our video search property, Google Videos. A video result is immediately recognizable by its thumbnail, duration, and a description.

As an example, this is what a video result from CNN.com looks like on Google:


We encourage those of you with videos to submit Video Sitemaps and to keep them updated with your new content. Please also visit our recently updated Video Sitemap Help Center, and utilize our Sitemap Help Forum. If you've submitted a Video Sitemap file via Webmaster Tools and want to share your experiences or problems, you can do so here.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Upcoming changes in Google’s HTTP Referrer 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

Protecting users’ privacy is a priority for us and it’s helped drive recent changes. Helping users save time is also very important; it’s explicitly mentioned as a part of our philosophy. Today, we’re happy to announce that Google Web Search will soon be using a new proposal to reduce latency when a user of Google’s SSL-search clicks on a search result with a modern browser such as Chrome.

Starting in April, for browsers with the appropriate support, we will be using the "referrer" meta tag to automatically simplify the referring URL that is sent by the browser when visiting a page linked from an organic search result. This results in a faster time to result and more streamlined experience for the user.

What does this mean for sites that receive clicks from Google search results? You may start to see "origin" referrers—Google’s homepages (see the meta referrer specification for further detail)—as a source of organic SSL search traffic. This change will only affect the subset of SSL search referrers which already didn’t include the query terms. Non-HTTPS referrals will continue to behave as they do today. Again, the primary motivation for this change is to remove an unneeded redirect so that signed-in users reach their destination faster.

Website analytics programs can detect these organic search requests by detecting bare Google host names using SSL (like "https://www.google.co.uk/"). Webmasters will continue see the same data in Webmasters Tools—just as before, you’ll receive an aggregated list of the top search queries that drove traffic to their site.

We will continue to look into further improvements to how search query data is surfaced through Webmaster Tools. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.