Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

Seo Master present to you:

Turn on 2-step verification

It is a method to get your Gmail more secure. After applying this method. You can visit your Gmail account after second log in . First one usual log In method like enter username and password. After that we will reach second log in method. At the time of log in Google send a verification code to your mobile number. After enter the verification code you can enable to browse your Emails. Calling method also available.





To set up 2-step verification:



  • From the drop-down menu, select the country where your phone is registered, and enter your phone number in the box.



  • Choose whether you’d like to receive your codes by text or by voice call. You can always change this later.
  • Enter your phone number, then click Send verification code to receive a code on your phone. We recommend you use a mobile phone number as opposed to a landline or Google Voice number.
  • Enter the code from the text or voice message into the box, then click Verify.



  • Next you’ll be asked whether you want to remember the computer you are using. If you check the box, you won’t need to enter a code to sign on with this computer for the next 30 days. Don’t check this box if you are using a public computer or a device that you don’t regularly use to sign in.

  • Click Turn on 2-step verification to finish the process! You’ll be automatically taken to your account settings page.

After Turn on 2-step verification  you can not open your Gmail without 6 digit code from your mobile.

Turn off 2-step verification
  • Visit the Using 2-step verification page under your Google Account settings. Sign in with your username, password, and verification code if prompted.
  • Click Turn off 2-step verification.
  • A pop-up window will appear to confirm that you want to turn off 2-step verification. Click OK.



...................................................End.....................................................
2013, By: Seo Master
salam every one, this is a topic from google web master centrale blog:
Last spring, the Sitemaps protocol was expanded to include the autodiscovery of Sitemaps using robots.txt to let us and other search engines supporting the protocol know about your Sitemaps. We subsequently also announced support for Sitemap cross-submissions using Google Webmaster Tools, making it possible to submit Sitemaps for multiple hosts on a single dedicated host. So it was only time before we took the next logical step of marrying the two and allowing Sitemap cross-submissions using robots.txt. And today we're doing just that.

We're making it easier for webmasters to place Sitemaps for multiple hosts on a single host and then letting us know by including the location of these Sitemaps in the appropriate robots.txt.

How would this work? Say for example you want to submit a Sitemap for each of the two hosts you own, www.example.com and host2.google.com. For simplicity's sake, you may want to host the Sitemaps on one of the hosts, www.example.com. For example, if you have a Content Management System (CMS), it might be easier for you to change your robots.txt files than to change content in a directory.

You can now exercise the cross-submission support via robots.txt (by letting us know the location of the Sitemaps):

a) The robots.txt for www.example.com would include:
Sitemap: http://www.example.com/sitemap-www-example.xml

b) And similarly, the robots.txt for host2.google.com would include:
Sitemap: http://www.example.com/sitemap-host2-google.xml

By indicating in each individual host's robots.txt file where that host's Sitemap lives you are in essence proving that you own the host for which you are specifying the Sitemap. And by choosing to host all of the Sitemaps on a single host, it becomes simpler to manage your Sitemaps.

We are making this announcement today on Sitemaps.org as a joint effort. To see what our colleagues have to say, you can also check out the blog posts published by Yahoo! and Microsoft.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:

The quality of your snippet — the short text preview we display for each web result — can have a direct impact on the chances of your site being clicked (i.e. the amount of traffic Google sends your way). We use a number of strategies for selecting snippets, and you can control one of them by writing an informative meta description for each URL.

<META NAME="Description" CONTENT="informative description here">

Why does Google care about meta descriptions?
We want snippets to accurately represent the web result. We frequently prefer to display meta descriptions of pages (when available) because it gives users a clear idea of the URL's content. This directs them to good results faster and reduces the click-and-backtrack behavior that frustrates visitors and inflates web traffic metrics. Keep in mind that meta descriptions comprised of long strings of keywords don't achieve this goal and are less likely to be displayed in place of a regular, non-meta description, snippet. And it's worth noting that while accurate meta descriptions can improve clickthrough, they won't affect your ranking within search results.

Snippet showing quality meta description




Snippet showing lower-quality meta description



What are some good meta description strategies?
Differentiate the descriptions for different pages
Using identical or similar descriptions on every page of a site isn't very helpful when individual pages appear in the web results. In these cases we're less likely to display the boilerplate text. Create descriptions that accurately describe each specific page. Use site-level descriptions on the main home page or other aggregation pages, and consider using page-level descriptions everywhere else. You should obviously prioritize parts of your site if you don't have time to create a description for every single page; at the very least, create a description for the critical URLs like your homepage and popular pages.

Include clearly tagged facts in the description
The meta description doesn't just have to be in sentence format; it's also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information -- price, age, manufacturer -- scattered throughout a page, making it unlikely that a snippet will capture all of this information. Meta descriptions can bring all this data together. For example, consider the following meta description for the 7th Harry Potter Book, taken from a major product aggregator.

Not as desirable:
<META NAME="Description" CONTENT="[domain name redacted]
: Harry Potter and the Deathly Hallows (Book 7): Books: J. K. Rowling,Mary GrandPré by J. K. Rowling,Mary GrandPré">

There are a number of reasons this meta description wouldn't work well as a snippet on our search results page:
  • The title of the book is complete duplication of information already in the page title.
  • Information within the description itself is duplicated (J. K. Rowling, Mary GrandPré are each listed twice).
  • None of the information in the description is clearly identified; who is Mary GrandPré?
  • The missing spacing and overuse of colons makes the description hard to read.

All of this means that the average person viewing a Google results page -- who might spend under a second scanning any given snippet -- is likely to skip this result. As an alternative, consider the meta description below.

Much nicer:
<META NAME="Description" CONTENT="Author: J. K. Rowling, Illustrator: Mary GrandPré, Category: Books, Price: $17.99, Length: 784 pages">

What's changed? No duplication, more information, and everything is clearly tagged and separated. No real additional work is required to generate something of this quality: the price and length are the only new data, and they are already displayed on the site.

Programmatically generate descriptions
For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation.

Use quality descriptions
Finally, make sure your descriptions are... descriptive. It's easy to become lax on the quality of the meta descriptions, since they're not directly visible in the UI for your site's visitors. But meta descriptions might be displayed in Google search results -- if the description is high enough quality. A little extra work on your meta descriptions can go a long way towards showing a relevant snippet in search results. That's likely to improve the quality and quantity of your user traffic.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.