Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo Create a Google Sitemap.xml File 2013

Seo Master present to you:

Create a Google Sitemap Google Sitemap File Google Sitemap Generator
How to create a Google Sitemap.xml
  1. Sign in to Google webmaster tools with your Google account at www.google.com/webmasters/sitemaps. If you don't have a Google account, you will need to create one.
  2. Add your site to your webmaster tools account.
  3. Verify your site, either by creating a blank html file with a certain name (generated by Google), or by inserting a specific Meta tag in your site index.
  4. Go to Sitemap Generator http://www.xml-sitemaps.com/ website and create a auto generator sitemap.
  5. Save the sitemap file to desktop and then upload in root level of your website.
  6. Go back to Google's webmaster tools, and click on the Sitemaps tab.
  7. Click "Add a Sitemap".
  8. Select General Web Sitemap on the drop down list.
  9. Enter the urls of you sitemap. Eg. Sitemap.xml Click on Add Web Sitemap.
  10. That's it! It can take up to a couple of days for Google to download your new sitemap.
2013, By: Seo Master

seo Search Engine Optimization (SEO) Checklist 2013

Seo Master present to you:

Search Engine Optimization (SEO) Checklist Google SEO Checklist Googe Ranking Factors
Search Engine Optimization (SEO) Checklist

SEO FactorBrief SEO Checklsit Information
Keyword in Domain Name/URL
MainKeyword(s) should always appear in the URL and if possible in thedomain name.
Keyword in Title
Main keyword should be as close to the beginning as possible. Don't stuff your title though! Don't use special characters.
Keyword in Meta Description Tag
Use1 to 2 reasonable sentences in your meta description and use yourkeywords at least once.
Keywords Meta Tag
Main keyword should be as close to the beginning as possible. Don'tuse more than about 10 keywords. Don't repeat any single word more thantwice!
Keyword Density in the body
There is no recommended or perfect keyword density, but don't use more than 15% to 20% per main keyword.
Keywords in header tags
Most important isH1, then follow H2, and H3. The others are proably not that important.
Keyword proximity
The closer the keywords are, the better they might rank.
Order of key phrases
Try to order your keywords so that they form key phrases that might be queried for more often in search engines.
Keyword frequency
The most important main keyword should be repeated more than the others. However, don't spam the keyword in nonsensical points.
Keyword prominence
The earlier the main keyword appears on the page, the higher its relevance.
Keyword in ALT and TITLE tags
Describe the image and if possible with one of the main keywords, however, never spam the ALT tag, this is a common SEO mistake!
Keyword Anchor Text
Try to have keywords in your inbound anchor texts, that are links that direct to subpages of your site.
Keyword Stemming
Stem your keywords. Use singular, plural, past forms, etc.
Keyword Semantics
Make use of synonyms and don't spam your site with one and the same keyword.
No excessive deep linking
Check that all your pages can be reached with no more than 3 or 4 clicks!
Page to Page Linking
Try to link to appropriate subpages from a related subpage.
Domain Name Extension
.EDUand .ORG seem to be very important. For any .COM domain it is a littlemore difficult to prove trustworthiness and relevance due to the hugeamount of spamming sites.
File Sizes
Never exceed more than 100kb per page. Try to keep below 40kb per page.
Hyphenate file names
Do never use more than 4 words in the file name, as this indicates spam.
Fresh and new contents
Google loves new and regularly updated contents, so do your visitors!
Total length of URL
Keepit at a minimum. That does not mean that you should not use mor than 60characters.
2013, By: Seo Master

seo History of Sitemap, Sitemap Format 2013

Seo Master present to you:

History of Sitemap, Importance of Sitemap, Sitemap Format, Sitemap Submission Search Engines
Sitemaps protocol gives a information to search engines about URLs that are available for crawling. A Sitemap is a format of XML file and save as .xml. Sitemap file contain the list of URLs for a website. Sitemap allows webmasters to include information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site.
History of Sitemap:
  • Google first introduced Sitemaps 0.84 in June 2005 so any one can publish lists of urls.
  • Google, MSN and Yahoo announced joint support for the Sitemaps protocol in November 2006. The schema version was changed to "Sitemap 0.90", but no other changes were made.
  • In April 2007 - Ask.com and IBM announced support for Sitemaps. Also, Google, Yahoo, MS announced auto-discovery for sitemaps through robots.txt.
  • In May 2007 - the state governments of Arizona, California, Utah and Virginia announced they would use Sitemaps on their web sites.
Sitemap Format:
The Sitemap Protocol format consists of XML tags. The file itself must be UTF-8 encoded.
Sample
<?xml version='1.0' encoding='UTF-8'?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
<url>
<loc> http://online-seo-information.blogspot.com/</loc>
<lastmod>2006-11-18</lastmod>
<changefreq>daily</changefreq>
<priority>0.8</priority>
</url>
</urlset>
The following table lists the Sitemap submission URLs for several major search engines:
Search engineSubmission URLHelp page
Googlehttp://www.google.com/webmasters/tools/ping?sitemap=How do I resubmit my Sitemap once it has changed?
Yahoo!
  • http://search.yahooapis.com/SiteExplorerService/V1/updateNotification?appid=SitemapWriter&url=
  • http://search.yahooapis.com/SiteExplorerService/V1/ping?sitemap=
  • Does Yahoo! support Sitemaps?
    Ask.comhttp://submissions.ask.com/ping?sitemap=Q: Does Ask.com support sitemaps?
    Live Searchhttp://webmaster.live.com/ping.aspx?siteMap=Webmaster Tools (beta)
    YandexSitemaps files
    Sitemap limits:
    Sitemap files have a limit of 50,000 URLs and 10 megabytes per sitemap. Sitemaps can be compressed using gzip format. Multiple sitemap files are supported, with a Sitemap index file serving as an entry point for a total of 1000 sitemaps.
    2013, By: Seo Master

    seo Robots.txt 2013

    Seo Master present to you:

    Meta Robots Tags, About Robots.txt and Search Indexing Robots
    EntryMeaning
    User-agent: *
    Disallow:
    Because nothing is disallowed, everything is allowed for every robot.
    User-agent: mybot
    Disallow: /
    mybot robot may not index anything, because the root path (/) is disallowed.
    User-agent: *
    Allow: /
    For all user agents, allow.
    User-agent: BadBotAllow: /About/robot-policy.htmlDisallow: /
    The BadBot robot can see the robot policy document, but nothing else.All other user-agents are by default allowed to see everything.This only protects a site if "BadBot" follows the directives in robots.txt
    User-agent: *Disallow: /cgi-bin/
    Disallow: /tmp/
    Disallow: /private
    In this example, all robots can visit the whole site, with the exception of the two directories mentioned and any path that starts with private at the host root directory, including items in privatedir/mystuff and the file privateer.html
    User-agent: BadBot
    Disallow: /

    User-agent: *
    Disallow: /*/private/*
    The blank line indicates a new "record" - a new user agent command. All other robots can see everything except any subdirectory named "private" (using the wildcard character)
    User-agent: WeirdBotDisallow: /links/listing.htmlDisallow: /tmp/
    Disallow: /private/

    User-agent: *
    Allow: /
    Disallow: /temp*
    Alllow: *temperature*

    Disallow: /private/
    This keeps the WeirdBot from visiting the listing page in the links directory, the tmp directory and the private directory.
    Allother robots can see everything except the temp directories or files,but should crawl files and directories named "temperature", and shouldnot crawl private directories. Note that the robots will use thelongest matching string, so temps and temporary will match the Disallow, while temperatures will match the Allow.
    Bad Examples - Common Wrong Entries
    use one of the robots.txt checkers to see if your file is malformed
    User-agent: googlebot
    Disallow /
    NO! This entry is missing the colon after the disallow.
    User-agent: sidewiner
    Disallow: /tmp/
    NO! Robots will ignore misspelled User Agent names (it should be "sidewinder"). Check your server logs for User Agent name and the listings of User Agent names.
    User-agent: MSNbot
    Disallow: /PRIVATE
    WARNING! Many robots and webservers are case-sensitive. So this path will not match any root-level folders named private or Private.
    User-agent: *
    Disallow: /tmp/
    User-agent: Weirdbot
    Disallow: /links/listing.html
    Disallow: /tmp/
    Robots generally read from top to bottom and stop when they reach something that applies to them. So Weirdbot would probably stop at the first record, *.
    Ifthere's a specific User Agent, robots don't check the * (all useragents) block, so any general directives should be repeated in thespecial blocks.
    2013, By: Seo Master
    Powered by Blogger.