Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel



Cara Memilih Tema Blog Terbaik sebenarnya gampang-gampang susah. Anda harus menentukan isi blog apa yang ingin anda buat, materi apa yang ingin anda masukkan ke dalam blog anda.

Apakah perlu memilih tema blog terbaik?

Tentu saja perlu, karena tema blog akan sangat menentukan gaya SEO yang akan anda lakukan pada blog anda nantinya. Contoh, optimasi SEO untuk blog yang berisi tema campuran akan
Seo Master present to you:
protect your images blogger trick
This is an awesome trick to protect your images.You can protect your images from copying with this Simple CSS Blogger Tricks.After applying this trick the person who save the image will get a transparent image instead of the original image.







Demo

Protect your image with CSS overlapping

  • Go to Blogger Account
  • Use the Following trick
<img src="Place image URL Here" />
<img border="0" src="http://i.imgur.com/eYKPf7b.png" alt="NetOops protected image" width="200" height="200" style="left: 0px; opacity: 0; position: relative; top: -216px;" />
  • That's it.

Protect your image with SPAN background

  • Use the following code
<span style="background-image: url(Place image URL here)"><img src="http://i.imgur.com/eYKPf7b.png" width="200" height="200" border="0" alt="NetOops protected Image."></span> 

  • You are done..
If you liked this article please share and like....
2013, By: Seo Master
salam every one, this is a topic from google web master centrale blog:
We've upgraded the crawl rate setting in Webmaster Tools so that webmasters experiencing problems with Googlebot can now provide us more specific information. Crawl rate for your site determines the time used by Googlebot to crawl your site on each visit. Our goal is to thoroughly crawl your site (so your pages can be indexed and returned in search results!) without creating a noticeable impact on your server's bandwidth. While most webmasters are fine using the default crawl setting (i.e. no changes needed, more on that below), some webmasters may have more specific needs.

Googlebot employs sophisticated algorithms that determine how much to crawl each site it visits. For a vast majority of sites, it's probably best to choose the "Let Google determine my crawl rate" option, which is the default. However, if you're an advanced user or if you're facing bandwidth issues with your server, you can customize your crawl rate to the speed most optimal for your web server(s). The custom crawl rate option allows you to provide Googlebot insight to the maximum number of requests per second and the number of seconds between requests that you feel are best for your environment.

Googlebot determines the range of crawl rate values you'll have available in Webmaster Tools. This is based on our understanding of your server's capabilities. This range may vary from one site to another and across time based on several factors. Setting the crawl rate to a lower-than-default value may affect the coverage and freshness of your site in Google's search results. However, setting it to higher value than the default won't improve your coverage or ranking. If you do set a custom crawl rate, the new rate will be in effect for 90 days after which it resets to Google's recommended value.

You may use this setting only for root level sites and sites not hosted on a large domain like blogspot.com (we have special settings assigned for them). To check the crawl rate setting, sign in to Webmaster Tools and visit the Settings tab. If you have additional questions, visit the Webmaster Help Center to learn more about how Google crawls your site or post your questions in the Webmaster Help Forum.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.