Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo Mau Page Title Lebih SEO Friendly 2013

Seo Master present to you: Bagi anda yang menggunakan blogger, anda dapat membikin Page Title lebih SEO friendly. Ini dapat dilakukan dengan mengubah letak dari Page Title, dengan cara meletakkan Page Title atau judul postingan sebelum nama blog.

1. Pilih Tata Letak --> Edit HTML --> centang "Expand Template Widget".

Tips : Sebelum melakukan edit template, sebaiknya simpan dulu template kamu yang ada dengan meng-klik tulisan Download Template Lengkap. Kemudian simpan dalam harddisk atau media penyimpanan lainnya, sehingga jika terjadi hal-hal yang tidak diinginkan kamu sudah memiliki back-up untuk mengembalikannya seperti semula.

2. Pada bagian atas akan terlihat
<title><data:blog.title/></title>
3. Kemudian ubah kode tersebut diatas dengan :

<b:if cond='data:blog.pageType == "item">
<b:section id='swaptitle'>
<b:widget id='Blog2' locked='false' title='Blog Posts' type='Blog'>
<b:includable id='nextprev'/>
<b:includable id='backlinks' var='post'/>
<b:includable id='post' var='post'>
<title>
<data:post.title/> | <data:blog.title/>
</title>
</b:includable>
<b:includable id='commentDeleteIcon' var='comment'/>
<b:includable id='status-message'/>
<b:includable id='feedLinks'/>
<b:includable id='backlinkDeleteIcon' var='backlink'/>
<b:includable id='feedLinksBody' var='links'/>
<b:includable id='postQuickEdit' var='post'/>
<b:includable id='comments' var='post'/>
<b:includable id='main' var='top'>
<b:loop values='data:posts' var='post'>
<b:include data='post' name='post'/>
</b:loop>
</b:includable>
</b:widget>
</b:section>
<b:else/>
<title><data:blog.pageTitle/></title>
</b:if>

4. Selanjutnya simpan Template

Sebelum melakukan pengubahan pada Template sebaiknya disimpan dulu Template yang lama.

Silahkan dicoba.

Wacana Yang Berkaitan:
Panduan SEO Menjadi nomor satu di Search Engine
Penulisan Meta Description2013, By: Seo Master

seo Cara submit di Technorati 2013

Seo Master present to you:
Technorati adalah salah satu search engine yang mengkhususkan diri pada blog, saat ini sudah jutaan blog yang terdaftar di http://technorati.com/. Sebagai sebuah komunitas yang besar, Technorati bisa menjadi salah satu sumber untuk mendapatkan pengunjung.

Karena itu sangat berasalan bagi kita untuk ikut mendaftarkan blog yang kita buat pada Technorati. Disamping untuk menjaring pengunjung, Technorati juga bisa dijadikan alat untuk mencari link dari blog-blog lain yang mempunyai tema yang sama.
Cara Mengclaim Blog di Technorati

Mengingat pentingnya Technorati ini bagi kemajuan blog yang kita buat, kita perlu untuk mendaftarkan blog ke Technorati, berikut cara untuk mendaftarkan blog ke Technorati.

1) Klik pada http://technorati.com/signup/ ini, isikanlah data-data yang diminta yaitu: Real name – nama asli kamu, Member name – nama alias atau nama user, Email – email yang kamu gunakan, New Password – password untuk login nantinya, Verify new password – ulangi password tadi.



2) Selanjutnya kita akan masuk ke account yang baru saja dibuat, pada layar account ini kita bisa mengubah data-data yang disediakan bila memang diperlukan. Tab Setting untuk mengedit data seperti pada saat mendaftar.



Bio untuk mengedit data Biografi, Photo untuk menampilkan photo diri, Blogs untuk mendaftarkan alamat blog, Extras untuk berlangganan Newsletter melalui email.

Sekarang kita akan mendaftarkan blog yang kita miliki ke Technorati, klik pada tab Blog. Isikan alamat URL blog yang kita miliki pada kotak yang disediakan, lalu klik tombol Begin Claim. Ini maksudnya untuk mengclaim bahwa alamat tersebut adalah blog milik kita.

Bergantung dari platform blog yang di claim, pihak Technorati akan menawarkan beberapa cara untuk mengclaim blog tersebut. Jika platform yang digunakan adalah Blogspot maka Technorati akan menawarkan tiga cara yaitu Quick Claim, Post Claim dan Embeded Claim.



Sedangkan yang menggunakan WordPress maka Technorati akan menawarkan 3 macam cara yaitu: OpenID, Quick CLaim dan Post Claim. Kita akan membahasnya satu persatu.


OpenId
Perlu diingat sekali lagi bahwa ini bagi yang meggunakan WordPress. Untuk mengclaim dengan OpenID, buka jendela baru atau tab baru pada browser, login ke blog yang kamu miliki di WordPress, kembali ke layar Technorati klik pada Use OpenID Clain.


Lalu pihak WordPress akan meminta konfirmasi bahwa kita menyetujui untuk memberikan kepercayaan pada pihak Technorati. Pilih Yes; just this time atau Yes; always untuk melanjutkan, dan selesai kita telah berhasil mengclaim alamat blog tersebut.
Quick Claim

Cara ini bisa digunakan baik yang menggunakan Blogspot maupun WordPress, cara ini juga yang paling mudah. Kita tinggal mengisikan nama user dan password untuk login ke blog yang kita miliki.


Post Claim
Cara ini juga bisa digunakan baik yang menggunakan platform Blogspot atau WordPress, berikut cara melakukannya.


1) Klik pada link Use Post Claim pada layar Technorati, Technorati akan menampilkan layar yang menjelaskan cara mengclaim dengan Pos Claim ini. Copy kode yang diberikan oleh pihak Technorati.

2) Login ke blog jika belum, buat artikel baru, buka layar kode pada editor blog dan pastekan kode yang tadi di copy dari Technorati ke editor code di Blogspot, terserah mau dikasih judul apa artikel tersebut lalu klik tombol PUBLISH untuk menyimpan dan menampilkannya di blog.

3) Klik tombol Release the Spider pada layar Technorati, Technorati akan melepaskan laba-laba (istilahnya aja) yang akan mencari artikel yang baru dibuat tadi, jika menemukannya maka selesailah proses pengclaiman blog.
Embeded Claim

Cara ini hanya bisa digunakan oleh yang menggunakan platform Blogspot. Mengclaim dengan cara Embedded Claim yaitu menempatkan kode yang diberikan Tecnorati pada template menggunakan Page Element. Berikut cara melakukannya:

1) pada blog masuk ke layar Template > Page Element, 2) lalu klik pada Add a Page Element, pilih HTML/Java script 3) pastekan code yang diberikan oleh Technorati pada kotak element HTML/Java scipt. 4) klik SAVE CHANGER pada kotak page element. 5) klik pada tombol Release the Spiders! pada Technorati. 6) selesai.

Setelah berhasil mengclaim blog Technorati meminta kita untuk memberikan keterangan mengenai blog tersebut, yaitu dengan cara mengedit setting blog.


Description – penjelasan secara singkat mengenai blog, buatlah penjelasan yang bisa menggambarkan isi dari blog agar orang mengerti dan tertarik untuk mengunjungi. Languange – bahasa utama yang digunakan pada artikel di blog.


Tags – tags ini sama dengan keyword atau kata kunci, masukkan semua keyword atau inti dari setiap artikel yang sering ditulis. Tags ini sangat penting bagi orang lain untuk bisa menemukan blog kita, karena itu gunakanlah tags ini sebaik-baiknya.

Sampai disini proses pengclaiman blog kita selesai, lalu apa manfaat yang bisa kita dapat dari mengclaim blog di Technorati ini ? Apa yang membuat Technorati penting bagi para blogger ?

Bagaimana cara memanfaatkan Technorati untuk mendapatkan pengunjung ? pertanyaan-pertanyaan inilah yang harus kita jawab. Untuk menjawabnya kita harus sering-sering mencari informasi dan melakukan berbagai percobaan.
2013, By: Seo Master

seo Cara mendaftar di ODP DMOZ 2013

Seo Master present to you:

“Tahukah anda dibalik kekuatan ODP (dmoz) sebagai pondasi trafik blog anda dan jangan pernah salahkan saya jika blog anda menjadi the king pada Directory Search Engine”


Sekilas Tentang ODP (Dmoz). (ODP) Open Directory Project atau biasa di sebut Dmoz merupakan directory search engine terbesar untuk saat ini, hampir kurang lebih 37.000 sukarelawan yang menjalankan tugasnya sebagai editor pada directory tersebut (Sumber : Buku “Top 10 On Google” – Rahmat Putra – Penerbit : Dian Rahmat).

Dmoz termasuk yang penomenal dalam Directory search engine karena hampir semua search engine terkenal mengacu pada directory Dmoz. Beberapa Search Engine yang mengacu pada Dmoz diantaranya: Google, Alexa, Yahoo, AOL, Lycos, Teoma, AskJeevess Netscape, Hotbot dan beberapa Search engine yang lain. Yang jelas bayangkan saja jika situs atau blog anda ter-index pada directory tersebut, berbanggalah jika blog anda salah satu yang dipilih dan pantas masuk kategory mereka, karena blog atau situs yang pantas masuk merupakan dari suatu proses penyaringan yang sangat ketat. Tidak jarang blog/situs yang mendaftar di tolak bahkan tanpa konfirmasi sama sekali.

Apakah mendaftar ke ODP trafik pengunjung akan meningkat?

Untuk dikatakan meningkat secara drastis jawabannya masih terlalu abstak, tetapi pada kenyataannya blog atau situs yang telah terindex pada Dmoz trafiknya akan sangat meningkat hal ini disebabkan karena site/blog anda akan muncul hampir pada setiap direktori search engine terkenal, tentunya sesuai dengan katakunci yang di cari yang berkenaan dengan deskripsi dan title blog anda.

Menurut dari beberapa pengamatan, blog/situs yang terdaftar pada ODP memiliki peringkat yang lebih tinggi dibandingkan yang tidak terdaftar. Sangat menguntungkan bukan!

Berapa lama Situs/Blog akan ter-index di ODP?

Saya pun tidak mengetahui pasti berapa lama situs atau blog kita terdaftar ke ODP, yang harus dipahami adalah editor ODP ini menggunakan jasa manusia bukan mesin seperti spider-nya Google. Dari pengalaman saya mendaftar ke Dmoz, blog saya baru di index selama 2 minggu setelah pendaftaran bahkan mungkin saja waktunya lebih lama lagi tergantung mood sang editor kali ya .

Teruslah berjuang dan jangan putus asa jika blog anda masih belum di index, mungkin saja masih dalam tahap proses, tapi jika dalam waktu 3 minggu tidak terdaftar juga, Dmoz membolehkan kita mencoba mendaftar kembali ke direktory mereka dan anda masih bisa memilih kategory yang lain sesuai dengan tema blog/site anda.

Beberapa hal penting sebelum melakukan pendaftaran.
Pada Title :
- Pilih Title sesuai dengan titile sebenarnya
- Jangan menggunakan huruf ALL CAPITAL.
- Buangkan bahasa pengiklanan dari tajuk.

Pada Deskripsi :
- Jangan gunakan tag HTML.
- Hindari bahasa pengiklanan. Perkataan dan frasa seperti “cool” dan “best darn site” akan dihapus.
- Jangan menggunakan ALLCAPS di dalam huraian anda.
- Elakkan membesarkan huruf pertama dalam setiap perkataan dalam ayat.
- Jangan mengulangi tajuk laman anda di dalam huraiannya.
- Semak ejaan anda.

Baca Baik-Baik Peraturan dari Dmoz :
- Sila memohon sekali sahaja, tambahan URL ke Open Directori. Jika laman itu tidak disenaraikan dalam masa tiga minggu, anda boleh memohonnya semula.
- Menyamar pemohonan untuk menambah URL yang sama lebih dari sekali, juga tidak dibenarkan.
Contoh: http://Dmoz.org dan http://Dmoz.org/index.html
- Jangan memohon lamn cermin. Laman cermin mempunyai kandongan yang sama tapi mempunyai URL berlainan.
- Elakkan memohon laman yang mempunyai URL yang bertukar ke URL lain.
- Sila memohon tambahan URL ke kategori yang paling berkenaan. Laman yang ditambah ke kategori yang tidak berkenaan akan dihapuskan.
- Sila tunggu sehingga laman anda siap sebelum memohon tambahan URL anda. Laman yang belum siap, mempunyai notis “Under Construction”, atau mempunyai grafik patah dan link patah bukannya calon baik untuk direktori ini.
- Laman dengan kandungan haram dan pornografi adalah tidak dibenarkan di dalam Open Directory. Contoh kandungan haram termasuk, tetapi tidak terhad kepada, pornografi kanak-kanak dan laman menyalahi hakcipta.

Catatan : Semua peraturan diatas merupakan informasi resmi dari situs Dmoz, yang jelas tidak ada penambahan dan pengurangan dari informasi tersebut

Tips Penting dari OOM:
- Pilih kategory yang tepat jangan sampe salah, setidaknya pilih yang paling mendekati sesuai dengan tema blog/site anda.
- Jangan mendaftar lebih dari satu kali, jika gagal tunggu selama 3 minggu lalu daftarkan kembali blog atau situs anda.

- Pada “Title” sebaiknya samakan saja sesuai title blog anda, jangan membohongi Editor Dmoz, ingat! editor Dmoz manusia bukan mesin.

- Pada “Site Description” ini sangat penting dan berpengaruh pada hasil pencarian, gunakan deskripsi sebagai kata kunci, karena menurut pengalaman saya beberapa search engine seperti google melakukan pencarian sesuai dengan deskripsi blog yang anda masukan (Kesalahan dari blog saya adalah ketidak telitian saya ketika memasukan deskripsi dengan benar)

- Jika blog atau situs anda tentang Fornografi, kekerasan, virus sebaiknya lupakanlah Dmoz!

- Jika Gagal mendaftar padahal anda telah mengikuti intruksi dengan benar? Pertama coba anda daftarkan sekali lagi, apa salahnya mencoba dan berusaha. Kedua : Jika anda mendapatkan konfirmasi dari editor berupa kesalahan yang disampaikan, coba kontak kembali editor tersebut. sebaiknya dalam menyampaikan pesan gunakan kata yang halus, tegas dan jangan bertele-tele apalagi kasar. Ketiga: coba daftarkan kembali pada kategori yang berbeda, mungkin saja editor juga berbeda. Keempat: Berdo’a lah mungkin ini cara terbaik hehehehe becanda

Langkah Mendaftar ke ODP
1. Buka situs http://www.dmoz.com

2. Pilih directory yang ada hubungannya situs/blog anda, misalnya situs/blog anda bertemakan bisnis pilih kategori bisnis jika tentang komputer ya pilih tentang komputer atau Misalnya anda ingin bergabung dalam blog pribadi dimana blog saya terdaftar anda bisa klik World – Bahasa_Indonesia – Masyarakat – Orang-orang – Halaman-halaman_pribadi – Anonim atau di http://www.dmoz.org/World/Bahasa_Indonesia/Masyarakat/Orang-orang/Halaman-halaman_pribadi/Anonim/

3. Jika anda sudah merasa sesuai dengan kategori yang dipilih klik tambah url (suggest url) yang terdapat pada halaman kanan atas pada halaman Dmoz

4. Selanjutnya Dmoz akan mengarahkan kita ke pengisian formulir
Perhatian: Baca semua peringatan dan peraturan yang harus dipatuhi ini sangat penting!
URL halaman : Masukan nama site/blog anda (contoh : http://namablog.blogspot.com)
Tajuk Halaman : Bisa dikatakan sebagai title, masukan title/judul halaman blog.
Huraian Halaman: Bisa dikatakan sebagai deskripsi dari blog.
Alamat imel Anda: masukan alamat email anda (Contoh : nama_email@gmail.com)
Masukan code verifikasi : Masukan code verifikasi yang ada pada gambar.

5. Cek kembali semua data yang telah anda masukan sebelum diproses, kesalahan kecil sedikitpun akan merugikan anda.

6. Kemudian tekan tombol “Submit

7. Sampai tahapan ini proses sudah selesai, jika ada kesalahan dalam proses submit lihat deskripsi atau peringatan kesalahan tersebut. segerah perbaiki.
Jika ada kesalahan dalam informasi yang saya buat, tolong sampaikan pada kolom komentar.2013, By: Seo Master

from web contents: Optimizing sites for TV 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Just as mobile phones make your site accessible to people on the go, Google TV makes your site easily viewable to people lounging on their couch. Google TV is a platform that combines your current TV programming with the web and, before long, more apps. It’s the web you love, with the TV you love, all available on the sofa made for you. Woohoo!

Because Google TV has a fully functioning web browser built in, users can easily visit your site from their TV. Current sites should already work, but you may want to provide your users with an enhanced TV experience -- what's called the “10-foot UI” (user interface). They'll be several feet away from the screen, not several inches away, and rather than a mouse on their desktop, they'll have a remote with a keyboard and a pointing device.

For example, here’s YouTube for desktop users versus what we’re calling “YouTube Leanback” -- our site optimized for large screens:


YouTube desktop version on the left, YouTube Leanback on the right

See our Spotlight Gallery for more examples of TV-optimized sites.

What does "optimized for TV" mean?

It means that, for the user sitting on their couch, your site on their TV is an even more enjoyable experience:
  • Text is large enough to be viewable from the sofa-to-TV distance.
  • Site navigation can be performed through button arrows on the remote (a D-pad), rather than mouse/touchpad usage
  • Selectable elements provide a visual queue when selected (when you’re 10 feet away, it needs to be really, really obvious what selections are highlighted)
  • and more...
How can webmasters gain a general idea of their site’s appearance on TV?

First, remember that appearance alone doesn't incorporate whether your site can be easily navigated by TV users (i.e. users with a remote rather than a mouse). With that said, here’s a quick workaround to give you a ballpark idea of how your site looks on TV. (For more in-depth info, please see the “Design considerations” in our optimization guide.)
  1. On a large monitor, make your window size 1920 x 1080.
  2. In a browser, visit your site at full screen.
  3. Zoom the browser to 1.5x the normal size. This is performed in different ways with different keyboards. For example, in Chrome if you press ctrl+ (press ctrl and + at the same time) twice, that’ll zoom the browser to nearly 1.5x the initial size.
  4. Move back 3 x (the distance between you and the monitor).
  5. Check out your site!
And don’t forget, if you want to see your site with the real thing, Google TV enabled devices are now available in stores.

How can you learn more?

Our team just published a developer site, with TV optimization techniques, at code.google.com/tv/web/.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Cara Mencegah Auto Blog Copy Paste Artikel 2013



Cara Mencegah Auto Blog Copy Paste Artikel - Auto blog adalah blog yang secara otomatis menerbitkan artikel dari blog lain melalui feed yang masuk ke email auto blog mereka.

Auto blog ini memang pada awalnya tidak ada masalah karena indeksnya masih lama, tapi lama kelamaan akan berakibat dan merugikan blog yang di copy paste jika indeks auto blog lebih cepat.

Kenapa bisa lebih cepat?

Karena

seo Kerja Dalam Rumah 2013

Seo Master present to you:
Kerja Dalam Rumah merupakan pusat informasi terbaru tentang bisnis, usaha, ataupun kerja yang dikelola didalam rumah. Kerja dalam rumah memiliki visi dan misi untuk membantu pemula internet agar bisa memanfaatkan peluang yang ada agar dapat dijadikan sebuah bisnis dan usaha didalam rumah, serta membantu mengembangkan hobi pemula untuk diarahkan kesebuah bisnis.

Kerja Dalam Rumah



Kerja dalam rumah adalah satu-satunya situs yang informatif, memberikan sebuah gambaran tentang bisnis rumahan yang mudah, simpel, dan minim modal. Tidak hanya berisi informasi, melainkan juga diadakannya training, atau tantangan berbisnis dalam rumah. 

Rumah adalah satu-satunya tempat yang dapat memunculkan berbagai ide, berbagai rasa, dan berbagai solusi untuk memecahkan masalah, oleh seba itu, kerja dalam rumah itu penting, baik mengelola ide sampai menjadi sebuah usaha nyata dan berkembang. Selalu ikuti perkembangan informasi di situs www.kerjadalamrumah.com semoga bisa bermanfaat dan memberikan pencerahan. 

Salam KDR,



Klik dan Like FanPage ini : Kerja Dalam Rumah
2013, By: Seo Master

from web contents: Website user research and testing on the cheap 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

As the team responsible for tens of thousands of Google’s informational web pages, the Webmaster Team is here to offer tips and advice based on their experiences as hands-on webmasters.

If you’ve never tested or analyzed usage of your website, ask yourself if you really know whether your site is useful for your target audience. If you’re unsure, why not find out? For example, did you know that on average users scroll down 5.9 times as often as they scroll up, meaning that often once page content is scrolled past, it is “lost?” (See Jakob Nielsen’s findings on scrolling, where he advises that users don’t mind scrolling, but within limits.)

Also, check your analytics—are you curious about high bounce rates from any of your pages, or very short time-on-page metrics?

First, think about your user


The start of a web project—whether it’s completely new or a revamp of an existing site—is a great time to ask questions like:

  • How might users access your site—home, office, on-the-go?
  • How tech-savvy are your visitors?
  • How familiar are users with the subject matter of your website?

The answers to some of these questions can be valuable when making initial design decisions.

For instance, if the user is likely to be on the road, they might be short on time to find the information they need from your site, or be in a distracting environment and have a slow data connection—so a simple layout with single purpose would work best. Additionally, if you’re providing content for a less technical audience, make sure it’s not too difficult to access content—animation might provide a “wow” factor, but only if your user appreciates it and it’s not too difficult to get to the content.

Even without testing, building a basic user profile (or “persona”) can help shape your designs for the benefit of the user—this doesn’t have to be an exhaustive biography, but just some basic considerations of your user’s behavior patterns.

Simple testing


Testing doesn’t have to be a costly operation – friends and family can be a great resource. Some pointers:

  • Sample size: Just five people can be a large enough number of users to find common problems in your layouts and navigation (see Jakob Nielsen’s article on why using a small sample size is sufficient).
  • Choosing your testers: A range of different technical ability can be useful, but be sure to only focus on trends—for example, if more than 50% of your testers have the same usability issue, it’s likely a real problem—rather than individual issues encountered.
  • Testing location: If possible, visit the user in their home and watch how they use the site—observe how he/she normally navigates the web when relaxed and in their natural environment. Remote testing is also a possibility if you can’t make it in person—we’ve heard that Google+ hangouts can be used effectively for this (find out more about using Google+ hangouts).
  • How to test: Based on your site’s goals, define 4 or 5 simple tasks to do on your website, and let the user try to complete the tasks. Ask your testers to speak aloud so you can better understand their experiences and thought processes.
  • What to test: Basic prototypes in clickable image or document format (for example, PDF) or HTML can be used to test the basic interactions, without having to build out a full site for testing. This way, you can test out different options for navigation and layouts to see how they perform before implementing them.
  • What not to test: Focus on functionality rather than graphic design elements; viewpoints are often subjective. You would only get useful feedback on design from quantitative testing with large (200+) numbers of users (unless, for example, the colors you use on your site make the content unreadable, which would be good feedback!). One format for getting some useful feedback on the design can be to offer 5-6 descriptive keywords and ask your user to choose the most representative ones.
Overall, basic testing is most useful for seeing how your website’s functionality is working—the ease of finding information and common site interactions.

Lessons learned


In case you’re still wondering whether it’s really worth research and testing, here are a few simple things we confirmed from actual users that we wouldn’t have known if we hadn’t sat with actual users and watched them use our pages, or analyzed our web traffic.

  • Take care when using layouts that hide/show content: We found when using scripts to expand and collapse long text passages, the user often didn’t realize the extra content was available—effectively “hiding” the JavaScript-rendered content when the user searches within the page (for example, using Control + F, which we’ve seen often).


    Wireframe of layout tested, showing “zipped”
    content on the bottom left



    Final page design showing anchor links in the top
    and content laid out in the main body of the page


  • Check your language: Headings, link and button text are what catches the user’s eye the most when scanning the page. Avoid using “Learn more…” in link text—users seem averse to clicking on a link which implies they will need to learn something. Instead, just try to use a literal description of what content the user will get behind the link—and make sure link text makes sense and is easy to understand out of context, because that is often how it will be scanned. Be mindful about language and try to make button text descriptive, inviting and interesting.
  • Test pages on a slower connection: Try out your pages using different networks (for example, try browsing your website using the wifi at your local coffee shop or a friend’s house), especially if your target users are likely to be viewing your pages from a home connection that’s not as fast as your office network. We found a considerable improvement in CTR and time-on-site metrics in some cases when we made scripted animations much simpler and faster (hint: use Google’s Page Speed Online to check performance if you don’t have access to a slower Internet connection).
So if you’re caught up in a seemingly never-ending redevelopment cycle, save yourself some time in the future by investing a little up front through user profiling and basic testing, so that you’re more likely to choose the right approach for your site layout and architecture.

We’d love to hear from you in the comments: have you tried out website usability testing? If so, how did you get on, and what are your favorite simple and low-cost tricks to get the most out of it? this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo The Linux Foundation Collaboration Summit @ Google 2013

Seo Master present to you:

Last week, Google hosted the inaugural Linux Foundation Collaboration Summit. More than 200 developers and community leaders converged for three days of talks and working group meetings, giving birth to many new synergies within the community. Of particular interest was an initiative formed to improve power management functionality in Linux. If you're interested in learning more about the results of the summit and the Linux Foundation's ongoing activities, you can check out the Linux Foundation's Summit wrap-up or the Foundation's Summit press release.

We'd like to thank all of our guests for attending the summit. It was our pleasure and privilege to help make the summit a success.2013, By: Seo Master

seo Code Review: I/O Videos, Gears release, App Engine examples, and more 2013

Seo Master present to you:

We are trying an experiment, putting up Code Review in a variety of formats, from text to audio (iTunes) and video.



You have probably heard by now, but all of the slides and video of the presentations at Google I/O are now available to watch and read. There are some real gems in there, such as Steve Yegge talking about dynamic languages and server side JavaScript.

Just as we come down from I/O, we head off to Google Developer Day events around the world. I am personally off to Brazil and Mexico City, and I am looking forward to meeting the local developers.

I gave a tech talk at Yahoo! where I discussed Google Back to Front, covering Gears and App Engine. I shared a simple App Engine example that takes a Gears-enabled Addressbook application that shows how you can store history in a visual way, and ports it to save the data on App Engine. You can watch a code walk through to see it in action.

Dick Wall (Google) and James Ward (Adobe) also got together to create an AIR application that talks to App Engine on the back end. The application, called QuickFix, takes a photo and has App Engine run the Picasa "I'm Feeling Lucky" transformation.

It is really fun to watch the great applications being built on App Engine already, such as Wordle, which builds "word clouds" from a series of text.

One final piece of news on App Engine. Nick Johnson (Google) created a little application in his spare time (read: not official) that is quite useful. smtp2web.com bridges SMTP to HTTP. This means that you can have your App Engine applications accepting email as input via the proxy. smtp2web will send an HTTP request when it gets an email on its doorstep.

There has been a lot of focus on the browser this week. Mozilla released Firefox 3, and look like they have set a download record in the process. There was a lot of browser news though, including all of the major vendors.

The standards are moving too. HTML 5 has a new working draft, and we are seeing the germination of an Acid4 series of tests.

When it comes to Gears, we saw the full release of version 0.3 which included support for the new Firefox 3 browser. It also includes the ability to create desktop shortcuts, new install flow support, progress events, and much more.

We also saw more frameworks baking Gears in. Appcelerator uses Gears under the hood to make your existing Appcelerator based application a better user experience. Also, Frizione is a JavaScript development, testing, and deployment environment that also has Gears under the hood.

Speaking of testing, Markus Clermont and John Thomas wrote up an introduction to testing Ajax applications, something that is notoriously hard to do.

The Geo world is cooking as usual, and you can check out the numerous election mashups as the season continues to blossom.

If you fancy some fun on Google Maps, Katsuomi Kobayashi has created a 2D Driving Simulator using the new Flash API.

The folks at 360cities also have a great new interface that uses the Flash API, and they also seem to use every other Geo related product. We were fortunate enough to have them come in and sit down with them, and get a bunch of demos.

What else?

If you care about the social Web, check out Kevin Marks post on how not to be viral. It makes you think long term about your strategy.

Kevin Lim posted on the Custom Search API and the new developer guide. This API always surprises me with its richness, and how you can create a fantastic, custom, search experience on your own Web site.

Related to that API, we have another new AJAX Search API, Patent Search. I have to admit, I feel sorry for you if you have to use it (due to the content)!

And to finish up, Michael Ogawa has created some great visualizations of open source projects over time, such as the history of the Python code base. Check it out below.



As always, thanks for reading, listening, or watching, and let us know if there is anything that you would like to see.2013, By: Seo Master

from web contents: State of the Index 2009 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

At PubCon in Las Vegas in November 2009, I gave a "State of the Index" talk which covers what Google has done for users, web developers, and webmasters in the last year. I recently recreated it on video for those of you who didn't make it to the conference. You can watch it below:


And here are the slides if you'd like to follow along:


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools gets a "Summer Shine" 2013

salam every one, this is a topic from google web master centrale blog:
The Google Webmaster Tools team has code names for each update we release. Today's update is aptly named "Summer Shine".

Here are a few highlights:
  • Our site selector now lists all verified sites that you own, and allows you to search as you type.
  • You can now block non-homepage sitelinks. Before today if you owned example.com, you couldn't block sitelinks for example.com/email.
  • You can now see URL removal requests submitted by other users for any sites you own, and revoke them if necessary. In the past, if another webmaster for your site mistakenly removed a URL on your site and left for vacation it was a difficult process to undo the request.
  • Our "Home" page is much easier to navigate. We now make a clear distinction between verified and unverified sites.
We hope you like the improvements; tell us what you think.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: URL removal explained, Part III: Removing content that you don't own 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Welcome to the third episode of our URL removals series! In episodes one and two, we talked about expediting the removal of content that's under your control and requesting expedited cache removals. Today, we're covering how to use Google's public URL removal tool to request removal of content from Google’s search results when the content originates on a website not under your control.

Google offers two tools that provide a way to request expedited removal of content:

1. Verified URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site of which you’re a verified owner in Webmaster Tools (like your blog or your company’s site)

2. Public URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site which you can’t verify ownership (like your friend’s blog)

Sometimes a situation arises where the information you want to remove originates from a site that you don't own or can't control. Since each individual webmaster controls their site and their site’s content, the best way to update or remove results from Google is for the site owner (where the content is published) to either block crawling of the URL, modify the content source, or remove the page altogether. If the content isn't changed, it would just reappear in our search results the next time we crawled it. So the first step to remove content that's hosted on a site you don't own is to contact the owner of the website and request that they remove or block the content in question.
  • Removed or blocked content

    If the website owner removes a page, requests for the removed page should return a "404 Not Found" response or a "410 Gone" response. If they choose to block the page from search engines, then the page should either be disallowed in the site's robots.txt file or contain a noindex meta tag. Once one of these requirements is met, you can submit a removal request using the "Webmaster has already blocked the page" option.



    Sometimes a website owner will claim that they’ve blocked or removed a page but they haven’t technically done so. If they claim a page has been blocked you can double check by looking at the site’s robots.txt file to see if the page is listed there as disallowed.
    User-agent: *
    Disallow: /blocked-page/
    Another place to check if a page has been blocked is within the page’s HTML source code itself. You can visit the page and choose “View Page Source” from your browser. Is there a meta noindex tag in the HTML “head” section?
    <html>
    <head>
    <title>blocked page</title>
    <meta name="robots" content="noindex">
    </head>
    ...
    If they inform you that the page has been removed, you can confirm this by using an HTTP response testing tool like the Live HTTP Headers add-on for the Firefox browser. With this add-on enabled, you can request any URL in Firefox to test that the HTTP response is actually 404 Not Found or 410 Gone.

  • Content removed from the page

    Once you've confirmed that the content you're seeking to remove is no longer present on the page, you can request a cache removal using the 'Content has been removed from the page' option. This type of removal--usually called a "cache" removal--ensures that Google's search results will not include the cached copy or version of the old page, or any snippets of text from the old version of the page. Only the current updated page (without the content that's been removed) will be accessible from Google's search results. However, the current updated page can potentially still rank for terms related to the old content as a result of inbound links that still exist from external sites. For cache removal requests you’ll be asked to enter a "term that has been removed from the page." Be sure to enter a word that is not found on the current live page, so that our automated process can confirm the page has changed -- otherwise the request will be denied. Cache removals are covered in more detail in part two of the "URL removal explained" series.


  • Removing inappropriate webpages or images that appear in our SafeSearch filtered results

    Google introduced the SafeSearch filter with the goal of providing search results that exclude potentially offensive content. For situations where you find content that you feel should have been filtered out by SafeSearch, you can request that this content be excluded from SafeSearch filtered results in the future. Submit a removal request using the 'Inappropriate content appears in our SafeSearch filtered results' option.

If you encounter any issues with the public URL removal tool or have questions not addressed here, please post them to the Webmaster Help Forum or consult the more detailed removal instructions in our Help Center. If you do post to the forum, remember to use a URL shortening service to share any links to content you want removed.

Edit: Read the rest of this series:
Part I: Removing URLs & directories
Part II: Removing & updating cached content
Part IV: Tracking requests, what not to remove
Companion post: Managing what information is available about you online

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Get Cooking with the Webmaster Tools API 2013

salam every one, this is a topic from google web master centrale blog:



As the days grow longer and summer takes full stage, many of us are flocking to patios and parks to engage in the time-honored tradition of grilling food. When it comes to cooking outdoors, the type of grills used span a spectrum from primitive to high tech. For some people a small campfire is all that's required for the perfect outdoor dining experience. For other people the preferred tool for outdoor cooking is a quad grill gas-powered stainless steel cooker with enough features to make an Iron Chef rust with envy.



An interesting off-shoot of outdoor cooking techniques is solar cooking, which combines primitive skills and modern ingenuity. At its most basic, solar cooking involves creating an "oven" that is placed in the sun and passively cooks the food it contains. It is simple to get started with solar cooking because a solar oven is something people can make themselves with inexpensive materials and a bit of effort. The appeal of simplicity, inexpensiveness and the ability to "do it yourself" has created a growing group of people who are making solar ovens themselves.
How all this relates to webmasters is that the webmaster community is also made up of a diverse group of people who use a variety of tools in a myriad of ways. Just like how within the outdoor cooking community there's a contingent of people creating their own solar ovens, the webmaster community has a subgroup of people creating and sharing their own tools. From our discussions with webmasters, we've consistently heard requests to open Webmaster Tools for third-party integration. The Webmaster Tools team has taken this request to heart and I'm happy to announce that we're now releasing an API for Webmaster Tools. The supported features in the first version of the Webmaster Tools API are the following:
  • Managing Sites
    • Retrieve a list of your sites in Webmaster Tools

    • Add your sites to Webmaster Tools

    • Verify your sites in Webmaster Tools

    • Remove your sites from Webmaster Tools

  • Working with Sitemaps
    • Retrieve a list of your submitted Sitemaps
    • Add Sitemaps to Webmaster Tools

    • Remove Sitemaps from Webmaster Tools



Although the current API offers a limited subset of all the functionality that Webmaster Tools provides, this is only the beginning. Get started with the Developer's Guide for the Webmaster Tools Data API to begin working with the API.



Webmasters... fire up your custom tools and get cooking!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Tips on using feeds and information on subscriber counts in Reader 2013

salam every one, this is a topic from google web master centrale blog: Does your site have a feed? A feed can connect you to your readers and keep them returning to your content. Most blogs have feeds, but increasingly, other types of sites with frequently changing content are making feeds available as well. Some examples of sites that offer feeds:
Find out how many readers are subscribed to your feed
If your site has a feed, you can now get information about the number of Google Reader and Google Personalized Homepage subscribers. If you use Feedburner, you'll start to see numbers from these subscriptions taken into account. You can also find this number in the crawling data in your logs. We crawl feeds with the user-agent Feedfetcher-Google, so simply look for this user-agent in your logs to find the subscriber number. If multiple URLs point to the same feed, we may crawl each separately, so in this case, just count up the subscriber numbers listed for each unique feed-id. An example of what you might see in your logs is below:

User-Agent: Feedfetcher-Google; (+http://www.google.com/feedfetcher.html; 4 subscribers; feed-id=1794595805790851116)

Making your feed available to Google
You can submit your feed as a Sitemap in webmaster tools. This will let us know about the URLs listed in the feed so we can crawl and index them for web search. In addition, if you want to make sure your feed shows up in the list of available feeds for Google products, simply add a <link> tag with the feed URL to the <head> section of your page. For instance:

<link rel="alternate" type="application/atom+xml" title="Your Feed Title" href="http://www.example.com/atom.xml" />

Remember that Feedfetcher-Google retrieves feeds only for use in Google Reader and Personalized Homepage. For the content to appear in web search results, Googlebot will have to crawl it as well.

Don't yet have a feed?

If you use a content management system or blogging platform, feed functionality may be built right now. For instance, if you use Blogger, you can go to Settings > Site Feed and make sure that Publish Site Feed is set to Yes. You can also set the feed to either full or short and can add a footer. The URL listed here is what subscribers add to their feed readers. A link to this URL will appear on your blog.

More tips from the Google Reader team
In order to provide the best experience for your users, the Google Reader team has also put together some tips for feed publishers. This document covers feed best practices, common implementation pitfalls, and various ways to promote your feeds. Whether you're creating your feeds from scratch or have been publishing them for a long time, we encourage you to take a look at our tips to make the most of your feeds. If you have any questions, please get in touch.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Siamo tornati dall' SES di Milano! 2013

salam every one, this is a topic from google web master centrale blog:

Un paio di chiarimenti...

Ciao! Siamo appena rientrati da un breve soggiorno in Italia. Tempo fantastico! Abbiamo partecipato come spettatori al Search Engine Strategies conference a Milano nei giorni 29 e 30 maggio. La conferenza è stato davvero una fantastica opportunità per parlare con molti di voi! Ci ha fatto molto piacere esserci e vorrei ringraziare tutti quelli che si sono fermati semplicemente a salutare o a discutere di strategie dei motori di ricerca. Abbiamo avuto la possibilità di parlare con diversi dei partecipanti e con alcuni dei più importanti attori del mondo SEO e Web Search Marketing in Italia. Discussioni utili e fruttuose per molti aspetti. Si e' parlato di come il mercato Web si stia sviluppando in Italia, di strategie SEO e di evangelizzazione (la traduzione italiana suona veramente forte).

Un buon numero di voi è saltato fuori con domande interessanti, e mi piacerebbe ora esporre un caso per poi fornire un paio di chiarificamenti che siano chiari e concisi.

Allora partiamo. Questa è la situazione in cui un webmaster potrebbe ritrovarsi: ho ottimizzato questo sito utilizzando tecniche non in accordo con le linee guida di Google. Ce la siamo cavata per un po', e questo ci ha aiutato a raggiungere la seconda posizione nei risultati di ricerca per alcune parole chiave. Ad un certo punto però, abbiamo ricevuto una email dal team della qualità della ricerca di Google che diceva che il nostro sito non sarebbe stato momentaneamente più presente nell'indice (nelle email c'è sempre almeno un esempio delle tecniche utilizzate). Abbiamo allora sistemato il sito togliendo tutto ciò che non era conforme alle linee guida e dopo alcuni giorni il nostro sito era di nuovo presente nell'indice. Come è possibile che non è più posizionato in seconda posizione nonostante il fatto che abbiamo rimosso tutto ciò che non era conforme alle linee guida?!

Va bene, lasciatemi fare un paio di domande prima di rispondere.

  • Non avete ottimizzato il sito utilizzando quelle tecniche al fine di posizionarlo il meglio possibile artificialmente?
  • Non pensavate che quelle tecniche avrebbero funzionato, almeno in una prospettiva di breve periodo?

Quindi se c'è stato un utilizzo di tecniche spam, incoraggiamo il sito che ha ricevuto la notifica da Google a prendere la cosa seriamente. Molti ripuliscono il proprio sito dalle tecniche scorrette di ottimizzazione dopo aver ricevuto una nostra notifica, ma noi dobbiamo anche tenere in considerazione che oltre a quelle presenti sul sito (per esempio testo nascosto, redirecting doorway page, etc) spesso ci sono anche tecniche utilizzate al di fuori del sito stesso come link popularity artificiali per guadagnarsi un’ottima posizione nelle pagine dei risultati di ricerca di Google.

Quindi, per rendere la questione più chiara possibile, una volta che ognuna delle manipolazioni sopra citate, inserite ai fini del posizionamento, e’ stata rimossa, il sito torna ad occupare la posizione che merita sulla base dei suoi contenuti e della sua link popularity naturale. C'è in oltre da evidenziare che il posizionamento del vostro sito dipende anche dagli altri siti relazionati al vostro per argomento trattato e tali siti nel frattempo potrebbero essere stati ottimizzati correttamente, va da sé che questo avrebbe un impatto anche sulla vostra posizione.

Notate che non c’è alcun tipo di penalizzazione preventiva applicata a quei siti che, ora puliti, hanno però visto in precedenza un utilizzo di tecniche non consentite. E questo è un punto a cui teniamo particolarmente: non rimangono né malus né macchie nella storia di un sito.

E' per questo motivo che insistiamo fermamente nel consigliare di lavorare sodo sui propri contenuti in modo che siano una risorsa che abbia valore per gli utenti, essendo proprio il buon contenuto una delle risorse più importanti che alimentano una link populary naturale e tutti dovremmo ormai sapere quanto una tale popolarità possa essere solida.

Qualità della ricerca, qualità dei contenuti e l'esperienza dei tuoi lettori.

Tra le varie conversazioni sulla qualità della ricerca, una su tutte ricorreva più spesso. Mi riferisco alle landing page e come scrivere per i motori di ricerca, due temi che spesso viaggiano in coppia quando si parla di risultati organici di ricerca.

Pensiamo allora al tuo visitatore che ha cercato qualcosa con Google e ha trovato la tua pagina. Ora, che tipo di accoglienza gli stai riservando? Una buona esperienza di ricerca consiste nel trovare una pagina che contiene l'informazione necessaria per rispondere alla domanda posta all'inizio.

Tuttavia un errore frequente nello scrivere per motori di ricerca é dimenticare proprio il visitatore e focalizzare l'attenzione solo sulla sua domanda. In effetti potremmo sostenere, "é con quella chiave di ricerca che hanno trovato la mia pagina!"

Alla fine dei conti, esasperare un comportamento del genere potrebbe portare a creare pagine fatta "su misura" per rispondere a quella ricerca ma con ben poco contenuto. Pagine del genere spesso utilizzano tecniche quali, tra l'altro, pure ripetizioni di parole, contenuti duplicati e in generale minimo contenuto. Ricapitolando, possono anche essere a tema con la domanda posta - ma per il tuo visitatore, sono inutili. In altri termini, hai finito per creare una pagina scritta solo per i motori di ricerca e ti sei dimenticato del visitatore. Il risultato é che l'utente trova pagine all'apparenza a tema ma in realtà completamente insignificanti.

Queste pagine "insignificanti", fatte artificialmente per generare traffico dai motori, non rappresentano una buona esperienza di ricerca. Anche se non adottano tecniche scorrette, quali ad esempio testo o links nascosti, sono fatte solo ed esclusivamente per posizionarsi per specifiche parole chiave, o combinazioni di parole, ma in realtà non offrono autonomamente alcun valore come risultato di una ricerca.

Un primo approccio per capire se stai causando una cattiva esperienza di ricerca ai tuoi utenti é controllare che le pagine trovate siano davvero utili. Queste pagine avranno contenuto a tema, che risponde alla domanda originalmente posta dall'utente ed in generale sono significative e rilevanti. Potresti cominciare con il controllo delle pagine che ricevono più visite e passare poi a rivedere tutto il sito. E per concludere, un consiglio: in generale, anche quando si vuole ottimizzare la pagina affinché il motore la trovi facilmente, bisogna ricordarsi che i visitatori sono il tuo pubblico e che una pagina scritta per i motori di ricerca non soddisfa necessariamente le aspettative del visitatore in termini di qualità e contenuti. Allora se stai pensando a come scrivere per il motore di ricerca, pensa invece ai tuoi utenti e a qual é il valore che stai offrendo loro!

We're back from SES Milan!

...with a couple of clarifications

Ciao everybody! We just got back from Italy—great weather there, I must say! We attended SES in Milan on the 29th and 30th of May. The conference was a great opportunity to talk to many of you. We really had a good time and want to thank all the people who stopped by to simply say "hi" or to talk to us in more detail about search engine strategies. This gave us a chance to talk to many participants and many of the big Italian actresses and actors in the SEO and web marketing worlds. We discussed recent developments in the Italian internet market, SEO strategies and evangelizing.

A number of you have raised interesting questions, and we'd like to go through two of these in more detail.

This is a situation a webmaster might find himself/herself in: I optimized this site using some sneaky techniques that are not in accordance with Google´s Webmaster Guidelines. I got away with it for a while and it helped me to rank in second position for certain keywords. Then, suddenly, I got an email from Google saying my site has been banned from the index because of those techniques (in these emails there is always an example of one of the infractions found). I now have cleaned up the site and after some days the site was back in the index.
Why on earth doesn't my site rank in the second position anymore, even though I've already paid for the sneaky techniques we used?

OK, before answering let me ask you a couple of questions:

  • Didn't you optimize your site with those techniques in order to artificially boost the ranking?
  • Didn't you think those techniques had worked out (in a short term perspective at least)?

So, if there has been spamming going on, we encourage a site that has gotten an email from Google to take this notification seriously. Many people clean up their sites after receiving a notification from us. But we must also take into account that besides the shady SEO techniques used on a particular site (for instance hidden text, redirecting doorway pages, etc) there are often off-site SEO techniques used such as creating artificial link popularity in order to gain a high position in Google's SERPs.

So, to make it straightforward, once those manipulations to make a site rank unnaturally high are removed, the site gains the position it merits based on its content and its natural link popularity. Note that of course the ranking of your site also depends on other sites related to the same topic and these sites might have been optimized in accordance to our guidelines, which might affect the ranking of your site.

Note that a site does not keep a stain or any residual negative effect from a prior breach of our webmaster guidelines, after it has been cleaned up.

That is why we first and foremost recommend to work hard on the content made for the audience of your site, as the content is a decisive factor for building natural link popularity. We all know how powerful a strong natural link popularity can be.

Search quality, content quality and your visitor's experience.

During our conversations about search-related issues, another topic that came up frequently was landing pages and writing for search engines, which are often related when we consider organic search results.

So, think of your visitors who have searched for something with Google and have found your page. Now, what kind of welcome are you offering? A good search experience consists of finding a page that contains enough information to satisfy your original query.

A common mistake in writing optimized content for search engines is to forget about the user and focus only on that particular query. One might say, that's how the user landed on my page!

At the end of that day, exaggerating this attitude might lead to create pages only made to satisfy that query but with no actual content on them. Such pages often adopt techniques such as, among others, mere repetition of keywords, duplicate content and overall very little value. In general, they might be in line with the keywords of the query – but for your visitor, they’re useless. In other words, you have written pages solely for the search engine and you forgot about the user. As a result, your visitor will find a page apparently on topic but totally meaningless.

These “meaningless” pages, artificially made to generate search engine traffic, do not represent a good search experience. Even though they do not employ other not recommendable techniques, such as for examples hidden text and links, they are very much made solely for the purpose of ranking for particular keywords, or a set of keywords, but actually are not offering a satisfying search result in itself.

A first step to identify if you are causing a bad search experience for your visitor consists of checking that the pages that he or she finds are actually useful. They will have topical content, that satisfies the query for which your visitor has found it and are overall meaningful and relevant. You might want to start with the pages that are most frequently found and extend your check up to your entire site. To sum up, as general advice, even if you want to make a page that is easily found via search engines, remember that the users are your audience, and that a page optimized for the search engine does not necessarily meet the user's expectations in terms of quality and content. So if you find yourself writing content for a search engine, you should ask yourself what the value is for the user!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Search Queries Alerts in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

We know many of you check Webmaster Tools daily (thank you!), but not everybody has the time to monitor the health of their site 24/7. It can be time consuming to analyze all the data and identify the most important issues. To make it a little bit easier we’ve been incorporating alerts into Webmaster Tools. We process the data for your site and try to detect the events that could be most interesting for you. Recently we rolled out alerts for Crawl Errors and today we’re introducing  alerts for Search Queries data.

The Search Queries feature in Webmaster Tools shows, among other things, impressions and clicks for your top pages over time. For most sites, these numbers follow regular patterns, so when sudden spikes or drops occur, it can make sense to look into what caused them. Some changes are due to differing demand for your content, other times they may be due to technical issues that need to be resolved, such as broken redirects. For example, a steady stream of clicks which suddenly drops to zero is probably worth investigating.

The alerts look like this:




We’re still working on the sensitivity threshold of the messages and welcome your feedback in our help forums. We hope the new alerts will be useful. Don’t forget to sign up for email forwarding to receive them in your inbox.

Posted by , Tech Lead, Webmaster Tools
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Post Rating di Blogger 2013

Seo Master present to you: Post Rating Widget adalah widget yang tampil pada setiap posting kita pada akhir post, yang dapat di vote langsung oleh pengunjung untuk menilai posting kita. Widgetnya dapat dilihat seperti gambar di bawah ini

post rating widget


Bagaimana cara menggunakannya pada blogger blogspot ?

Relatif mudah, begini caranya :

1. Login ke account blogger

2. Klik pada layout

3. Kemudian klik pada Edit HTML dan centang box "expand widget template"

4. Cari kode di bawah ini
<data:post.body/>
5. Kemudian letakkan kode dibawah ini persis dibawah kode di atas
<script language='JavaScript'>
var OutbrainPermaLink='<data:post.url/>';
var OB_demoMode = false;
var OB_Script = true;
</script>
<script src='http://widgets.outbrain.com/OutbrainRater.js' type='text/javascript'/>
6. save dan selesai.

Silahkan diutak-atik, kodenya dapat diletakkan dimana saja selama masih didalam include=’post’.
2013, By: Seo Master

from web contents: Google's email communication with webmasters 2013

salam every one, this is a topic from google web master centrale blog: Posted by Ríona MacNamara, Webmaster Tools Team

In den letzten Tagen gab es nochmals Versuche, deutsche Webmaster durch falsche E-Mails von Google zu verunsichern. Diese E-Mails stammen nicht von Google. Seit einigen Wochen hat Google die Benachrichtigung von Webmastern durch E-Mails eingestellt. Google arbeitet derzeit an einem zuverlässigeren Webmaster-Kommunikationsprozess.

We've noticed that someone is again trying to spoof the emails that Google sends to webmasters to alert them with issues about their site. These emails are not coming from Google, and in fact several weeks ago we temporarily discontinued sending these emails to webmasters while we explore different, secure ways of communicating with webmasters. Watch this space for more news - but in the meantime, you can safely assume that any email message you receive is not, in fact, from us.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Review On Bitdefender Internet Security 2013 2013

Seo Master present to you:

Bitdefender Internet Security 2013
According to AV-Test in Germany Bitdefender was declared as the Best Antivirus engine in terms of protection, repair and usability. Bitdefender also received "Best Product Of The Year 2013" award and is ranking on the top in Top Ten Reviews in terms of it's Performance, Features & Help and Support. There are to many if we have to talk about it's achievements, So let go the review of this World's Best Antivirus "Bitdefender Internet Security 2013".

Features:

Bitdefender Internet Security 2013

Bitdefender Internet Security Offers a variety of security features to protect us from variety of online threats and to safeguard our privacy online. It is empowered with lot of tools for parents to monitor their child's activity online, and allowing you to block Suspicious sites, messages and email that contain flagged words or Phrases. Also You can limit time of internet access for each child. When you search for something on a search engine like Google, Bitdefender gives each url in the search results a security rank based on the no. of votes it receives.

Bitdefender Makes Online banking more secure through it Safepay(Sandboxed Web Browser) feature. This feature prevents any type of keylogger, spyware, etc.. from stealing info. on your bank or other transitions you make online. It is really a must needed tool for safe online transitions.

Bitdefender automatically check for updates periodically to protect you from the new kind of threads emerging day to day. This feature can be disabled if you plan to check it manually. But it is necessary to get the most out of it's Security.

Speed and Performance:

Speed and Performance
Like other Antivirus programs Bitdefender also slightly appears to slow the system when a scan is in progress, But fortunately it is empowered with some additional features to utilize best time to scan without slowing your system. Bitdefender Internet Security can run full system scan quickly with less utilization of Internal memory. You can even schedule scans. But Don't worry! These scans runs on background without affecting your work in progress and allowing you to do multitask at one time. Usually bitdefender Scans only if your system is idle. The Scan Dispatcher tool will make a scan only if the PC usage falls below a certain level.

Bitdefender's "AutoPilot" which is present at the top right corner of the window can be turned on/off. Autopilot is really a great option if you are busy and don't want to get interrupted in the middle of your work in applying decisions to the actions to be taken against the threats.

Help & Support:

When it comes to Help & Support Bitdefender is 24/7 and ready to help you at any time. You can search for your answers online, ask questions to the professionals, chat with them or even call them. Their Blog has interesting post on Security topics and Emerging Threats.

System Requirements:

If you ask for minimum system requirements. Then here it is,
  • Operating system: Microsoft Windows XP SP3 (32 bit) , Vista (SP2), Microsoft Windows 7 (SP1), Microsoft Windows 8
  • CPU: 800MHz processor
  • Memory (RAM): 1 GB
  • Available free hard disk space: 1.8 GB free space (at least 800 MB on the system drive)
  • Additional Softwares: .NET Framework 3.5 (automatically installed by Bitdefender if necessary)

Price/Shopping:

Bitdefender is comparatively low when compared to other Antivirus softwares who have less features when compared to Bitdefender.

1PC                        -               $24.97 (1 Year)  $44.98(2 Year)   $64.98(3 Year)
Upto 3PC's             -               $34.97(1 Year)   $54.97(2 Year)   $79.97(3 Year)
Upto 5PC's             -               $54.97(1 Year)   $89.98(2 Year)   $124.97(3 Year)
Upto 10PC's           -               $89.98(1 Year)   $159.97(2 Year) $214.97(3 Year)

You can visit Bitdefender Antivirus to download a trial version/buy this product, or check out other antivirus solutions.
2013, By: Seo Master

from web contents: How to verify Googlebot 2013

salam every one, this is a topic from google web master centrale blog: Lately I've heard a couple smart people ask that search engines provide a way know that a bot is authentic. After all, any spammer could name their bot "Googlebot" and claim to be Google, so which bots do you trust and which do you block?

The common request we hear is to post a list of Googlebot IP addresses in some public place. The problem with that is that if/when the IP ranges of our crawlers change, not everyone will know to check. In fact, the crawl team migrated Googlebot IPs a couple years ago and it was a real hassle alerting webmasters who had hard-coded an IP range. So the crawl folks have provided another way to authenticate Googlebot. Here's an answer from one of the crawl people (quoted with their permission):


Telling webmasters to use DNS to verify on a case-by-case basis seems like the best way to go. I think the recommended technique would be to do a reverse DNS lookup, verify that the name is in the googlebot.com domain, and then do a corresponding forward DNS->IP lookup using that googlebot.com name; eg:

> host 66.249.66.1
1.66.249.66.in-addr.arpa domain name pointer crawl-66-249-66-1.googlebot.com.

> host crawl-66-249-66-1.googlebot.com
crawl-66-249-66-1.googlebot.com has address 66.249.66.1

I don't think just doing a reverse DNS lookup is sufficient, because a spoofer could set up reverse DNS to point to crawl-a-b-c-d.googlebot.com.


This answer has also been provided to our help-desk, so I'd consider it an official way to authenticate Googlebot. In order to fetch from the "official" Googlebot IP range, the bot has to respect robots.txt and our internal hostload conventions so that Google doesn't crawl you too hard.

(Thanks to N. and J. for help on this answer from the crawl side of things.)this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Daftar Feed pada Feedburner Google 2013

Seo Master present to you: Feedburner adalah satu situs yang memberikan layanan dalam hal syndikasi atau feed. Semua posting kita terdokumentasi secara penuh. Dan yang lebih menarik adalah bahwa Feedburner sekarang telah diakuisisi oleh google (beruntunglah pengguna blogspot, ga perlu capek-capek lagi) sehingga namanya berubah menjadi Feedburner Google.

Keuntungan lainnya adalah blog anda akan cepat terkenal. Kenapa ? Pada Feedburner ada fitur subscriber yang berfungsi agar pengunjung anda dapat mengetahui atau update post-post terbaru kita. Dan feedburner ini dapat didaftarkan pada berbagai search engine dan blog directory agar mendapat update post kita. Asik kan ? Layanan gratis pula !

Seperti saya sebutkan di atas, feedburner sudah diambil alih oleh google, jadi layanan ini bagi pengguna blogspot yang notabene sudah punya account di google akan lebih mudah. Tinggal masuk ke account google dan klik pada feedburner. Kalau belum terpaksa bikin account dulu...

Selanjutnya :

1. Akan muncul halaman ucapan selamat datang (welcome) dari feedburner

2. Isi kotak di bawah tulisan Burn a feed right this instant dengan alamat blog anda. Contoh :http://www.matrixar.com

3. Klik tombol Next

4. Ubah Feed Title dengan Feed Address jika mau di ubah (terserah anda), kemudian klik tombol Active Feed

5. Muncul ucapan Conrats, klik saja langsung tombol Next

6. Beri tanda centang pada kotak di samping tulisan Clickthroughs dan I want more ! have FeedBurner stats PRO also track (sekarang layanan yang PRO sudah gratis sejak diakuisisi oleh google)

7. Klik tombol Next

8. Untuk 0ptimalisasi Klik tab Optimize

9. Klik SmartFeed lalu klik tombol Activate

Yang lain silahkan setting sesuai keinginan anda. Kenapa SmartFeed yang saya tekankan ? Karena fitur ini berfungsi agar feed kita kompatibel atau match dengan berbagai jenis feed reader.

10. Selanjut klik tab Publicize

11. Klik PingShot

Satu lagi yang menurut saya penting adalah menghidupkan fungsi pingshot, agar setiap kali posting baru kita publish, feedburner akan segera memberitahu berbagai search engine atau blog directory
12. Beri centang pada kotak disamping Ping-o-matic

13. Klik Activate

14. Selesai sudah.

Untuk fitur-fitur lainnya silahkan di-explore sendiri yah...

Semoga bermanfaat.

Upss... ada yang ketinggalan satu lagi yaitu mengaktifkan fitur Awarness API. Fungsinya adalah memperbolehkan pengembang (developer) dalam hal ini pemilik feedburner, membuat aplikasi yang mempelihatkan, menganalisa dan mempromosikan data trafik feed kita diluar Feedburner.

Caranya :
Masih pada halaman Publicize (langkah no. 10), klik Awarness Api, kemudian klik Activate.
Ok. Mudah-mudahan tidak ada yang terlewat lagi. Kalau ada nanti kita bahas lagi ok?!2013, By: Seo Master
Powered by Blogger.