Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Today we’re announcing two updates to rich snippets.

First, we’re happy to announce that product rich snippets, which previously were only available in a limited set of locales, are supported globally.  Users viewing your site’s results in Google search can now preview information about products available on your website, regardless of where they’re searching from. Here’s an example of a product rich snippet:
A product rich snippet from www.google.fr
Second, we’ve updated the rich snippets testing tool to support HTML input. We heard from many users that they wanted to be able to test their HTML source without having to publish it to a web page. This is now supported by the tool, as shown below.
Preview rich snippets from HTML source
If you have any questions or feedback about these changes, please let us know in our Help Forum. You can find more information about rich snippets in our Help Center and Webmaster Education site.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Sites with more content can have more opportunities to rank well in Google. It makes sense that having more pages of good content represent more chances to rank in search engine result pages (SERPs). Some SEOs however, do not focus on the user’s needs, but instead create pages solely for search engines. This approach is based on the false assumption that increasing the volume of web pages with random, irrelevant content is a good long-term strategy for a site. These techniques are usually accomplished by abusing qlweb style catalogues or by scraping content from sources known for good, valid content, like Wikipedia or the Open Directory Project.

These methods violate Google's webmaster guidelines. Purely scraped content, even from high quality sources, does not provide any added value to your users. It's worthwhile to take the time to create original content that sets your site apart. This will keep your visitors coming back and will provide useful search results.

In order to provide best results possible to our Polish and non-Polish users, Google continues to improve its algorithms for validating web content.

Google is willing to take action against domains that try to rank more highly by just showing scraped or other autogenerated pages that don't add any value to users. Companies, webmasters, and domain owners who consider SEO consultation should take care not to spend time on methods which will not have worthwhile long-term results. Choosing the right SEO consultant requires in-depth background research, and their reputation and past work should be important factors in your decision.

PS: Head on over to our Polish discussion forum, where we're monitoring the posts and chiming in when we can!

Treść oraz katalogi na serwisach internetowy

Serwisy o dużej ilości stron mają szanse na wyższe pozycje w indeksie Google. Oznacza to, że oferując wiele stron z niepowtarzalną treścią można polepszyć notowania w wynikach wyszukiwarek (SERP). Fakt ten jest znany i wykorzystywany przez przedsiębiorstwa oferujące usługi pozycjonowania witryn internetowych. Często jednak nie jest brane pod uwagę, że treść strony powinna być tworzona dla użytkowników, a nie dla wyszukiwarek (w tym Google). Takie podejście prowadzi do błędnego założenia, że wystarczy zwiększyć ilość stron konkretnej domeny, dodając na przykład katalogi z dowolną, niejednokrotnie zupełnie nieistotną treścią, aby na dłuższy okres czasu wypozycjonować domenę. Przejawia się to między innymi nadużywaniem katalogów typu qlweb lub kopiowaniem znanych z jakościowo dobrej treści serwisów, jak Wikipedia lub Open Directory Project.

Takie metody są bez wątpliwości rozbieżne z wytycznymi Google dla webmasterów. Dowolnie skopiowane treści, nawet jeżeli dobrej jakości, nie stanowią większej wartości informacyjnej dla użytkowników. Aby wyróżnić serwis internetowy, warto poświęcić czas na tworzenie nowej treści, dzięki czemu można zwiększyć lojalność użytkowników i dostarczyć przydatnych wyników w wyszukiwarce.

W trosce o naszych polskich użytkowników (i nie tylko) Google konsekwentnie ulepsza algorytmy weryfikujące merytoryczną wartość serwisów internetowych.

Google jest skłonny podejmować działania przeciwko domenom, których webmasterzy usiłują osiągnąć lepsze pozycje w wynikach poprzez dodawanie skopiowanej lub automatycznie wygenerowanej treści, która nie stanowi żadnej wartości dla użytkowników. Przedsiębiorstwa, webmasterzy oraz właściciele domen biorący pod uwagę konsultacje specjalistów SEO, powinni zadbać o to, żeby ich czas nie był wykorzystywany na stosowanie metod nieprzynoszących długoterminowych rezultatów. Przy wyborze doradców oraz firm oferujących pozycjonowanie, ich reputacja jest kluczowym czynnikiem i powinna zostać dokładnie zweryfikowana przed podjęciem ostatecznej decyzji.

PS: Zapraszamy na naszą polską grupe dyskusyjną, na której z zainteresowaniem czytamy Wasze wpisy i staramy się na nie reagować.

Posted by Kaspar Szymanski, Search Qualitythis is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Thanks to everyone who stopped by to say hi at the Search Engine Strategies conference in San Jose last week!

I had a great time meeting people and talking about our new webmaster tools. I got to hear a lot of feedback about what webmasters liked, didn't like, and wanted to see in our Webmaster Central site. For those of you who couldn't make it or didn't find me at the conference, please feel free to post your comments and suggestions in our discussion group. I do want to hear about what you don't understand or what you want changed so I can make our webmaster tools as useful as possible.

Some of the highlights from the week:

This year, Danny Sullivan invited some of us from the team to "chat and chew" during a lunch hour panel discussion. Anyone interested in hearing about Google's webmaster tools was welcome to come and many did -- thanks for joining us! I loved showing off our product, answering questions, and getting feedback about what to work on next. Many people had already tried Sitemaps, but hadn't seen the new features like Preferred domain and full crawling errors.

One of the questions I heard more than once at the lunch was about how big a Sitemap can be, and how to use Sitemaps with very large websites. Since Google can handle all of your URLs, the goal of Sitemaps is to tell us about all of them. A Sitemap file can contain up to 50,000 URLs and should be no larger than 10MB when uncompressed. But if you have more URLs than this, simply break them up into several smaller Sitemaps and tell us about them all. You can create a Sitemap Index file, which is just a list of all your Sitemaps, to make managing several Sitemaps a little easier.

While hanging out at the Google booth I got another interesting question: One site owner told me that his site is listed in Google, but its description in the search results wasn't exactly what he wanted. (We were using the description of his site listed in the Open Directory Project.) He asked how to remove this description from Google's search results. Vanessa Fox knew the answer! To specifically prevent Google from using the Open Directory for a page's title and description, use the following meta tag:
<meta name="GOOGLEBOT" content="NOODP">

My favorite panel of the week was definitely Pimp My Site. The whole group was dressed to match the theme as they gave some great advice to webmasters. Dax Herrera, the coolest "pimp" up there (and a fantastic piano player), mentioned that a lot of sites don't explain their product clearly on each page. For instance, when pimping Flutter Fetti, there were many instances when all the site had to do was add the word "confetti" to the product description to make it clear to search engines and to users reaching the page exactly what a Flutter Fetti stick is.

Another site pimped was a Yahoo! Stores web site. Someone from the audience asked if the webmaster could set up a Google Sitemap for their store. As Rob Snell pointed out, it's very simple: Yahoo! Stores will create a Google Sitemap for your website automatically, and even verify your ownership of the site in our webmaster tools.

Finally, if you didn't attend the Google dance, you missed out! There were Googlers dancing, eating, and having a great time with all the conference attendees. Vanessa Fox represented my team at the Meet the Google Engineers hour that we held during the dance, and I heard Matt Cutts even starred in a music video! While demo-ing Webmaster Central over in the labs area, someone asked me about the ability to share site information across multiple accounts. We associate your site verification with your Google Account, and allow multiple accounts to verify ownership of a site independently. Each account has its own verification file or meta tag, and you can remove them at any time and re-verify your site to revoke verification of a user. This means that your marketing person, your techie, and your SEO consultant can each verify the same site with their own Google Account. And if you start managing a site that someone else used to manage, all you have to do is add that site to your account and verify site ownership. You don't need to transfer the account information from the person who previously managed it.

Thanks to everyone who visited and gave us feedback. It was great to meet you!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.