Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Google now indexes SVG 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

You can now use Google search to find SVG documents. SVG is an open, XML-based format for vector graphics with support for interactive elements. We’re big fans of open standards, and our mission is to organize the world’s information, so indexing SVG is a natural step.

We index SVG content whether it is in a standalone file or embedded directly in HTML. The web is big, so it may take some time before we crawl and index most SVG files, but as of today you may start seeing them in your search results. If you want to see it yourself, try searching for [sitemap site:fastsvg.com] or [HideShow site:svg-whiz.com]

If you host SVG files and you wish to exclude them from Google’s search results, you can use the “X-Robots-Tag: noindex” directive in the HTTP header.

Check out Webmaster Central for a full list of file types we support.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Spookier than malware 2013

salam every one, this is a topic from google web master centrale blog:

hotdog

lion king
...and infinitely more fun: webmasters and their pets incognito! Happy Halloween, everyone! If you see any costumes that would pass the SafeSearch filter :), feel like sharing a gripe or telling a good story, please join the chat!

Take care, and don't forget to brush your teeth.
 Yours scarily,
  The Webmaster Central Team


Our glasses-wearing, no vampire-teeth vampire (Ryan), zoombie Mur, Holiday Fail (Tiffany Lane), Colbert Hipster (Dan Vanderkam), Rick Astley Cutts, Homeboy Ben D'Angelo, Me -- pinker & poofier, Investment Bank CEO Shyam Jayaraman (though you can't see the golden parachute in his backpack)



Chark as Juno, Wysz as Beah Burger (our co-worker), Adi and Matt Dougherty as yellow ninja, red ninja!


Heroes come in all shapes and sizes...

Powdered toast man, Mike Leotta

Adam Lasnik as, let me see if I get this right, a "secret service agent masquerading as a backstage tech" :)

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Plumbing the web 2013

salam every one, this is a topic from google web master centrale blog:

Today is Google Developer Day! We're hosting events for developers in ten cities around the world, as you can read about from Matt Cutts and on our Google Blog. Jonathan Simon and Maile Ohye, whom you have seen on this blog, at conferences, and in our discussion forum, are currently hanging out at the event in San Jose.

I've been at the Beijing event, where I gave a keynote about "Plumbing the Web -- APIs and Infrastructures" for 600 Chinese web developers. I talked about a couple of my favorite topics, Sitemaps and Webmaster Tools, and some of the motivations behind them. Then I talked a bit about consumer APIs and some of our backend infrastructures to support our platform.

Check out the video of my keynote on YouTube or see some of the other videos from the events around the globe.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Making harmonious use of Webmaster Tools and Analytics 2013

salam every one, this is a topic from google web master centrale blog: Written by Reid Yokoyama, Search Quality Team

Occasionally in the discussion group, webmasters ask, "Should I be using Google Webmaster Tools or Google Analytics?" Our answer is: use both! Here are three scenarios that really highlight the power of both tools.

1. Make the most of your impressions
One of my favorite features of Webmaster Tools is that it will show you the Top 20 search queries your site appeared for along with the Top 20 clicked queries. The data from the Top Search Queries allows you to quickly pinpoint what searches your site appears for and which of those searches are resulting in clicks. Let's look at last week's data for www.google.com/webmasters as an example.


As you can see, Google Webmaster Central is receiving a great number of impressions for the query [gadgets] but may not be fully capitalizing on these impressions with user clicks. Click on [gadgets] to see how your site appears in our search results. Does your title and snippet look appealing to users? As my colleague Michael recently wrote, it might be time to do some "housekeeping" on your website -- it's a great, low-to-no-cost way to catch the attention of your users. For example, we could work to improve our snippet from:

To something more readable such as "Use gadgets to easily add cool, dynamic content to your site..." by adding a meta description to the URL.

And what are users doing when they visit your site? Are they browsing your content or bouncing off your site quickly? To find out, Google Analytics will calculate your site's "bounce rate," or the percentage of single-page visits (e.g. someone just visiting your homepage and then leaving). This can be a helpful measure of the quality of your site's landing page and the traffic your site receives. After all, once you've worked hard to get your users to visit your site, you want to keep them there! Check out the Analytics blog for further information about "bounce rate."

2. Perform smart geo-targeting
Let's imagine you have a .com that you want to target at a Japanese market. Webmaster Tools allows you to set a geographic target for your site, where you would probably pick Japan. But, doing so is not an immediate solution. You can confirm the location of your visitors using the map overlay of Analytics, right up to the city level. You can also discover what types of users are accessing your site - including their browser and connection speed. If users cannot access your website due to an incompatible browser or slower connection speeds, you may need to rethink your website's design. Doing so can go a long way toward achieving the level of relevant traffic you would like.

3. Control access to sensitive content
One day, you log into Analytics and look at your "Content by Title" data. You shockingly discover that users are visiting your /privatedata pages. Have no fear! Go into Webmaster Tools and use the URL removal tool to remove those pages from Google's search results. Modifying your robots.txt file will also block Googlebot from crawling that section of your site in the future.

For more tips and tricks on Analytics, check out the Analytics Help Center. If you have any more suggestions, feel free to comment below or in our Webmaster Help Group.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: How search results may differ based on accented characters and interface languages 2013

salam every one, this is a topic from google web master centrale blog:
When a searcher enters a query that includes a word with accented characters, our algorithms consider web pages that contain versions of that word both with and without the accent. For instance, if a searcher enters [México], we'll return results for pages about both "Mexico" and "México."



Conversely, if a searcher enters a query without using accented characters, but a word in that query could be spelled with them, our algorithms consider web pages with both the accented and non-accented versions of the word. So if a searcher enters [Mexico], we'll return results for pages about both "Mexico" and "México."



How the searcher's interface language comes into play
The searcher's interface language is taken into account during this process. For instance, the set of accented characters that are treated as equivalent to non-accented characters varies based on the searcher's interface language, as language-level rules for accenting differ.

Also, documents in the chosen interface language tend to be considered more relevant. If a searcher's interface language is English, our algorithms assume that the queries are in English and that the searcher prefers English language documents returned.

This means that the search results for the same query can vary depending on the language interface of the searcher. They can also vary depending on the location of the searcher (which is based on IP address) and if the searcher chooses to see results only from the specified language. If the searcher has personalized search enabled, that will also influence the search results.

The example below illustrates the results returned when a searcher queries [Mexico] with the interface language set to Spanish.



Note that when the interface language is set to Spanish, more results with accented characters are returned, even though the query didn't include the accented character.

How to restrict search results
To obtain search results for only a specific version of the word (with or without accented characters), you can place a + before the word. For instance, the search [+Mexico] returns only pages about "Mexico" (and not "México"). The search [+México] returns only pages about "México" and not "Mexico." Note that you may see some search results that don't appear to use the version of word you specified in your query, but that version of the word may appear within the content of the page or in anchor text to the page, rather than in the title or description listed in the results. (You can see the top anchor text used to link to your site by choosing Statistics > Page analysis in webmaster tools.)

The example below illustrates the results returned when a searcher queries [+Mexico].

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Target visitors or search engines? 2013

salam every one, this is a topic from google web master centrale blog: Last Friday afternoon, I was able to catch the end of the Blog Business Summit in Seattle. At the session called "Blogging and SEO Strategies," John Battelle brought up a good point. He said that as a writer, he doesn't want to have to think about all of this search engine optimization stuff. Dave Taylor had just been talking about order of words in title tags and keyword density and using hyphens rather than underscores in URLs.

We agree, which is why you'll find that the main point in our webmaster guidelines is to make sites for visitors, not for search engines. Visitor-friendly design makes for search engine friendly design as well. The team at Google webmaster central talks a lot with site owners who care a lot about the details of how Google crawls and indexes sites (hyphens and underscores included), but many site owners out there are just concerned with building great sites. The good news is that the guidelines and tips about how Google crawls and indexes sites come down to wanting great content for our search results.

In the spirit of John Battelle's point, here's a recap of some quick tips for ensuring your site is friendly for visitors.

Make good use of page titles
This is true of the main heading on the page itself, but is also true of the title that appears in the browser's title bar.


Whenever possible, ensure each page has a unique title that describes the page well. For instance, if your site is for your store "Buffy's House of Sofas", a visitor may want to bookmark your home page and the order page for your red, fluffy sofa. If all of your pages have the same title: "Wecome to my site!", then a visitor will have trouble finding your site again in the bookmarks. However, if your home page has the title "Buffy's House of Sofas" and your red sofa page has the title "Buffy's red fluffy sofa", then visitors can glance at the title to see what it's about and can easily find it in the bookmarks later. And if your visitors are anything like me, they may have several browser tabs open and appreciate descriptive titles for easier navigation.

This simple tip for visitors helps search engines too. Search engines index pages based on the words contained in them, and including descriptive titles helps search engines know what the pages are about. And search engines often use a page's title in the search results. "Welcome to my site" may not entice searchers to click on your site in the results quite so much as "Buffy's House of Sofas".
Write with words
Images, flash, and other multimedia make for pretty web pages, but make sure your core messages are in text or use ALT text to provide textual descriptions of your multimedia. This is great for search engines, which are based on text: searchers enter search queries as word, after all. But it's also great for visitors, who may have images or Flash turned off in their browsers or might be using screen readers or mobile devices. You can also provide HTML versions of your multimedia-based pages (if you do that, be sure to block the multimedia versions from being indexed using a robots.txt file).

Make sure the text you're talking about is in your content
Visitors may not read your web site linearly like they would a newspaper article or book. Visitors may follow links from elsewhere on the web to any of your pages. Make sure that they have context for any page they're on. On your order page, don't just write "order now!" Write something like "Order your fluffy red sofa now!" But write it for people who will be reading your site. Don't try to cram as many words in as possible, thinking search engines can index more words that way. Think of your visitors. What are they going to be searching for? Is your site full of industry jargon when they'll be searching for you with more informal words?

As I wrote in that guest post on Matt Cutts' blog when I talked about hyphens and underscores:

You know what your site’s about, so it may seem completely obvious to you when you look at your home page. But ask someone else to take a look and don’t tell them anything about the site. What do they think your site is about?

Consider this text:

“We have hundreds of workshops and classes available. You can choose the workshop that is right for you. Spend an hour or a week in our relaxing facility.”

Will this site show up for searches for [cooking classes] or [wine tasting workshops] or even [classes in Seattle]? It may not be as obvious to visitors (and search engine bots) what your page is about as you think.

Along those same lines, does your content use words that people are searching for? Does your site text say “check out our homes for sale” when people are searching for [real estate in Boston]?

Make sure your pages are accessible
I know -- this post was supposed to be about writing content, not technical details. But visitors can't read your site if they can't access it. If the network is down or your server returns errors when someone tries to access the pages of your site, it's not just search engines who will have trouble. Fortunately, webmaster tools makes it easy. We'll let you know if we've had any trouble accessing any of the pages. We tell you the specific page we couldn't access and the exact error we got. These problems aren't always easy to fix, but we try to make them easy to find.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Introducing the Structured Data Dashboard 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Structured data is becoming an increasingly important part of the web ecosystem. Google makes use of structured data in a number of ways including rich snippets which allow websites to highlight specific types of content in search results. Websites participate by marking up their content using industry-standard formats and schemas.

To provide webmasters with greater visibility into the structured data that Google knows about for their website, we’re introducing today a new feature in Webmaster Tools - the Structured Data Dashboard. The Structured Data Dashboard has three views: site, item type and page-level.

Site-level view
At the top level, the Structured Data Dashboard, which is under Optimization, aggregates this data (by root item type and vocabulary schema).  Root item type means an item that is not an attribute of another on the same page.  For example, the site below has about 2 million Schema.Org annotations for Books (“http://schema.org/Book”)


Itemtype-level view
It also provides per-page details for each item type, as seen below:


Google parses and stores a fixed number of pages for each site and item type. They are stored in decreasing order by the time in which they were crawled. We also keep all their structured data markup. For certain item types we also provide specialized preview columns as seen in this example below (e.g. “Name” is specific to schema.org Product).


The default sort order is such that it would facilitate inspection of the most recently added Structured Data.

Page-level view
Last but not least, we have a details page showing all attributes of every item type on the given page (as well as a link to the Rich Snippet testing tool for the page in question).


Webmasters can use the Structured Data Dashboard to verify that Google is picking up new markup, as well as to detect problems with existing markup, for example monitor potential changes in instance counts during site redesigns.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo PubSubHubbub, Feeds, and the Feed API 2013

Seo Master present to you: Author Photo
By Peter Dickman, Engineering Manager

Google has supported the PubSubHubbbub (PuSH) protocol since its introduction in 2009. Earlier this year we completely rewrote our PuSH hub implementation, both to make it more resilient and to considerably enhance its capacity and throughput. Our improved PuSH hub means we can expose feeds more efficiently, coherently and consistently, from a robust secure access point. Using the PuSH protocol, servers can subscribe to an almost arbitrarily large number of feeds and receive updates as they occur.

In contrast, the Feed API allows you to download any specific public Atom or RSS feed using only JavaScript, enabling easy mashups of feeds with your own content and other APIs. We are planning some improvements to the Feed API, as part of our ongoing infrastructure work.

We encourage you to consider PuSH as a means of accessing feeds in bulk. To support that, we’re clarifying our practices around bots interacting with Google’s PuSH system: we encourage providers of feed systems and related tools to connect their automated systems for feed acquisition to our PuSH hub (or other hubs in the PuSH ecosystem). The PuSH hub is designed to be accessed by bots and it’s tuned for large-scale reading from the PuSH endpoints. We have safeguards against abuse, but legitimate users of the access points should see generous limits, with few restrictions, speed bumps or barriers. Similarly, we encourage publishers to submit their feeds to a public PuSH hub, if they don’t want to implement their own.

Google directly hosts many feed producers (e.g. Blogger is one of the largest feed sources on the web) and is a feed consumer too (e.g. many webmasters use feeds to tell our Search system about changes on their sites). Our PuSH hub offers easy access to hundreds of millions of Google-hosted feeds, as well as hundreds of millions of other feeds available via the PuSH ecosystem and through active polling.

The announcement of v0.4 of the PuSH specification advances our goal of strengthening infrastructure support for feed handling. We’ve worked with Superfeedr and others on the new specification and look forward to it being widely adopted.


Peter Dickman spends his days herding cats for the Search Infrastructure group in Zurich. He divides his spare time between helping government bodies understand cloud computing and systematically evaluating the products of Switzerland’s chocolatiers.

Posted by Scott Knaster, Editor
2013, By: Seo Master

from web contents: Changes in the Chrome user agent 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Intermediate to Advanced

The Chrome team is exploring a few changes to Chrome’s UA string. These changes are designed to provide additional details in the user-agent, remove redundancy, and increase compatibility with Internet Explorer. They’re also happening in conjunction with similar changes in Firefox 4.

We intend to ship Chrome 11 with these changes, assuming they don't cause major web compatibility problems. To test them out and ensure your website remains compatible with Chrome, we recommend trying the Chrome Dev and Beta channel builds. If you have any questions, please check out the blog post on the Chromium blog or drop us a line at our help forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Supplemental goes mainstream 2013

salam every one, this is a topic from google web master centrale blog:

When Google originally introduced Supplemental Results in 2003, our main web index had billions of web pages. The supplemental index made it possible to index even more web pages and, just like our main web index, make this content available when generating relevant search results for user queries. This was especially useful for queries that did not return many results from the main web index, and for these the supplemental index allowed us to query even more web pages. The fewer constraints we're able to place on sites we crawl for the supplemental index means that web pages that are not in the main web index could be included in the supplemental. These are often pages with lower PageRank or those with more complex URLs. Thus the supplemental index (read more - and here's Matt's talk about it on video) serves a very important purpose: to index as much of the relevant content that we crawl as possible.

The changes we make must focus on improving the search experience for our users. Since 2006, we've completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We're also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.

The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results." Of course, you will continue to benefit from Google's supplemental index being deeper and fresher.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Reorganizing internal vs. external backlinks 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Today we’re making a change to the way we categorize link data in Webmaster Tools. As you know, Webmaster Tools lists links pointing to your site in two separate categories: links coming from other sites, and links from within your site. Today’s update won’t change your total number of links, but will hopefully present your backlinks in a way that more closely aligns with your idea of which links are actually from your site vs. from other sites.

You can manage many different types of sites in Webmaster Tools: a plain domain name (example.com), a subdomain (www.example.com or cats.example.com), or a domain with a subfolder path (www.example.com/cats/ or www.example.com/users/catlover/). Previously, only links that started with your site’s exact URL would be categorized as internal links: so if you entered www.example.com/users/catlover/ as your site, links from www.example.com/users/catlover/profile.html would be categorized as internal, but links from www.example.com/users/ or www.example.com would be categorized as external links. This also meant that if you entered www.example.com as your site, links from example.com would be considered external because they don’t start with the same URL as your site (they don’t contain www).

Most people think of example.com and www.example.com as the same site these days, so we’re changing it such that now, if you add either example.com or www.example.com as a site, links from both the www and non-www versions of the domain will be categorized as internal links. We’ve also extended this idea to include other subdomains, since many people who own a domain also own its subdomains—so links from cats.example.com or pets.example.com will also be categorized as internal links for www.example.com.

Links for www.google.comExternal linksInternal links
Previously categorized as...www.example.com/
www.example.org/stuff.html
scholar.google.com/
sketchup.google.com/
google.com/
www.google.com/
www.google.com/stuff.html
www.google.com/support/webmasters/
Now categorized as...www.example.com/
www.example.org/stuff.html
scholar.google.com/
sketchup.google.com/
google.com/
www.google.com/
www.google.com/stuff.html
www.google.com/support/webmasters/

If you own a site that’s on a subdomain (such as www..matrixar.com) or in a subfolder (www.google.com/support/webmasters/) and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links. We’ve made a few backend changes so that these numbers should be even more accurate for you.

Note that, if you own a root domain like example.com or www.example.com, your number of external links may appear to go down with this change; this is because, as described above, some of the URLs we were previously classifying as external links will have moved into the internal links report. Your total number of links (internal + external) should not be affected by this change.

As always, drop us a comment or join our Webmaster Help Forum if you have questions!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Enabling Public Service: All for Good and Google App Engine 2013

Seo Master present to you:

You may have seen the post on the Google Blog about All for Good, the new site that makes it easy to find and share volunteer activities within the United States. The site was built collaboratively by Google and several partners. We're especially proud that it was built using 100% open source Python code. All for Good's first release includes both gadgets and a free API, making it even easier for casual developers to build applications and embed All for Good listings in their own apps and sites.

If you're interested in seeing your software talents dedicated to community service, check out the API documentation, our Getting Started Guide and the complete source code for the core engine. We're especially looking forward to see what applications the developer community will create for mobile platforms and for Facebook. Happy Hacking!

by Guido van Rossum, Software Engineering Team2013, By: Seo Master

from web contents: Happy Halloween to our spooktacular webmasters! 2013

salam every one, this is a topic from google web master centrale blog:

With apologizes to Vic Mizzy, we've written short verse to the tune of the "Addams Family" theme (please use your imagination):

We may be hobbyists or just geeky,
Building websites and acting cheeky,
Javascript redirects we won't make sneaky,
Our webmaster fam-i-ly!

Happy Halloween everyone! Feel free to join the discussion and share your Halloween stories and costumes.


Magnum P.I., Punk Rocker, Rubik's Cube, Mr. T., and Rainbow Brite
a.k.a. Several members of our Webmaster Tools team: Dennis Geels, Jonathan Simon, Sean Harding, Nish Thakkar, and Amanda Camp


Panda and Lolcat
Or just Evan Tang and Matt Cutts?


7 Indexing Engineers and 1 Burrito


Cheese Wysz, Internet Repairman, Community Chest, Internet Pirate (don't tell the RIAA)
Helpful members of the Webmaster Help Group: Wysz, MattD, Nathan Johns (nathanj) , and Bergy


Count++
Webspam Engineer Shashi Thakur (in the same outfit he wore to Searchnomics)


Hawaiian Surfer Dude and Firefox
Members of Webmaster Central's communications team: Reid Yokoyama and Mariya Moeva


Napolean Dynamite and Raiderfan
Shyam Jayaraman (speaking at SES Chicago, hopefully doing the dance) and me
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: DNS Verification FTW 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Advanced

A few weeks ago, we introduced a new way of verifying site ownership, making it easy to share verified ownership of a site with another person. This week, we bring you another new way to verify. Verification by DNS record allows you to become a verified owner of an entire domain (and all of the sites within that domain) at once. It also provides an alternative way to verify for folks who struggle with the existing HTML file or meta tag methods.

I like to explain things by walking through an example, so let's try using the new verification method right now. For the sake of this example, we'll say I own the domain example.com. I have several websites under example.com, including http://www.example.com/, http://blog.example.com/ and http://beta.example.com/. I could individually verify ownership of each of those sites using the meta tag or HTML file method. But that means I'd need to go through the verification process three times, and if I wanted to add http://customers.example.com/, I'd need to do it a fourth time. DNS record verification gives me a better way!

First I'll add example.com to my account, either in Webmaster Tools or directly on the Verification Home page.


On the verification page, I select the "Add a DNS record" verification method, and follow the instructions to add the specified TXT record to my domain's DNS configuration.



When I click "Verify," Google will check for the TXT record, and if it's present, I'll be a verified owner of example.com and any associated websites and subdomains. Now I can use any of those sites in Webmaster Tools and other verification-enabled Google products without having to verify ownership of them individually.

If you try DNS record verification and it doesn't work right away, don't despair!


Sometimes DNS records take a while to make their way across the Internet, so Google may not see them immediately. Make sure you've added the record exactly as it’s shown on the verification page. We'll periodically check, and when we find the record we'll make you a verified owner without any further action from you.

DNS record verification isn't for everyone—if you don't understand DNS configuration, we recommend you continue to use the HTML file and meta tag methods. But for advanced users, this is a powerful new option for verifying ownership of your sites.

As always, please visit the Webmaster Help Forum if you have any questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Better geographic choices for webmasters 2013

salam every one, this is a topic from google web master centrale blog: Written by Amanda Camp, Webmaster Tools and Trystan Upstill, International Search Quality Team

Starting today Google Webmaster Tools helps you better control the country association of your content on a per-domain, per-subdomain, or per-directory level. The information you give us will help us determine how your site appears in our country-specific search results, and also improves our search results for geographic queries.

We currently only allow you to associate your site with a single country and location. If your site is relevant to an even more specific area, such as a particular state or region, feel free to tell us that. Or let us know if your site isn't relevant to any particular geographic location at all. If no information is entered in Webmaster Tools, we'll continue to make geographic associations largely based on the top-level domain (e.g. .co.uk or .ca) and the IP of the webserver from which the context was served.

For example, if we wanted to associate www.google.com with Hungary:


But you don't want www.google.com/webmasters/tools" associated with any country...


This feature is restricted for sites with a country code top level domain, as we'll always associate that site with the country domain. (For example, google.ru will always be the version of Google associated with Russia.)


Note that in the same way that Google may show your business address if you register your brick-and-mortar business with the Google Local Business Center, we may show the information that you give us publicly.

This feature was largely initiated by your feedback, so thanks for the great suggestion. Google is always committed towards helping more sites and users get better and more relevant results. This is a new step as we continue to think about how to improve searches around the world.

We encourage you to tell us what you think in the Webmaster Tools section of our discussion group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Fridaygram 2013

Seo Master present to you:
By Scott Knaster, Google Code Blog Editor

The idea of Google Web Fonts is one of those things that makes you say “of course!” once you hear about it. Google Web Fonts are stored remotely and loaded via HTTP for use on your web pages, so you don’t have to wonder about which fonts are installed on users’ machines. Using a web font is easy: add a <link rel="stylesheet"> tag to specify the font you want, then add styles to your CSS that use the font.

Earlier this week, the Web Fonts team launched an updated site with a three-step process for browsing and choosing fonts. It’s pretty simple:
  • In the Choose step, check out the available fonts and choose from among them by looking at whatever text you want, in any size. You can search for fonts by name, filter by category or thickness, or look for fonts that support specific scripts.
  • After you pick one or more fonts, use the Review step. This step includes a Test Drive feature to see your chosen fonts in a sample layout.
  • Finally, you’ll see a nice speedometer that shows you an estimate of loading time for your selected fonts, and you’ll also get the necessary code for adding the fonts to your pages.
From beautiful fonts to beautiful art: this week we announced that Google Goggles now knows all about the permanent collection of the J. Paul Getty Museum in Los Angeles. When you use Goggles on your phone to view a painting, you’ll get details and audio commentary about the work you’re looking at.

Finally, here’s a tale of danger in space: the crew of the International Space Station temporarily evacuated into docked capsules this week when a piece of space junk got a little too close to the station. That’s a story you don’t hear every day; in fact, the last time it happened was in 2009. Stay safe up there!

Fridaygram posts are lighter than our usual fare. They're designed for your Friday afternoon and weekend enjoyment. Each Fridaygram item must pass only one test: it has to be interesting to us nerds.

2013, By: Seo Master

from web contents: Introducing Page Speed Online, with mobile support 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: intermediate

At Google, we’re striving to make the whole web fast. As part of that effort, we’re launching a new web-based tool in Google Labs, Page Speed Online, which analyzes the performance of web pages and gives specific suggestions for making them faster. Page Speed Online is available from any browser, at any time. This allows website owners to get immediate access to Page Speed performance suggestions so they can make their pages faster.

In addition, we’ve added a new feature: the ability to get Page Speed suggestions customized for the mobile version of a page, specifically smartphones. Due to the relatively limited CPU capabilities of mobile devices, the high round-trip times of mobile networks, and rapid growth of mobile usage, understanding and optimizing for mobile performance is even more critical than for the desktop, so Page Speed Online now allows you to easily analyze and optimize your site for mobile performance. The mobile recommendations are tuned for the unique characteristics of mobile devices, and contain several best practices that go beyond the recommendations for desktop browsers, in order to create a faster mobile experience. New mobile-targeted best practices include eliminating uncacheable landing page redirects and reducing the amount of JavaScript parsed during the page load, two common issues that slow down mobile pages today.


Page Speed Online is powered by the same Page Speed SDK that powers the Chrome and Firefox extensions and webpagetest.org.

Please give Page Speed Online a try. We’re eager to hear your feedback on our mailing list and how you’re using it to optimize your site.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Penguin 4.0 Updated May 2013 2013

Seo Master present to you:

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. 
The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.
2013, By: Seo Master

seo How to Get a Best Scholarship 2013

Seo Master present to you:
After your first degree, there is also this need to acquire advanced knowledge, new skills to remain relevant in your chosen career or for the quest of trying out on a new career path.

Similarly, the desire to acquire university or technical qualifications and skills is growing in several countries of the world with the global vision of giving education to all. 


I explained what should be expected from you after your university career, though many people didn’t get these experiences while in school for one reason or the other.

Most people prefer to take up their further studies or professional development programs in another country other than their home country.

Consequently, several people lack the money to fund their education especially when they desire to study in a country with a currency value higher than their country currency value. This is evident in countries as United Kingdom, USA, Canada, Netherlands, etc. where international students pay higher fees for tuition, accommodation and maintenance. 


Most prospective students have resorted to seeking funding from organizations, universities and individuals in form of scholarship.

One great challenge about securing a scholarship is that too many applications chase a small fund. 

This article reveals some of the rules to adopt when applying scholarship to enhance the chances of your application being selected. 

Gather your admission and scholarship materials beforehand.
 
Most scholarship applicants do not know the application documents required by the school they intend to apply to. 

Different schools require different application materials depending on the country, course of study or type and mode of programme. 

Generally, the following documents may be required:
  • copy of Degree certificate usually certified/notarised copy in the case of scholarship application.
  • copy of transcript usually certified copy in the case of scholarship application.
  • Letter of motivation.
  • Two letters of Reference-professional and academic.
  • Fully completed scholarship application Form.
  • Letter of financial support or letter of sponsorship.
  • English Language certificate or other evidence of proficiency in English Language which is the language of instruction for course of study.

This requirement varies in countries where other international languages are used other than English Language such German, French, Japanese or Spanish. 

However, most universities would give a waiver on the English language requirement for countries whose one of the official languages is English language.

Start early to apply for admission into your chosen schools and programme(s).
 
When I was scouting for scholarship for my graduate programme, one mistake l made was to start my admission process late even though l thought I started early. 

l later discovered that the admission process takes at least three months or more depending on a range of factors. It could be delays in the postal delivery of the application documents or getting reference letters from your academic and professional referees, or getting your transcript(s) from your school. 

Therefore, if you want to start your programme in September for scholarship, it is good to start the application process from November of the preceding year of study. 


This will place you in position to be in possession of an offer of admission from the school, either conditional or otherwise, before the scholarship application deadline. 

The reason is that most universitiesand lnternational funding organizations have their scholarship deadline fixed between March and May annually. For example, the deadline for scholarship application for the University of Westminster is May 30 for September intake and November 30 for January intake. 

Take note of scholarship application deadline and work to beat it.
 
You should remember that any application for scholarship that did not beat the deadline will not be processed and you don't have any contractual agreement with the scholarship panel in this case as their decision is final. 

Most funding bodies do not permit sending the scholarship application documents through courier service fax or as email attachment. This could be a way of reducing the number of application to be received due to the available funding. 

Most scholarship applications are done online but you are expected to send the supporting documents through mail in some cases which would take some days to get to the destination or may be delayed due to transit problems. 

Construct a good letter of Support. 

The letter of support is expected to reveal to the scholarship panel why you need the scholarship, what amount of support you need-tuition, maintenance, or accommodation, the effort you have made in securing funding from other scholarship bodies or individual/company sponsorship and why you are the best candidate for the scholarship.

The letter of support should not write in a hurry. You should be clear, precise, sincere, persuasive and convincing in your writing. Remember that in most cases, you are not interviewed by the Panel. 

Your write-up represents you and one of the crucial parameters to be duly considered before awarding scholarship to you. 

Remember that scholarship opportunities are few and highly competitive. 

About two-thirds of prospective students need financial support in one form or the other which makes it very competitive to get funding for your programme. 

Most International funding Bodies and Universities receive large number of scholarship applications annually while they have few openings.

These Universities have devised means to reduce the number of applications by considering only applications that meet their criteria as usually stipulated in the application form. 

Therefore, you are expected to read the criteria for application very carefully considering the basic qualificationfor scholarship application for the programme, the number of years required as work experience, the language of instruction, the value and duration of the award. 

Most scholarships are partial while others are full. The partial scholarship may cover tuition (full tuition waiver) or accommodation only or maintenance only or a combination of the two above while full scholarships cover tuition, accommodation and maintenance. 
Do not apply for scholarship without an offer of admission. 

Almost all the Low Tuition Colleges and Universities require that you first secure admission into the programme of your choice before applying for scholarship.This is one of the reasons why you need to start early to apply for admission. 

Most applicants spend time running after funding opportunities without a corresponding letter of an offer of admission only to regret hereafter.

The scholarship awarding body wants to be sure that you would take up the offer of scholarship when awarded to you and therefore, would see evidence of acceptance into the school of your proposed programme of study. 

I wish you the best in your career ambitions.
2013, By: Seo Master
Powered by Blogger.