Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo How to create Robot.text and htaccess file. 2013

Seo Master present to you:

Use of  .htaccess file

The .htaccess file can be used on Apache servers running Linux or Unix to increase your web site security, and customize the way your web site behaves. The main uses of the .htaccess files are to redirect visitors to custom error pages, stop directory listings, ban robots gathering email addresses for spam, ban visitors from certain countries and IP addresses, block visitors from bad referring (warez) sites, protect your web site from hot linking images and bandwidth theft, redirect visitors from a requested page to a new web page, and to password protect directories. Use the information in this article as a starting point to optimize and protect your web site.
More details in htaccess


How to create  robots.txt File

The robots.txt file is a simple ASCII text file used to indicate web site files and directories that should not be indexed. Many webmasters choose not to revise their robots.txt file because they are uncertain how the changes could impact their rankings. However, a poorly written robots.txt file can cause your complete web site to be indexed, gathering information like passwords, email addresses, hidden links, membership areas, and confidential files. A poorly written robots.txt file could also cause portions of your web site to be ignored by the search engines.

More details in Robot.txt2013, By: Seo Master

seo Differences between robot.txt vs .htaccess 2013

Seo Master present to you:
The robots.txt file consists of directives to search engine spiders (robots) as to what files and folders you want or do not want to be indexed. However, this will not necessarily prevent spiders from following links into those folders and there are some spiders that do not respect the robots.txt file (all of the major search engines do but there are still quite a few unscrupulous bots to worry about). Additionally, the use of robots.txt directives does not prevent human visitors from accessing those folders and directories if they know they are there (or if they're just hacking their way in via guesses, e.g., looking for index.html or index.php files).

Depending on how you set it up, the .htaccess file, in contrast, actually blocks access to certain files or folders. This applies to both human visitors and bots.
2013, By: Seo Master

seo Get a quality product for your teeth 2013

Seo Master present to you:
If you really care about your own health then you need to be vigilant in choosing products for your body. One part of your self is the teeth. This is a very sensitive, therefore you need to keep your teeth properly. Never think that taking care of teeth is the hardest thing because you definitely do not want to feel a toothache. If you want to feel your teeth stay healthy then you need to give teeth whitening for your teeth. You would be surprised to see drastic changes after you use this quality product for your teeth. You will see your teeth look white and shining.

Therefore, what are you waiting for? Now you have a good recipe to whiten your teeth and you will definitely be satisfied with the results. Furthermore this product is sold at a light so you do not need a lot of money to buy it. In addition, Polanight is also available for you. Imagine using only quality products are then you'll find healthy teeth at any time. You only need to read the rules carefully and then use you perform maintenance in accordance with the instructions after that you will get a healthy tooth. You do not have to pay a premium for doing this because you can do it yourself without paying a dentist.
2013, By: Seo Master

seo Top 20 Free Image Editing Software List With Downloading Link-2012 2013

Seo Master present to you:

This list of top 20 best image editing software programs is newly updated and is as per the number of total downloads, best reviews provided by editors and common users.



All These top 20 listed photo editing software are available as 100% free to all users! Other paid and premium photo editing software programs haven’t included here in this list.



1. PhotoScape – Free download PhotoScape



2. Irfan View - Download IrfanView Free



3. Paint.NET - Free Download Paint.NET



4. Photo Pos Pro - Download Photo Pos Pos Free



5. GIMP - Download Free GIMP Photo Editing Program


6. Fast Stone Image Viewer - Download Fast Stone Image Viewer


7. Photobie - Download Photobie Digital Image Editor


8. Pixia - Download Pixia Digital Photo Editor


9. InkScape - Free Download Inkscape Photo editor


10. Xn View - Free Download XnView


11. Photo Plus 6 - Free Download Photo Plus 6


12. Picasa 3 - Free Download Picasa 3


13. ColorPic 4.1 - Free Download ColorPic 4.1


14. PaintStar 2.70 - Free Download PaintStar 2.70


15. 5DFly 3.62.3 - Free Download 5DFly 3.62.3


16. Pinta 1.0 - Free Download Pinta 1.0


17. DigiKam 1.2.0 - Free Download DigiKam 1.2.0


18. Visual LightBox v4.8 - Free Download Visual LightBox v4.8


19. Stoik Imagic 5.06 - Free Download Stoik Imagic 5.06


20. KolourPaint 4.0 – Free Download KolourPaint 4.0



Please leave comment about recently launched free picture editing software. i will add that to above list.


.........................................................................................................
2013, By: Seo Master

from web contents: Microdata support for Rich Snippets 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

HTML5 is the fifth major revision of HTML, the core language of the World Wide Web. The HTML5 specification includes a description of microdata, a new markup standard for specifying structured information within web pages.

Today, we’re happy to announce support for microdata for use in rich snippets in addition to our existing support for microformats and RDFa. By using microdata markup in your web pages, you can specify reviews, people profiles, or events information on your web pages that Google may use to improve the presentation of your pages in Google search results.

Here is a simple HTML block showing a section of a review of “L’Amourita Pizza”:


Here is the same HTML with microdata added to specify the restaurant being reviewed, the author and date of the review, and the rating:


Microdata has the nice property of balancing richness with simplicity. As you can see, it’s easy to add markup to your pages using a few HTML attributes like itemscope (to define a new item), itemtype (to specify the type of item being described), and itemprop (to specify a property of that item). Once you’ve added markup to a page, you can test it using the rich snippets testing tool to make sure that Google can parse the data on your page.

As with microformats and RDFa, the vocabulary that we support -- including which item types and item properties are understood by Google -- is specified in our rich snippets documentation as well as on data-vocabulary.org. Marking up your content does not guarantee that rich snippets will show for your site; Google will expand the use of microdata markup gradually to ensure a great user experience.

To get started, here are some helpful links:

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: What a feeling! Even better indexing of SWF content 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

We often get questions from webmasters about how we index content designed for Flash Player, so we wanted to take a moment to update you on some of our latest progress.

About two years ago we announced that through a collaboration with Adobe we had significantly improved Google’s capability to index Flash technology based content. Last year we followed up with an announcement that we had added external resource loading to our SWF indexing capabilities. This work has allowed us to index all kinds of textual content in SWF files, from Flash buttons and menus to self-contained Flash technology based websites. Currently almost any text a user can see as they interact with a SWF file on your site can be indexed by Googlebot and used to generate a snippet or match query terms in Google searches. Additionally, Googlebot can also discover URLs in SWF files and follow those links, so if your SWF content contains links to pages inside your website, Google may be able to crawl and index those pages as well.

Last month we expanded our SWF indexing capabilities thanks to our continued collaboration with Adobe and a new library that is more robust and compatible with features supported by Flash Player 10.1. Additionally, thanks to improvements in the way we handle JavaScript, we are also now significantly better at recognizing and indexing sites that use JavaScript to embed SWF content. Finally, we have made improvements in our video indexing technology, resulting in better detection of when a page has a video and better extraction of metadata such as alternate thumbnails from Flash technology based videos. All in all, our SWF indexing technology now allows us to see content from SWF files on hundreds of millions of pages across the web.

While we’ve made great progress indexing SWF content over the past few years, we’re not done yet. We are continuing to work on our ability to index deep linking (content within a Flash technology based application that is linked to from the same application) as well as further improving indexing of SWF files executed through JavaScript. You can help us improve these capabilities by creating unique links for each page that is linked from within a single Flash object and by submitting a Sitemap through Google Webmaster Tools.

We’re excited about the progress we’ve made so far and we look forward to keeping you updated about further progress.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Google Technology User Groups 2013

Seo Master present to you: My favorite part about Google I/O is the dozens of interesting conversations with developers -- getting a first-hand look at the different things that they are doing with our technologies. That's the spirit of the Google Technology User Groups -- regular meetups where local developers can get together to network and discuss, demo, and hack on Google's many developer offerings.

From lightning talks in Mountain View, to App Engine hackathons in Tokyo, to lectures in Berlin, the GTUGs are a great place to meet fellow developers and learn (or teach) something new.

At Google I/O, there were many folks eager to bring the spirit of the conference back to their hometowns by starting up GTUGs of their own. Since the conference ended, our list of current GTUGs has grown to include this 'baby boomer' generation of chapters. The following are all new groups looking for members and starting to set up their first events.

If there's one near you, check it out! Let the organizers know you're interested; suggest topics for discussion and even offer to do a talk about your own experiences.

Europe

Paris GTUG - http://groups.google.com/group/paris-gtug
Hamburg GTUG - http://www.hamburg-gtug.org
GTUG Munich - http://gtug-muc.org
Istanbul GTUG - http://www.istanbul-gtug.org/
Polish GTUG - http://www.gtug.pl

North America

Tri-Valley California GTUG - http://groups.google.com/group/tv-gtug
Berkeley GTUG - http://www.meetup.com/Berkeley-GTUG/
San Diego GTUG - http://www.meetup.com/sd-gtug/
NYC GTUG - http://sites.google.com/site/nycgtug
New Jersey GTUG - http://nj-gtug.org/
Philly/Delaware GTUG - http://sites.google.com/site/phillygtug/
Boston GTUG - http://groups.google.com/group/boston-gtug
Denver GTUG - http://groups.google.com/group/denver-gtug
Twin Cities GTUG - tc-gtug.org
Austin GTUG - http://sites.google.com/site/austingtug/
Michigan GTUG - http://groups.google.com/group/mi-gtug
Utah GTUG - http://utahgtug.blogspot.com/
Laguna GTUG - www.laguna-gtug.org
Quebec GTUG - http://groups.google.com/group/gtug-quebec/?pli=1

South America
Chile GTUG - http://groups.google.com/group/gtug-cl
Argentina GTUG - http://groups.google.com/group/gtug-ar

Asia
Kuala Lumpur GTUG - http://sites.google.com/site/gtugkl/
Hyderabad GTUG - http://sites.google.com/site/hydgtug/

Also a big shout-out to our existing chapters:

Silicon Valley GTUG - http://www.meetup.com/sv-gtug (watch the organizers, Kevin and Van, talk about GTUGs at Google I/O)
Pune GTUG - http://pune-gtug.blogspot.com/
Chico GTUG http://www.chico-gtug.org
Berlin GTUG - http://www.berlin-gtug.org
Tokyo GTUG - http://tokyo-gtug.org/


View GTUGs in a larger map

Don't see a chapter near you? Start one! Join our GTUG managers mailing list. Other info at gtugs.org.

2013, By: Seo Master

from web contents: Google Videos best practices 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

We'd like to highlight three best practices that address some of the most common problems found when crawling and indexing video content. These best practices include ensuring your video URLs are crawlable, stating what countries your videos may be played in, and that if your videos are removed, you clearly indicate this state to search engines.

  • Best Practice 1: Verify your video URLs are crawlable: check your robots.txt
    • Sometimes publishers unknowingly include video URLs in their Sitemap that are robots.txt disallowed. Please make sure your robots.txt file isn't blocking any of the URLs specified in your Sitemap. This includes URLs for the:
      • Playpage
      • Content and player
      • Thumbnail
      More information about robots.txt.

  • Best Practice 2: Tell us what countries the video may be played in
    • Is your video only available in some locales? The optional attribute “restriction” has recently been added (documentation at http://www.google.com/support/webmasters/bin/answer.py?answer=80472), which you can use to tell us whether the video can only be played in certain territories. Using this tag, you have the option of either including a list of all countries where it can be played, or just telling us the countries where it can't be played. If your videos can be played everywhere, then you don't need to include this.

  • Best Practice 3: Indicate clearly when videos are removed -- protect the user experience
    • Sometimes publishers take videos down but don't signal to search engines that they've done so. This can result in the search engine's index not accurately reflecting content of the web. Then when users click on a search result, they're taken to a page either indicating that the video doesn't exist, or to a different video. Users find this experience dissatisfying. Although we have mechanisms to detect when search results are no longer available, we strongly encourage following community standards.

      To signal that a video has been removed,
      1. Return a 404 (Not found) HTTP response code, you can still return a helpful page to be displayed to your users. Check out these guidelines for creating useful 404 pages.
      2. Indicate expiration dates for each video listed in a Video Sitemap (use the <video:expiration_date> element) or mRSS feed (<dcterms:valid> tag) submitted to Google.
For more information on Google Videos please visit our Help Center, and to post questions and search answers check out our Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Musings on Down Under 2013

salam every one, this is a topic from google web master centrale blog:

Earlier this year, a bunch of Googlers (Maile, Peeyush, Dan, Adam and I) bunged ourselves across the equator and headed to Sydney, so we could show our users and webmasters that just because you're "down under" doesn't mean you're under our radar. We had a great time getting to know folks at our Sydney office, and an even greater time meeting and chatting with all the people attending Search Summit and Search Engine Room. What makes those 12-hour flights worthwhile is getting the chance to inform and be informed about the issues important to the webmaster community.

One of the questions we heard quite frequently: Should we as webmasters/SEOs/SEMs/users be worried about personalized search?

Our answer: a resounding NO! Personalized search takes each user's search behavior, and subtly tunes the search results to better match their interests over time. For a user, this means that even if you're a lone entomologist in a sea of sports fans, you'll always get the results most relevant to you for the query "cricket". For the webmaster, it allows niche markets that collide on the same search terms to disambiguate themselves based on individual user preferences, and this really presents a tremendous opportunity for visibility. Also, to put things in perspective, search engines have been moving towards some degree of personalization for years; for example, providing country/language specific results is already a form of personalization, just at a coarser granularity. Making it more fine-grained is the logical next step, and helps level the playing field for smaller niche websites which now have a chance to rank well for users that want their content the most.

Another question that popped up a lot: I'm moving my site from domain X to Y. How do I make sure all my hard-earned reputation carries over?

Here are the important bits to think about:
  • For each page on domain X, have it 301-redirect to the corresponding page on Y. (How? Typically through .htaccess, but check with your hosting provider).
  • You might want to stagger the move, and redirect sub-sections of your site over time. This gives you the chance to keep an eye on the effects, and also gives search engines' crawl/indexing pipelines time to cover the space of redirected URLs.
  • http://www.google.com/webmasters is your friend. Keep an eye on it during the transition to make sure that the redirects are having the effect you want.
  • Give it time. How quickly the transition is reflected in the results depends on how quickly we recrawl your site and see those redirects, which depends on a lot of factors including the current reputation of your site's pages.
  • Don't forget to update your Sitemap. (You are using Sitemaps, aren't you?)
  • If possible, don't substantially change the content of your pages at the same time you make the move. Otherwise, it will be difficult to tell if ranking changes are due to the change of content or incorrectly implemented redirects.
Before we sign off, we wanted to shout-out to a couple of the folks at the Sydney office: Lars (one of the original Google Maps guys) gets accolades from all of us jetlagged migrants for donating his awesome Italian espresso machine to the office. And Deepak, thanks for all your tips on what to see and do around Sydney.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help us help you 2013

salam every one, this is a topic from google web master centrale blog:
You're a webmaster, right? Well, we love webmasters! To ensure we give you the best support possible, we've set up a survey to get your thoughts on Webmaster Central and our related support efforts. If you have a few extra minutes this week, please click here to give us your honest feedback.

Thanks from all of us on the Webmaster Central Team.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo DevFest Tour - Coming to a City Near You 2013

Seo Master present to you:

Just last month we had Google I/O and although we had attendees from all over the world join us for the festivities, we know that most of you could not join us in San Francisco. To help make up for that, we decided to do a DevFest tour and recently announced that we were on our way to visit Australia (Sydney), Israel, and Southeast Asia (Manila, Singapore, and Kuala Lumpur) in the next couple months. Then, there’s Spain (Madrid), Argentina (Buenos Aires), and Chile (Santiago) in October. Here’s your chance to hear about your favorite Google technologies and interact with the Googlers that work on them every day.

Today, we’ve updated the site to include the cities we’re visiting and topics we’d like to cover, along with registration links for the first round of events. Space is limited at each location and we cannot guarantee that everyone will be able to secure a spot so register early, check your email for confirmation and check back for any event updates.

For many of our international speakers, this is their first time visiting most of the cities on our tour, and they're incredibly excited to meet the local developer communities and learn what you're doing with our technologies - or what you're thinking of doing.

We hope to see you there!

2013, By: Seo Master

seo Updated Themes API for iGoogle 2013

Seo Master present to you:

We are excited to open up an updated Themes API for developers to customize new features coming to iGoogle. Features you can modify include the left navigation and UI updates introduced in the iGoogle developer sandbox in April, as well as the chat feature that was released to the sandbox last week. If you have already created one of the 800 themes in the iGoogle directory, make sure to update your theme with the latest attributes and resubmit.

You can more details in the updated developer's guide. We're hoping these feature additions will allow for developers to customize even more of iGoogle.

As always, questions and feedback are welcome in the Google Themes API group.2013, By: Seo Master

from web contents: It's 404 week at Webmaster Central 2013

salam every one, this is a topic from google web master centrale blog: This week we're publishing several blog posts dedicated to helping you with one response code: 404.

Response codes are a numeric status (like 200 for "OK", 301 for "Moved Permanently") that a webserver returns in response to a request for a URL. The 404 response code should be returned for a file "Not Found".

When a user sends a request for your webpage, your webserver looks for the corresponding file for the URL. If a file exists, your webserver likely responds with a 200 response code along with a message (often the content of the page, such as the HTML).

200 response code flow chart


So what's a 404? Let's say that in the link to "Visit Google Apps" above, the link is broken because of a typing error when coding the page. Now when a user clicks "Visit Google Apps", the particular webpage/file isn't located by the webserver. The webserver should return a 404 response code, meaning "Not Found".

404 response code flow chart


Now that we're all on board with the basics of 404s, stay tuned 4 even more information on making 404s good 4 users and 4 search engines.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Google Friend Connect introduces the social bar 2013

salam every one, this is a topic from google web master centrale blog:

Update: The described product or service is no longer available.

In our previous Google Friend Connect posts, we've enjoyed connecting with you, the webmasters, and hearing your feedback about Friend Connect. We're now standing on our own two feet -- find us over at the new Social Web Blog where we just announced the new social bar feature.

The social bar packages many of the basic social functions -- sign-in, site activities, comments, and members -- into a single strip that appears at the top or bottom of your website. You can use it alone, or use it to complement your existing social gadgets, by putting it on the top or bottom of as many of your webpages as you want.

For anyone visiting your site, the social bar offers a snapshot of the activity taking place within your website's community. One click on any these features produces a convenient, interactive drop-down gadget, so users get all the functionality of the Friend Connect gadgets, while you save real estate on your website. With the social bar, visitors can:
  • Join or sign in to your site, view and edit their profiles, and change their personal settings.
  • View recent activity on your website, including new members and posts on any of your pages.
  • Post on your wall or read and reply to others' comments.
  • See the other members of your site, check out other peoples' profiles, and become friends. Users can also find out if any of their existing friends are members of your site.
Watch this quick video to learn how easy it is to add a social bar to your website:


To try out the social bar before deciding whether to add it to your website, visit:
http://www.ossamples.com/socialmussie/

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: High-quality sites algorithm goes global, incorporates user feedback 2013

salam every one, this is a topic from google web master centrale blog: Over a month ago we introduced an algorithmic improvement designed to help people
find more high-quality sites in search. Since then we’ve gotten a lot of positive responses about the change: searchers are finding better results, and many great publishers are getting more traffic.

Today we’ve rolled out this improvement globally to all English-language Google users, and we’ve also incorporated new user feedback signals to help people find better search results. In some high-confidence situations, we are beginning to incorporate data about the sites that users block into our algorithms. In addition, this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before. The impact of these new signals is smaller in scope than the original change: about 2% of U.S. queries are affected by a reasonable amount, compared with almost 12% of U.S. queries for the original change.

Based on our testing, we’ve found the algorithm is very accurate at detecting site quality. If you believe your site is high-quality and has been impacted by this change, we encourage you to evaluate the different aspects of your site extensively. Google's quality guidelines provide helpful information about how to improve your site. As sites change, our algorithmic rankings will update to reflect that. In addition, you’re welcome to post in our Webmaster Help Forums. While we aren’t making any manual exceptions, we will consider this feedback as we continue to refine our algorithms.

We will continue testing and refining the change before expanding to additional languages, and we’ll be sure to post an update when we have more to share.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New tools for Google Services for Websites 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All
(A nearly duplicate version :) cross-posted on the Official Google Blog)

Earlier this year, we launched Google Services for Websites, a program that helps partners, e.g., web hoster and access providers, offer useful and powerful tools to their customers. By making services, such as Webmaster Tools, Custom Search, Site Search and AdSense, easily accessible via the hoster control panel, hosters can easily enable these services for their webmasters. The tools help website owners understand search performance, improve user retention and monetize their content — in other words, run more effective websites.

Since we launched the program, several hosting platforms have enhanced their offerings by integrating with the appropriate APIs. Webmasters can configure accounts, submit Sitemaps with Webmaster Tools, create Custom Search Boxes for their sites and monetize their content with AdSense, all with a few clicks at their hoster control panel. More partners are in the process of implementing these enhancements.

We've just added new tools to the suite:
  • Web Elements allows your customers to enhance their websites with the ease of cut-and-paste. Webmasters can provide maps, real-time news, calendars, presentations, spreadsheets and YouTube videos on their sites. With the Conversation Element, websites can create more engagement with their communities. The Custom Search Element provides inline search over your own site (or others you specify) without having to write any code and various options to customize further.
  • Page Speed allows webmasters to measure the performance of their websites. Snappier websites help users find things faster; the recommendations from these latency tools allow hosters and webmasters to optimize website speed. These techniques can help hosters reduce resource use and optimize network bandwidth.
  • The Tips for Hosters page offers a set of tips for hosters for creating a richer website hosting platform. Hosters can improve the convenience and accessibility of tools, while at the same time saving platform costs and earning referral fees. Tips include the use of analytics tools such as Google Analytics to help webmasters understand their traffic and linguistic tools such as Google Translate to help websites reach a broader audience.
If you're a hoster and would like to participate in the Google Services for Websites program, please apply here. You'll have to integrate with the service APIs before these services can be made available to your customers, so the earlier you start that process, the better.

And if your hosting service doesn't have Google Services for Websites yet, send them to this page. Once they become a partner, you can quickly configure the services you want at your hoster's control panel (without having to come to Google).

As always, we'd love to get feedback on how the program is working for you, and what improvements you'd like to see.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Crawling through HTML forms 2013

salam every one, this is a topic from google web master centrale blog:

Google is constantly trying new ideas to improve our coverage of the web. We already do some pretty smart things like scanning JavaScript and Flash to discover links to new web pages, and today, we would like to talk about another new technology we've started experimenting with recently.

In the past few months we have been exploring some HTML forms to try to discover new web pages and URLs that we otherwise couldn't find and index for users who search on Google. Specifically, when we encounter a <FORM> element on a high-quality site, we might choose to do a small number of queries using the form. For text boxes, our computers automatically choose words from the site that has the form; for select menus, check boxes, and radio buttons on the form, we choose from among the values of the HTML. Having chosen the values for each input, we generate and then try to crawl URLs that correspond to a possible query a user may have made. If we ascertain that the web page resulting from our query is valid, interesting, and includes content not in our index, we may include it in our index much as we would include any other web page.

Needless to say, this experiment follows good Internet citizenry practices. Only a small number of particularly useful sites receive this treatment, and our crawl agent, the ever-friendly Googlebot, always adheres to robots.txt, nofollow, and noindex directives. That means that if a search form is forbidden in robots.txt, we won't crawl any of the URLs that a form would generate.  Similarly, we only retrieve GET forms and avoid forms that require any kind of user information. For example, we omit any forms that have a password input or that use terms commonly associated with personal information such as logins, userids, contacts, etc. We are also mindful of the impact we can have on web sites and limit ourselves to a very small number of fetches for a given site.

The web pages we discover in our enhanced crawl do not come at the expense of regular web pages that are already part of the crawl, so this change doesn't reduce PageRank for your other pages. As such it should only increase the exposure of your site in Google. This change also does not affect the crawling, ranking, or selection of other web pages in any significant way.

This experiment is part of Google's broader effort to increase its coverage of the web. In fact, HTML forms have long been thought to be the gateway to large volumes of data beyond the normal scope of search engines. The terms Deep Web, Hidden Web, or Invisible Web have been used collectively to refer to such content that has so far been invisible to search engine users. By crawling using HTML forms (and abiding by robots.txt), we are able to lead search engine users to documents that would otherwise not be easily found in search engines, and provide webmasters and users alike with a better and more comprehensive search experience.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Analytics - Another tool for webmasters 2013

salam every one, this is a topic from google web master centrale blog: Webmaster tools from Google are indispensable for people who optimize their site for indexing in Google. Eighteen months ago, Google launched another free tool for webmasters - Google Analytics - which tells you about your visitors and the traffic patterns to your site using a JavaScript code snippet to execute tracking and reporting. This past Tuesday, Google Analytics launched a new version, with an easier-to-use interface that has more intuitive navigation and greater visibility for important metrics. We also introduced some collaboration and customization features such as email reports and custom dashboards.

But we wanted to highlight some of the webmaster-specific metrics within Google Analytics for our regular readers, since it offers a lot of easily-accessible information that will enrich the work you're doing.

For instance, do you know how many visitors to your site are using IE versus Firefox? And even further, how many of those IE or Firefox users are converting on a goal you have set up? Google Analytics can tell you information like this so you can prepare and tailor your website for your audience. Then, when you are designing, you can prioritize your testing to make sure that the site works on the most popular browsers and operating systems first.



Are your visitors using Java-enabled browsers? What version of Flash are the majority of your visitors using? What connection speed do they have? If you find that lots of visitors are using a dial-up service, you're going to want to put in some more effort to streamline the load time of images on your site, for example.

Plus, Google Analytics will make your company's marketing division very happy. It reports on the most effective search keywords, the most popular referring sources and the geographic location of visitors, as well as the performance of banner ads, PPC keyword campaigns, and email newsletters. If you haven't tried Google Analytics, watch the Flash tour of the product or set up a free account now and see statistics on your visitors and the traffic to your site within three hours.

Posted by Jeff Gillis, Google Analytics Teamthis is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Got a website? Get gadgets. 2013

salam every one, this is a topic from google web master centrale blog: Google Gadgets are miniature-sized devices that offer cool and dynamic content -- they can be games, news clips, weather reports, maps, or most anything you can dream up. They've been around for a while, but their reach got a lot broader last week when we made it possible for anyone to add gadgets to their own webpages. Here's an example of a flight status tracker, for instance, that can be placed on any page on the web for free.

Anyone can search for gadgets to add to their own webpage for free in our directory of gadgets for your webpage. To put a gadget on your page, just pick the gadget you like, set your preferences, and copy-and-paste the HTML that is generated for you onto your own page.

Creating gadgets for others isn't hard, either, and it can be a great way to get your content in front of people while they're visiting Google or other sites. Here are a few suggestions for distributing your own content on the Google homepage or other pages across the web:

* Create a Google Gadget for distribution across the web. Gadgets can be most anything, from simple HTML to complex applications. It’s easy to experiment with gadgets – anyone with even a little bit of web design experience can make a simple one (even me!), and more advanced programmers can create really snazzy, complex ones. But remember, it’s also quick and easy for people to delete gadgets or add new ones too their own pages. To help you make sure your gadget will be popular across the web, we provide a few guidelines you can use to create gadgets. The more often folks find your content to be useful, the longer they'll keep your gadget on their pages, and the more often they’ll visit your site.

* If your website has a feed, visitors can put snippets of your content on their own Google homepages quickly and easily, and you don't even need to develop a gadget. However, you will be able to customize their experience much more fully with a gadget than with a feed.

* By putting the “Add to Google” button in a prominent spot on your site, you can increase the reach of your content, because visitors who click to add your gadget or feed to Google can see your content each time they visit the Google homepage. Promoting your own gadget or feed can also increase its popularity, which contributes to a higher ranking in the directory of gadgets for the Google personalized homepage.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools keeps your "messages waiting" 2013

salam every one, this is a topic from google web master centrale blog:

We’re happy to announce that the Message Center supports a new “messages waiting” feature. Previously, it could only store penalty notifications for existing verified site owners (webmasters who had already verified their sites). However, the Message Center now has the ability to keep these waiting for future owners, i.e. those who haven’t previously registered with Google's Webmaster Tools.

Creating a new Webmaster Tools account and verifying your site gives you access to any message from Google concerning violations of our Webmaster Guidelines. Messages sent after the launch of this feature can now be retrieved for one year and will remain in your account until you choose to delete them.

Some questions you might be asking:

Q: What happens to old messages when a site changes ownership?
A: Also in the case of a change of ownership, new verified owners will be able to retrieve a message as noted above.

Q: If a site has more than one verified owner and one of them deletes a message, will it be deleted for all the other site owners as well?
A: No, each owner gets his or her own copy of the message when retrieving the message. Deleting one does not affect any past, current, or future message retrievals.

Just as before, if you've received a message alerting you to Webmaster Guidelines violations, you can make the necessary changes so that your site is in line with our guidelines. Then, sign in to Webmaster Tools and file a reconsideration request.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo How To Make The Matrix Falling Code Effect 2013

Seo Master present to you:

Hello friends, here i will show you how to make the matrix falling code effect in your computer. Its just a batch file with random numbers that repeat themselves one after the other. Try it.




Open Notepad and copy below code And paste it.

Copy This Code








@echo off




color 0a




:top




echo %1%random%2%random%3%random%4%random%7%random%8%random%9%andom%entertainment%random%




goto top





After Pasting,save it like as yourname.bat
Run it and Enjoy!

 If u want to change random number's or letter's, just replace letter or number between % &%, and type "random% " after each word like "%1%random%2%random%3%random%4%random%".
Replace red color with suitable letters or numbers.

If you want to replace  letter color,copy above code find "color 0a"(CTRL+F). Replace red color with suitable color. List below

0 = Black              
1 = Blue
2 = Green
4 = Red
5 = Purple
6 = Yellow
7 = White
8 = Gray
9 = Light Blue
A= Light Green
B= Light Aqua
C= Light Red
D= Light Purple
E= Light Yellow
F= Bright White


* Save it only like .bat file .
2013, By: Seo Master

from web contents: Introducing the Google Webmaster Team 2013

salam every one, this is a topic from google web master centrale blog:

We’re pleased to introduce the Google Webmaster Team as contributors to the Webmaster Central Blog. As the team responsible for tens of thousands of Google’s informational web pages, they’re here to offer tips and advice based on their experiences as hands-on webmasters.

Back in the 1990s, anyone who maintained a website called themselves a “webmaster” regardless of whether they were a designer, developer, author, system administrator, or someone who had just stumbled across GeoCities and created their first web page. As the technologies changed over the years, so did the roles and skills of those managing websites.

Around 20 years after the word was first used, we still refer to ourselves as the Google Webmaster Team because it’s the only term that really covers the wide variety of roles that we have on our team. Although most of us have solid knowledge of HTML, CSS, JavaScript and other web technologies, we also have specialists in design, development, user experience, information architecture, system administration, and project management.


Part of the Google Webmaster Team, Mountain View

In contrast to the Google Webmaster Central Team—which mainly focuses on helping webmasters outside of Google understand web search and how things like crawling and indexing affect their sites—our team is responsible for designing, implementing, optimizing and maintaining Google’s corporate pages, informational product pages, landing pages for marketing campaigns, and our error page. Our team also develops internal tools to increase our productivity and help to maintain the thousands of HTML pages that we own.

We’re working hard to follow, challenge and evolve best practices and web standards to ensure that all our new pages are produced to the highest quality and provide the best user experience, and we’re constantly evaluating and updating our legacy pages to ensure their deprecated HTML isn’t just left to rot.

We want to share our work and experiences with other webmasters, so we recently launched our @GoogleWebTeam account on Twitter to keep our followers updated on the latest news about our projects, web standards, and anything else which may be of interest to other webmasters, web designers and web developers. We’ll be posting here on the Webmaster Central Blog when we want to share anything longer than 140 characters.

Before we share more details about our processes and experiences, please let us know if there’s anything you’d like us to specifically cover by leaving a comment here or by tweeting @GoogleWebTeam.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Race across screens and platforms, powered by the mobile web 2013

Seo Master present to you: Author PictureBy Pete LePage, Chromium team

Cross-posted with the Chromium Blog

You may have seen our recent demo of Racer at Google I/O, and wondered how it was made. So today we wanted to share some of the web technologies that made this Chrome Experiment “street-legal” in a couple of months. Racer was built to show what’s possible on today’s mobile devices using an entirely in-browser experience. The goal was to create a touch-enabled experience that plays out across multiple screens (and speakers). Because it was built for the web, it doesn’t matter if you have a phone or a tablet running Android or iOS, everyone can jump right in and play.
   
Racer required two things: speedy pings and a consistent experience across screens. We delivered our minimal 2D vector drawings to each device’s HTML5 Canvas using the Paper.js vector library. Paper.js can handle the path math for our custom race track shapes without getting lapped. To eke out all the frame rate possible on such a large array of devices we rendered the cars as image sprites on a DOM layer above the Canvas, while positioning and rotating them using CSS3’s transform: matrix().

Racer’s sound experience is shared across multiple devices using the Web Audio API (available in latest iOS and Android M28 beta). Each device plays one slice of Giorgio Moroder’s symphony of sound—requiring five devices at once to hear his full composition. A constant ping from the server helps synchronize all device speakers allowing them to bump to one solid beat. Not only is the soundtrack divided across devices, it also reacts to each driver’s movements in real time—the accelerating, coasting, careening, and colliding rebalances the presence of every instrument.

To sync your phones or tablets, we use WebSockets, which enables rapid two-way communication between devices via a central server. WebSocket technology is just the start of what multi-device apps of the future might use. We’re looking forward to when WebRTC data channels—the next generation of speedy Web communication—is supported in the stable channel of Chrome for mobile. Then we’ll be able to deliver experiences like Racer with even lower ping times and without bouncing messages via a central server. Racer’s backend was built on the Google Cloud Platform using the same structure and web tools as another recent Chrome Experiment, Roll It.

To get an even more detailed peek under the hood, we just published two new case studies on our HTML5 Rocks site. Our friends at Plan8 wrote one about the sound engineering and Active Theory wrote one about the front-end build. You can also join the team at Active Theory for a Google Developers Live event this Thursday, June 13, 2013 at 3pm GMT for an in depth discussion.

Pete LePage is a Developer Advocate on the Google Chrome team and helps developers create great web applications and mobile web experiences.

Posted by Ashleigh Rentz, Editor Emerita
2013, By: Seo Master

seo Google BigQuery new features: bigger, faster, smarter 2013

Seo Master present to you: Author PictureBy Felipe Hoffa, Cloud Platform team

Google BigQuery is designed to make it easy to analyze large amounts of data quickly. Today we announced several updates that give BigQuery the ability to handle arbitrarily large result sets, use window functions for advanced analytics, and cache query results. You are also getting new UI features, larger interactive quotas, and a new convenient tiered pricing scheme. In this post we'll dig further into the technical details of these new features.

Large results

BigQuery is able to process terabytes of data, but until today BigQuery could only output up to 128 MB of compressed data per query. Many of you asked for more and from now on BigQuery will be able to output results as large as the largest tables our customers have ever had.

To get this benefit, you should enable the new "--allow_large_results" flag when issuing a query job, and specify a destination table. All results will be saved to the new specified table (or appended, if the table exists). In the updated web UI these options can be found under the new "Enable Options" menu.

With this feature, you can run big transformations on your tables, plus get big subsets of data to further analyze from the new table.

Analytic functions

BigQuery's power is in the ability to interactively run aggregate queries over terabytes of data, but sometimes counts and averages are not enough. That's why BigQuery also lets you calculate quantiles, variance and standard deviation, as well as other advanced functions.

To make BigQuery even more powerful, today we are adding support for window functions (also known as "analytical functions") for ranking, percentiles, and relative row navigation. These new functions give you different ways to rank results, explore distributions and percentiles, and traverse results without the need for a self join.

To introduce these functions with an advanced example, let's use the dataset we collected from the Data Sensing Lab at Google I/O. With the percentile_cont() function it's easy to get the median temperature over each room:


SELECT percentile_cont(0.5) OVER (PARTITION BY room ORDER BY data) AS median, room
FROM [io_sensor_data.moscone_io13]
WHERE sensortype='temperature'

In this example, each original data row shows the median temperature for each room. To visualize it better, it's a good idea to group all results by room with an outer query:


SELECT MAX(median) AS median, room FROM (
SELECT percentile_cont(0.5) OVER (PARTITION BY room ORDER BY data) AS median, room
FROM [io_sensor_data.moscone_io13]
WHERE sensortype='temperature'
)
GROUP BY room

We can add an additional outer query, to rank the rooms according to which one had the coldest median temperature. We'll use one of the new ranking window functions, dense_rank():


SELECT DENSE_RANK() OVER (ORDER BY median) rank, median, room FROM (
SELECT MAX(median) AS median, room FROM (
SELECT percentile_cont(0.5) OVER (PARTITION BY room ORDER BY data) AS median, room
FROM [io_sensor_data.moscone_io13]
WHERE sensortype='temperature'
)
GROUP BY room
)

We've updated the documentation with descriptions and examples for each of the new window functions. Note that they require the OVER() clause, with an optional PARTITION BY and sometimes required ORDER BY arguments. ORDER BY tells the window function what criteria to use to rank items, while PARTITION BY allows you to define multiple groups to be analyzed independently of each other.

The window functions don't work with the big GROUP EACH BY and JOIN EACH BY operators, but they do work with the traditional GROUP BY and JOIN BY. As a reminder, we announced GROUP EACH BY and JOIN EACH BY last March, to allow large join and group operations.

Query caching

BigQuery now remembers values that you've previously computed, saving you time and the cost of recalculating the query. To maintain privacy, queries are cached on a per-user basis. Cached results are only returned for tables that haven't changed since the last query, or for queries that are not dependent on non-deterministic parameters (such as the current time). Reading cached results is free, but each query still counts against the max number of queries per day quota. Query results are kept cached for 24 hours, on a best effort basis. You can disable query caching with the new flag --use_cache in bq, or "useQueryCache" in the API. This feature is also accessible with the new query options on the BigQuery Web UI.

BigQuery Web UI: Query validator, cost estimator, and abandonment

The BigQuery UI gets even better: You'll get instant information while writing a query if its syntax is valid. If the syntax is not valid, you'll know where the error is. If the syntax is valid, the UI will inform you how much the query would cost to run. This feature is also available with the bq tool and API, using the --dry_run flag.

An additional improvement: When running queries on the UI, previously you had to wait until its completion before starting another one. Now you have the option to abandon it, to start working on the next iteration of the query without waiting for the abandoned one.

Pricing updates

Starting in July, BigQuery pricing becomes more affordable for everyone: Data storage costs are going from $0.12/GB/month to $0.08/GB/month. And if you are a high-volume user, you'll soon be able to opt-in for tiered query pricing, for even better value.

Bigger quota

To support larger workloads we're doubling interactive query quotas for all users, from 200GB + 1 concurrent query, to 400 GB of concurrent queries + 2 additional queries of unlimited size.

These updates make BigQuery a faster, smarter, and even more affordable solution for ad hoc analysis of extremely large datasets. We expect they'll help to scale your projects, and we hope you'll share your use cases with us on Google+.


The BigQuery UI features a collection of public datasets for you to use when trying out these new features. To get started, visit our sign-up page and Quick Start guide. You should take a look at our API docs, and ask questions about BigQuery development on Stack Overflow. Finally, don't forget to give us feedback and join the discussion on our Cloud Platform Developers Google+ page.



Felipe Hoffa has recently joined the Cloud Platform team. He'd love to see the world's data accessible for everyone in BigQuery.

Posted by Ashleigh Rentz, Editor Emerita
2013, By: Seo Master

from web contents: Validation: measuring and tracking code quality 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

Google’s Webmaster Team is responsible for most of Google’s informational websites like Google’s Jobs site or Privacy Centers. Maintaining tens of thousands of pages and constantly releasing new Google sites requires more than just passion for the job: it requires quality management.

In this post we won’t talk about all the different tests that can be run to analyze a website; instead we’ll just talk about HTML and CSS validation, and tracking quality over time.

Why does validation matter? There are different perspectives on validation—at Google there are different approaches and priorities too—but the Webmaster Team considers validation a baseline quality attribute. It doesn’t guarantee accessibility, performance, or maintainability, but it reduces the number of possible issues that could arise and in many cases indicates appropriate use of technology.

While paying a lot of attention to validation, we’ve developed a system to use it as a quality metric to measure how we’re doing on our own pages. Here’s what we do: we give each of our pages a score from 0-10 points, where 0 is worst (pages with 10 or more HTML and CSS validation errors) and 10 is best (0 validation errors). We started doing this more than two years ago, first by taking samples, now monitoring all our pages.

Since the beginning we’ve been documenting the validation scores we were calculating so that we could actually see how we’re doing on average and where we’re headed: is our output improving, or is it getting worse?

Here’s what our data say:


Validation score development 2009-2011.


On average there are about three validation issues per page produced by the Webmaster Team (as we combine HTML and CSS validation in the scoring process, information about the origin gets lost), down from about four issues per page two years ago.

This information is valuable for us as it tells us how close we are to our goal of always shipping perfectly valid code, and it also tells us whether we’re on track or not. As you can see, with the exception of the 2nd quarter of 2009 and the 1st quarter of 2010, we are generally observing a positive trend.

What has to be kept in mind are issues with the integrity of the data, i.e. the sample size as well as “false positives” in the validators. We’re working with the W3C in several ways, including reporting and helping to fix issues in the validators; however, as software can never be perfect, sometimes pages get dinged for non-issues: see for example the border-radius issue that has recently been fixed. We know that this is negatively affecting the validation scores we’re determining, but we have no data yet to indicate how much.

Although we track more than just validation for quality control purposes, validation plays an important role in measuring the health of Google’s informational websites.

How do you use validation in your development process?

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Îñţérñåţîöñåļîžåţîöñ 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: Intermediate

So you’re going global, and you need your website to follow. Should be a simple case of getting the text translated and you’re good to go, right? Probably not. The Google Webmaster Team frequently builds sites that are localized into over 40 languages, so here are some things that we take into account when launching our pages in both other languages and regions.

(Even if you think you might be immune to these issues because you only offer content in English, it could be that non-English language visitors are using tools like Google Translate to view your content in their language. This traffic should show up in your analytics dashboard, so you can get an idea of how many visitors are not viewing your site in the way it’s intended.)
More languages != more HTML templates
We can’t recommend this enough: reuse the same template for all language versions, and always try to keep the HTML of your template simple.

Keeping the HTML code the same for all languages has its advantages when it comes to maintenance. Hacking around with the HTML code for each language to fix bugs doesn’t scale–keep your page code as clean as possible and deal with any styling issues in the CSS. To name just one benefit of clean code: most translation tools will parse out the translatable content strings from the HTML document and that job is made much easier when the HTML is well-structured and valid.
How long is a piece of string?
If your design relies on text playing nicely with fixed-size elements, then translating your text might wreak havoc. For example, your left-hand side navigation text is likely to translate into much longer strings of text in several languages–check out the difference in string lengths between some English and Dutch language navigation for the same content. Be prepared for navigation titles that might wrap onto more than one line by figuring out your line height to accommodate this (also worth considering when you create your navigation text in English in the first place).

Variable word lengths cause particular issues in form labels and controls. If your form layout displays labels on the left and fields on the right, for example, longer text strings can flow over into two lines, whereas shorter text strings do not seem associated with their form input fields–both scenarios ruin the design and impede the readability of the form. Also consider the extra styling you’ll need for right-to-left (RTL) layouts (more on that later). For these reasons we design forms with labels above fields, for easy readability and styling that will translate well across languages.

Screenshots of Chinese and German versions of web forms
click to enlarge


Also avoid fixed-height columns–if you’re attempting to neaten up your layout with box backgrounds that match in height, chances are when your text is translated, the text will overrun areas that were only tall enough to contain your English content. Think about whether the UI elements you’re planning to use in your design will work when there is more or less text–for instance, horizontal vs. vertical tabs.
On the flip side
Source editing for bidirectional HTML can be problematic because many editors have not been built to support the Unicode bidirectional algorithm (more research on the problems and solutions). In short, the way your markup is displayed might get garbled:

<p>ابةتث <img src="foo.jpg" alt=" جحخد"< ذرزسش!</p>

Our own day-to-day usage has shown the following editors to currently provide decent solutions for bidirectional editing: particularly Coda, and also Dreamweaver, IntelliJ IDEA and JEditX.

When designing for RTL languages you can build most of the support you need into the core CSS and use the directional attribute of the html element (for backwards compatibility) in combination with a class on the body element. As always, keeping all styles in one core stylesheet makes for better maintainability.

Some key styling issues to watch out for: any elements floated right will need to be floated left and vice versa; extra padding or margin widths applied to one side of an element will need to be overridden and switched, and any text-align attributes should be reversed.

We generally use the following approach, including using a class on the body tag rather than a html[dir=rtl] CSS selector because this is compatible with older browsers:

Elements:

<body class="rtl">
<h1><a href="http://www.blogger.com/"><img alt="Google" src="http://www.google.com/images/logos/google_logo.png" /></a> Heading</h1>

Left-to-right (default) styling:

h1 {
height: 55px;
line-height: 2.05;
margin: 0 0 25px;
overflow: hidden;
}
h1 img {
float: left;
margin: 0 43px 0 0;
position: relative;
}

Right-to-left styling:

body.rtl {
direction: rtl;
}
body.rtl h1 img {
float: right;
margin: 0 0 0 43px;
}

(See this in action in English and Arabic.)

One final note on this subject: most of the time your content destined for right-to-left language pages will be bidirectional rather than purely RTL, because some strings will probably need to retain their LTR direction–for example, company names in Latin script or telephone numbers. The way to make sure the browser handles this correctly in a primarily RTL document is to wrap the embedded text strings with an inline element using an attribute to set direction, like this:

<h2>‫עוד ב- <span dir="ltr">Google</span>‬</h2>

In cases where you don’t have an HTML container to hook the dir attribute into, such as title elements or JavaScript-generated source code for message prompts, you can use this equivalent to set direction where &#x202B; and &#x202C;‬ are Unicode control characters for right-to-left embedding:

<title>&#x202B;‫הפוך את Google לדף הבית שלך‬&#x202C;</title>

Example usage in JavaScript code:
var ffError = '\u202B' +'כדי להגדיר את Google כדף הבית שלך ב\x2DFirefox, לחץ על הקישור \x22הפוך את Google לדף הבית שלי\x22, וגרור אותו אל סמל ה\x22בית\x22 בדפדפן שלך.'+ '\u202C';

(For more detail, see the W3C’s articles on creating HTML for Arabic, Hebrew and other right-to-left scripts and authoring right-to-left scripts.)
It’s all Greek to me…
If you’ve never worked with non-Latin character sets before (Cyrillic, Greek, and a myriad of Asian and Indic), you might find that both your editor and browser do not display content as intended.

Check that your editor and browser encodings are set to UTF-8 (recommended) and consider adding a element and the lang attribute of the html element to your HTML template so browsers know what to expect when rendering your page–this has the added benefit of ensuring that all Unicode characters are displayed correctly, so using HTML entities such as &eacute; (é) will not be necessary, saving valuable bytes! Check the W3C’s tutorial on character encoding if you’re having trouble–it contains in-depth explanations of the issues.
A word on naming
Lastly, a practical tip on naming conventions when creating several language versions. Using a standard such as the ISO 639-1 language codes for naming helps when you start to deal with several language versions of the same document.

Using a conventional standard will help users understand your site’s structure as well as making it more maintainable for all webmasters who might develop the site, and using the language codes for other site assets (logo images, PDF documents) is handy to be able to quickly identify files.

See previous Webmaster Central posts for advice about URL structures and other issues surrounding working with multi-regional websites and working with multilingual websites.

That’s a summary of the main challenges we wrestle with on a daily basis; but we can vouch for the fact that putting in the planning and work up front towards well-structured HTML and robust CSS pays dividends during localization!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: What's new with Sitemaps.org? 2013

salam every one, this is a topic from google web master centrale blog: What has the Sitemaps team been up to since we announced sitemaps.org? We've been busy trying to get Sitemaps adopted by everyone and to make the submission process as easy and automated as possible. To that end, we have three new announcements to share with you.

First, we're making the sitemaps.org site available in 18 languages! We know that our users are located all around the world and we want to make it easy for you to learn about Sitemaps, no matter what language you speak. Here is a link to the Sitemap protocol in Japanese and the FAQ in German.

Second, it's now easier for you to tell us where your Sitemaps live. We wondered if we could make it so easy that you wouldn't even have to tell us and every other search engine that supports Sitemaps. But how? Well, every website can have a robots.txt file in a standard location, so we decided to let you tell us about your Sitemap in the robots.txt file. All you have to do is add a line like

Sitemap: http://www.mysite.com/sitemap.xml

to your robots.txt file. Just make sure you include the full URL, including the http://. That's it. Of course, we still think it's useful to submit your Sitemap through Webmaster tools so you can make sure that the Sitemap was processed without any issues and you can get additional statistics about your site

Last but not least, Ask.com is now also supporting the Sitemap protocol. And with the ability to discover your Sitemaps from your robots.txt file, Ask.com and any other search engine that supports this change to robots.txt will be able to find your Sitemap file.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.