Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo What Is Alexa?How To Improve Alexa Ranking 2013

Seo Master present to you:
What Is Alexa?How To Improve Alexa Ranking




Alexa ranking is an important factor in determining the online presence of your website. Thus all webmasters ought to work in improving their particular Alexa Position. And your motive with this article should be to make them aware of how to raise Alexa ranking with their website.




Precisely what is Alexa Position(Ranking)?


alexa ranking+What Is AlexaHow To Improve Alexa Ranking

First away from all you have to know what accurately Alexa can be and precisely what is Alexa Position. Alexa is often a website held by Amazon online marketplace which has a high ranking websites all over the world based in traffic statistics of your website in the past three many weeks. A site with lower Alexa rank surpasses the one particular with higher Alexa rank.

Why you have to Improve Alexa Position?



improve alexa-ranking+What Is AlexaHow To Improve Alexa Ranking






Alexa ranking is just about the most significant metrics to measure any websites on-line presence therefore makes your online reputation greater among advertisers along with your readers. To earn money you must have good traffic therefore you will probably obviously have good Alexa Position.

How to raise Alexa Position?

When My partner and i said that “to earn money you must have good traffic therefore you will probably obviously have good Alexa Ranking” doesn't means that only site visitors determines any websites Alexa Ranking and you ought to work for traffic just to improve Alexa ranking. There are other sorts of factors on what Alexa Position depends. Below are solutions to improve Alexa ranking don't forget all your factors.


What Is Alexa?How To Improve Alexa Ranking


1. High quality and Quantity Content


unique+content+What Is AlexaHow To Improve Alexa Ranking





You should publish quality and also quantity content material. Try to publish a minimum of one post per day or or else possible as compared to one submit per a couple days. But you need to do this often and there’s no running far from it. And make certain that your content are 100 % unique and authentic. Do definitely not copy paste content.

 Verify Site/Blog Ownership

Check out Alexa, sign-up as well as verify your internet site ownership having Alexa. This makes your website look such as genuine towards the eyes associated with Alexa.

3. Alexa Toolbar as well as widget

Install Alexa toolbar as well as encourage your current readers to try and do the same because while someone visits your website from any browser by which Alexa toolbar can be installed, and then it counts each.

Put your Alexa widget in the sidebar as well as the footer of the website. Have a look in the sidebar to see how a great Alexa widget appears to be. Whenever a person visits your website then an impression is recorded about the widget. It will improve Alexa ranking very fast (according to my experience).

4. Reviews to your website

Write an appraisal for your website and ask your mates and readers and also to write an appraisal for your website/blog. This may also help to improve Alexa Rank of the website.

5. High quality Backlinks

Get excellent backlinks to your website. Touch upon websites obtaining more Alexa rank and great PR as compared to that associated with yours to have backlinks by those internet websites. Participate in forums as well as use your website URL because your register forums. Guest Post in blogs having same niche and High PR as well as good Alexa rank. Use these methods to have quality back-links.

6. International Traffic

Try to have traffic from all over the world and besides from specific country as well as Area. Aim on global traffic rather than on local traffic unless your website talks in relation to regional issues etc.

Note:
If You Find Any Difficulty Regarding 'What Is Alexa?How To Improve Alexa Ranking' Than Feel Free To Ask Or You Can Also Comment Us.


2013, By: Seo Master

from web contents: Video Sitemaps: Understanding location tags 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

If you want to add video information to a Sitemap or mRSS feed you must specify the location of the video. This means you must include one of two tags, either the video:player_loc or video:content_loc. In the case of an mRSS feed, these equivalent tags are media:player or media:content, respectively. We need this information to verify that there is actually a live video on your landing page and to extract metadata and signals from the video bytes for ranking. If one of these tags is not included we will not be able to verify the video and your Sitemap/mRSS feed will not be crawled. To reduce confusion, here is some more detail about these elements.

Video Locations Defined

Player Location/URL: the player (e.g., .swf) URL with corresponding arguments that load and play the actual video.

Content Location/URL: the actual raw video bytes (e.g., .flv, .avi) containing the video content.

The Requirements

One of either the player video:player_loc or content video:content_loc location is required. However, we strongly suggest you provide both, as they each serve distinct purposes: player location is primarily used to help verify that a video exists on the page, and content location helps us extract more signals and metadata to accurately rank your videos.

URL extensions at a glance:



















Sitemap:mRSS:Contents:
<loc><link>The playpage URL
<video:player_loc>

<media:player> (url attribute)The SWF URL
<video:content_loc><media:content> (url attribute)The FLV or other raw video URL

NOTE: All URLs should be unique (every URL in your entire Video Sitemap and mRSS feed should be unique)

If you would like to better ensure that only Googlebot accesses your content, you can perform a reverse DNS lookup.

For more information on Google Videos please visit our Help Center, and to post questions and search for answers check out our Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Unescape Decoder and Encoder– Online Tool 2013

Seo Master present to you:

Friends, This is a secret tool to hide your content from strangers. It protect your content without loss any data. You can convert text,HTML,JavaScript etc, in to Un escape format. Also you can decode your content with this tool. Just paste your Text,HTML,JavaScript on first box then hit Encode button. Your text will encode like %3C%73%63%72%69%70%74% format. You can decode it easily, Just copy this encoded code second box. then hit on Decode button. Are you happy now enjoy…..








Leave a comment below about this encoder/decoder...... 2013, By: Seo Master

from web contents: Joint support for the Sitemap Protocol 2013

salam every one, this is a topic from google web master centrale blog: We're thrilled to tell you that Yahoo! and Microsoft are joining us in supporting the Sitemap protocol.

As part of this development, we're moving the protocol to a new namespace, www.sitemaps.org, and raising the version number to 0.9. The sponsoring companies will continue to collaborate on the protocol and publish enhancements on the jointly-maintained site sitemaps.org.

If you've already submitted a Sitemap to Google using the previous namespace and version number, we'll continue to accept it. If you haven't submitted a Sitemap before, check out the documentation on www.sitemaps.org for information on creating one. You can submit your Sitemap file to Google using Google webmaster tools. See the documentation that Yahoo! and Microsoft provide for information about submitting to them.

If any website owners, tool writers, or webserver developers haven't gotten around to implementing Sitemaps yet, thinking this was just a crazy Google experiment, we hope this joint announcement shows that the industry is heading in this direction. The more Sitemaps eventually cover the entire web, the more we can revolutionize the way web crawlers interact with websites. In our view, the experiment is still underway.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Trendy Tube Blogger Template - Video Blogger Template 2013

Seo Master present to you:
A New Professional and impressive design ideal for video blogs is Trendy Tube Blogger Template.The thing You will find for love in this Template is its simple and stunning style!.Actually this template has 2 columns layout along with One Beautiful right side-bar which makes it more sexy template.Awesome Search box at the top of the template is built-in this template.It is similar in design to YouTube,You can easily upload your videos and can share it via this Template.Moreover it works perfect with all type of Browsers,You Can Download it free from Our Blog.
Trendy Tube Blogger Template

Features/Properties Of Trendy Tube Blogger Template

  • Layout :- Trendy tube Blogger template has 2 Columns layout,it is ideal for Video Blogs.
  • SEO-Friendly: This template is SEO Ready template,so it is SEO- Friendly Template
  • Professional:- This Template has Professional Outlook like YouTube.
  • Gray Background: This template has gray Background which makes it more awesome template.
  • Right-Sidebar : It has Stunning Right side bar which makes it more stunning template
  • Video
  • Works with all type of Browsers

License and Installation

2013, By: Seo Master

from web contents: Get a more complete picture about how other sites link to you 2013

salam every one, this is a topic from google web master centrale blog: For quite a while, you've been able to see a list of the most common words used in anchor text to your site. This information is useful, because it helps you know what others think your site is about. How sites link to you has an impact on your traffic from those links, because it describes your site to potential visitors. In addition, anchor text influences the queries your site ranks for in the search results.

Now we've enhanced the information we provide and will show you the complete phrases sites use to link to you, not just individual words. And we've expanded the number we show to 100. To make this information as useful as possible, we're aggregating the phrases by eliminating capitalization and punctuation. For instance, if several sites have linked to your site using the following anchor text:

Site 1 "Buffy, blonde girl, pointy stick"
Site 2 "Buffy blonde girl pointy stick"
Site 3 "buffy: Blonde girl; Pointy stick."

We would aggregate that anchor text and show it as one phrase, as follows:

"buffy blonde girl pointy stick"

You can find this list of phrases by logging into webmaster tools, accessing your site, then going to Statistics > Page anaysis. You can view this data in a table and can download it as a CSV file.

And as we told you last month, you can see the individual links to pages of your site by going to Links > External links. We hope these details give you additional insight into your site traffic.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Let visitors recommend your content 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

We recently posted about some of the engaging gadgets you can add to your site with Google Friend Connect. Here's one more that may be of interest if you're looking for another way to get feedback from your site's visitors:

The new Recommendation gadgets make it easy for your visitors to let you and the world know which parts of your site they like best. By placing recommendation buttons next to photos, articles or other content, visitors can recommend specific items to others with the click of a button. Your most popular items will surface to the top of the recommendation list.



To install a recommendation gadget on your site, or to check out the other gadgets that are available, please visit www.google.com/friendconnect.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Enable Custom robots header tags for blogger SEO Traffic 2013

Seo Master present to you:
enable Custom robots header tags in blogger
Blogger has recently introduced search preferences tab in Blogger and continually introducing new features in it to help fellow bloggers do better SEO of their blogs. This will help bloggers to increase your blogs traffic and get better google search results rankings.
One of the new features introduced by blogger's developers is Custom Robots Header Tags.This tool plays great role in our blog's Search Engine Optimization. Using it properly helps us to gain more visitors from search engines and hence our blog page views will also increase.
Also the search engine will not accept the duplicate content form your site so you can avoid your site duplicate content with the custom robots header tags settings.

What are Custom Robots Header Tags ?


These are all the header tags used in blogger and their usage/meaning is defined below.

all:  There are no restrictions for indexing or serving. This is default for all pages
noindex:  Do not show this page in search results and do not show a "Cached" link in search results.
nofollow:  Do not follow the links on this page
none:  Equivalent to noindex, nofollow
noarchive:  Do not show a "Cached" link in search results.
nosnippet:  Do not show a snippet in the search results for this page
noodp:  Do not use metadata from the Open Directory project (DMOZ) for titles or snippets shown for this page.
notranslate:  Do not offer translation of this page in other languages in search results.
noimageindex:  Do not index images on this page.
unavailable_after: [RFC-850 date/time]: Do not show this page in search results after the specified date/time. The date/time must be specified in the RFC 850 format. Example: 17 May 2012 15:00:00 PST

How to Enable Custom robots header tags in Blogger SEO

Follow these steps to enable Custom robots header tags in Blogger Blogs to increase your blog traffic and page views.

Step 1.  Log in to your blogger account Dashboard and click on your blog.
Step 2.  Go to Settings >> Search Preferences.
Step 3.  
Click on edit link in front of Custom Robots Header Tags and then click on Yes option.
Step 4. 
Once you click on the Yes you will see many options. Simply tick on the options as shown in the below image.
Best Custom robots header tags for blogger SEO Traffic

Step 5.  Now click on "Save changes" button and You are done!

That's all! If you have any doubt or need help about this post, please leave comment.
2013, By: Seo Master

seo Working with Chrome's file browser handler 2013

Seo Master present to you:
By Jeremy Glassenberg, Platform Manager, Box

This post is part of Who's at Google I/O, a series of guest blog posts written by developers who appeared in the Developer Sandbox at Google I/O 2011.


During the day 2 keynote of Google I/O, I was excited to see Box's integration with the Chromebook's file browser handler getting demoed on the big stage. The integration makes local files and files you encounter on the web easily accessible to cloud services inside Chrome OS.

Chrome's file browser handler utilizes the new HTML5 file system API, designed to enable web applications to interact with local files. This API lets web applications read files, edit files, and create new files within a designated local space on a user's machine. This includes creating binary files for application data, and in Box's case, accessing user-created files to let people easily move their content to the cloud.

As mentioned during the Google I/O keynote, the integration between Box and the Chrome OS file browser handler only took our team a weekend to build. We were able to build the integration quickly because of the simplicity of both Chrome's file browser platform and Box's API, both of which were designed to make content integrations like this easy for developers to implement.

In this case, the Quick Importer tool from the Box API made the entire development process just a few steps:

1. We created a Chrome extension manifest to work with Box.
{
"name”: "Box Uploader",
...
"file_browser_handlers": [
{
"id”: "upload",
"default_title": "Save to Gallery", // What the button will display
"file_filters": [
]
}
],
2. In the Chrome manifest, we specified the relevant file types to which the service applies. In our case, that's most file types, as seen below. Specialized services may just want certain types, such as images for Picasa.
"file_browser_handlers": [
{
"id": "upload",
"default_title": "Save to Box",
"file_filters": [
"filesystem:*.*"
]
}
],
3. With some JavaScript code connecting to the file browser handler, we set up a way to upload files through Box’s Quick Importer.
var fm = new FileManager();
fm.uploadServer = 'https://www.box.net/<...>';

if (bgPage && bgPage.filesToUpload.length) {
var entry;
while(entry = bgPage.filesToUpload.pop()) {
entry.file(function(file) {
fm.uploadFile(file);
});
}
}
That's actually all there was to the integration.

Once the file is uploaded to the Box API's Quick Import URL, our page is displayed to authenticate the user, to let the user select a Box folder to save the file, and then to upload the file.


While such an integration can be customized through our API, our Quick Import provided an easy and fast means to connect the platforms. Developers can customize the integration by using direct calls to our API, and take advantage of additional features such as automatic sharing, if they prefer.

Thanks to the simplicity of Chrome's file browser handler and some extra tools in the Box API, our development time was very short (just a weekend), but it could have actually been even quicker. We had a couple of unusual complications that weekend:

1. The Google Chrome team was still experimenting with the file browser, so development from both sides was happening in parallel, which can be a bit tricky. Now that the file browser has been thoroughly tested, you should have an even easier time.

2. I took my girlfriend out a couple times, since her final exams were coming up soon afterward. I love you, Hayley!

Once the content has been uploaded to Box, it’s accessible to many Google services, including Gmail, Google Docs, and Google Calendar, through additional integrations on our site with Google Apps. Ah, the wonders of open platforms.


Jeremy Glassenberg is the Platform Manager at Box, where he oversees partner integrations, API and platform product management, and Box’s community of several thousand developers. In addition to managing Box's developer platform, Jeremy is a part-time blogger at ProgrammableWeb, and a contributor to several open-source projects.

Posted by Scott Knaster, Editor
2013, By: Seo Master

from web contents: New robots.txt feature and REP Meta Tags 2013

salam every one, this is a topic from google web master centrale blog:

We've improved Webmaster Central's robots.txt analysis tool to recognize Sitemap declarations and relative URLs. Earlier versions weren't aware of Sitemaps at all, and understood only absolute URLs; anything else was reported as Syntax not understood. The improved version now tells you whether your Sitemap's URL and scope are valid. You can also test against relative URLs with a lot less typing.

Reporting is better, too. You'll now be told of multiple problems per line if they exist, unlike earlier versions which only reported the first problem encountered. And we've made other general improvements to analysis and validation.

Imagine that you're responsible for the domain www.example.com and you want search engines to index everything on your site, except for your /images folder. You also want to make sure your Sitemap gets noticed, so you save the following as your robots.txt file:

disalow images

user-agent: *
Disallow:

sitemap: http://www.example.com/sitemap.xml

You visit Webmaster Central to test your site against the robots.txt analysis tool using these two test URLs:

http://www.example.com
/archives

Earlier versions of the tool would have reported this:



The improved version tells you more about that robots.txt file:





We also want to make sure you've heard about the new unavailable_after meta tag announced by Dan Crow on the Official Google Blog a few weeks ago. This allows for a more dynamic relationship between your site and Googlebot. Just think, with www.example.com, any time you have a temporarily available news story or limited offer sale or promotion page, you can specify the exact date and time you want specific pages to stop being crawled and indexed.

Let's assume you're running a promotion that expires at the end of 2007. In the headers of page www.example.com/2007promotion.html, you would use the following:

<META NAME="GOOGLEBOT"
CONTENT="unavailable_after: 31-Dec-2007 23:59:59 EST">


The second exciting news: the new X-Robots-Tag directive, which adds Robots Exclusion Protocol (REP) META tag support for non-HTML pages! Finally, you can have the same control over your videos, spreadsheets, and other indexed file types. Using the example above, let's say your promotion page is in PDF format. For www.example.com/2007promotion.pdf, you would use the following:

X-Robots-Tag: unavailable_after: 31 Dec
2007 23:59:59 EST


Remember, REP meta tags can be useful for implementing noarchive, nosnippet, and now unavailable_after tags for page-level instruction, as opposed to robots.txt, which is controlled at the domain root. We get requests from bloggers and webmasters for these features, so enjoy. If you have other suggestions, keep them coming. Any questions? Please ask them in the Webmaster Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Reversing Code Bloat with the JavaScript Knowledge Base 2013

Seo Master present to you:

JavaScript libraries let developers do more with less code. But JavaScript libraries need to work on a variety of browsers, so using them often means shipping even more code. If JQuery has code to support XMLHttpRequest over ActiveX on an older browser like IE6 then you end up shipping that code even if your application doesn't support IE6. Not only that, but you ship that code to the other 90% of newer browsers that don't need it.


This problem is only going to get worse. Browsers are rushing to implement HTML5 and EcmaScript5 features like JSON.parse that used to be provided only in library code, but libraries will likely have to keep that code for years if not decades to support older browsers.


Lots of compilers (incl. (JSMin, Dojo, YUI, Closure, Caja) remove unnecessary code from JavaScript to make the code you ship smaller. They seem like a natural place to address this problems. Optimization is just taking into account the context that code is going to run in to improve it; giving compilers information about browsers will help them avoid shipping code to support marginal browsers to modern browsers.

The JavaScript Knowledge Base (JSKB) on browserscope.org seeks to systematically capture this information in a way that compilers can use.

It collects facts about browsers using JavaScript snippet. The JavaScript code (!!window.JSON && typeof window.JSON.stringify === 'function') is true if JSON is defined. JSKB knows that this is true for Firefox 3.5 but not Netscape 2.0.

Caja Web Tools includes a code optimizer that uses these facts. If it sees code like

if (typeof JSON.stringify !== 'function') { /* lots of code */ }

it knows that the body will never be executed on Firefox 3.5, and can optimize it out. The key here is that the developer writes feature tests, not version tests, and as browsers roll out new features, JSKB captures that information, letting compilers produce smaller code for that browser.


The Caja team just released Caja Web Tools, which already uses JSKB to optimize code. We hope that other JavaScript compilers will adopt these techniques. If you're working on a JavaScript optimizer, take a look at our JSON APIs to get an idea of what the JSKB contains.


If you're writing JavaScript library code or application code then the JSKB documentation can suggest good feature tests. And the examples in the Caja Web Tools testbed are good starting places.


2013, By: Seo Master

from web contents: Handling legitimate cross-domain content duplication 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

We've recently discussed several ways of handling duplicate content on a single website; today we'll look at ways of handling similar duplication across different websites, across different domains. For some sites, there are legitimate reasons to duplicate content across different websites — for instance, to migrate to a new domain name using a web server that cannot create server-side redirects. To help with issues that arise on such sites, we're announcing our support of the cross-domain rel="canonical" link element.



Ways of handling cross-domain content duplication:
  • Choose your preferred domain
    When confronted with duplicate content, search engines will generally take one version and filter the others out. This can also happen when multiple domain names are involved, so while search engines are generally pretty good at choosing something reasonable, many webmasters prefer to make that decision themselves.
  • Enable crawling and use 301 (permanent) redirects where possible
    Where possible, the most important step is often to use appropriate 301 redirects. These redirects send visitors and search engine crawlers to your preferred domain and make it very clear which URL should be indexed. This is generally the preferred method as it gives clear guidance to everyone who accesses the content. Keep in mind that in order for search engine crawlers to discover these redirects, none of the URLs in the redirect chain can be disallowed via a robots.txt file. Don't forget to handle your www / non-www preference with appropriate redirects and in Webmaster Tools.
  • Use the cross-domain rel="canonical" link element
    There are situations where it's not easily possible to set up redirects. This could be the case when you need to move your website from a server that does not feature server-side redirects. In a situation like this, you can use the rel="canonical" link element across domains to specify the exact URL of whichever domain is preferred for indexing. While the rel="canonical" link element is seen as a hint and not an absolute directive, we do try to follow it where possible.


Still have questions?

Q: Do the pages have to be identical?
A: No, but they should be similar. Slight differences are fine.

Q: For technical reasons I can't include a 1:1 mapping for the URLs on my sites. Can I just point the rel="canonical" at the homepage of my preferred site?
A: No; this could result in problems. A mapping from old URL to new URL for each URL on the old site is the best way to use rel="canonical".

Q: I'm offering my content / product descriptions for syndication. Do my publishers need to use rel="canonical"?
A: We leave this up to you and your publishers. If the content is similar enough, it might make sense to use rel="canonical", if both parties agree.

Q: My server can't do a 301 (permanent) redirect. Can I use rel="canonical" to move my site?
A: If it's at all possible, you should work with your webhost or web server to do a 301 redirect. Keep in mind that we treat rel="canonical" as a hint, and other search engines may handle it differently. But if a 301 redirect is impossible for some reason, then a rel="canonical" may work for you. For more information, see our guidelines on moving your site.

Q: Should I use a noindex robots meta tag on pages with a rel="canonical" link element?
A: No, since those pages would not be equivalent with regards to indexing - one would be allowed while the other would be blocked. Additionally, it's important that these pages are not disallowed from crawling through a robots.txt file, otherwise search engine crawlers will not be able to discover the rel="canonical" link element.

We hope this makes it easier for you to handle duplicate content in a user-friendly way. Are there still places where you feel that duplicate content is causing your sites problems? Let us know in the Webmaster Help Forum!


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Duplicate content and multiple site issues 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Last month, I gave a talk at the Search Engine Strategies San Jose conference on Duplicate Content and Multiple Site Issues. For those who couldn't make it to the conference or would like a recap, we've reproduced the talk on the Google Webmaster Central YouTube Channel. Below you can see the short video reproduced from the content at SES:



You can view the slides here:



this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: For Those Wondering About Public Service Search 2013

salam every one, this is a topic from google web master centrale blog:

Update: The described product or service is no longer available. More information.

We recently learned of a security issue with our Public Service Search service and disabled login functionality temporarily to protect our Public Service Search users while we were working to fix the problem. We are not aware of any malicious exploits of this problem and this service represents an extremely small portion of searches.

We have a temporary fix in place currently that prevents exploitation of this problem and will have a permanent solution in place shortly. Unfortunately, the temporary fix may inconvenience a small number of Public Service Search users in the following ways:

* Public Service Search is currently not open to new signups.
* If you use Public Service Search on your site, you are currently unable to log in to make changes, but rest assured that Public Service Search continues to function properly on your site.
* The template system is currently disabled, so search results will appear in a standard Google search results format, rather than customized to match the look and feel of your site. However, the search results themselves are not being modified.


If you are a Public Service Search user and are having trouble logging in right now, please sit tight. As soon as the permanent solution is in place the service will be back on its feet again. In the meantime, you will still be able to provide site-specific searches on your site as usual.

Google introduced this service several years ago to support universities and non-profit organizations by offering ad-free search capabilities for their sites. Our non-profit and university users are extremely important to us and we apologize for any inconvenience this may cause.

Please post any questions or concerns in our webmaster discussion forum and we'll try our best to answer any questions you may have.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Managing your reputation through search results 2013

salam every one, this is a topic from google web master centrale blog: (Cross-posted on the Official Google Blog)

A few years ago I couldn't wait to get married. Because I was in love, yeah; but more importantly, so that I could take my husband's name and people would stop getting that ridiculous picture from college as a top result when they searched for me on Google.

After a few years of working here, though, I've learned that you don't have to change your name just because it brings up some embarrassing search results. Below are some tips for "reputation management": influencing how you're perceived online, and what information is available relating to you.

Think twice

The first step in reputation management is preemptive: Think twice before putting your personal information online. Remember that although something might be appropriate for the context in which you're publishing it, search engines can make it very easy to find that information later, out of context, including by people who don't normally visit the site where you originally posted it. Translation: don't assume that just because your mom doesn't read your blog, she'll never see that post about the new tattoo you're hiding from her.

Tackle it at the source

If something you dislike has already been published, the next step is to try to remove it from the site where it's appearing. Rather than immediately contacting Google, it's important to first remove it from the site where it's being published. Google doesn't own the Internet; our search results simply reflect what's already out there on the web. Whether or not the content appears in Google's search results, people are still going to be able to access it — on the original site, through other search engines, through social networking sites, etc. — if you don't remove it from the original site. You need to tackle this at the source.
  • If the content in question is on a site you own, easy — just remove it. It will naturally drop out of search results after we recrawl the page and discover the change.
  • It's also often easy to remove content from sites you don't own if you put it there, such as photos you've uploaded, or content on your profile page.
  • If you can't remove something yourself, you can contact the site's webmaster and ask them to remove the content or the page in question.
After you or the site's webmaster has removed or edited the page, you can expedite the removal of that content from Google using our URL removal tool.

Proactively publish information

Sometimes, however, you may not be able to get in touch with a site's webmaster, or they may refuse to take down the content in question. For example, if someone posts a negative review of your business on a restaurant review or consumer complaint site, that site might not be willing to remove the review. If you can't get the content removed from the original site, you probably won't be able to completely remove it from Google's search results, either. Instead, you can try to reduce its visibility in the search results by proactively publishing useful, positive information about yourself or your business. If you can get stuff that you want people to see to outperform the stuff you don't want them to see, you'll be able to reduce the amount of harm that that negative or embarrassing content can do to your reputation.

You can publish or encourage positive content in a variety of ways:
  • Create a Google profile. When people search for your name, Google can display a link to your Google profile in our search results and people can click through to see whatever information you choose to publish in your profile.
  • If a customer writes a negative review of your business, you could ask some of your other customers who are happy with your company to give a fuller picture of your business.
  • If a blogger is publishing unflattering photos of you, take some pictures you prefer and publish them in a blog post or two.
  • If a newspaper wrote an article about a court case that put you in a negative light, but which was subsequently ruled in your favor, you can ask them to update the article or publish a follow-up article about your exoneration. (This last one may seem far-fetched, but believe it or not, we've gotten multiple requests from people in this situation.)
Hope these tips have been helpful! Feel free to stop by our Web Search Forum and share your own advice or stories about how you manage your reputation online.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: More on 404 2013

salam every one, this is a topic from google web master centrale blog: Now that we've bid farewell to soft 404s, in this post for 404 week we'll answer your burning 404 questions.

How do you treat the response code 410 "Gone"?
Just like a 404.

Do you index content or follow links from a page with a 404 response code?
We aim to understand as much as possible about your site and its content. So while we wouldn't want to show a hard 404 to users in search results, we may utilize a 404's content or links if it's detected as a signal to help us better understand your site.

Keep in mind that if you want links crawled or content indexed, it's far more beneficial to include them in a non-404 page.

What about 404s with a 10-second meta refresh?
Yahoo! currently utilizes this method on their 404s. They respond with a 404, but the 404 content also shows:

<meta http-equiv="refresh" content="10;url=http://www.yahoo.com/?xxx">

We feel this technique is fine because it reduces confusion by giving users 10 seconds to make a new selection, only offering the homepage after 10 seconds without the user's input.

Should I 301-redirect misspelled 404s to the correct URL?
Redirecting/301-ing 404s is a good idea when it's helpful to users (i.e. not confusing like soft 404s). For instance, if you notice that the Crawl Errors of Webmaster Tools shows a 404 for a misspelled version of your URL, feel free to 301 the misspelled version of the URL to the correct version.

For example, if we saw this 404 in Crawl Errors:
http://www.google.com/webmsters  <-- typo for "webmasters"

we may first correct the typo if it exists on our own site, then 301 the URL to the correct version (as the broken link may occur elsewhere on the web):
http://www.google.com/webmasters

Have you guys seen any good 404s?
Yes, we have! (Confession: no one asked us this question, but few things are as fun to discuss as response codes. :) We've put together a list of some of our favorite 404 pages. If you have more 404-related questions, let us know, and thanks for joining us for 404 week!
http://www.metrokitchen.com/nice-404-page
"If you're looking for an item that's no longer stocked (as I was), this makes it really easy to find an alternative."
-Riona, domestigeek

http://www.comedycentral.com/another-404
"Blame the robot monkeys"
-Reid, tells really bad jokes

http://www.splicemusic.com/and-another
"Boost your 'Time on site' metrics with a 404 page like this."
-Susan, dabbler in music and Analytics

http://www.treachery.net/wow-more-404s
"It's not reassuring, but it's definitive."
-Jonathan, has trained actual spiders to build websites, ants handle the 404s

http://www.apple.com/iPhone4g
"Good with respect to usability."
http://thcnet.net/lost-in-a-forest
"At least there's a mailbox."
-JohnMu, adventurous

http://lookitsme.co.uk/404
"It's pretty cute. :)"
-Jessica, likes cute things

http://www.orangecoat.com/a-404-page.html
"Flow charts rule."
-Sahala, internet traveller

http://icanhascheezburger.com/iz-404-page
"I can has useful links and even e-mail address for questions! But they could have added 'OH NOES! IZ MISSING PAGE! MAYBE TIPO OR BROKN LINKZ?' so folks'd know what's up."
-Adam, lindy hop geek

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Central YouTube update for June 8th - 12th 2013

salam every one, this is a topic from google web master centrale blog:
Want to know what's new on the Webmaster Central YouTube channel? Here's what we've uploaded in the past week:

Matt Cutts answered a few new questions from the Grab Bag:
Matt also went over a great example of whitehat linkbait:



And if you've ever thought about hiding text, here's one technique that didn't fool Google:



Feel free to leave comments letting us know how you liked the videos, and if you have any specific questions, ask the experts in the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Open Source Developers @ Google Speaker Series: Raph Levien 2013

Seo Master present to you:

On Monday, June 25th, Raph Levien will join us to present Lessons from Advogato. Raph, Advogato's founder, will give us insights into attack-resistant trust metrics and the other mechanisms used to build the website's user community.

Like all sessions of the Open Source Developers @ Google Speaker Series, Raph's presentation will be open to the public. Doors open at 6:30 PM at our Mountain View campus; guests should plan to sign in at Building 43 reception upon arrival. Refreshments will be served and all are welcome and encouraged to attend. Raph's presentation will also be taped and published along with all of the public Google Tech Talks.

For those of you who were unable to attend our last session, you can watch the video of Bob Lee's recent presentation Java on Guice: Dependency Injection the Java Way.2013, By: Seo Master

from web contents: Building link-based popularity 2013

salam every one, this is a topic from google web master centrale blog: Late in November we were at SES in Paris, where we had the opportunity to meet some of the most prominent figures in the French SEO and SEM market. One of the issues that came up in sessions and in conversations was a certain confusion about how to most effectively increase the link-based popularity of a website. As a result we thought it might be helpful to clarify how search engines treat link spamming to increase a site´s popularity.

This confusion lies in the common belief that there are two ways for optimizing the link-based popularity of your website: Either the meritocratic and long-term option of developing natural links or the risky and short-term option of non-earned backlinks via link spamming tactics such as buying links. We've always taken a clear stance with respect to manipulating the PageRank algorithm in our Quality Guidelines. Despite these policies, the strategy of participating in link schemes might have previously paid off. But more recently, Google has tremendously refined its link-weighting algorithms. We have more people working on Google's link-weighting for quality control and to correct issues we find. So nowadays, undermining the PageRank algorithm is likely to result in the loss of the ability of link-selling sites to pass on reputation via links to other sites.

Discounting non-earned links by search engines opened a new and wide field of tactics to build link-based popularity: Classically this involves optimizing your content so that thematically-related or trusted websites link to you by choice. A more recent method is link baiting, which typically takes advantage of Web 2.0 social content websites. One example of this new way of generating links is to submit a handcrafted article to a service such as http://digg.com. Another example is to earn a reputation in a certain field by building an authority through services such as http://answers.yahoo.com. Our general advice is: Always focus on the users and not on search engines when developing your optimization strategy. Ask yourself what creates value for your users. Investing in the quality of your content and thereby earning natural backlinks benefits both the users and drives more qualified traffic to your site.

To sum up, even though improved algorithms have promoted a transition away from paid or exchanged links towards earned organic links, there still seems to be some confusion within the market about what the most effective link strategy is. So when taking advice from your SEO consultant, keep in mind that nowadays search engines reward sweat-of-the-brow work on content that bait natural links given by choice.

In French / en Francais

Liens et popularité.
[Translated by] Eric et Adrien, l’équipe de qualité de recherche.

Les 28 et 29 Novembre dernier, nous étions à Paris pour assister à SES. Nous avons eu la chance de rencontrer les acteurs du référencement et du Web marketing en France. L’un des principaux points qui a été abordé au cours de cette conférence, et sur lequel il règne toujours une certaine confusion, concerne l’utilisation des liens dans le but d’augmenter la popularité d’un site. Nous avons pensé qu’il serait utile de clarifier le traitement réservé aux liens spam par les moteurs de recherche.

Cette confusion vient du fait qu’un grand nombre de personnes pensent qu’il existe deux manières d’utiliser les liens pour augmenter la popularité de leurs sites. D’une part, l’option à long terme, basée sur le mérite, qui consiste à développer des liens de manière naturelle. D’autre part, l’option à court terme, plus risquée, qui consiste à obtenir des liens spam, tel les liens achetés. Nous avons toujours eu une position claire concernant les techniques visant à manipuler l’algorithme PageRank dans nos conseils aux webmasters.

Il est vrai que certaines de ces techniques ont pu fonctionner par le passé. Cependant, Google a récemment affiné les algorithmes qui mesurent l’importance des liens. Un plus grand nombre de personnes évaluent aujourd’hui la pertinence de ces liens et corrigent les problèmes éventuels. Désormais, les sites qui tentent de manipuler le Page Rank en vendant des liens peuvent voir leur habilité à transmettre leur popularité diminuer.

Du fait que les moteurs de recherche ne prennent désormais en compte que les liens pertinents, de nouvelles techniques se sont développées pour augmenter la popularité d’un site Web. Il y a d’une part la manière classique, et légitime, qui consiste à optimiser son contenu pour obtenir des liens naturels de la part de sites aux thématiques similaires ou faisant autorité. Une technique plus récente, la pêche aux liens, (en Anglais « link baiting »), consiste à utiliser à son profit certains sites Web 2.0 dont les contenus sont générés par les utilisateurs. Un exemple classique étant de soumettre un article soigneusement prépare à un site comme http://digg.com. Un autre exemple consiste à acquérir un statut d’expert concernant un sujet précis, sur un site comme http://answers.yahoo.com. Notre conseil est simple : lorsque vous développez votre stratégie d’optimisation, pensez en premier lieu à vos utilisateurs plutôt qu’aux moteurs de recherche. Demandez-vous quelle est la valeur ajoutée de votre contenu pour vos utilisateurs. De cette manière, tout le monde y gagne : investir dans la qualité de votre contenu bénéficie à vos utilisateurs, cela vous permet aussi d’augmenter le nombre et la qualité des liens naturels qui pointent vers votre site, et donc, de mieux cibler vos visiteurs.

En conclusion, bien que les algorithmes récents aient mis un frein aux techniques d’échanges et d’achats de liens au profit des liens naturels, il semble toujours régner une certaine confusion sur la stratégie à adopter. Gardez donc à l’esprit, lorsque vous demandez conseil à votre expert en référencement, que les moteurs de recherche récompensent aujourd’hui le travail apporté au contenu qui attire des liens naturels.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Fridaygram: Humanitarian hacking, Shenzhou 9, robot seeks Martians 2013

Seo Master present to you: Author PictureBy Ashleigh Rentz, Google Developers Blog Editor Emerita

The Googleplex is really buzzing this week with people furiously preparing for Google I/O!  As with any conference, there are physical limits on how many people can participate, so we’re striving to make the Google I/O Extended events around the world more interactive than just simple viewing parties.  Yesterday, we got to share the details of the Develop For Good hackathon contest sponsored by Google.org in conjunction with Google I/O Extended.

But this certainly isn’t the first time Google.org has engaged with developers to help make our planet a better place.  In fact, many Googlers recently participated in Random Hacks of Kindness Global, a twice-annual event where developers in 21+ cities around the world spark new ideas for making the world a better place through innovation and technology.  Among the many projects, one team in San Francisco worked on an algorithm for scanning textbooks and processing mathematical formulae in an accessible way for users with vision impairment.  Take some time this weekend to read the recaps and get inspired!

The Chinese space program may also make this an inspiring weekend when they attempt the country’s first manned docking mission, designated Shenzhou 9, and take off for the Tiangong 1 space laboratory on Saturday.  The attempt becomes even more inspiring since the three-person crew will include the first female Chinese astronaut.  We wish them godspeed.


Meanwhile, scientists at NASA in the United States are awaiting the arrival of a new Mars rover which will search for signs of life in a new way.  The rover, named Curiosity, is scheduled to land on the red planet in August, well ahead of any humans who might one day be en route.  Who can imagine what it might find there?



Even when Scott takes a well-deserved break, we bring you Fridaygram: a few items of Google and non-Google geekery to enjoy during the weekend. Ashleigh was our previous blog editor and now works behind the scenes on the Google Developers website. I write about space now; space is cool.
2013, By: Seo Master

from web contents: Video Tutorial: Google for Webmasters 2013

salam every one, this is a topic from google web master centrale blog:
We're always looking for new ways to help educate our fellow webmasters. While you may already be familiar with Webmaster Tools, Webmaster Help Discussion Groups, this blog, and our Help Center, we've added another tutorial to help you understand how Google works. Hence we've made this video of a soon-to-come presentation titled "Google for Webmasters." This video will introduce how Google discovers, crawls, indexes your site's pages, and how Google displays them in search results. It also touches lightly upon challenges webmasters and search engines face, such as duplicate content, and the effective indexing of Flash and AJAX content. Lastly, it also talks about the benefits of offerings Webmaster Central and other useful Google products.


Take a look for yourself.

Discoverability:



Accessibility - Crawling and Indexing:


Ranking:


Webmaster Central Overview:


Other Resources:



Google Presentations Version:
http://docs.google.com/Presentation?id=dc5x7mrn_245gf8kjwfx

Important links from this presentation as they chronologically appear in the video:
Add your URL to Google
Help Center: Sitemaps
Sitemaps.org
Robots.txt
Meta tags
Best uses of Flash
Best uses of Ajax
Duplicate content
Google's Technology
Google's History
PigeonRank
Help Center: Link Schemes
Help Center: Cloaking
Webmaster Guidelines
Webmaster Central
Google Analytics
Google Website Optimizer
Google Trends
Google Reader
Google Alerts
More Google Products


Special thanks to Wysz, Chark, and Alissa for the voices.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo How to Get Adsense Account Approved for Any Domain 2013

Seo Master present to you:
How to get Adsense Account ApprovedHow to get adsense account approved for blogspot, wordpress, webs, weebly and other paid hosting sites.

I know many of you want to earn good money from the Internet, especially with Google Adsense. Google Adsense is the best money making program , and all blogger & websites prefer it. But it’s difficult to get into Adsense program now a days.
Below is the noted Adsense mail that most of us get once we aren't eligible for the Adsense program.


Hello,

Thank you for your interest in Google AdSense. Unfortunately, after reviewing your application, we’re unable to accept you into AdSense at this time.

We did not approve your application for the reasons listed below.

Issues:
- Copyrighted material
- Insufficient content
- Site does not comply with Google policies

Regards,
The Google AdSense Team


But i will tell you how to get very easily Adsense into your website and blogspot . This guide is totally based on my own experience. I've worked with this method of getting approval of AdSense application many times for my friends and it worked anytime.

Register New Email Account1. Register New Email Account:


You should need a new email account could be Anyone. I had registered Adsense Account with Hotmail email it was later changed to gmail because gmail would be better.
1. Go to https://accounts.google.com/SignUp
2. Now fill your correct personal details

2. Custom Domain Name:

Custom Domain Name

Too many of us focus on the domain name, however my view is not important for blogspot.com and different free hosting domain you'll be able to use customized name. yes if you web site is (dot).com domain then your name is good for domain.

3. Site Design: 

Site Design

Your website should have a professional design which is able to satisfied Adsense advertiser to publish ads on your site and can help your blog being approved by Adsense.

4. Content and Number of Posts:


You have need 30 to 50 number of post that include the high quality content. You need to have useful articles with more than 500+ words. they need to create unique content before they apply for Adsense program. If you're having content that is copied from different blog’s then don’t even imagine to get approved. Your blog shouldn't contain any copyrighted stuff. You can’t provide links for downloading movies, eBooks or any of such type. Your blog should also contain safe content that doesn’t harm the society or in easy term children friendly. You can’t even apply for an account after you write in your regional language there are only few languages that Google supports for Adsense program. you'll read more in Google’s AdSense program policies.

5. Traffic:

Traffic

your website must have some good quantity of traffic to get Adsense account approved. 

  • Minimum Traffic 1000 page views per day.
  • At least more then 300 unique visitors and majority from search engines especially Google.
  • If you have more traffic from United States, United Kingdom, Australia, Germany and Canada. then its a plus point.
If you don’t have any traffic from search engines then drop the idea of applying for Adsense.

6. Connect Your Site To Social Media:

Connect Your Site To Social Media 
That is very important point to your blog must be connected with Social media networks, because the role of social media sites are very important for improve your blog visibility, traffic and blog ranking.  The best social media sites for driving traffic to your latest blog post
  1. Facebook
  2. Google+
  3. Twitter
  4. LinkedIn
  5. Stumbleupon
  6. Youtube
  7. Pinterest

7. Create Blog Pages:


Static Pages allows you to create specific pages like about me, Contact page, Privacy Policy etc. on complete pages that are linked from your site. The static pages basically are the same as post pages, however there are many things that make them different. According to adsense terms and condition every blog must have a working contact us page and privacy policy page to get adsense approved.
Create some pages for website.
1. About  2. Contact  3.Privacy Policy  4. FAQ  5. Advertise and create other pages according your blog.

Note: Some time Google Adsense reject the application, the key word is resubmit your application.

I hope when you complete above all instructions will you to get adsense approved account . 
2013, By: Seo Master
Powered by Blogger.