Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Google Friend Connect - Now in 47 new languages 2013

salam every one, this is a topic from google web master centrale blog:

Update: The described product or service is no longer available.


Have you been holding off on using Google Friend Connect because your site isn't in English? Starting today, Friend Connect is available in 47 new languages, including French, Italian, German, Spanish, Chinese, Japanese, Hindi and Portuguese. Now you can easily add social features that match the language of other content on your site.

Most Google-created gadgets (such as the members, comments, and recommendation gadgets) are now available in these new languages. Some developers have also created gadgets that support additional languages and we hope that there will be more to come in the future. To see a list of gadgets available in your language, visit the gadget gallery.

When you add Friend Connect to a new site, it will default to your primary language. But if your site is in another language, simply select it on the site settings tab and Friend Connect will automatically render the gadgets in that language. And if have multiple sites in different languages, you can select a different language for each of your sites.

To learn more or see the full list of languages, check out the Social Web Blog.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Better targeting your indic language site 2013

salam every one, this is a topic from google web master centrale blog:
A lot has been said about how to start a multi-lingual site and how to better target content through meta tags. Our users have raised a number of interesting questions about creating websites in different languages, like the one below.

ganex':
> How does one do for INDIA.
> As there are many languages spoken here.
> My Site is primarily in English, but my site targets different cities in INDIA.
> For Hyderabad - I want in Urdu & Telugu and for Chennai I want in Tamil
> for Bengaluru I want in Kannada.
> For North I want in Hindi.’

We’d like to introduce the transliteration API for Indic languages (languages spoken in India) in addition to our Ajax API for languages. With this API at your disposal, content creation is simplified because it not only helps integrating transliteration in your websites but also allows users visiting your site to type in Indic languages.

To include the transliteration API, first you need the AJAX script.

<script type="text/javascript" src="http://www.google.com/jsapi"></>

This script tag will load the google.load function, which lets you load the individual Google APIs. For loading Google Transliteration API, call to google.load looks like this:

<script type="text/javascript">
google.load("elements", "1", {
packages: "transliteration"
});
</script>


When it comes to targeting, don't forget to add meta tags in your local language. And for your questions, we have a new addition to our already existing communication channels like the webmaster help groups and webmaster tools (available in 26 languages!). We also have our own official Orkut webmaster community! Here users can share thoughts and discuss webmaster related issues.

Sign up for our Orkut community now and if you have any additional thoughts we'd love to hear about them.

Cheers,
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Cara Membuat Blog Di Blogger dengan mudah 2013



Cara membuat blog di blogger sangat mudah. Nama lengkap blog adalah weblog yang berarti sebuah situs website yang memungkinkan pengguna nya dapat menuliskan atau memposting berbagai hal sesuai dengan keinginannya dengan mudah, dan dapat dikomentari oleh pengunjungnya.

Secara umum, ada 2 macam platform blog yang paling populer, yaitu Blogger dan Wordpress. Tapi dalam postingan kali ini kita

from web contents: Webmaster Central YouTube update for July 6th - 10th 2013

salam every one, this is a topic from google web master centrale blog:
Want to see what's new on the Webmaster Central YouTube channel? Check out the answers to the latest Grab Bag questions:
Below is Matt's clarification about Google's use of the meta description tag:


Feel free to leave comments letting us know how you liked the videos, and if you have any specific questions, ask the experts in the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Better recipes on the web: Introducing recipe rich snippets 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Anticipating the start of the season of barbecues and potlucks, we’ve added recipes as our newest rich snippets format. This means that for certain sites with recipe content, Google users will see quick facts when these recipe pages show up as part of the search results.

For example, if you were searching for an easy to make thai mango salad, you can now see user ratings, preparation time, and a picture of the dish directly in search result snippets.


Recipes is the fifth format we support, following the introduction of reviews, people, video and, most recently, events.

If you have recipe content on your site, you can get started now by marking up your recipes with microdata, RDFa, or the hRecipe microformat. To learn more, read our documentation on how to mark up recipe information or our general help articles on rich snippets for a more complete overview.

Please remember that to ensure a great user experience we’re taking a gradual approach to surface rich snippets. This means that we can’t guarantee that marking up your site will result in a rich snippet when your page shows up on our search results. However, we encourage you to get started, and once you’re done you can test your pages with our rich snippets testing tool.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Menghapus Atribut pada Template Blog SEO Friendly? 2013




Menghapus Atribut pada Template Blog SEO Friendly - Sebenarnya setiap blogger atau webmaster sangat ingin mendapatkan backlink, dan karena itu juga banyak di antaranya yang membuat template atau blog untuk mendapatkan backlink....




Tapi sudah sebulan ini ada beberapa komentator yang bertanya cara menghapus atribut dari blog kami. Ini sebenarnya sulit juga dijawab, karena jelas membuat

seo How to Find Vulnerabilities with web vulnerability scanner tools 2013

Seo Master present to you:

How to Find Vulnerabilities with web vulnerability scanner tools

Website security is a big problem now a days and lot of security researcher find vulnerability and got a Gift, hall of fame, acknowledgments and bounty. So in same way black-hat hacker use these bugs to exploit the website (Hack website and easily access all secret data like credit card, important data and email). 


top vulnerability owsap 2013


If you running a website and your Google PageRank, Alexa rank or you are from organization sowebsite Security is very important for you or if you are security researcher this tutorial also important for you . I have already explained in my previous tutorials how to find vulnerability manually in a website. So now today I am going to start How to find vulnerability with Different website scanner software. So let’s start

Common website vulnerabilities:


There are lots of security flaw in a website but most common vulnerability now days mention below

·        XSS (Cross site scripting)
·         SQL injection
·         Remote File inclusion (RFI)
·         Local File inclusion (LFI)
·         CSRF
·         Remote code execution
·         Full Path
So many other’s bugs

List of web scanner software


There are lot software’s available on internet to find different types of vulnerabilities. So few are good for newbie

Netsparker website security scanner:

 Netsparker is a commercial tool and this is my favorite tools and this is also good for newbie and this is designed to find a different types of vulnerability like Cross site scripting (XSS) , SQL , LFI, RFI , RCE and so many others so use this tools hopefully you get good result. 

Vulnerabilities web vulnerability scanner tools

Acunetix website application security tool:


Acunetix is also my favorite tool to find different type vulnerability and its automatically scan all the website for XSS , SQL , LFi , RFI and other security flaw .
Acunetix is one of my favorite tool to find a venerability in any web application It automatically checks your web applications for SQL Injection, XSS & other web vulnerabilities.

Owasp zed attack proxy – ZAP
W3af
Nikto
Websecurify 

so there are also lot of web vulnerability scanner tools . if you have any problem in this tutorials so then comment below
 
2013, By: Seo Master

from web contents: Expanding the webmaster central team 2013

salam every one, this is a topic from google web master centrale blog: You've probably already figured this out if you use webmaster tools, the webmaster help center, or our webmaster discussion forum, but the webmaster central team is a fantastic group of people. You have seen some of them helping out in the discussion forums, and you may have met a few more at conferences, but there are lots of others behind the scenes who you don't see, working on expanding webmaster tools, writing content, and generally doing all they can for you, the webmaster. Even the team members you don't see are paying close attention to your feedback: reading our discussion forum, as well as blogs and message boards. We introduced you to a few of the team before SES NY and Danny Sullivan told you about a few Googler alternatives before SES Chicago. We also have several interns working with us right now, including Marcel, who seems to have been the hit of the party at SMX Advanced.

I am truly pleased to welcome a new addition to the team, although she'll be a familiar face to many of you already. Susan Moskwa is joining Jonathan Simon as a webmaster trends analyst! She's already started posting on the forums and is doing lots of work behind the scenes. Jonathan does a wonderful job answering your questions and investigating issues that come up and he and Susan will make a great team. Susan is a bit of a linguistic genius, so she'll also be helping out in some of the international forums, where Dublin Googlers have started reading and replying to your questions. Want to know more about Susan? You just never know what you find when you do a Google search.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Las Vegas Pubcon 2006 2013

salam every one, this is a topic from google web master centrale blog: As if working at Google isn't already a party, today I'm traveling to Las Vegas for WebmasterWorld PubCon 2006! But instead of talking bets and odds, I'll be talking about how Google can help webmasters improve their sites. I love chatting with webmasters about all the work that goes into creating a great website. Several other Googlers will be there too, so if you have a burning question or just wanna talk about random stuff feel free to stop us and say hi. Besides the sessions, we'll be at the Google booth on Wednesday and Thursday, so come by and introduce yourself.

Here's the list of Google events at PubCon:

Tuesday 14

10:15 - 11:30 SEO and Big Search Adam Lasnik, Search Evangelist

1:30 - 2:45 PPC Search Advertising Programs Frederick Vallaeys, Senior Product Specialist, AdWords

2:45 - 4:00 PPC Tracking and Reconciliation Brett Crosby, Senior Manager, Google Analytics

Wednesday 15

10:15 - 11:30 Contextual Advertising Optimization Tom Pickett, Online Sales and Operations

11:35 - 12:50 Site Structure for Crawlability Vanessa Fox, Product Manager, Google Webmaster Central

1:30 - 3:10 Duplicate Content Issues Vanessa Fox, Product Manager, Google Webmaster Central

5:30 - 7:30 Safe Bets From Google Cocktail party!

Thursday 16

11:35 - 12:50 Spider and DOS Defense Vanessa Fox, Product Manager, Google Webmaster Central

1:30 - 3:10 Interactive Site Reviews Matt Cutts, Software Engineer

3:30 - 5:00 Super Session Matt Cutts, Software Engineer

You can view this schedule on Google Calendar here:

Come to "Safe Bets From Google" on Wednesday 5:30-7:30pm -- it's a cocktail party where you can mingle with other webmasters and Googlers, learn about other Google products for webmasters, and in typical Google style enjoy some great food and drinks. I'll be there with some other engineers from our Seattle office. Don't miss it!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help Google index your mobile site 2013

salam every one, this is a topic from google web master centrale blog:
(This post was largely translated from our Japanese Webmaster Central Blog.)

It seems the world is going mobile, with many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn't easy. Mobile sites not only use a different format from normal desktop site, but the management methods and expertise required are also quite different. This results in a variety of new challenges. As a mobile search engineer, it's clear to me that while many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly. I'd like to help ensure that your mobile site is also available for users of mobile search.

Here are troubleshooting tips to help ensure that your site is properly crawled and indexed:

Verify that your mobile site is indexed by Google

If your web site doesn't show up in the results of a Google mobile search even using the 'site:' operator, it may be that your site has one or both of the following issues:
Googlebot may not be able to find your site
Googlebot, our crawler, must crawl your site before it can be included in our search index. If you just created the site, we may not yet be aware of it. If that's the case, create a Mobile Sitemap and submit it to Google to inform us to the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, in the same way as with a standard Sitemap.
Googlebot may not be able to access your site
Some mobile sites refuse access to anything but mobile phones, making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Our crawler for mobile sites is "Googlebot-Mobile". If you'd like your site crawled, please allow any User-agent including "Googlebot-Mobile" to access your site. You should also be aware that Google may change its User-agent information at any time without notice, so it is not recommended that you check if the User-agent exactly matches "Googlebot-Mobile" (which is the string used at present). Instead, check whether the User-agent header contains the string "Googlebot-Mobile". You can also use DNS Lookups to verify Googlebot.

Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, we then check for whether the URL is viewable on a mobile device. Pages we determine aren't viewable on a mobile phone won't be included in our mobile site index (although they may be included in the regular web index). This determination is based on a variety of factors, one of which is the "DTD (Doc Type Definition)" declaration. Check that your mobile-friendly URLs' DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML. If it's in a compatible format, the page is eligible for the mobile search index. For more information, see the Mobile Webmaster Guidelines.

If you have any question regarding mobile site, post your question to our Webmaster Help Forum and webmasters around the world as well as we are happy to help you with your problem.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New warnings feedback 2013

salam every one, this is a topic from google web master centrale blog:

Given helpful suggestions from our
discussion group, we've improved feedback for sitemaps in Webmaster Tools. Now, minor problems in a sitemap will be reported as "warnings," and will appear instead of, or in addition to, more serious "errors." (Previously all problems were listed as errors.) Warnings allow us to provide feedback on portions of your sitemap that may be confusing or inaccurate, while saving the real "error" alarm for problems that make your sitemap completely unreadable. We hope the additional information makes it even easier to share your sitemaps with Google.

The new set of warnings includes many problems that we had previously classified as errors, including the "incorrect namespace" and "invalid date" examples shown in the screenshot above. We also crawl a sample of the URLs listed in your sitemap and report warnings if the Googlebot runs into any trouble with them. These warnings might suggest a widespread problem with your site that warrants further investigation, such as a stale sitemap or a misconfigured robots.txt file.
Please let us know how you like this new feedback. Tell us what you think via the comments below, or in the
discussion group. We also appreciate suggestions for additional warnings that you would find useful.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools API updated with Site Settings 2013

salam every one, this is a topic from google web master centrale blog:
The Webmaster Tools GData API has been updated to allow you to get even more out of Webmaster Tools, such as setting a geographic location or your preferred domain. For those of you that aren't familiar with GData, it's a protocol for reading and writing data on the web. GData makes it very easy to communicate with many Google services, like Webmaster Tools. The Webmaster Tools GData API already allows you to add and verify sites for your account and to submit Sitemaps programmatically. Now you can also access and update site-specific information. This is especially useful if you have a large number of sites. With the Webmaster Tools API, you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.
What can I do?
We've included four new features in the API. You can see and update these settings for each site that you have verified. The features are:
  • Crawl Rate: You can request that Googlebot crawl your site slower or faster than it normally would (the details can be found in our Help Center article about crawl rate control). If many of your sites are hosted on the same server and you know your server's capacity, you may want to update all sites at the same time. This now a trivial task using the Webmaster Tools GData API.
  • Geographic Location: If your site is targeted towards a particular geographic location but your domain doesn't reflect that (for example with a .com domain), you can provide information to help us determine where your target users are located.
  • Preferred Domain: You can select which is the canonical domain to use to index your pages. For example, if you have a site like www.example.com, you can set either example.com or www.example.com as the preferred domain to use. This avoids the risk of treating both sites differently.
  • Enhanced Image Search: Tools like the Google Image Labeler allow users to tag images in order to improve Image Search results. Now you can opt in or out for all your sites in a breeze using the Webmaster Tools API.
How do I do it?
We provide you with Java code samples for all the current Webmaster Tools API functionality. Here's a sample snippet of code that takes a list of sites and updates the geographic location of all of them:

  // Authenticate against the Webmaster Tools service
  WebmasterToolsService service;
  try {
    service = new WebmasterToolsService("exampleCo-exampleApp-1");
    service.setUserCredentials(USERNAME, PASSWORD);
  } catch (AuthenticationException e) {
    System.out.println("Error while authenticating.");
    return;
  }

  // Read sites and geolocations from your database
  readSitesAndGeolocations(sitesList, geolocationsList);

  // Update all sites
  Iterator
sites = sitesList.iterator();
  Iterator
geolocations = geolocationsList.iterator();
  while (sites.hasNext() && geolocations.hasNext()) {
    // Create a blank entry and add the updated information
    SitesEntry updateEntry = new SitesEntry();
    updateEntry.setGeolocation(geolocations.next());

    // Get the URL to update the site
    String encodedSiteId = URLEncoder.encode(sites.next(),
        "UTF-8");
    URL siteUrl = new URL(
        "http://www.google.com/webmasters/tools/feeds/sites/"
        + encodedSiteId);

    // Update the site
    service.update(siteUrl, updateEntry);
  }

Where do I get it?
The main page for the Webmaster Tools GData API explains all the details of the API. It has a detailed reference guide and also many code snippets that explain how to use the Java client library, which is available for download. You can find more details about GData and all the different Google APIs in the Google Data API homepage.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools shows Crawl error sources 2013

salam every one, this is a topic from google web master centrale blog:
Ever since we released the crawl errors feature in Webmaster Tools, webmasters have asked for the sources of the URLs causing the errors. Well, we're listening! We know it was difficult for those of you who wanted to identify the cause of a particular "Not found" error, in order to prevent it in the future or even to request a correction, without knowing the source URL. Now, Crawl error sources makes the process of tracking down the causes of "Not found" errors a piece of cake. This helps you improve the user experience on your site and gives you a jump start for links week (check out our updated post on "Good times with inbound links" to get the scoop).

In our "Not Found" and "Errors for URLs in Sitemaps" reports, we've added the "Linked From" column. For every error in these reports, the "Linked From" column now lists the number of pages that link to a specific "Not found" URL.



Clicking on an item in the "Linked From" column opens a separate dialog box which lists each page that linked to this URL along with the date it was discovered. The source URL for the 404 can be within or external to your site.





For those of you who just want the data, we've also added the ability to download all your crawl error sources at once. Just click the "Download all sources of errors on this site" link to download all your site's crawl error sources.



Again, if we report crawl errors for your website, you can use crawl error sources to quickly determine if the cause is from your site or someone else's. You'll have the information you need to contact them to get it fixed, and if needed, you can still put in place redirects on your own site to the appropriate URL. Just sign in to Webmaster Tools and check it out for your verified site. You can help people visiting your site—from anywhere on the web—find what they're looking for.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New: Content analysis and Sitemap details, plus more languages 2013

salam every one, this is a topic from google web master centrale blog:

We're always striving to help webmasters build outstanding websites, and in our latest release we have two new features: Content analysis and Sitemap details. We hope these features help you to build a site you could compare to a fine wine -- getting better and better over time.

Content analysis

To help you improve the quality of your site, our new content analysis feature should be a helpful addition to the crawl error diagnostics already provided in Webmaster Tools. Content analysis contains feedback about issues that may impact the user experience or that may make it difficult for Google to crawl and index pages on your site. By reviewing the areas we've highlighted, you can help eliminate potential issues that could affect your site's ability to be crawled and indexed. This results in better indexing of your site by Google and other search engines.

The Content analysis summary page within the Diagnostics section of Webmaster Tools features three main categories. Click on a particular issue type for more details:

  • Title tag issues
  • Meta description issues
  • Non-indexable content issues

content analysis usability section

Selecting "Duplicate title tags" displays a list of repeated page titles along with a count of how many pages contain that title. We currently present up to thirty duplicated page titles on the details page. If the duplicate title issues shown are corrected, we'll update the list to reflect any other pages that share duplicate titles the next time your website is crawled.

Also, in the Title tag issues category, we show "Long title tags" and "Short title tags." For these issue types we will identify title tags that are way too short (for example "IT" isn't generally a good title tag) or way too long (title tag was never intended to mean <insert epic novel here>). A similar algorithm identifies potentially problematic meta description tags. While these pointers won't directly help you rank better (i.e. pages with <title> length x aren't moved to the top of the search results), they may help your site display better titles and snippets in search results, and this can increase visitor traffic.

In the "Non-indexable content issues," we give you a heads-up of areas that aren't as friendly to our more text-based crawler. And be sure to check out our posts on Flash and images to learn how to make these items more search-engine friendly.


content analysis crawlability section


Sitemap details page

If you've submitted a Sitemap, you'll be happy when you see the additional information in Webmaster Tools revealing how your Sitemap was processed. You can find this information on the newly available Sitemap Details page which (along with information that was previously provided for each of your Sitemaps) shows you the number of the pages from your Sitemap that were indexed. Keep in mind the number of pages indexed from your Sitemap may not be 100% accurate because the indexed number is updated periodically, but it's more accurate than running a "site:example.com" query on Google.

The new Sitemap Details page also lists any errors or warnings that were encountered when specific pages from your Sitemap were crawled. So the time you might have previously spent on crafting custom Google queries to determine how many pages from your Sitemap were indexed, can now be spent on improving your site. If your site is already the crème de la crème, you might prefer to spend the extra free time mastering your ice-carving skills or blending the perfect eggnog.

Here's a view of the new Sitemap details page:


Sitemaps are an excellent way to tell Google about your site's most important pages, especially if you have new or updated content that we may not know about. If you haven't yet submitted a Sitemap or have questions about the process, visit our Webmaster Help Center to learn more.

Webmaster Tools now available in Czech & Hungarian

We love expanding our product to help more people and in their language of choice. We recently put in effort to expand the number of Webmaster Tools available languages to Czech and Hungarian, in addition to the 20 other languages we already support. We won't be stopping here. Our desire to support even more languages in the future means that if your language of choice isn't currently supported, stay tuned -- there'll be even more supported languages to come.

We always love to hear what you think. Please visit our Webmaster Help Group to share comments or ask questions.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo We'd love to hear from you 2013

Seo Master present to you:

Thanks to everyone who has sent in feedback about Google Developer Day. It really makes a difference. For instance, some of you pointed out that the U.S. session videos looked blurry, so we reencoded them at a higher bit rate. Others have asked for more code samples from the sessions. We're working on that too, so stay tuned.

Keep it coming! We're looking for feedback not just on GDD but on our developer program as a whole. We're always trying to make our APIs better. Sure, we have ideas about what to do next, but who better to tell us where to focus than you? Please take our survey before Wednesday, June 18. We promise to read all of your comments!2013, By: Seo Master

from web contents: Malware reviews via Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:

In the past year, the number of sites affected by malware/badware grew from a handful a week to thousands per week. We noted your suggestions to improve communication for webmasters of affected sites -- suggestions mentioned in our earlier blog post "About badware warnings" as well as the stopbadware discussion group. Now, Webmaster Tools provides malware reviews.

If you find that your site is affected by malware, either through malware-labeled search results or in the summary for your site in Webmaster Tools, we've streamlined the process to review your site and return it malware-label-free in our search results:
  1. View a sample of the dangerous URLs on your site in Webmaster Tools.
  2. Make any necessary changes to your site according to StopBadware.org's Security tips.
  3. New: Request a malware review from Google and we'll evaluate your site.
  4. New: Check the status of your review.
    • If we feel the site is still harmful, we'll provide an updated list of remaining dangerous URLs
    • If we've determined the site to be clean, you can expect removal of malware messages in the near future (usually within 24 hours).


We encourage all webmasters to become familiar with Stopbadware's malware prevention tips. If you have additional questions, please review our documentation or post to the discussion group. We hope you find this new feature in Webmaster Tools useful in discovering and fixing any malware-related problems, and thanks for your diligence for awareness and prevention of malware.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo 3 Options For Beginners To Design a Website 2013

Seo Master present to you:
Ten years ago designing your own website wasn't easy. You needed to know a lot of technical stuff about Web programming and Web design. But times change and in the world of Web design they change for the better. True, it's still good to know things like HTML, CSS, php and other Web programming languages and techniques. But even if you don't, you can still create a professionally-looking website. Here are three options that are best for beginners.


1. Use Weebly
1. Use Weebly
2. Use Moonfruit
3. Use WordPress




If you know nothing about Web design but would still like to have your own website, try Weebly. Weebly is a San-Francisco based company that offers an awesome service for less techy users. With the help of this service you can quickly create a professional website for free. All you need to pay for is a domain name (best to get a domain name through Namecheap - you get great value for money). You can also publish a website using a Weebly subdomain without paying a cent (it will look like yoursite.weebly.com), but that's not the preferable option. A website using a proper domain is always best.

So, what does Weebly have to offer? Well, a lot. You can create webpages and a blog, you can add images, videos, upload files, and even create slideshows. If you are photographer, Weebly is great for portfolios. If you just want a personal website, Weebly is great and easy to use. If you know CSS and HTML, Weebly themes are editable and you can configure your website in any way you want. Best of all, it's free. You don't even have to pay for hosting. Just a note - there is a Pro plan, but it's not really necessary as you probably won't need the pro features they offer.

Ten years ago designing your own website wasn't easy. You needed to know a lot of technical stuff about Web programming and Web design. But times change and in the world of Web design they change for the better. True, it's still good to know things like HTML, CSS, php and other Web programming languages and techniques. But even if you don't, you can still create a professionally-looking website. Here are three options that are best for beginners.

If you know nothing about Web design but would still like to have your own website, try Weebly. Weebly is a San-Francisco based company that offers an awesome service for less techy users. With the help of this service you can quickly create a professional website for free. All
you need to pay for is a domain name (best to get a domain name through Namecheap - you get great value for money). You can also publish a website using a Weebly subdomain without paying a cent (it will look like yoursite.weebly.com), but that's not the preferable option. A website using a proper domain is always best.

Recommendation For You:


So, what does Weebly have to offer? Well, a lot. You can create webpages and a blog, you can add images, videos, upload files, and even create slideshows. If you are photographer, Weebly is great for portfolios. If you just want a personal website, Weebly is great and easy to use. If you know CSS and HTML, Weebly themes are editable and you can configure your website in any way you want. Best of all, it's free. You don't even have to pay for hosting. Just a note - there is a Pro plan, but it's not really necessary as you probably won't need the pro features they offer. You can read more about Weebly here.

Moonfruit is another service offering you to build a professional website for free. They offer a number of different themes, designs and options. Moonfruit is ideal for portfolios and online stores, as you get a very vibrant design and some great e-commerce options for free. You can also start a blog and share it to the world with integrated sharing features. Best of all, Moonfruit sites are mobile-friendy, which is great in the world of smartphones and tablets. In addition to the somewhat limited free plan (1 website, 15 pages, 20MB of storage), Moonfruit offers advanced paid plans. You can check them out here.
However, Moonfruit has one disadvantage - they build websites using Flash. This technology is OK for photography portfolios, but overall it's less SEO-friendly than conventional website building. Some people will disagree, but that's my experience with Flash. You just need to spend more time optimizing your site. In any case, Moonfruit websites look awesome and you can do a lot for free - just have a look at their website gallery.

WordPress is by far the best way to design a website. All right, you need more technical knowledge, but the results can satisfy even the most choosy of you. Don't forget that WordPress is used by thousands of Web design professionals all over the words and is the CMS of choice for thousands of professional and popular websites.

Using WordPress.org (not WordPress.com) requires you to purchase a domain name and a hosting plan. This is a small investment compared to what you get in return. You get a free website and blogging platform, you can choose from a massive number of free themes (there are paid themes too), WordPress is very search engine friendly and there is no end of customization to be done. With WordPress you can create virtually any website - just select the theme that suits your needs.

Well, here are three most beginner-friendly version. Hope this post answers the question.

More Recommendation For You:

2013, By: Seo Master

from web contents: Update on Public Service Search 2013

salam every one, this is a topic from google web master centrale blog: Public Service Search is a service that enables non-profit, university, and government web sites to provide search functionality to their visitors without serving ads. While we've stopped accepting new Public Service Search accounts, if you want to add the functionality of this service to your site, we encourage you to check out the Google Custom Search Engine. Note that if you already have a Public Service Search account, you'll be able to continue offering search results on your site.

A Custom Search Engine can provide you with free web search and site search with the option to specify and prioritize the sites that are included in your search results. You can also customize your search engine to match the look and feel of your site, and if your site is a non-profit, university, or government site, you can choose not to display ads on your results pages.

You have two opportunities to disable ads on your Custom Search Engine. You can select the "Do not show ads" option when you first create a Custom Search Engine, or you can follow the steps below to disable advertising on your existing Custom Search Engine:

1. Click the "My search engines" link on the left-hand side of the Overview page.
2. Click the "control panel" link next to the name of your search engine.
3. Under the "Preferences" section of the Control panel page, select the Advertising status option that reads "Do not show ads on results pages (for non-profits, universities, and government agencies only)."
4. Click the "Save Changes" button.

Remember that disabling ads is available only for non-profit, university, and government sites. If you have a site that doesn't fit into one of these categories, you can still provide search to your visitors using the Custom Search Engine capabilities.

For more information or help with Custom Search Engines, check out the FAQ or post a question to the discussion group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

seo Falling Object Generator For Blogs And Websites - Online Tool 2013

Seo Master present to you:
Friends , This is a tool to make falling object code easily and it help to make different type falling images. It make a html code, You can add it directly in your blogger. After install this widget. You will see a falling object on your website. You can install it with below widget. Use it and enjoy...






Leave a comment.below..............................
2013, By: Seo Master

from web contents: On-Demand Sitemaps for Custom Search 2013

salam every one, this is a topic from google web master centrale blog:
Since we launched enhanced indexing with the Custom Search platform earlier this year, webmasters who submit Sitemaps to Webmaster Tools get special treatment: Custom Search recognizes the submitted Sitemaps and indexes URLs from these Sitemaps into a separate index for higher quality Custom Search results. We analyze your Custom Search Engines (CSEs), pick up the appropriate Sitemaps, and figure out which URLs are relevant for your engines for enhanced indexing. You get the dual benefit of better discovery for Google.com and more comprehensive coverage in your own CSEs.

Today, we're taking another step towards improving your experience with Google webmaster services with the launch of On-Demand Indexing in Custom Search. With On-Demand Indexing, you can now tell us about the pages on your websites that are new, or that are important and have changed, and Custom Search will instantly schedule them for crawl, and index and serve them in your CSEs usually within 24 hours, often much faster.

How do you tell us about these URLs? You guessed it... provide a Sitemap to Webmaster Tools, like you always do, and tell Custom Search about it. Just go to the CSE control panel, click on the Indexing tab, select your On-Demand Sitemap, and hit the "Index Now" button. You can tell us which of these URLs are most important to you via the priority and lastmod attributes that you provide in your Sitemap. Each CSE has a number of pages allocated within the On-Demand Index, and with these attributes, you can us which are most important for indexing. If you need greater allocation in the On-Demand index, as well as more customization controls, Google Site Search provides a range of options.


Some important points to remember:
  1. You only need to submit your Sitemaps once in Webmaster Tools. Custom Search will automatically list the Sitemaps submitted via Webmaster Tools and you can decide which Sitemap to select for On-Demand Indexing.
  2. Your Sitemap needs to be for a website verified in Webmaster Tools, so that we can verify ownership of the right URLs.
  3. In order for us to index these additional pages, our crawlers must be able to crawl them. You can use "Webmaster Tools > Crawl Errors > URLs restricted by robots.txt" or check your robots.txt file to ensure that you're not blocking us from crawling these pages.
  4. Submitting pages for On-Demand Indexing will not make them appear any faster in the main Google index, or impact ranking on Google.com.
We hope you'll use this feature to inform us regularly of the most important changes on your sites, so we can respond quickly and get those pages indexed in your CSE. As always, we're always listening for your feedback on Custom Search.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools - Links to your site updated 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

The "Links to your site" feature in Webmaster Tools is now updated to show you which domains link the most to your site, in addition to other improvements. On the overview page you'll notice that there are three main sections: the domains linking most to your site, the pages on your site with the most links, and a sampling of the anchor text external sites are using when they link to your site.


Who links the most
Clicking the “More »” link under the “Who links the most” section will take you to a new view that shows a listing of all the domains that link to your site. Each domain in the list can be expanded to display a sample of pages from your site which are linked to by that domain.


The "More »" link under each specific domain lists all the pages linked to by that domain. At the top of the page there's a total count of links from that domain and a total count of your site's pages linked to from that domain.


Your most linked content
If you drill into the “Your most linked content” view from the overview page, you’ll see a listing of all your site’s most important linked pages. There's also a link count for each page as well as a count of domains linking to that page. Clicking any of the pages listed will expand the view to show you examples of the leading domains linking to that page and the number of links to the given page from each domain listed. The data used for link counts and throughout the "Links to your site" feature is more comprehensive now, including links redirected using 301 or 302 HTTP redirects.


Each page listed in the "All linked pages" view has an associated "More »" link which displays all the domains linking to that specific page on your site.


Each domain listed leads to a report of all the pages from that domain linking to your specific page.


We hope the updated “Links to your site” feature in Webmaster Tools will help you better understand where the links to your site are coming from and improve your ability to track changes to your site’s link profile. Please post any comments you have about this updated feature or post your questions in the Webmaster Help Forum. We appreciate your feedback since it helps us to continue to improve the functionality of Webmaster Tools.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Duplicate content summit at SMX Advanced 2013

salam every one, this is a topic from google web master centrale blog: Last week, I participated in the duplicate content summit at SMX Advanced. I couldn't resist the opportunity to show how Buffy is applicable to the everday Search marketing world, but mostly I was there to get input from you on the duplicate content issues you face and to brainstorm how search engines can help.

A few months ago, Adam wrote a great post on dealing with duplicate content. The most important things to know about duplicate content are:
  • Google wants to serve up unique results and does a great job of picking a version of your content to show if your sites includes duplication. If you don't want to worry about sorting through duplication on your site, you can let us worry about it instead.
  • Duplicate content doesn't cause your site to be penalized. If duplicate pages are detected, one version will be returned in the search results to ensure variety for searchers.
  • Duplicate content doesn't cause your site to be placed in the supplemental index. Duplication may indirectly influence this however, if links to your pages are split among the various versions, causing lower per-page PageRank.
At the summit at SMX Advanced, we asked what duplicate content issues were most worrisome. Those in the audience were concerned about scraper sites, syndication, and internal duplication. We discussed lots of potential solutions to these issues and we'll definitely consider these options along with others as we continue to evolve our toolset. Here's the list of some of the potential solutions we discussed so that those of you who couldn't attend can get in on the conversation.

Specifying the preferred version of a URL in the site's Sitemap file
One thing we discussed was the possibility of specifying the preferred version of a URL in a Sitemap file, with the suggestion that if we encountered multiple URLs that point to the same content, we could consolidate links to that page and could index the preferred version.

Providing a method for indicating parameters that should be stripped from a URL during indexing
We discussed providing this in either an interface such as webmaster tools on in the site's robots.txt file. For instance, if a URL contains sessions IDs, the webmaster could indicate the variable for the session ID, which would help search engines index the clean version of the URL and consolidate links to it. The audience leaned towards an addition in robots.txt for this.

Providing a way to authenticate ownership of content
This would provide search engines with extra information to help ensure we index the original version of an article, rather than a scraped or syndicated version. Note that we do a pretty good job of this now and not many people in the audience mentioned this to be a primary issue. However, the audience was interested in a way of authenticating content as an extra protection. Some suggested using the page with the earliest date, but creation dates aren't always reliable. Someone also suggested allowing site owners to register content, although that could raise issues as well, as non-savvy site owners wouldn't know to register content and someone else could take the content and register it instead. We currently rely on a number of factors such as the site's authority and the number of links to the page. If you syndicate content, we suggest that you ask the sites who are using your content to block their version with a robots.txt file as part of the syndication arrangement to help ensure your version is served in results.

Making a duplicate content report available for site owners
There was great support for the idea of a duplicate content report that would list pages within a site that search engines see as duplicate, as well as pages that are seen as duplicates of pages on other sites. In addition, we discussed the possibility of adding an alert system to this report so site owners could be notified via email or RSS of new duplication issues (particularly external duplication).

Working with blogging software and content management systems to address duplicate content issues
Some duplicate content issues within a site are due to how the software powering the site structures URLs. For instance, a blog may have the same content on the home page, a permalink page, a category page, and an archive page. We are definitely open to talking with software makers about the best way to provide easy solutions for content creators.

In addition to discussing potential solutions to duplicate content issues, the audience had a few questions.

Q: If I nofollow a substantial number of my internal links to reduce duplicate content issues, will this raise a red flag with the search engines?
The number of nofollow links on a site won't raise any red flags, but that is probably not the best method of blocking the search engines from crawling duplicate pages, as other sites may link to those pages. A better method may be to block pages you don't want crawled with a robots.txt file.

Q: Are the search engines continuing the Sitemaps alliance?
We launched sitemaps.org in November of last year and have continued to meet regularly since then. In April, we added the ability for you to let us know about your Sitemap in your robots.txt file. We plan to continue to work together on initiatives such as this to make the lives of webmasters easier.

Q: Many pages on my site primarily consist of graphs. Although the graphs are different on each page, how can I ensure that search engines don't see these pages as duplicate since they don't read images?
To ensure that search engines see these pages as unique, include unique text on each page (for instance, a different title, caption, and description for each graph) and include unique alt text for each image. (For instance, rather than use alt="graph", use something like alt="graph that shows Willow's evil trending over time".

Q: I've syndicated my content to many affiliates and now some of those sites are ranking for this content rather than my site. What can I do?
If you've freely distributed your content, you may need to enhance and expand the content on your site to make it unique.

Q: As a searcher, I want to see duplicates in search results. Can you add this as an option?
We've found that most searchers prefer not to have duplicate results. The audience member in particular commented that she may not want to get information from one site and would like other choices, but for that case, other sites will likely not have identical information and therefore will show up in the results. Bear in mind that you can add the "&filter=0" parameter to the end of a Google web search URL to see additional results which might be similar.

I've brought back all the issues and potential solutions that we discussed at the summit back to my team and others within Google and we'll continue to work on providing the best search results and expanding our partnership with you, the webmaster. If you have additional thoughts, we'd love to hear about them!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.