Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

Seo Master present to you:

Google Person Finder has become a useful tool in responding to natural disasters by reconnecting people with their family and friends. We’ve been looking at the next phase of Google Person Finder and decided to begin hosting the open source project at Google Code. We’re inviting the developer community to help improve Google Person Finder and the PFIF data format.

Google Person Finder provides a common place to search for, comment on, and connect records from many missing person registries. After the January 12th earthquake in Haiti, a team of Googlers worked with the U.S. Department of State to quickly create a site that helped people who were affected by the disaster. The site was used heavily after the Chile earthquake in February and put in action again in April after the Qinghai earthquake in China and in August for the Pakistan floods.

The software powering Google Person Finder is open source so we’re listing the open issues and feature requests we’ve received over the past few months in hopes the community can help us improve the code. We’ve created a Developer Guide to help developers get started. As always, we invite those interested to post questions on our public Person Finder discussion group. Those who are interested in improving the PFIF data format can also join the PFIF discussion group.

In addition to opening our product for developers, we’ve decided it’s now time to turn off our Google Person Finder instances for Haiti, Chile, China, and Pakistan. It doesn’t seem useful to be serving these missing person records on the Internet indefinitely, so we intend for each instance of Google Person Finder to be running for a limited time. Once an instance has served its purpose, we will archive the PFIF records in a secure location for historical preservation for one year while we work to identify a permanent owner for these records. Assuming a long-term owner cannot be found, we will delete the records after one calendar year. For more information, please feel free to review the Google Person Finder FAQ.

2013, By: Seo Master
Seo Master present to you:

With hundreds (if not thousands) of popular email clients and mail servers out there, importing email into another service can be challenge, especially when you consider the troves of old email most people save. To ease this pain, we created the Google Apps Email Migration API.

This new API is available in Google Apps Premier, Partner, and Education Editions, and you can use it to migrate your existing email from anywhere into Google Apps. Let's say, for example, you want to import email from your Obscurix Email Server v2.0001715. Just write some parsing code and use our simple API to upload that email into the desired mailbox. For convenience, you can authenticate to the API not only as the end user of the destination mailbox, but also as a Google Apps administrator, and target any mailbox in the domain. This API uses the Google data API protocol, which means there are a host of client libraries to make importing even easier.

LimitNone (one of our Enterprise Professional partners) has already built a migration utility that works with calendars, email and contacts.

For more info, check out the Google Enterprise Blog, or just dive right into the developer's guide. And please, let us know what you think!2013, By: Seo Master
Seo Master present to you:

If you’ve used Google Search recently, you may have noticed a new feature that we’re calling Instant Previews. By clicking on the (sprited) magnifying glass icon next to a search result you see a preview of that page, often with the relevant content highlighted. Once activated, you can mouse over the rest of the results and quickly (instantly!) see previews of those search results, too.

Adding this feature to Google Search involved a lot of client-side Javascript. Being Google, we had to make sure we could deliver this feature without slowing down the page. We know our users want their results fast. So we thought we’d share some techniques involved in making this new feature fast.

JavaScript compilation

This is nothing new for Google Search: all our Javascript is compiled to make it as small as possible. We use the open-sourced Closure Compiler. In addition to minimizing the Javascript code, it also re-writes expressions, reuses variables, and prunes out code that is not being used. The Javascript on the search results page is deferred, and also cached very aggressively on the client side so that it’s not downloaded more than once per version.

On-demand JSONP

When you activate Instant Previews, the result previews are requested by your web browser.There are several ways to fetch the data we need using Javascript. The most popular techniques are XmlHttpRequest (XHR) and JSONP. XHR generally gives you better control and error-handling, but it has two drawbacks: browsers caching tends to be less reliable, and only same-origin requests are permitted (this is starting to change with modern browsers and cross-origin resource sharing, though). With JSONP, on the other hand, the requested script returns the desired data as a JSON object wrapped in a Javascript callback function, which in our case looks something like

google.vs.r({"dim":[302,585],"url":"http://example.com",ssegs:[...]}).

Although error handling with JSONP is a bit harder to do compared to XHR (not all browsers support onerror events), JSONP can be cached aggressively by the browser, and is not subject to same-origin restrictions. This last point is important for Instant Previews because web browsers restrict the number of concurrent requests that they send to any one host. Using a different host for the preview requests means that we don’t block other requests in the page.

There are a couple of tricks when using JSONP that are worth noting:

  • If you insert the script tag directly, e.g. using document.createElement, some browsers will show the page as still “loading” until all script requests are finished. To avoid that, make your DOM call to insert the script tag inside a window.setTimeout call.
  • After your requests come back and your callbacks are done, it’s a good idea to set your script src to null, and remove the tag. On some browsers, allowing too many script tags to accumulate over time may slow everything down.

Data URIs

At this point you are probably curious as to what we’re returning in our JSONP calls, and in particular, why we are using JSON and not just plain images. Perhaps you even used Firebug or your browser’s Developer Tools to examine the Instant Previews requests. If so, you will have noticed that we send back the image data as sets of data URIs. Data URIs are base64 encodings of image data, that modern browsers (IE8+, Chrome, Safari, Firefox, Opera, etc) can use to display images, instead of loading them from a server as usual.

To show previews, we need the image, and the relevant content of the page for the particular query, with bounding boxes that we draw on top of the image to show where that content appears on the page. If we used static images, we’d need to make one request for the content and one request for the image; using JSONP with data URIs, we make just one request. Data URIs are limited to 32K on IE8, so we send “slices” that are all under that limit, and then use Javascript to generate the necessary image tags to display them. And even though base64 encoding adds about 33% to the size of the image, our tests showed that gzip-compressed data URIs are comparable in size to the original JPEGs.

We use caching throughout our implementation, but it’s important to not forget about client-side caching as well. By using JSONP and data URIs, we limit the number of requests made, and also make sure that the browser will cache the data, so that if you refresh a page or redo a query, you should get the previews, well... instantly!

2013, By: Seo Master
Powered by Blogger.