Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

seo Google Prediction API: faster, easier to use, and more accurate 2013

Seo Master present to you: Author Photo
By Marc Cohen, Developer Relations

This holiday season, the Google Prediction API Team is bringing you four presents and, thanks to the joys of cloud computing, no reindeer are required for delivery. Here’s what you’ve already received:
  • Faster on-ramp: We’ve made it easier to get started by enabling you to create an empty model (by sending a trainedmodels.insert request with no storageDataLocation specified) and add training data using the trainedmodels.update method. This change allows you to submit your model contents without needing to stage the data in Google Cloud Storage.
  • Improved updates: The algorithms used to implement model updates (adding additional data to existing models) have been modified to work faster than ever.
  • More classification algorithms: We’ve increased the number of classification algorithms used to build predictive models, resulting in across-the-board improvements in accuracy.
  • Integration with Google Apps Script: Prediction services are now available as part of Google Apps Script, which means you can integrate prediction services with Google Docs, Google Maps, Gmail, and other great Google products.
All of the above enhancements are supported by the current Prediction API version 1.4 so you can enjoy these features using the existing client libraries.

Happy Holidays from the Google Prediction API Team. We’re looking forward to bringing you more exciting features in 2012!


Marc Cohen is a member of Google’s Developer Relations Team in Seattle. When not teaching Python programming and listening to indie rock music, he enjoys using the Google Prediction API to peer into the future.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Introducing Au-to-do, a sample application built on Google APIs 2013

Seo Master present to you: Author Photo
By Dan Holevoet, Developer Relations Team

A platform is more than the sum of its component parts. You can read about it or hear about it, but to really learn what makes up a platform you have to try it out for yourself, play with the parts, and discover what you can build.

With that in mind, we started a project called Au-to-do: a full sample application implementing a ticket tracker, built using Google APIs, that developers can download and dissect.

Au-to-do screen shot

Au-to-do currently uses the following APIs and technologies:
Additional integrations with Google APIs are on their way. We are also planning a series of follow-up blog posts discussing each of the integrations in depth, with details on our design decisions and best practices you can use in your own projects.

By the way, if you’re wondering how to pronounce Au-to-do, you can say "auto-do" or "ought-to-do" — either is correct.

Ready to take a look at the code? Check out the getting started guide. Found a bug? Have a great idea for a feature or API integration? Let us know by filing a request.

Happy hacking!


Dan Holevoet joined the Google Developer Relations team in 2007. When not playing Starcraft, he works on Google Apps, with a focus on the Calendar and Contacts APIs. He's previously worked on iGoogle, OpenSocial, Gmail contextual gadgets, and the Google Apps Marketplace.

Posted by Scott Knaster, Editor



2013, By: Seo Master

seo Google Prediction API graduates from labs, adds new features 2013

Seo Master present to you: Author Photo
By Zachary Goldberg, Product Manager

Since the general availability launch of the Prediction API this year at Google I/O, we have been working hard to give every developer access to machine learning in the cloud to build smarter apps. We’ve also been working on adding new features, accuracy improvements, and feedback capability to the API. Today we take another step by announcing Prediction v1.4. With the launch of this version, Prediction is graduating from Google Code Labs, reflecting Google’s commitment to the API’s development and stability. Version 1.4 also includes two new features:
  • Data Anomaly Analysis
    • One of the hardest parts of building an accurate predictive model is gathering and curating a high quality data set. With Prediction v1.4, we are providing a feature to help you identify problems with your data that we notice during the training process. This feedback makes it easier to build accurate predictive models with proper data.
  • PMML Import
    • PMML has become the de facto industry standard for transmitting predictive models and model data between systems. As of v1.4, the Google Prediction API can programmatically accept your PMML for data transformations and preprocessing.
    • The PMML spec is vast and covers many, many features. You can find more details about the specific features that the Google Prediction API supports here.



We’re looking forward to seeing what you create with these new capabilities!

Feel free to find us and ask questions about these new features on our discussion group or submit feedback via our feedback form.


Zachary Goldberg is Product Manager for the Google Prediction API. He has a strange fascination with the Higgs Boson.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Streak brings CRM to the inbox with Google Cloud Platform 2013

Seo Master present to you: Author PhotoBy Aleem Mawani, Co-Founder of Streak

Cross-posted with the Google App Engine Blog

This guest post was written by Aleem Mawani, Co-Founder of Streak, a startup alum of Y Combinator, a Silicon Valley incubator. Streak is a CRM tool built into Gmail. In this post, Aleem shares his experience building and scaling their product using Google Cloud Platform.

Everyone relies on email to get work done – yet most people use separate applications from their email to help them with various business processes. Streak fixes this problem by letting you do sales, hiring, fundraising, bug tracking, product development, deal flow, project management and almost any other business process right inside Gmail. In this post, I want to illustrate how we have used Google Cloud Platform to build Streak quickly, scalably and with the ability to deeply analyze our data.



We use several Google technologies on the backend of Streak:

  • BigQuery to analyze our logs and power dashboards.

Our core learning is that you should use the best tool for the job. No one technology will be able to solve all your data storage and access needs. Instead, for each type of functionality, you should use a different service. In our case, we aggressively mirror our data in all the services mentioned above. For example, although the source of truth for our user data is in the App Engine Datastore, we mirror that data in the App Engine Search API so that we can provide full text search, Gmail style, to our users. We also mirror that same data in BigQuery so that we can power internal dashboards.

System Architecture




App Engine - We use App Engine for Java primarily to serve our application to the browser and mobile clients in addition to serving our API. App Engine is the source of truth for all our data, so we aggressively cache using Memcache. We also use Objectify to simplify access to the Datastore, which I highly recommend.

Google Cloud Storage - We mirror all of our Datastore data as well as all our log data in Cloud Storage, which acts as a conduit to other Google cloud services. It lets us archive the data as well as push it to BigQuery and the Prediction API.

BigQuery - Pushing the data into BigQuery allows us to run non-realtime queries that can help generate useful business metrics and slice user data to better understand how our product is getting used. Not only can we run complex queries over our Datastore data but also over all of our log data. This is incredibly powerful for analyzing the request patterns to App Engine. We can answer questions like:

  • Which requests cost us the most money?
  • What is the average response time for every URL on our site over the last 3 days?

BigQuery helps us monitor error rates in our application. We process all of our log data with debug statements, as well as something called an “error type” for any request that fails. If it’s a known error, we'll log something sensible, and we log the exception type if we haven’t seen it before. This is beneficial because we built a dashboard that queries BigQuery for the most recent errors in the last hour grouped by error type. Whenever we do a release, we can monitor error rates in the application really easily.



A Streak dashboard powered by BigQuery showing current usage statistics
In order to move the data into Cloud Storage from the Datastore and LogService, we developed an open source library called Mache. It’s a drop-in library that can be configured to automatically push data into BigQuery via Cloud Storage. The data can come from the Datastore or from LogService and is very configurable - feel free to contribute and give us feedback on it!

Google Cloud Platform also makes our application better for our users. We take advantage of the App Engine Search API and again mirror our data there. Users can then query their Streak data using the familiar Gmail full text search syntax, for example, “before:yesterday name:Foo”. Since we also push our data to the Prediction API, we can help users throughout our app by making smart suggestions. In Streak, we train models based on which emails users have categorized into different projects. Then, when users get a new email, we can suggest the most likely box that the email belongs to.

One issue that arises is how to keep all these mirrored data sets in sync. It works differently for each service based on the architecture of the service. Here’s a simple breakdown:




Having these technologies easily available to us has been a huge help for Streak. It makes our products better and helps us understand our users. Streak’s user base grew 30% every week for 4 consecutive months after launch, and we couldn’t have scaled this easily without Google Cloud Platform. To read more details on why Cloud Platform makes sense for our business, check out our case study and our post on the Google Enterprise blog.


Aleem Mawani is the co-founder of Streak.com, a CRM tool built into Gmail. Previously, Aleem worked on Google Drive and various ads products at Google. He has a degree from the University of Waterloo in Software engineering and an MBA from Harvard University.

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Prediction API: Make smart apps even smarter 2013

Seo Master present to you:

Since its announcement at Google I/O, the Google Prediction API has seen an outstanding response from the developer community. Developers participating in the Prediction API preview are already using it to identify spam, categorize news, and more.

Today we’re adding new features to the Prediction API to make your apps even smarter:

Multi-category prediction: Imagine you’re writing a news aggregator that suggests articles based on the kinds of stories the user has read before. Previously, using the Prediction API, each article could only be tagged with one label - the most pertinent one. For example, an article about a new truck might be labeled as “truck,” but not “roomy” or “quiet.” Now articles can be tagged with all of those labels, with the labels ranked by pertinence, enabling your app to make better recommendations.

Continuous Output: You’d like to create a wine recommendation app. Matching a wine to personal preferences is a tricky task, dependent on many factors, including origin, grape, age, growing environment, and flavor presence. Previously, your app could only label wine as “good,” “decent,” “bad,” or some other set of pre-defined values. Using the new continuous output option, your app can provide a fine-grained ranking of wines based on how well they fit the user’s preferences.

Mixed Inputs: You’re creating an automatic moderator for your blog. You could already classify incoming posts automatically based on comment text and the username of the poster (text inputs), but not the number of times they’ve posted before or the number of users that have liked their posts (numeric inputs). We’ve now added support for mixed inputs, so both numeric and text data can be incorporated in your moderation helper, greatly improving accuracy and letting you get back to making content rather than managing it.

Combining Continuous Output with Mixed Inputs: To further enhance your automatic moderator, you can use continuous output to set thresholds for automatic posting, automatic rejection and manual moderation, further reducing your workload.

You can get all the details about these and other new features on the Prediction API website. We are continuing to offer the Prediction API as a preview to a limited number of developers. There is no charge for using the service during the preview. To learn more and sign up for an invitation, please join the waitlist.

2013, By: Seo Master

seo Prediction API: Tunable predictive models 2013

Seo Master present to you:
By Travis Green, Product Manager

Over the last year, the Prediction API has given you more and more tools to make your apps smarter and teach them to adapt and learn. Today we're adding a frequently requested feature: the ability to adjust models to get better performance.

Historically, getting the right predictive model has required detailed knowledge of algorithmic behavior and experience with similar datasets, and a lot of guess-and-check. With the Prediction API, we ask you what behavior you want to see, and search across many algorithms to find the best-matching one.

How it works:
  1. Upload data to Google Storage for Developers.
  2. Ask the Prediction API to find a great predictive model.
  3. [new] Examine more detailed statistics about your model’s performance, including more training metadata and better accuracy statistics through a confusion matrix.
  4. Improve performance.
    1. Give your model more samples to learn from.
    2. Add in more information (see these samples).
    3. [new] Show the API what data is most important (categorical data only).

For those of you ready to get started, feel free to jump in through our newly updated code samples.

Travis Green's favorite part about his job is designing smart applications. In his spare time, he is in the great outdoors (looking for trouble).

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo BigQuery and Prediction API: Get more from your data with Google 2013

Seo Master present to you:
To deliver our services, Google has had to develop sophisticated internal tools to process data more efficiently. We know that some of these tools could be useful to any developer, so we’ve been working to create external versions that we can share with the world.

We’re excited to introduce two new developer tools to get more from your data: BigQuery and Prediction API. These two tools can be used with your data stored on Google Storage for Developers.

BigQuery enables fast, interactive analysis over datasets containing trillions of records. Using SQL commands via a RESTful API, you can quickly explore and understand your massive historical data. BigQuery can help you analyze your network logs, identify seasonal sales trends, or find a needle in a haystack of big data.

Prediction API exposes Google’s advanced machine learning algorithms as a RESTful web service to make your apps more intelligent. The service helps you use historical data to make real-time decisions such as recommending products, assessing user sentiment from blogs and tweets, routing messages or assessing suspicious activities.

We are introducing BigQuery and Prediction API as a preview to a limited number of developers. There is no charge for using these services during the preview. To learn more and sign up for an invitation, please visit the BigQuery and Prediction API sites.

If you are in San Francisco for Google I/O, we look forward to meeting you. Please come to our session tomorrow to learn more.

Posted by Amit Agarwal and Jordan Breckenridge, BigQuery and Prediction API Teams
2013, By: Seo Master

seo Google Prediction API helps all apps to adapt and learn 2013

Seo Master present to you:
By Travis Green, Product Manager

Now your apps can get smarter with as little as a single line of code. They can learn to continually adapt to changing conditions and to integrate new information. This week at Google I/O, we’re making the Google Prediction API generally available, meaning you can create apps with these capabilities for yourself. Additionally, we’re introducing several significant new features, including:
  • The ability to stream data and tune your predictive models
  • A forthcoming gallery of user-developed, pre-built models to add smarts even faster.
The Google Prediction API can be used by almost any app to recommend the useful, extract the essential, and automate the repetitive. For example:
  • Recommend a new movie to a customer.
  • Identify most important customers.
  • Automatically tag posts with relevant flags.
For example, Ford Motor Co. Research is working to use the Prediction API to optimize plug-in hybrid vehicle fuel efficiency by optionally providing users with likely destinations to choose from, and soon, optimizing driving controls to conserve fuel. Because the API is a cloud-hosted RESTful service, Ford has been able to access its computationally-intensive machine learning algorithms to find patterns that rank potential destinations based on previous driving paths. Ford will be demonstrating their work at the API’s I/O Session.

Here’s a summary of the features we added to the API today:
  • Streaming training data: Continually incorporate feedback for fast-adapting systems (e.g. user-chosen tags vs predicted ones, final purchases vs expected).
  • General availability: Anyone can now sign up to use the API. Paid users also receive a 99.9% SLA with increased quota.
  • New JavaScript library: Now deploy the Prediction API in your JavaScript – in addition to our updated Python and Java libraries.
Today, we are also announcing the Prediction API’s forthcoming gallery of pre-trained third party predictive models (try these demo models right now), and we will be adding more constantly (maybe yours – waitlist). Once complete, all Prediction API users will be able to:
  • Subscribe to others’ models: improve your apps with others’ predictive data tools.
  • Sell access to your models (e.g. sentiment analysis on social media).
  • Import customized models through the open-standard PMML encoding.
See our recent blog post for even more ideas, and get started at the Google APIs Console.

Thanks to our community of preview developers, who have played a crucial role in helping us make the Google Prediction API simpler and more powerful since its announcement last year at I/O 2010. We are thrilled to invite all developers to join them.


Travis Green's favorite part about his job is designing smart applications. In his spare time, he is in the great outdoors (looking for trouble).

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Prediction API: Every app a smart app 2013

Seo Master present to you: By Travis Green of the Google Prediction API Team.

If you’re looking to make your app smarter and you think machine learning is more complicated than making three API calls, then you’re reading the right blog post.

Today, we are releasing v1.2 of the Google Prediction API, which makes it even easier for preview users to build smarter apps by accessing Google’s advanced machine learning algorithms through a RESTful web service.

Some technical details of the Prediction API:
  • Chooses best technique from several available machine learning algorithms.
  • Supported inputs: numeric data and unstructured text.
  • Outputs hundreds of discrete categories, or continuous values.
  • Integrates with many platforms: Google App Engine, web and desktop apps, and command line.
  • v1.2 improvements:
    • Simpler interface: automatic data type detection, and score normalization.
    • Paid usage tier.
    • Improved usage monitoring and faster signup through the APIs Console.
Ideas to make the most of the Prediction API:
  • Recommendation: What products might a user be interested in? (example)
  • Filter RSS feeds, user comments, or feedback: Which posts are most relevant? Should a user comment be featured? Which feedback should we look at first? (example)
  • Customize homepages: Predict what content a user would like to see and populate the page with the user’s anticipated interests.
  • Sentiment analysis: Is this comment positive or negative? Does a commenter support Group A or Group B?
  • Message routing: Route emails to the appropriate person based on analysis of the email contents.
  • See the Prediction API website for many more!
To join the preview group, go to the APIs Console and click the Prediction API slider to “ON,” and then sign up for a Google Storage account.

We would also like to continue to thank our supportive preview users for their help making the API the service it is today. We look forward to seeing many more of you join us in making the web just a little bit smarter, and hearing your thoughts and feedback through our discussion group.

Travis Green's favorite part about his job is designing smart applications. In his spare time, he is in the great outdoors (looking for trouble).

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Service Accounts have arrived 2013

Seo Master present to you: Author Photo
By Justin Smith, Product Manager

Starting today, Google supports Service Accounts, which provide certificate-based authentication for server-to-server interactions. This means, for example, that a request from a web application to Google Cloud Storage can be authenticated via a certificate instead of a shared key. Certificates offer better security properties than shared keys and passwords, largely because they are not human-readable or guessable.

Service accounts are currently supported by the following Google developer services:
  • Google Cloud Storage
  • Google Prediction API
  • Google URL Shortener
  • Google OAuth 2.0 Authorization Server
  • Google APIs Console
  • Google APIs Client Libraries for Python, Java, and PHP
Over time, more Google APIs and client libraries will be supported.

This feature is implemented as an OAuth 2.0 flow and is compliant with draft 25 of the OAuth 2.0 specification. An application implements the following steps to authenticate with a Service Account:
  1. Generate a JSON structure.
  2. Sign the JSON structure with a private key, and encode it as a JSON Web Token (JWT).
  3. Send the JWT to Google’s OAuth 2.0 Authorization Server in exchange for an access token.
  4. Send the access token to Google Cloud Storage or the Google Prediction API.
The Google APIs Client Libraries for Python, Java, and PHP wrap these steps into a few lines of code and abstract the error-prone signing and encoding operations from your applications. We strongly encourage you to use these libraries for this type of interaction. We will be expanding support to other client libraries (including Ruby and .NET). Library developers can find the specifics of the protocol in the OAuth 2.0 Service Accounts documentation.

If you’re a Google App Engine developer, all this might sound similar to what is described in these articles: App Engine & Storage, App Engine & Prediction. Service Accounts generalize this App Engine capability by making it available to other server-side platforms. When using another server-side platform, you can create a Service Account through the Google APIs Console. See the Google APIs Console documentation for more information on creating a Service Account.

As always, we welcome and appreciate feedback. Please post any questions or comments to the OAuth 2.0 Google group.


Justin Smith is a Google Product Manager and works on authentication and authorization technologies. He enjoys woodworking, cycling, country music, and the company of his wife and newborn daughter (not in that order).

Posted by Scott Knaster, Editor
2013, By: Seo Master

seo Google Prediction API 1.5 adds enumeration, analysis, and more 2013

Seo Master present to you: Author Photo
By Marc Cohen, Developer Relations

The Google Prediction API Team has been hard at work on Release 1.5, which is available now, with the following new features:
  • Model enumeration. We’ve added the ability to list all of your models via the trainedmodels.list request. You can obtain the entire list in one response or you can iterate through a large listing in pieces using the maxResults and pageToken options.

  • Model analysis. We’ve added the ability to obtain more detailed information about data and models via the trainedmodels.analyze request, which returns information about the trained model’s output values, features, confusion matrix, and other information.

  • Simplified get method. We’ve simplified the output returned by the trainedmodels.get request. Model analysis data that previously was returned by a get request (e.g. the confusion matrix), is now returned by the new analyze request, along with additional analysis data. The get response now returns a simpler model description along with new timestamps indicating when the model was inserted and when model training completed, which should make it easier to keep track of model lifecycle.

  • New Google App Engine samples. We’ve created two new sample apps illustrating how to use the Prediction API from App Engine, coded in Python and Java. These samples show how to create and manage shared server OAuth 2.0 credentials, and how to make predictions on behalf of any site visitors using the shared server credentials. The sample code is available here and a live version of the sample app is available here: http://try-prediction.appspot.com.
You can read more about the API details here. The new release is available now via the HTTP RESTful interface and our various language-specific client libraries. You can also experiment with the new Prediction API 1.5 interactively via the Google APIs Explorer.

We’re always looking for ways to improve the Prediction API so, as always, please let us know about any problems or feature suggestions you might have. Happy Predicting!


Marc Cohen is a member of Google’s Developer Relations Team in Seattle. When not teaching Python programming and listening to indie rock music, he enjoys using the Google Prediction API to peer into the future.

Posted by Scott Knaster, Editor
2013, By: Seo Master
Powered by Blogger.