You can tell a lot about a person by the way they doodle. That’s the conceit behind Quick Draw, an interactive web experiment launched by Google’s Creative Lab in November 2016. It recruits intrepid web surfers to illustrate prompts with sketches, all the while leveraging artificial intelligence (AI) to attempt to identify what was drawn — sort of like a high-tech version of Pictionary.

Quick Draw has collected more than 1 billion drawings across 345 categories, 50 million of which Google open-sourced last year — complete with metadata, including prompts and geographical user locations. Today, it’s making them available through Google Cloud Platform (GCP) in the form of an API and an accompanying Polymer component. (Polymer, for the uninitiated, is an open source JavaScript library for building web applications.)

Starting this week, any GCP customer who joins the public Google Group can add the API to their library by searching for it and adding it to a project. With the Polymer component, doodles can be displayed in a web-based app with a single line of code.

“When we release the dataset, it was basically a file for every one of the 345 categories, and it was a little bit cumbersome to work with,”  Nick Jonas, creative technologist at Google’s Creative Lab, told VentureBeat in a phone interview. “A lot of these studies that have been done over the past year are large analyses on the entire dataset. We got some feedback from developers who said they wanted an easier way to prototype with the data quickly.”

Google Quick Draw Cloud API

Above: Adding the Quick, Draw! API to a project in Google Cloud.

Image Credit: Google

The Quick Draw API — which uses Google Cloud Endpoints to host a Node.js API, Jonas explained — provides access to the same 50 million files contained in the original dataset, but obviates the need to download them in their entirety. It returns a JSON object or an HTML canvas rendering for each drawing — one doodle.

“It’s a way for users not to have to download gigs and gigs of data before they can start playing with it,” Jonas said.

There are surprising insights to be gleaned from the data. A study by Quartz in June found that 86 percent of U.S. players drew circles counterclockwise, while 80 percent of Japanese drew them clockwise. (The difference, the publication found, can be attributed to the top-left-to-bottom-right stroke order in Japanese writing.) Meanwhile, an internal survey out of Google Research discovered that users from Western countries tend to doodle fish facing the opposite direction from those drawn by Asian users.

The dataset’s also been used creatively. British artist Neil Mendoza used a face-tracking algorithm to apply Quick Draw sketches atop a human head, and German-based computer scientist Deborah Schmidt tapped a subset of 300,000 random doodles to fill in letter templates with collages.

Google Quick Draw Cloud API

Above: Neil Mendoza mapping Quick Draw facial features to a human face.

Image Credit: Neil Mendoza

In the future, the team is considering migrating the doodles to a database, which would afford fine-grain access control. In theory, users would be able to perform queries like “Give me a recognized drawing from China in March 2017.”

“I just want to encourage people to use [the dataset] in new ways and to contribute and see how this can possibly expand,” Jonas said. “I just want to encourage developers to play with it.”