Google today announced several enhancements to its BigQuery cloud-based data analytics service. The biggest one is a new option for setting quotas, designed to keep users from spending too much money with the product on any given day.

Using the new Custom Quotas feature, admins can set quotas for individual users or for entire projects, Google technical product manager Tino Tereshko wrote in a blog post.

Google is also rolling out something called Query Explain, which puts your SQL queries through a sort of X-ray to show their infrastructural impact.

“You can now see if your queries are write, read or compute heavy, and where any performance bottlenecks might be,” Tereshko wrote. “You can use BigQuery Explain to optimize queries, troubleshoot errors or understand if BigQuery Slots might benefit you.”

Part of the growing Google Cloud Platform, BigQuery abstracts away core compute and storage and provides a simple data warehouse for storage and queries. It’s comparable to Redshift from public cloud market leader Amazon Web Services, or Microsoft Azure’s SQL Data Warehouse.

Google also recently enhanced the performance of its core Cloud SQL-managed database service.

Besides enhancing existing services, Google is also rolling out new features (like the Cloud CDN), lowering prices, and expanding its presence geographically.

Indeed, today the Streaming application programming interface (API) for getting data into BigQuery now supports usage deployments into the European Union. The Google cloud has a data center facility in St. Ghislain, Belgium. And going forward, there will be no more waiting for tables to warm up  — you can query a table immediately after you start streaming data into it, Tereshko wrote.

He also wrote that BigQuery is finally getting audit logs to show what users are doing with the service.

Google first announced BigQuery in 2010 and made it available to all developers in 2012.