BigQuery, a cloud-based service from Google for analysing very large sets of data, is now publicly available after a period of limited-availability testing, Google said yesterday.
Big Query is an online analytical processing (OLAP) system able to crunch data sets with billions of rows containing terabytes of data and is aimed at businesses and enterprise developers that need to slice and dice “big data” in real time, according to Google.
Users can tap into the BigQuery service via several methods, including a web-based user interface, a REST API (application programming interface) and a command-line tool. Data can be imported into the Google BigQuery servers in CSV format.
Pricing varies according to the amount of data being stored and the volume of queries processed.
Some early testers of BigQuery have built applications on it, according to Google. For example, a company called Claritics created a tool that lets game developers analyse user-behavior data in real time.
Big Query isn’t for OLTP (online transaction processing) tasks, nor suitable for making changes to the data sets stored on it. For companies that need a more traditional relational SQL database for managing small and medium data sets in OLTP scenarios, Google instead recommends its cloud-based database Google Cloud SQL, which supports full SQL syntax and tables.
Big Query was unveiled at the Google I/O developer conference two years ago and enhanced last November, when among other things the web-based UI was added.
With this release, Google continues to add to its arsenal of cloud-based enterprise computing products, a market where it competes with Amazon, Oracle, IBM, Microsoft and others.