Big Times Series Data


We have delivered phase 1 of a project storing 1500 real time events per minute, to a 700 million row (and growing) time series data table.

We use a carefully curated set of Python-heavy tools deployed in a modular architecture offering high capacity data storage, manipulation and visualisation at a competative cost.

A cost so competative that it is enabling novel applications.

State of the art time series downsampling provides rapid inspection of millions of samples in an instant. All hosted either locally on your PC or inexpensively in the cloud.

Our tech stack takes 100x less disk space than Postgres by using column orientated storage, enabling high levels of lossless compression. These database technologies are excellent for OLAP (OnLine Analytical Processing) applications.

Our current application is deployed for remote condition monitoring of a fleet of trains. We sample a range of data types including boolean logic levels, voltages, pressures, temperatures, humidity, speed, lat/lon and many more. We can also run custom state machine logic rules over the data in real time to rapidly diagnose bespoke fault conditions.

Whatever your industry, if you're interested in finding out the costs and approaches to storing and processing large amounts of time series data, call us to find out more.

Use the contact form below to get started on your MVP now.

Demonstration Django Plotly Dash App

CONTACT

Contact us and we'll get back to you within 24 hours.

Hampshire, UK

info@rockstonedata.co.uk

Contact Us