It's now standard practice to connect any new piece of machinery or plant to the cloud to record and use the data. Often referred to as
creating a 'Digital Twin' or 'The Internet of Things'.
This is partly driven by the almost negligable cost and power requirements of sophisticated embedded systems and by the rise of cloud services such as Amazon AWS and Microsoft Azure with associated state of the art AI and ML tools.
So we are currently installing remote condition monitoring collectors (RCMC) to hundreds of remote devices.
Each RCMC is collecting data from a number of sensors such as:
Logic signalling events
Electronic control unit logs
Server loading in this case can be predicted, so the choice is to build on a known number of Ubuntu compute instances, in this case in Digital Ocean. The software stack is fully open source, with a micro-services architecture.
Data ingest is implemented with token based authentication over SSL (TLS 1.3) using a combination of dockerized nginx reverse proxying to a custom Golang app implementing a RESTful API with JSON body (many alternatives were investigated including MQTT).
To achieve a super high performance with reasonable hosting costs, the data is then cached in a dockerized redis in-memory cache. A Python/Django background service then batch reads the data and bulk writes to a combination of SQL and no SQL databases. All dockerized.
Python is a great choice for server implemention because of the many high quality data science libraries available, and because prototyping in PyCharm and Jupyter Notebooks speeds development time. Plus there are many Python developers available worldwide for further development and maintenance.
The combination of Go for performance bottlenecks and Python for analysis is a well proven one.
If you are interested in or require help with projects involving any of these topics, please contact us or call 0333 444 0763.
2nd July 2021
Contact us and we'll get back to you within 24 hours.