KX Use Case: Real Time Processing of ELT Telemetry Data

24 October 2018 |
Notice: Undefined variable: post in /var/www/vhosts/kx.com/devweb.kx.com/wp-content/themes/eprefix-bootstrap/functions.php on line 18

Notice: Trying to get property 'ID' of non-object in /var/www/vhosts/kx.com/devweb.kx.com/wp-content/themes/eprefix-bootstrap/functions.php on line 18
5 minutes

By John Lockhart

 

Construction of the world’s largest optical telescope is underway on a mountain top in northern Chile by the European Southern Observatory (ESO), a 16-nation intergovernmental research organization for ground-based astronomy. The Extremely Large Telescope (ELT) is expected to significantly expand astrophysical knowledge and is planned to be completed in 2024. As part of the preparations for the telescope’s launch, the ESO is already working on how to manage and analyze the enormous streams of astronomy data it anticipates that the ELT will generate, as well as the telemetry data created in the operation of the telescope itself.

Real-time telemetry data for predictive maintenance

A unique element of the ELT is its advanced system of mirrors which will enable astronomers to study the universe with greater clarity than is currently possible. The precise positioning of these mirrors when they are deployed for research missions is a critical aspect of their effectiveness.

The ELT’s primary mirror has a surface of 39 meters made up of 798 individual segments, each of which will have sensors on actuators that can move the segments. The sensors will produce position and condition data, as well as other information, at a rate that far exceeds that collected from current telescopes, and also exceeds what conventional database technologies can handle.The sensors will also generate many more messages, and different types of messages, than today’s telescopes. In one possible scenario, the ESO theorizes that there could be over 28 million telemetry sensor values conveyed at a rate of 1.2 million messages per second.

As the ELT’s primary mirror is maneuvered throughout a night of study, the ESO needs to know the exact alignment of each of the segments, because all of the segments need to behave together as one mirror. If not, the ESO wants to be able to quickly identify and reposition any segment of the mirror that is out of alignment. Having the ability to analyze streaming sensor telemetry data from the segments in real time will speed up repairs as well as enable predictive maintenance to anticipate anomalies and negative events before they occur.

The KX difference

KX recently conducted a study to demonstrate to the ESO the potential of KX technology for ingesting, storing and analyzing telemetry data based on both simulated ELT sensor data and historical data from a large telescope currently operating in Chile.

The study was conducted on the Amazon cloud (AWS) and showed that KX technology, and its underlying kdb+ database platform, met and sometimes exceeded targets set by the ESO with their ability to quickly and accurately ingest data as it was produced, identify and flag out-of-range values, and query the data, while operating on a small hardware footprint.

The sheer volume of data generated when the ELT is operational will require a solution that can handle millions of events and measurements per second, and gigabytes to petabytes of historical data at nanosecond resolution. KX, with its columnar, time-series database, kdb+, demonstrated its ability to perform more efficiently and cost-effectively than any other available alternatives, which makes it uniquely suited for the ELT predictive maintenance use case.

Going forward in real time

The ESO Council has ambitious plans for the ELT, which will address some of astronomy’s most challenging questions. Areas of research are expected to include probing early stage planetary systems, primordial stars and black matter, as well as space research topics that are unknowable in advance of its operation.

A problem that can be tackled today though is the design of the advanced telemetry data analytics system for ELT predictive maintenance — learnings from which might be useful for currently operating telescopes as well. Accurate telemetry analytics based on sensor data from mirror segments is a first step in determining if an image is flawed or correct, which in turn is necessary information for research scientists relying on those images.

Up until now, operators of large telescopes have found sensor data ingestion and scaling to be challenging. They have not had the ability to analyze it in real time. Instead, they have done batch processing every 15 minutes to put data into a data store, and in some cases these batch processing loads are transferred to an online transaction processing application (OLTP). However, this method will not be able to keep up with the volume and velocity of data from ELT mirror telemetry sensors.

KX technology, with an ingestion rate of over 4,000,000 sensor points per second on a single core and the ability to quickly store and analyze trillions of records, can give organizations like the ESO accurate telemetry data in real time. The research benefits from this shift in positional awareness of the 798 segments of the ELT’s primary mirror, for example, are enormous, as well as the cost savings made possible by being able to anticipate anomalies and errors before they occur.

John Lockhart is the ESO IT Group Garching head, having joined ESO as a database administrator almost 20 years ago. Although settled in Bavaria, John is originally from Newry, Co Down.

 

Demo kdb, the fastest time-series data analytics engine in the cloud








    For information on how we collect and use your data, please see our privacy notice. By clicking “Download Now” you understand and accept the terms of the License Agreement and the Acceptable Use Policy.