The New-Age thruster for space exploration
The story of Apollo 13 is one of courage, innovation, partnership and a treasure trove of data. Data backed up, archived and stored in an expert manner using a backup module made available to them. At first glance, the unequivocal importance between space exploration and data may seem unclear. But the truth is, without data there can be no missions into the unknown.
To put it in perspective, for National Aeronautics and Space Administration (NASA) missions, several hundred terabytes are collected per hour. The premier space exploration and aviation research agency has dozens of missions, which means it relies on a gold mine of data.
Digital recordings linked to the launch and tracking of satellites in space, images of the Earth and distant galaxies can yield enormous results in providing solutions capable of solving the greatest challenges of the human race.
All digital recordings must therefore be indexed, stored and processed so that scientists, software engineers, space engineers and people around the world can exploit the data. This requires a different level of solutions and policies to manage the extremely complex record retention, eDiscovery, and security requirements.
Fuel storage for a long, smooth journey
There are three main areas of data management in the context of space missions: storage, processing and access.
Data storage is the more difficult feat of the other two simply because of its enormous volume. For example, once the Square Kilometer Array (SKA) – an array of thousands of telescopes in Australia and South Africa – is operational, it will send 700 terabytes of data per day, which is equivalent to doubling the data flowing through Internet every day.
Another example is that of Dhruva Space, an Indian space technology start-up known for creating small satellite systems for commercial and government purposes, makes extensive use of its satellite to gather huge amounts of terrestrial data from space and retransmits them to earth for business. and industries to generate value from this data. Faced with this Big Data, technology leaders are moving away from deploying more hardware and offering innovative software solutions and models in the cloud, to store and retrieve data more efficiently.
Much like space travel, data also travels an unknown path, whether it is multiple clouds or containers (or both!). It is imperative to mitigate and minimize the risk factors of uncharted territories through proactive data management. While multi-cloud and / or containers offer simplicity and ease of scalability for data management, it is important to ensure that any risks associated with these disruptive technologies are mitigated and minimized.
Democratize data: high availability and accessibility
It is crucial to provide transparent access to archival data. Online archives provide primary access to raw and calibrated mission data. Scientists combine and analyze datasets across traditionally separate wavelength boundaries.
Therefore, it is important to determine where to store long-term archival data and how to make it durable and usable for the community.
In addition to overcoming archival challenges, technology leaders in space exploration also face complexity. The Mars Reconnaissance Orbiter, launched by NASA to study the climate and geology of the red planet, regularly returns images to Earth. Each image contains 120 megapixels. Needing advanced means to visualize this information, technology leaders at NASA’s Jet Propulsion Laboratory create computer graphics, animations, and movies from the datasets.
To ensure that engineers and scientists can easily use the data, IT managers are working to automate the process of creating visualization products. It is therefore imperative to pave the way for innovations that will allow them to easily use the datasets they receive from space.
Another aspect of big data is its availability, which allows users to easily access whatever they need in archives. The data archived by the Indian Space Research Organization (ISRO) that has been accumulated by the instruments aboard their Chandrayaan-1 – its first mission to the moon – has been widely used to understand the queries raised in the field of science. lunar and remote sensing data applications to study lunar evolution.
Satellite data is an important source of information for climate activists and environmentalists. However, the possibilities are not limited to comparing and viewing satellite images. By combining different images, one can derive critical information about soil hydration levels or vegetation health. Timely research in these areas can go a long way in protecting our environment.
It is therefore imperative that the data is always available and accessible. The data management solution must be such that it eliminates single points of failure and promises continuous availability or operations for long periods of time.
Data analysis is the key to unraveling the mysteries of the universe, and processing data in milliseconds is essential. Ultimately, it boils down to having a solution that promises the high availability of big data.
Towards rock-solid space exploration
The coming decades will accelerate the growth in the number of terabytes generated by human exploration of space. The dawn of interplanetary missions, the deployment of faster communications technologies, and the acceleration of the deployment of commercial satellites will all contribute to this end.
The space industry is gradually moving in a direction where everything will be governed by digital connections to data, models and software. This digitization is necessary to ensure that all the data collected is used to the maximum.
To build a smart, resilient and secure data solution, space exploration agencies – public or private, must align themselves with IT vendors capable of delivering a cutting edge and scalable solution without borders. architecture that can enable a transparent flow of data to and from the cloud. With this state-of-the-art data solution, they will be able to control and mitigate risk through a single command center, while protecting data from attack. As the challenges of Big Data are all set to become more important, space agencies will need to continue to innovate in technologies for processing, storing and visualizing (accessing) data.
To weather the inbound data storm, they must continue to align with a strong data protection and management partner and develop new strategies because, in the space, outages are not an option.
–Pradeep Seshadri is Director, Sales Engineering at Commvault India and SAARC
Reference links –