How big data is fueling a new age in space exploration

In 2018, a group of organizations from all of the world will begin construction of the largest radio telescope ever built, the Square Kilometre Array (SKA). With one million square meters of collecting area and enough optical fiber to wrap around the Earth twice, this marvel of modern engineering will be sensitive enough to detect airport radar on a planet 50 light years away.
 
SKA will also generate 700 terabytes of data every second, equivalent to roughly 35 times the data stored in the Library of Congress. At full capacity, the SKA’s aperture arrays are expected to produce 100 times more data than the entire Internet. It doesn’t take a rocket scientist to realize that such a deluge of information creates a big data problem, perhaps the biggest we have ever encountered.
 
Solving this big data problem for the space industry requires innovation in the data storage, processing, and access (or visualization) technologies, which, in turn, creates ample opportunities for startups and large data crunching companies to take advantage of. A few major factors will drive exponential growth in the amount of terabytes falling on us from the skies over the next couple of decades: the increasing speed of commercial satellite deployment, implementation of faster communication technology, and the onset of interplanetary missions.