How Big Data Analytics is Aiding Search for Flight 370
As the hours and days go by following the sudden and mysterious disappearance of Malaysia Airlines Flight 370 somewhere in Southeast Asia, more people and organizations are joining the search party. And they are using every tool at their disposal -- not the least of which big data analytics -- to try and locate the Boeing 777, which carried 239 people and has thousands of family members and friends heartsick.
Daniel Hardman, chief architect of Adaptive Computing in Provo, Utah, is a supervisor of sorts in this search. Adaptive is most well known for the Moab data analytics platform, which is used by a number of various enterprises globally, including Oak Ridge National Laboratory, the University of Cambridge, and The Weather Channel.
Adaptive Computing's Moab HPC Suite and Cloud Suite are an integral part of the company's Big Workflow data center package, which the company claims unifies all data center resources, optimizes the analysis process and guarantees services. Big Workflow derives its name from its ability to solve big data problems by streamlining workflows to deliver insights from massive quantities of data across multiple platforms, environments and locations.
Teaming Up for the Search
DigitalGlobe is an Adaptive customer and also deeply involved in the search. Based in Longmont, Colo., DigitalGlobe is a commercial vendor of space imagery and geospatial content in addition to serving as an operator of civilian remote sensing spacecraft.
It also has a crowdsourcing website, Tomnod, that it is using to enable the public to join in the jetliner hunt.
DigitalGlobe and Adaptive are currently operating a crowdsourcing-type search for Flight 370, which was, as of March 16, still missing after disappearing from the world's radar screens on March 9. Anybody can join in the search by going to this site. You will need to enter your email address to participate.
See this article in Fast Company for more details about the crowdsourcing search for Flight 370.
"When DigitalGlobe gets a call, it's often under great time pressure," Hardman told eWEEK. "It's usually on the order of: 'We desperately need to search an area of the world, and we're looking for item X.' They have to train their satellites on that particular part of the world. They then gather their latest images, and then they run them through a gauntlet of computational steps. These steps involve taking [photo] overlays and reconciling the pixels in the photos so they don't overlap, and then building a giant mosaic of lots of small pictures."
Search from the Sky Focuses on Ocean
DigitalGlobe is focusing its search on the oceans around Malaysia, not on land. Its satellites take scores of photos, which are transmitted into its big data storage banks. Corrections are made to the photos as needed, such as making colors consistent and the contrast uniform, adjusting for the different camera angles [because the satellites are always moving], and detecting clouds that obscure the view. The system then eliminates photos that are unusable.
DigitalGlobe uses Adaptive's Big Workflow package with Moab as the centerpiece to dynamically allocate resources, maximize data throughput, and monitor system efficiency to analyze its imagery; DigitalGlobe already has archives that contain more than 4.5 billion square kilometers of global coverage. With that much global coverage, it was a no-brainer for it to use the Tomnod platform so that good Samaritans can help find Flight 370 using satellite imagery.
"They will create a custom algorithm that says: 'Here's what a whole plane looks like; here's what possible pieces and parts look like,'" Adaptive spokeswoman Jill King told eWEEK. "They are training their computers to look for those types of shapes. It can also look for certain colors -- even certain types of reflective light.
"The data center then will use Moab to analyze each of those shapes to see if they're a match [to Flight 370]."
Using Big Workflow, DigitalGlobe can process geospatial big data and identify shapes like aircraft within 90 minutes to aid rescuers during natural disasters and events such as Flight 370.
Big Workflow Enables Big Search Area
"Moab enables our responsiveness when disaster strikes," said Jason Bucholtz, principal architect at DigitalGlobe. "With Big Workflow, we have been able to gain insights about our changing planet more rapidly—all without adding new resources to our existing infrastructure."
In the unusual case of the missing Flight 370, DigitalGlobe is gathering as much human-created data as possible through the crowdsourcing model.
"On Tomnod.com, any person can go and just look at photos in the grid, and you're supposed to flag anything that looks interesting," Hardman said. "The problem is, humans can see lots of things, but they might not always be the right things. For example, someone may see something they think is unusual, but it's really just some floating garbage or a whitecap wave that may look a little suspicious.
"So what [DigitalGlobe] does is get the input of many thousands of people, run it through big data filters on the back end that say things like: 'Are there areas of the Indian Ocean where a lot of people have flagged an item of interest?' They then do cluster analysis on that. Then, experts in search-and-rescue may say, 'There's a hot spot, go fly over this.'"
As of 2 p.m. Pacific time March 15, the free-to-use Tomnod.com site indicated that 421,338 photos -- each showing an area about 1,000 to 2,000 feet wide of the Indian Ocean -- have been produced and entered into the analytics engine.
"I don't know what size grid that ends up being, but it's a wide, wide area of the western side of the search," Hardman said.
As of March 15, the search was still on, with many thousands of human eyes -- and a very high-powered photo/data analytics engine -- trying to solve the mystery of the missing airliner.