Reservoir Labs Targets PERFECT Parallel Tools for DARPA
Developing sensor applications such as image formation, change detection and adaptive imaging for the new architectures requires specialized skill. Developers must know about the app and its physics, required precision and algorithms, as well as the target hardware, parallelization and optimization techniques, Lethin said. This often involves an error-prone, two-step development process where first an application specialist employs a high-productivity language to create prototypes to validate the algorithm, and then a code jockey comes in to re-implement the app in a low-level language and parallelize the software. The objective of SANE is to lower the barrier and reduce requirements for success in writing code such that developers need only know about the app and not algorithms, target hardware and such. Reservoir’s R-Stream is a high level compiler for embedded computing, parallel processing and HPC algorithms, and is designed to seamlessly generate parallelized code to target-specific or low-level C compilers, the company said. R-Stream is a source-to-source compiler that accepts a sequential C program as input and produces code that has been parallelized and optimized for the new types of processors. R-Stream can output optimized code in a variety of formats for downstream processors, including highly optimized OpenMP and CUDA – formerly known as Compute Unified Device Architecture. Moreover, R-Stream performs multiple advanced transformations from the input C source code to achieve high performance. These include special forms of array expansion -- to remove constraints on parallelism -- joint scheduling for parallelism and locality, task granularity selection, communications generation, software pipelining, memory region reshaping, and back end dialect generation. The resulting mapped program is more than simply parallelized — it represents a detailed choreography of computation and data motion across parallel units and through explicitly managed memory hierarchies, Lethin said. Established in 1958, DARPA, then known simply as the Advanced Research Projects Agency (ARPA), is notorious for its involvement in the creation of the Internet, which began with an idea to link time-sharing computers into a national system. The agency’s ARPANET was a precursor to the Internet. A primary role of DARPA has been to spark innovations that can assist in the country’s war fighting capabilities. However, over the years many of the innovations that started at DARPA have found their way into the mainstream and into consumer and business uses.Lethin said Reservoir hopes to add to that list. “The future of computing is ‘physical’ as embedded wireless sensors augmenting our reality,” he told eWEEK. “The kinds of optimizations from our project, which are targeted at DOD ISR systems, will be useful for consumer devices performing video image processing, wireless communication, and other forms of sensing. It will also have application in big data processing.” Lethin also noted that Reservoir will be releasing research versions of its compiler on a quarterly basis. The next release is Jan 15, 2013. “We will license the compiler commercially for a fee and provide the compiler free to academics for research purposes,” Lethin said. “We will also provide evaluation versions. The compiler uses the Gurobi tool internally for optimization. That can be embedded in the commercially licensed versions and academics can obtain free licenses to Gurobi.”
Among the innovations that started at DARPA and moved into the mainstream are: Voice-to-text software; distributed computing; other projects that helped lead to innovations like Unix; Windows NT; packet switching; TCP/IP protocols; reduced instruction set computing; massively parallel processing; computer-aided design/computer-aided manufacturing; synchronous optical networking; asynchronous transfer mode and computer graphics.