LexisNexis Risk Solutions, which has been using its home-developed, large-scale computing system in production for 10 years, recently announced that it is sharing its wealth with the open-source community. It’s now offering an alternative to Hadoop, which gets most of the “big data” press these days.
The well-established risk-management and fraud-detection service provider has made available this data-intensive supercomputing platform under a dual-license, open-source spinoff called HPCC Systems. HPCC Systems can manage, sort, link and analyze billions of records within seconds. That’s right.
It is expressly designed to help an enterprise solve big data problems, and as anybody who has had to deal with the increasing influx of human- and machine-created data knows, those can be big-headache-type issues.
HPCC Systems, which has been tested, pounded upon and proven with customers since 2001, provides a high-performance computing cluster with a single architecture and a consistent data-centric programming language.
“We think the time is right to do this, and we also believe that HPCC Systems will take big data computing to the next level,” CEO James M. Peck told The Station.
“We’ve been doing this quietly for years for our customers with great success. We are now ready to present it to the community to spur greater adoption.”
Peck & Co. are also expecting the talents and creativity of the open-source community to help spur interest and development in the platform, and he’s probably right. Never has interest in the processing of big data caused more attention. Check it out right here for all the deets. This post is merely an intro; you can expect to see more about how this develops here on eWEEK.