NEW YORK--(BUSINESS WIRE)--LexisNexis Risk & Information Analytics Group today announced that Sandia National Laboratories is leveraging the company's Data Analytics Supercomputer (DAS) to address the challenges posed by exponential growth of data sets and next-generation informatics applications. In competitive performance tests conducted by LexisNexis Risk & Information Analytics Group and Sandia, results show that the DAS, which is a leading-edge, high performance computing system (HPCC), performed 10 times faster than the next fastest system for large data analysis.
"While open source systems like Hadoop have come a long way in solving next-generation data challenges, the LexisNexis Data Analytics Supercomputer takes data analytics to the next level and enables extreme high-performance computing at a scale not previously available," said Armando Escalante, Chief Technology Officer of LexisNexis Risk & Information Analytics Group.
For more than thirty years, LexisNexis has been on the frontier of large-scale data management and analysis. The DAS was originally developed by LexisNexis Risk & Information Analytics Group to solve its internal data management and large-scale data analytics challenges and to deliver the speed and accuracy demanded by its expanding customer base. Today, the DAS powers all of the company's risk management solutions and helps customers solve large, complex data challenges such as national security issues.
Designed to manage the most complex and data-intensive analytical problems, the DAS features industry-leading speed, capacity, accuracy and ease of use. Powered by high performance computing cluster (HPCC) technology, the DAS was designed to be able to process, analyze, and find links and associations in high volumes of complex data significantly faster and more accurately than current technology systems. In addition, the DAS scales linearly from 10's to 1000's of nodes handling 10's of petabytes, supporting millions of transactions per day.
The core of the DAS is the Enterprise Control Language (ECL), which is a declarative, non-procedural programming language optimized for large-scale data management and query processing, which automatically manages workload distribution across all nodes. This benefits programmers, who do not need to understand how to manage the parallel processing environment. ECL programming efficiency is proven to be far greater than other approaches.