New Samplify APAX Storage Library Accelerates Disk Throughput and Expands Storage Capacity for HPC, Big Data, and Cloud Computing Applications
APAX HDF Library Accelerates HDF-Enabled Applications 3-8X Without Any Software Changes
CAMPBELL, Calif., June 14, 2013 /PRNewswire/ -- Samplify, the leading intellectual property company for accelerating memory, storage, and I/O bottlenecks in computing, consumer electronics and mobile devices, announces the availability of its APAX HDF (Hierarchical Data Format ) Storage Library for high-performance computing (HPC), Big Data, and cloud computing applications. With APAX HDF, HPC users can accelerate disk throughput by 3-8X and reduce the storage requirements of their HDF-enabled applications without having to modify their application software. The APAX HDF Storage Library works with Samplify's APAX Profiler tool to analyze the inherent accuracy in each dataset being stored, and applies the recommended encoding rate to maximize acceleration of algorithms with no effect on results.
"Our engagements with government labs, academic institutions, and private data centers reveal a continuous struggle to manage an ever increasing amount of data," says Al Wegener, Founder and CTO of Samplify. "We have been asked for a simpler way to integrate our APAX encoding technology in Big Data and cloud applications. By using plug-in technology for HDF, we enable any application that currently uses HDF as its storage format to get the benefits of improved disk throughput and reduced storage requirements afforded by APAX."
Next week at the International Supercomputing Conference (ISC'13), a paper presentation co-authored with Deutsches Klimarechenzentrum (DKRZ) [German Climate Computing Centre], University of Hamburg and Samplify cites, "The most easily obtained benefit from lossy compression of climate datasets is a significant reduction in disk file size and a corresponding increase in disk bandwidth." For increasing disk throughput, the authors observe, "APAX appears to be faster… APAX is a single-pass algorithm which leads to better cache usage." When comparing the quality of the results, the authors note, "APAX averaged 1.6X more compression." The authors conclude, "APAX offers better encoding for most climate variables due to its superior compression or data quality."
About HDF
HDF is an open source library developed and maintained by the HDF Group. HDF5 is the underlying file format in NetCDF 4 which is the standard file format for the interchange of climate data. In its current release HDF5 version 1.8.11, the HDF Group adds support for third party plug-ins into the storage pipeline. Samplify's APAX plug-in for HDF enables transparent access to APAX compression technology without requiring modification of the application software. When used in conjunction with Samplify's APAX Profiler, the encoding rate can be optimized for each dataset being stored in the file.
About APAX
The APAX technology is a universal numerical data encoder that operates on any integer or floating point data type and can achieve typical encoding rates of 3:1 to 8:1 without affecting the results of computing applications. Samplify's APAX SDK is a software library which can be linked into any computing application to enable it to operate natively on APAX-encoded data in memory, on disk, or streaming across network interfaces. The APAX SDK is optimized for SIMD execution on SSE/AVX Intel CPUs achieving a throughput of 200 MB/sec per core, and the API is fully compatible with the company's APAX hardware IP core for SoC and FPGA integration, allowing APAX-enabled applications to take advantage of future APAX-enabled hardware in the data center. The web-based APAX Profiler analysis tool analyzes the inherent accuracy in the user's dataset and makes a recommendation of encoding rates to maximize the acceleration of their algorithm with no effect on results.
Availability and Pricing
Samplify's APAX HDF Library is available immediately from the company for Linux platforms with annual licensing starting at U.S. $50,000 for data centers with storage requirements of one petabyte. For more information, go to www.samplify.com/apax-hdf
Samplify will be exhibiting in Booth 365 at ISC'13, June 17-19, in Leipzig, Germany. A paper entitled "Evaluating Lossy Compression on Climate Data" will be presented on Wednesday, June 19, at 9:40 AM in Session 6—Hall 5.
About Samplify:
Samplify is a Silicon Valley startup providing the only software and hardware numerical encoder for solving memory, I/O, and storage bottlenecks in HPC, Big Data, cloud computing, consumer electronics and mobile devices. Samplify is a privately-held company with funding from Charles River Ventures and Formative Ventures and strategic investors such as Schlumberger, Mamiya, and IDT.
SOURCE Samplify
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article