This website requires JavaScript.
DOI: 10.1101/2023.05.22.541700

Compression strategies for large-scale electrophysiology data

A. P.Buccino O. Winter D. Bryant D. Feng K. Svoboda J. H. Siegle
摘要
With the rapid adoption of high-density electrode arrays for recording neural activity, electrophysiology data volumes within labs and across the field are growing at unprecedented rates. For example, a one-hour recording with a 384-channel Neuropixels probe generates over 80 GB of raw data. These large data volumes carry a high cost, especially if researchers plan to store and analyze their data in the cloud. Thus, there is a pressing need for strategies that can reduce the data footprint of each experiment. Here, we establish a set of benchmarks for comparing the performance of various compression algorithms on experimental and simulated recordings from Neuropixels 1.0 (NP1) and 2.0 (NP2) probes. For lossless compression, audio codecs (FLAC and WavPack) achieve compression ratios 6% higher for NP1 and 10% higher for NP2 than the best general-purpose codecs, at the expense of a slower decompression speed. For lossy compression, the WavPack algorithm in 'hybrid mode' increases the compression ratio from 3.59 to 7.08 for NP1 and from 2.27 to 7.04 for NP2 (compressed file size of ~14% for both types of probes), without adverse effects on spike sorting accuracy or spike waveforms. Along with the tools we have developed to make compression easier to deploy, these results should encourage all electrophysiologists to apply compression as part of their standard analysis workflows.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?