Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 159 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Designing dedicated data compression for physics experiments within FPGA already used for data acquisition (1511.00856v1)

Published 3 Nov 2015 in cs.IT and math.IT

Abstract: Physics experiments produce enormous amount of raw data, counted in petabytes per day. Hence, there is large effort to reduce this amount, mainly by using some filters. The situation can be improved by additionally applying some data compression techniques: removing redundancy and optimally encoding the actual information. Preferably, both filtering and data compression should fit in FPGA already used for data acquisition - reducing requirements of both data storage and networking architecture. We will briefly explain and discuss some basic techniques, for a better focus applied to design a dedicated data compression system basing on a sample data from a prototype of a tracking detector: 10000 events for 48 channels. We will focus on the time data here, which after neglecting the headers and applying data filtering, requires on average 1170 bits/event using the current coding. Encoding relative times (differences) and grouping data by channels, reduces this number to 798 bits/channel, still using fixed length coding: a fixed number of bits used for a given value. Using variable length Huffman coding to encode numbers of digital pulses for a channel and the most significant bits of values (simple binning) reduces further this number to 552 bits/event. Using adaptive binning: denser for frequent values, and an accurate entropy coder we get further down to 455 bits/event - this option can easily fit unused resources of FPGA currently used for data acquisition. Finally, using separate probability distributions for different channels, what could be done by a software compressor, leads to 437bits/event, what is 2.67 times less than the original 1170 bits/event.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.