Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 148 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 40 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 183 tok/s Pro
GPT OSS 120B 443 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Compression for Multiple Reconstructions (1802.03937v1)

Published 12 Feb 2018 in cs.MM

Abstract: In this work we propose a method for optimizing the lossy compression for a network of diverse reconstruction systems. We focus on adapting a standard image compression method to a set of candidate displays, presenting the decompressed signals to viewers. Each display is modeled as a linear operator applied after decompression, and its probability to serve a network user. We formulate a complicated operational rate-distortion optimization trading-off the network's expected mean-squared reconstruction error and the compression bit-cost. Using the alternating direction method of multipliers (ADMM) we develop an iterative procedure where the network structure is separated from the compression method, enabling the reliance on standard compression techniques. We present experimental results showing our method to be the best approach for adjusting high bit-rate image compression (using the state-of-the-art HEVC standard) to a set of displays modeled as blur degradations.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.