Papers
Topics
Authors
Recent
Search
2000 character limit reached

Information Mechanics

Published 3 May 2016 in physics.gen-ph | (1605.01673v4)

Abstract: Despite the wide usage of information as a concept in science, we have yet to develop a clear & concise scientific definition. This paper is aimed at laying the foundations for a new theory concerning the mechanics of information alongside its intimate relationship with working processes. Principally it aims to provide a better understanding of what information is. We find that like entropy, information is also a state variable, and both their values are surprisingly contextual. Conversely, contrary to popular belief, we find that information is not negative entropy. However, unlike entropy, information can be both positive and negative. We further find that it is possible to consider a communications process as a working process and that Shannon's entropy is only applicable to the modelling of statistical distributions. In extension, it appears that information could exist within any system, even thermodynamic systems at equilibrium. Surprisingly, the amount of mechanical work a thermodynamic system can do directly relates to the appropriate corresponding information available. If the system does not have corresponding information, it can do no work irrespective of how much energy it may contain. Following the theory we present, it will become evident to the reader that information is an intrinsic property which exists naturally within our universe, and not an abstract human notion.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.