Information Mechanics (1605.01673v4)
Abstract: Despite the wide usage of information as a concept in science, we have yet to develop a clear & concise scientific definition. This paper is aimed at laying the foundations for a new theory concerning the mechanics of information alongside its intimate relationship with working processes. Principally it aims to provide a better understanding of what information is. We find that like entropy, information is also a state variable, and both their values are surprisingly contextual. Conversely, contrary to popular belief, we find that information is not negative entropy. However, unlike entropy, information can be both positive and negative. We further find that it is possible to consider a communications process as a working process and that Shannon's entropy is only applicable to the modelling of statistical distributions. In extension, it appears that information could exist within any system, even thermodynamic systems at equilibrium. Surprisingly, the amount of mechanical work a thermodynamic system can do directly relates to the appropriate corresponding information available. If the system does not have corresponding information, it can do no work irrespective of how much energy it may contain. Following the theory we present, it will become evident to the reader that information is an intrinsic property which exists naturally within our universe, and not an abstract human notion.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.