Title
Kolmogorov-Sinai entropy and dissipation in driven classical Hamiltonian systems.
Abstract
A central concept in the connection between physics and information theory is entropy, which represents the amount of information extracted from the system by the observer performing measurements in an experiment. Indeed, Jaynes' principle of maximum entropy allows to establish the connection between entropy in statistical mechanics and information entropy. In this sense, the dissipated energy in a classical Hamiltonian process, known as the thermodynamic entropy production, is connected to the relative entropy between the forward and backward probability densities. Recently, it was revealed that energetic inefficiency and model inefficiency, defined as the difference in mutual information that the system state shares with the future and past environmental variables, are equivalent concepts in Markovian processes. As a consequence, the question about a possible connection between model unpredictability and energetic inefficiency in the framework of classical physics emerges. Here, we address this question by connecting the concepts of random behavior of a classical Hamiltonian system, the Kolmogorov-Sinai entropy, with its energetic inefficiency, the dissipated work. This approach allows us to provide meaningful interpretations of information concepts in terms of thermodynamic quantities.
Year
DOI
Venue
2018
10.1103/PhysRevE.98.052109
PHYSICAL REVIEW E
Field
DocType
Volume
Information theory,Statistical physics,Statistical mechanics,Classical physics,Hamiltonian system,Entropy production,Mutual information,Principle of maximum entropy,Classical mechanics,Kullback–Leibler divergence,Physics
Journal
98
Issue
ISSN
Citations 
5
1539-3755
0
PageRank 
References 
Authors
0.34
2
4
Name
Order
Citations
PageRank
Matheus Capela100.34
M. Sanz274.20
Enrique Solano3107.28
Lucas C. Céleri400.34