Speaker
Description
The talk aims to analyse the definition of information by comparing the crucial definitions formulated by Shannon and Von Neumann and highlighting some of their criticalities.
Statistical mathematics poses the basis of the understanding of information in the form of Shannon’s proposal in A Mathematical Theory of Communication. In this work, information is explained by the concept of Information Entropy. However, it creates confusion in understanding what information is, first because of its comparison to the thermodynamic counterpart and second because Shannon’s definition is very precise, as it analyses entropy as the quantity of uncertainty in a transferred message.
Instead, the advent of quantum mechanics also posed a turning point in the definition of information and, more precisely, in the exchange of information. Von Neumann proposed a new definition that, at first glance, seems pretty near to the definition of Shannon; in fact, the shape of both formulas are very similar, and the Von Neumann Entropy seems to be the same equation of the Information Entropy but for the microscopic world. However, the Von Neumann definition captures something different: the entanglement property. The two equations coincide in some situations, but the same applies to Shannon Entropy and thermodynamics. We will investigate if the definitions are specular.
The seminar will be structured as follows: in the first part, we will analyse Shannon's definition; in the second part, we will discuss Von Neumann’s definition; and in the last section, we will compare them using recent works on this topic.