Information Theory

Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s.: vii  The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.Covers theoretical and experimental aspects of information theory and coding. Includes material in ACM Subject Class E.4 and intersects with H.1.1.
相关学科: Networking and Internet ArchitectureQuantizationConvolutionFrequency Division MultiplexingDiscrete MathematicsCryptography and SecurityCompressive SensingMLTime SeriesGPS

学科讨论

讨论Icon

暂无讨论内容,你可以

推荐文献

按被引用数

学科管理组

暂无学科课代表,你可以申请成为课代表

重要学者

Jean-François Cardoso

171039 被引用,376 篇论文

Michael I. Jordan

150356 被引用,1056 篇论文

Yang Yang

143009 被引用,2784 篇论文

Terrence J. Sejnowski

134448 被引用,931 篇论文

Vladimir Vapnik

134337 被引用,136 篇论文

Stephen Boyd

132438 被引用,861 篇论文

John R. Yates

124537 被引用,1104 篇论文

Claude E. Shannon

121827 被引用,63 篇论文

David Cox

117105 被引用,661 篇论文

Bradley Efron

114291 被引用,300 篇论文