[OPE] information theory

From: Dave Zachariah <davez@kth.se>
Date: Mon Jan 05 2009 - 19:53:50 EST

Paula wrote:
> could you explain this materialist notion of information - in
> layperson terms? Thanks.

It arose from an engineering problem in telecommunication. The general
idea can be thought of as a 'source' and a 'receiver'. The source can
assume different states and the receiver is uncertain of which. When the
source transmits some signal to the receiver its uncertainty about the
source is reduced. The reduction of uncertainty is the 'information'
obtained. Hence the maximum amount is obtained when the uncertainty
about the source is depleted. That is the source's 'information content'.

Information theory was formalised by Claude Shannon in 1948, whose
notion of information was given a precise quantitative meaning. Note
that it does not require the existence of human minds, only two
configurations of matter that can change their states and manipulate
physical configuration of the medium that connects them (a 'signal' over
a 'channel'). Moreover, Shannon's term for information content was
'entropy' which has a parallel in thermodynamics, with possibly deeper
connections than he foresaw.

A deeper development of information theory was given by Andrey
Kolmogorov and others in the mid-1960s. This notion goes beyond the
communication between a source and a receiver and considers the
'information content' of a structure as its minimum possible description
using a universal language. Again, this is not predicated on human
language but universal computer language. It turns out that it is
connected with Shannon's notion of information.

//Dave Z
ope mailing list
Received on Mon Jan 5 19:58:41 2009

This archive was generated by hypermail 2.1.8 : Sat Jan 31 2009 - 00:00:03 EST