From: Paul Cockshott (wpc@DCS.GLA.AC.UK)
Date: Thu Feb 24 2005 - 05:04:03 EST
A question about the information in prices I have a question about how much information in a price is value and how much is noise generated by market disequilibria. Start out with Farjoun and Machovers Psi variable. This is a random variable defined as the price of a randomly selected hours worth of embodied labour. They measure the price not in terms of what has more recently been called the Melt but in terms of the hourly wage. Thus psi = price in hourly wages/ hours embodied labour since they expect s/v = 100% they argue that psi will be normally distributed with a mean of 2. They argue that it is very unlikely that the price will be too low to cover all of the labour costs, thus assume that if e= s/v , then the standard deviation of psi must be no more than e/3 as otherwise too many firms will be making a loss. When one translates the results that Allin and I have published fo the the UK PSI into F&Ms formulation one finds that, adjusting for the difference in e between the UK and their assumed 100%, the observed standard deviation of psi of approx 0.15 almost exactly fits their predictions for the measured rate of exploitation we got for 1984 of 46%. I then ask the question, how much information is encoded in the random variable PSI. If we perform numerical integration of the Shannon entropy formula on a normal distribution with sigma = 0.15 we get and entropy of just under 6 bits, for the F&M case we would have an entropy of 7. If the deviation of price from value represents around 6 to 7 bits of information. The next question is how much information is there in a randomly selected price. Generally prices are given to about 3 figures, but they can range from around 10pence to 1billion pounds, say for a large ship. This is about 10 orders of magnitude. Thus you might need 3 digits to encode a price and one digit to give the order of magnitude, since a digit encodes 3.3 bits roughly, this means that the entropy of the original prices is unlikely to exceed 14 bits. This implies that the value of a commodity probably represents something between 6 and 8 bits of an actual price. My question is if anyone can think of a realistic functional form for the statistical distribution of prices, if one had that one could in principle integrate the Shannon formula H = - p Log p over the pdf of prices and work out the likely entropy of prices slightly more accurately than the rough and ready estimate of <14 bits. I would suspect that some sort of gamma distribution may do it, but one has to take into account what sort of weighting to use. Should the pdf of prices be value weighted, so that a purchase of an aircraft carrier embodying perhaps 100 million hours of labour counts for more than a packet of crisps embodying perhaps 4 mins labour.
This archive was generated by hypermail 2.1.5 : Fri Feb 25 2005 - 00:00:02 EST