[OPE-L:3762] Re: Two "causes" of deviation?

Paul Cockshot (wpc@cs.strath.ac.uk)
Mon, 2 Dec 1996 08:33:31 -0800 (PST)

[ show plain text ]

Mike W:
>
>1. Is it your position that a capitalist economy approximates a chaotic system?
>(In the sense in which, for example, the solar system may be said to
approximate
>a Newtonian particle system.)

Yes.

Mike W
>2. Are prices, to some degree of approximation, what they are 'presented as' by
>F&M- random variables with a sharp-pointed distribution of the kind you
>describe?

I would not use the phrase 'sharp pointed' but I agree with the gist of this.

>3. If so, why are the random? Is it because we have not or cannot discover any
>structural determinations of them? Or because we can discover so many that
>variations approximate random movements.
>

This raises interesting questions about what a random system is.
The Kolgomorov/Chaitin definition of a random sequence of numbers R is one that
can not be produced by a generator function G, such that there exists an
encoding of G so that it would include fewer bits than R.

Clearly in this sense prices are not random. If we take as our sequence of
numbers an NxN io matrix M, plus a labour intput vector L plus a vector of
aggregate
output prices P, we have a sequence of N(N+2) numbers. If, for the moment we
assume that the [M,L] is irreducible,or incompressible,
then we want to know if there exists a function G(M,L)which can give
us more information about P than it takes to encode G. If such a G exists,
then the prices are not entirely random.

I assume that at least one such G does exist for sufficiently large N, and
that labour value theory V is one such generator function.
Let I(x) be the information content of x, then we can express this as:

I(V) < I(P) - I(P - V(M,L)) for some N

The information content of a suitably consise for encoding of the formulae
of the labour theory of value, is less than the information content in the
original price vector less the information content in the vector of
discrepancies
between real prices and those predicted by value theory.

If this holds, then the theory has some validity. Within the context of
this theory, there remains some unexplained information whose entropy is
given by the expression I(P - V(M,L)). This is a measure of the uncertainty
of prices with respect to value theory. Your question about randomness then
translates into whether or not there exists any other generator function
which given M,L and V can reduce the entropy of the differential term
I(P - V(M,L)) without itself costing more bits to encode than the information
it provides.

I suspect that there exists at least one other such generator - Price of
production
theory. But as we start working through second and third order correction
functions we soon arrive at the point where we can make no further improvement
in our predictions, at this point, with respect to the available information
the residual errors in predicted prices are Kolmogorov random.

>4. Or are you enough of an instrumentalist to say that it doesn't matter since
>the predicted correlation between price and vertically integrated labour times
>seems to be empirically supported? Are we not interested in why labour
times and
>production prices are attractors of prices? How are attractors and attractees
>distinguished - is it in the mathematics? or in the data? The term 'attractor'
>invites a causal comparison (gravitational or electromagnetic attraction)
But we
>are not to interpret it like that - is that right? Why can we not instead (as I
>think Duncan came close to mooting some time ago) take this correlation as
>grounds for doubting the usefulness of trying to separate the labour-time
system
>from its monetary expression in prices?

There are too many questions for me to answer here.

>5. We know that you and I work with widely varying methodological
'settlements',
>and that such settlements are not susceptible to definitive empirical or
logical
>determination. Nevertheless I would be interested in your reasons for assuming
>that a system characterised by Marx as people making history, but not in
>conditions of their own choosing is adequately grasped ('represented'?) by the
>mathematics of chaos. Perhaps only because of my ignorance, I do not find
>implausible the ontological claim of some physicists that physical (sub-atomic
>particle) systems are in them selves mathematical. I have not seen an argument
>why we should even consider the possibility that variables which are indeed the
>outcome of intentional action within social constraints should approximate a
>chaotic system. John Hicks, and more recently Frank Kahn both seemed to think
>that the non-linear dynamics of economic variables involved a path dependence
>pushing us inevitably towards real historical and sociological description. One
>advantage of the Marxist tradition is that it offers a much richer
>conceptualisation for such a task. Doesn't it?

People make history not in circumstances of their own chosing, but they also
lack the ability to predict the consequences of their actions. The results may
then be poorly correlated with their intentions. Beyond this, the actions of
large numbers of individuals tend to be chaotic with respect to one another.
It is a function of political parties to reduce this entropy, increasing the
likelyhood of a coherent outcome from political action.

>6. I think that the complexity of capitalist competition, and therefore the
>expected weakness and intermittent nature of tendencies dependent upon
entry and
>exit of firms into markets ( a medium if not a long-term process) does make
>demands on an adequate interpretation of single-system, as well as the
>two-system 'models'. However, the task of the former is less formidable given
>that it deals with the variables upon which capitalists actually make their
>calculations, about which they form their expectations, etc.
>

When attempting to predict actual prices from I/O tables you are
right, one gets a better prediction if one does not transform the inputs.
But in this sense one is not comparing like with like. The single system
theories have a further hidden parameter - the current price vector P, or
in a temporal version, last years price vector. The fact that the
single system generator SS requires this extra information means that may
fail the inequality

I(SS)+I(P) < I(P) - I(P - SS(M,L,P))

needed to be a valid predictive theory.

This inequality says that the information content of the SS generator plus
the information content of its hidden parameter the current price vector p
must be less than the information gained by using the theory to predict prices.
The problem is that the extra information required in the form of the current
of previous price vector will outweigh the increase in accuracy that it
gains vis a vis value theory when predicting prices.
Paul Cockshott

wpc@cs.strath.ac.uk
http://www.cs.strath.ac.uk/CS/Biog/wpc/index.html