[ show plain text ]
A few comments on the arguments Julian expounds so clearly:
----- Original Message -----
Sent: Monday, January 17, 2000 6:24 PM
Subject: [OPE-L:2196] Statistical regularities
> Since murder -- and especially self-murder (as many thought of it in
> those days) seem on the face of it the most extreme outcomes of human will
> and intentionality, one might expect the annual rates of these to be very
> erratic, whereas they are (or were) apparently rather regularly
Not if that intentionality is structured (inter alia via family resemblance
commonalties in ideas). I seem to remember Elster or Roemer or one of their
comrades made a similar argument in a reductio ad absurdum mode to defend
Rational Choice Marxism: i.e. the only justification for *not* theorising
choice iaw preferences would be a (to them clearly untenable) argument that
constraints (of social structure) were binding to an extent that made issues
of choice and motivation irrelevant.
> Quetelet was an outright statistical fatalist: he fully believed
> that the regularities which he discovered implied that the agents were
> compulsion to carry out the acts involved: "society prepares the crimes",
> said. Thus he argued that responsibility and punishment were inappropriate
> categories in this connection.
This is the standard rabid right wing argument against Hampstead Liberals.
It is clearly a non sequitur: because we understand an act doesn't mean that
we, individually or socially, must condone it. It may of course have a
bearing on what preventative regimes might be considered efficacious.
> point is HOW one distinguishes (if one can) causally-determined actions
> those resulting from free will, faced with a bare regularity, such as the
> distribution of suicide rates.
But surely this is just a particular spin on Hume's account of causality -
as Julian goes on to suggest?
> given very simple assumptions about firms' behaviour --
> essentially that the firms' actions with regard to inputs and outputs are
> unco-ordinated ("independent"), and that their interactions are solely
> through exchange in a competitive market.
These may be very simple, but they are also very restrictive and
> F&M argue that
> market power, collusion, etc. undoubtedly reduce the number of degrees of
> freedom in the system (increase the amount of co-ordination among agents),
> but in practice not by enough to significantly reduce the number of
> of freedom.
I would appreciate a bit of an account of what is 'enough', 'not enough' and
> Since not only are there millions of distinct commodity-types in a
> real capitalist economy, but billions of daily transactions, market
> intervention would have to be pretty thorough to undermine the
> of statistical mechanical formalism.
Yes, but surely there is a prior subversive point here, analogous to an
occasionally mentioned but rarely systematically developed critique of GE
theory: there are far fewer markets than there are actual and certainly
possible distinct commodity types (especially if we allow differentiation by
time and space subscripts, and characteristics - of which commodities are
'bundles' a la Lancaster). Thus the problem of missing markets is not
confined to the usual 'externalities', but is ubiquitous. I'm not sure how
this would map into a dynamic statistical mechanical framework?
Dr Michael Williams
Economics and Social Sciences
De Montfort University
fax: 0870 133 1147
[This message may be in html, and any attachments may be in MSWord 97. If
you have difficulty reading either, please let me know.]
This archive was generated by hypermail 2b29 : Mon Jan 31 2000 - 07:00:08 EST