From: Jurriaan Bendien (adsl675281@TISCALI.NL)
Date: Wed Oct 31 2007 - 12:21:57 EDT
Paul, I didn't follow what you mean by that. Entropy is commonly defined in at least five ways: - For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work. - A measure of the disorder or randomness (implying uncertainty) in a closed system. - A measure of the loss of information in a transmitted message. - The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity. - Inevitable and steady deterioration of a system or society. The entropy of English text is between 1.0 and 1.5 bits per letter (Schneier, B: Applied Cryptography, Second edition, page 234). I am always a bit wary of applying analogies and metaphors taken from natural science to social phenomena in human society, admitting, of course, that exponents of historical materialism have often ignored the biological and physical basis of human life. The tendency for industrial profit rates to level out could be viewed as a process, instead of an end state in which all profit rates are equal. I think Marx hypothesizes that competition in a developed capitalism, or in the pure case, will ultimately establish a general norm of surplus-labour extraction, and a general norm for the minimum acceptable industrial profit rate, which inform the trade in labour and capital. But that is something that is very difficult to prove. Jurriaan
This archive was generated by hypermail 2.1.5 : Fri Nov 02 2007 - 00:00:19 EDT