[OPE] Reply to Paul Cockshott on Ochoa indirect labour methodology

From: Jurriaan Bendien (adsl675281@tiscali.nl)
Date: Wed Jul 16 2008 - 09:28:41 EDT


Paul, 

The central problem as I said is that the procedure for obtaining indirect labour inputs depends on the assumption of a fixed ratio between labour-time and output value.  

I had a look at the Ochoa methodology when I worked for Statistics New Zealand in the 1990s, but actually the mathematical statisticians there were skeptical of the procedure for obtaining indirect labour-values from I/O tables, among other things given the quality of price observations. 

You can of course let loose very sophisticated manipulations on a data set when the data, because of their quality, do not warrant it. I was referred inter alia to Oskar Morgenstern's work on price data. However I did not write all this up in a technical paper at the time, and in fact my (irreplacable) copy of the Langston Memorial Volume got stolen. 

You describe the Ochoa procedure fairly clearly, in a useful paper:

"If we divide the directly utilised labour by the dollar value of the industry's output, we get an initial figure for the amount of [direct] labour in each dollar of the output. For industry A we see that 0.32 units of labour go directly into each dollar of output. Since we already know the number of dollars worth of A's output used by every other industry, we can use this to work out the amount of indirect labour used in each industry when it spends a dollar on the output of industry A. This gives a second estimate for the labour used in each industry, which in turn gives us a better estimate for the number of units of labour per dollar output of all industries. We can repeat this process many times and as we do so, our estimates will converge on the true value." www.dcs.gla.ac.uk/~wpc/reports/rethinking.pdf

In reality, however, that sort of argument involves quite a few conceptual and quantitative assumptions which, to my knowledge, have never really been spelled out explicitly in the relevant research literature. But, just as you ought to examine first what you put in your mouth, lest you break your teeth or suffer disease, the empiricist researcher ought to be aware of his own assumptions, before he operates computationally on the data.

Eight issues from memory (there are more) are:

- There must be fixed labour/output-value ratios which apply to all transacting sectors involved with a given output. 
- It is assumed all producers' input-purchase and output-sale prices given are final prices, valued in the same way.
- In reality, the official sectoral I/O totals are all not valued and estimated by sector in the same way.
- At least a quarter of inputs by value are imported, and often revalued in resale.
- Certain operating expenses or revenues of business are excluded or imputed in the I/O table (due to the gross output valuation definition).
- Abstraction is made in the I/O table for a year from variations in production time.
- The iteration procedure aiming to converge on the "true value" assumes empirically that all labour-content estimated for the product-chain is attributable to the labour-content of the output value being estimated. 
- The accuracy of I/O tables may not be very great, since many sectors are not subject to a direct annual enterprise survey, and due to response error in poorly designed questionnaires, and figures may be extrapolated, inferred indirectly or imputed.

It is in fact also possible to find a "robust correlation" (to use Ian Wright's term) between constant capital inputs and output, applying the same Ochoa iteration procedure not for labour inputs, but for constant capital inputs. If we assume (as we must) fixed labour/output ratios to find labour-values, applying the law of averages, then there's a sense in which we have already assumed what we are trying to prove. The aim of the exercise was to demonstrate a strong correlation between quantities of labour-time worked and output values, but in reality we are inferring those very quantities using output and input magnitudes.

In general, the statistical literature I have read so far suggests that the LTV has very strong predictive power for the manufacturing sector, but not for many other production sectors. It is fairly easy to show statistically that fluctuations in output values correlate strongly with fluctuations in labour expenditure, but I do not think this observation alone is sufficient for a theory of price formation or price determination.

In general, I think that epistemologically speaking it is impossible for science to prove conclusively that any particular theory of value is true, because of the nature of the phenomenon of value itself. What we can prove however is that in most price-analysis of any kind, assumptions about economic value are being made even, if economists are blissfully unaware of that, and that those assumptions may be better or worse than other assumptions, or have more explanatory, predictive or heuristic power than others.  

I think the nearest we get to a meaningful MELT is the ratio of the (modal-average, post-tax and inflation-adjusted) labour-earnings per year, to the (modal-average) hours worked per year. 

Jurriaan






_______________________________________________
ope mailing list
ope@lists.csuchico.edu
https://lists.csuchico.edu/mailman/listinfo/ope


This archive was generated by hypermail 2.1.5 : Thu Jul 31 2008 - 00:00:10 EDT