Re: [OPE-L] marx's conception of labour

From: Howard Engelskirchen (howarde@TWCNY.RR.COM)
Date: Sat Nov 18 2006 - 10:33:46 EST


Hi Ian,

This is challenging stuff.

I think I disagree.  You write:

>Consider a thermostat. Does the temperature setting, if it differs
> from the current ambient temperature of the room, refer to a future
> temperature? Would it do so even if humans were not there to observe
> it?

Later you refer to absenting an absence, and I don't have any problem with
that.  But the key idea is reference.  I don't think the thermostat
"refers."  Reference takes an entity to interpret.   An interpreter is
essential to sign making.  And interpretation takes consciousness or
proto-consciousness.  What the thermostat gives us is a thing in process.  A
ball rolling down a hill is a thing in process.  Dominos falling.
Mechanical things are processes.  We can interpret all such things as goal
oriented, but I think this takes interpretation and interpretation takes
consciousness.  I don't know anything about artificial intelligence, really,
and have no judgment on whether or not machine consciousness is possible.

I do think that efforts to suggest that the activity of reference or
representation is peculiar to humans are wrong.  It is clear that the
capacity to refer exists in some at least crude form for many forms of life
(all?) -- clyder's wonderful example of the spider makes this point very
clearly.  It follows that by saying intentionality is characteristic of
humans I do not mean to say that it is peculiar or exclusive to humans.   In
general we should be very suspicious of anything that looks to cut us off
from the rest of the natural world.  At some point quantity becomes quality
and it does look like language marks off such a difference, but this is a
difference of degree become significant, not something more.

Now I am open but I have no basis for thinking this argument extends to
machines and that we can refer to anything like machine intentionality or
machine consciousness.  I just don't know what the evidence for this would
be.  Machines can translate from one language to another, but that doesn't m
ean they know what is being said.

Howard


----- Original Message -----
From: "Ian Wright" <wrighti@ACM.ORG>
To: <OPE-L@SUS.CSUCHICO.EDU>
Sent: Friday, November 17, 2006 7:09 PM
Subject: Re: [OPE-L] marx's conception of labour


> Hi Howard
>
> > But there is nothing in Marx's analysis to suggest that the sign we form
to
> > guide practice functions as a mechanical template imposed on a person's
> > labor the way a robot might be programmed or without regard to class
> > relations and the other points you mention.
>
> Consider a thermostat. Does the temperature setting, if it differs
> from the current ambient temperature of the room, refer to a future
> temperature? Would it do so even if humans were not there to observe
> it?
>
> I think the answer is yes to both questions. The thermostat therefore
> has intentionality of a kind. It represents the absence of a
> temperature. And it has a causal structure that changes the world to
> absent that absence. It is a goal-following mechanism. The goal is
> real, a part of objective reality, rather than the subjective
> ascription of a human scientist attempting to understand its
> operation.
>
> I take Paul to be implying that it isn't fruitful to classify the
> world of autonomous mechanisms into the human and non-human. Instead,
> to understand human cognition, we also need to understand animal
> cognition and machine cognition, and the relations between them. For
> instance, I do not agree that intentional activity is characteristic
> of humans. Plenty of other mechanisms have intentions, although they
> may not tell us about them.
>
> Best wishes,
> -Ian.
>


This archive was generated by hypermail 2.1.5 : Thu Nov 30 2006 - 00:00:06 EST