Re: Mind

From: Andrew Brown (Andrew@LUBS.LEEDS.AC.UK)
Date: Thu Jun 03 2004 - 14:01:22 EDT


Hi Ian,

I have always felt it would be great to properly relate Ilyenkov to Western
philsophical debates, esp. the debate on the philsophy of mind. (Bakhurst's book
tries to do exactly this but alas Bakhurst cannot remove his Western philsophical
lens, hence fundamentally and essentially fails to grasp Ilyenkov in my view). No
doubt an important initial step would be to critique Ryle from Ilyenkov's viewpoint.
But I suspect I will never get the time; I'm afraid our email exchange is as far as I
have got on this issue!

Given the above defensive caveat, a reply:

>
> > The question is what is the objective material process that
> > expresses this subjectivity. You seem to think it is
> > neurophysiological processes / 'mechanisms'. I think it is outer
> > bodily activity, at the core of which is social labour. This is a
> > materialist dialectics view of thought, not prevalent within AI
> > discourse.
>
> Whether it is a materialist dialectics view of thought or not,
> I think it slides into behaviourism.

It can't be behaviourism for many reasons. I guess the fundamental one is that
behviourism tries to reduce human activity to a simple impulse-response schema,
whereas, of course, human activity can be thought of in this context to have an
impluse-reflection-response schema. We think before we respond (as you say
below)! I hasten to add that a full schema would include activity, object, sociality,
idea, language (I would say these are moments in human labour). Note how
impoverished the behaviourist schema is by comparison!

Another (related) difference is that there is no denial of the internal side of thought
for Ilyenkov. There are 'inner experiences' as well as outward bodily activity. Ryle is
the wrong to 'miss out the mind'. The question is, how do they (outward activity and
inner experience, the ideal) relate? The answer embraces the view that they are
internally related.

>
> To remove a confusion, I do not think that mind is reducible to
> neurophysiological mechanisms. I think mind consists of a large
> collection of virtual machines ultimately implemented in neural
> mechanisms but not reducible to them. Many folk-psychological
> terms, such as "belief", "desire", "emotion", and so forth,
> pre-theoretically refer to states and processes of those virtual
> machines, and not only to behavioural dispositions that are
> ultimately observable, as Ryle and Wittgenstein, and perhaps
> Ilyenkov, maintain. In other words mind is software not only
> outputs.

1) What is a 'virtual machine'?
2) Beliefs and desires, etc. must be located within the context of general human life,
and outward bodily activity (labour in fact) is essential to that context.
3) The reason for the stress on 'outputs' is the following fundamental notion: the
relevent physiology, neurophysiogy etc. of the human being must be characterised
by the abilties (emergent properties) of self-tranformation and reflection. Relevant
inner mechanisms of a human are not fixed like computer hardware. Rather they are
capable of self-change. The change is not random but guided by reflection.
Examination of the inner structure of the body therefore should seek out the
mechanisms (structures) enabling reflection and structures that are being changed
(developed), the mechanism of self-change or development. Grasping 'thoughts'
does not rest on such examination, rather thoughts are to be grasped as part of the
examination of outer activity and its' context. Self-change is guided not by inner
mechanisms but by reflection on outer activity and its' context. Guided in fact by
social labour.

<snip>
>
> You could argue that ultimately private information must make
> some kind of difference to outer bodily activity, for example
> keeping a secret may cause someone to hesitate in certain
> contexts. I have no objection to that, except that reduction to
> observable behaviour does not exhaust the analysis.

I hope you can see that there is no such 'reduction'

>
> For example, why do some people hesitate when they try to
> suppress a secret? The answer is that it takes computational
> work to check and guard against the betrayal of a secret, which
> can take a discernible amount of time. Reduction of all mental
> terms to outer bodily activity misses such phenomena. Investigation of
> the mechanisms of mind tries to give an account of it.

Perhaps, the above has helped indicate how I woul incorporate your comment here?

>
> To state this in another way: thoughts are real and not only
> dispositions to act. There is an enormous amount of internal
> behaviour that, unless communicated, remains private and
> unobservable. Another example: I may form a sentence in my
> mind in response to one of your points, then decide against it,
> and write a different one instead. The sentence I did not write
> but only thought did not manifest in outer bodily activity. Do
> you deny its existence?

Of course not (but thanks for allowing me to clarify). The point is, rather, that the
sentence cannot be understood outside of a grasp of outward human activity. The
sentence must not be reduced to the neuron movement etc. that goes on as you
think up a sentence. The sentence was learnt in the context of outward social
activity, and only in light of such a context do we grasp the sentence. The neuron
movement (etc.) helps to enable these things but it does not constitute them.
(I am not saying you are making such a reduction, indeed am curious to explore
your notion of a 'virtual machine').

<snip>

>
> But now you have swapped from outer bodily activity to "inner
> structures" of the body. It seems to me that the term "body" is
> entirely innapropriate in this context. Take a computer -- it has a
> material body manifest to the eye, but it also instantiates a large
> number of virtual machines that are not. Why not replace your talk of
> "innter structures of the body" with the modern ontology of
> information processing mechanisms? Why not follow the prevalent
> paradigm in the field of cognitive science, which, granted, may not
> have a sophisticated understanding of materialist dialectics, but due
> to the division of labour, does concentrate all its labour on
> answering these questions.

The ultimate answer to your question has to do with the problem of theory-ladeness
and reference, believe it or not. If the mind were like a computer, how could it know
that its' ideas were true? Its' ideas are one thing, but their referents are totally
different. How could the chasm be bridged? I don't think it could. Reference requires
access to something akin to the object referred to. What could this be, for humans?
It is outward human bodily activity. Such activity is in *direct* contact with the object.
This activity continually and fluidly adapts to the object. This activity in fact changes
itself and the object simultaneously. Any designer worth their salt would, if they
could, ensure that self-awareness of outer bodily activity was at the heart of any
mechanisms associated with reference.

>
> > Your argument re 'private' experience is of course heavily attacked
> > by Wittgenstein.
>
> He attacks the possibility of a private language. I do not think
> this is relevant.

It means that thoughts are public, like outward bodily activity.

>
> To avoid another confusion -- I do think as a rule of thumb that
> what is conscious is precisely what is social, although the vast
> majority of information processing that occurs is unconscious,
> following Freud.
>
> > How do you avoid a collaspe to solopsism?
> You'll need to expand this point if I am to answer it.

Hmm..doesn't make sense in this context, so best ignored!

Many thanks,

Andy


This archive was generated by hypermail 2.1.5 : Tue Jun 08 2004 - 00:00:01 EDT