From: Ian Wright (iwright@GMAIL.COM)
Date: Sun May 30 2004 - 14:22:23 EDT
Hi Andy, > Ryle makes the mistake of 'missing out the mind' as Searle would put it. > Certainly we have 'inner' subjective experience, in my view (do you > agree?). Yes. > The question is what is the objective material process that > expresses this subjectivity. You seem to think it is neurophysiological > processes / 'mechanisms'. I think it is outer bodily activity, at the core > of which is social labour. This is a materialist dialectics view of > thought, not prevalent within AI discourse. Whether it is a materialist dialectics view of thought or not, I think it slides into behaviourism. To remove a confusion, I do not think that mind is reducible to neurophysiological mechanisms. I think mind consists of a large collection of virtual machines ultimately implemented in neural mechanisms but not reducible to them. Many folk-psychological terms, such as "belief", "desire", "emotion", and so forth, pre-theoretically refer to states and processes of those virtual machines, and not only to behavioural dispositions that are ultimately observable, as Ryle and Wittgenstein, and perhaps Ilyenkov, maintain. In other words mind is software not only outputs. AI "discourse" has contributed an entirely new ontology of information processing mechanisms. For example, how is it possible that matter can support mechanisms that have beliefs? There have been many philosophical answers to this question. The great merit of information processing theories of mind is that it allows us to build machines that have them. Moving on to your emphasis that mind is fully expressed in outer bodily activity. There are many objective mental processes and states that do not necessarily manifest in outer bodily activity, e.g. keeping a secret. You could argue that ultimately private information must make some kind of difference to outer bodily activity, for example keeping a secret may cause someone to hesitate in certain contexts. I have no objection to that, except that reduction to observable behaviour does not exhaust the analysis. For example, why do some people hesitate when they try to suppress a secret? The answer is that it takes computational work to check and guard against the betrayal of a secret, which can take a discernible amount of time. Reduction of all mental terms to outer bodily activity misses such phenomena. Investigation of the mechanisms of mind tries to give an account of it. To state this in another way: thoughts are real and not only dispositions to act. There is an enormous amount of internal behaviour that, unless communicated, remains private and unobservable. Another example: I may form a sentence in my mind in response to one of your points, then decide against it, and write a different one instead. The sentence I did not write but only thought did not manifest in outer bodily activity. Do you deny its existence? If the materialist dialectics view of thought denies the existence of mental states, processes and mechanisms then it is wrong. > To grasp 'happiness', you have to grasp objects, actions, ideas, > sociality, language. Grasping neurophysiology doesn't do this, rather it > (neurophysiology) should try to show one facet of how the activity and > subjective state in question is enabled, *not* the nature of the action or > the state. As I mentioned above, I do not think ideas are reducible to neurophysiology, and I have never claimed that. Neither have I argued against the relevance of actions, the importance of social relations, or language, to the understanding of emotions. Instead, I have usually given an account of mechanisms that support ideas, actions, and language. Information processing mechanisms are not "mechanical" in the sense of nineteenth century notions of mechanism. Information processing mechanisms manipulate information, and information refers to referents. For example, there are information processing sub-states in my mind that refer to all kinds of social phenomena. So, of course, a full account of mental contents involves semantics and relations between the individual mind and the social world. > There are no 'happiness' mechanisms waiting to be uncovered. I have spent a good deal of time in the past arguing against the existence of a "happiness" mechanism, so I agree. > What is waiting to be uncovered is how the inner structures of the body > enable subjectivity / human activity. But now you have swapped from outer bodily activity to "inner structures" of the body. It seems to me that the term "body" is entirely innapropriate in this context. Take a computer -- it has a material body manifest to the eye, but it also instantiates a large number of virtual machines that are not. Why not replace your talk of "innter structures of the body" with the modern ontology of information processing mechanisms? Why not follow the prevalent paradigm in the field of cognitive science, which, granted, may not have a sophisticated understanding of materialist dialectics, but due to the division of labour, does concentrate all its labour on answering these questions. > Your argument re 'private' experience is of course heavily attacked by > Wittgenstein. He attacks the possibility of a private language. I do not think this is relevant. To avoid another confusion -- I do think as a rule of thumb that what is conscious is precisely what is social, although the vast majority of information processing that occurs is unconscious, following Freud. > How do you avoid a collaspe to solopsism? You'll need to expand this point if I am to answer it. Gotta dash ... sorry for the abrupt end. -Ian.
This archive was generated by hypermail 2.1.5 : Mon May 31 2004 - 00:00:01 EDT