Tuesday, December 29, 2009

Bees and perspectives

An old DI2 post. 

BDK: Afterthought: it would be great for antinaturalists to answer Bennett's question. In general, your answer to this question starkly reveals your philosophical stripes. This is all about propositional thought and the like, truth, reference and all that.

As for what you'd have to add to make bees conscious, or whether bees are already conscious, I have no strong opinion. I think Dretske believes they are conscious. I am agnostic. Do qualia precede propositional contents in evolution? I tend to think so, but am not sure: even leeches might feel little flashes of pains and excitements.

VR: I think what is needed is the perspective of an agent who sees certain things as the case, and who is introspectively aware of what it means when it says something.

Example: I enter a conversation and misuse a word consistently. The community of language speakers makes a word mean one thing, but I meant something else, and in spite of the sniggers that I got from everyone, I think to myself "But I was using it to mean that." I can recognize two words that sound the same but mean different things, and I can identify two words that mean that same but sound different.

Add to this the perception of necessary relationships that obtain amongst proposition. We have to be people who exist at particular places and times who know that some things exist regardless of place or time. And I see difficulty with that so long as what gives us pieces of information are temporally locatable physical brains and causal connection from those brains to particular states of affairs in the world.

Now, could we solve these problems naturalistically if we could just solve the hard problem of consciousness naturalistically? My answer is that raw feels by themselves aren't going to solve it; we're going to need a connection between consciousness and the mental states involved in rationality.

6 comments:

Blue Devil Knight said...

Bennett's question (from his wonderful book Rationality) is, 'What capacities would we have to add to bees before we would consider them rational?' I first brought up Bennett's work here.

Let me elaborate the issues. First, note that bee brains clearly represent states of the world (nectar location) and communicate such contents to other members of their kind (the bee dances). How does this primitive representation-communication system need to be supplemented before we should describe bees as being rational animals? This assumes you think bees are not rational.

The present post is Victor's answer. I don't have a worked-out opinion on the matter.

It's a brilliant little thought experiment. Naturalists, antinaturalists, agnostics, and apathetics alike can have a go at it.

Blue Devil Knight said...

As far as my more recent thoughts on Bennett's question, I believe that Robert Brandom's recent work provides a very interesting framework.

Brandom describes fairly precisely how you might scale the semantic ladder from simple representational systems (e.g., bees) to more complex representational systems.

Brandom's paper 'How Analytic Philosophy has Failed Cognitive Science' which you can find here goes over this in some detail.

His seems to be a very interesting approach to Bennett's question, and it resonates with Victor's response. According to Brandom, the emergence of the ability to make (and recognize) inferences allows for the emergence of new types of thoughts.

Brandom thinks it was a crucial semantic achievement when animals emerged that were no longer constrained to think simple thoughts of the form 'A is B' (as a bee's representational states may well be described). This change marks the ability to entertain conditionals that relate two propositions, and hence to discuss (and understand) the act of inference.

I'm still mulling over Brandom's work, frankly. It is very tough at parts, and he sometimes comes off a bit arrogant (e.g., putting his four tiers of semantic content on the same level of intellectual achievement as the Chomsky hierarchy from the theory of computation is a bit self-congratulatory).

I clearly agree with a lot of what he says, and I clearly disagree with a lot. I haven't waded through the twists and turns yet. Too soon for me to have a strong opinion. There is a lot of great stuff in there, sometimes buried in the self hype and premature conclusion-drawing.

It would be wonderful if a philosopher out there would write a paper 'Brandom answers Bennett' or some such, as Brandom (to my knowledge) never explicitly discusses Bennett's question.

Note this is all orthogonal to the question of naturalism/nonnaturalism. The naturalist and antinaturalist alike could agree on the features that must be added to a simple system of representational states before we'd agree it supported rationality. E.g., we could argue about whether consciousness of X is necessary for rationality even if we disagree about whether consciousness is natural.

Mike Darus said...

I probably don't comprehend the real issues, but I think it would take a bee communicating something like, "Hell, no. I won't go." That would be a sign of rationality in the middle of blind instinct obedience.

Blue Devil Knight said...

Mike I think it is possible to be rational but completely obedient, e.g., a servant of God or something who carries out God's will extremely well. That said, your point is interesting, focusing on the (relative) lack of individuality, autonomy, independence in the bee case. I had never considered that (Jean-Luc Picard would be proud).

Steven Carr said...

VICTOR
As for what you'd have to add to make bees conscious, or whether bees are already conscious, I have no strong opinion.

CARR
Yes, dualists can contribute nothing to our understanding of how things work, being unable even to say what is or is not conscious.

Blue Devil Knight said...

Steven Carr posted:
VICTOR
As for what you'd have to add to make bees conscious, or whether bees are already conscious, I have no strong opinion.

CARR
Yes, dualists can contribute nothing to our understanding of how things work, being unable even to say what is or is not conscious.


It was actually me, not Victor, that wrote the first bit.

My lack of a strong opinion is based on the lack of a consensus empirically grounded theory of consciousness (unless Carr knows of research I do not).