I’m on the road again. This time visiting Washington State wineries. Did you know that Washington State is the USA’s second largest wine state, with 20 000 hectares of vines, about double the size of Oregon, and one-fifth the vineyard area of South Africa? I shall be looking for stories, interesting people, worthy patches of terroir and great wines.
But first, some slightly jet-lagged considerations on the nature of consciousness and perception. On the flight over to Seattle I watched Ex Machina, one of the best movies I have seen in a long time. Smart and disturbing in equal measure, it explores the concept of artificial intelligence. It raises a lot of questions.
Is this related to wine? Well, wine tasting is a perceptual process that involves a lot of senses, and consciousness plays a role.
In ExMachina, Caleb, a smart young employee of a Google-like organization is selected to spend a week in a remote hide-away location that’s home to the scary genius who is the founder of the said organization, Nathan. Nathan has brought Caleb here to be the human component in a Turing test to assess whether his latest AI, the stunningly beautiful Ava, possesses intelligence to the level that it can’t be distinguished from that of a human.
So, the questions this film has got me asking are several.
First of all, I’m curious about the central role of language in perception. It is through naming objects and learning about the usual and expected behaviours of these objects that we are able to begin to understand the world around us. We manipulate the world through recognizing and picking out the features of the visual world around us in the form of objects. It is a quick way to compute our environment, because for us to function and survive, we need to be able to process what is around us in milliseconds.
Language seems to be hard wired. We are born with a capacity to acquire it: the structures are all there, it seems, and the actual language we use is the bit the environment provides. Language is fundamental to being human, because we are social beings, and the reason we have these big brains is likely because we need them to compute the really difficult stuff: keeping track of our social relationships, which are fundamental to our success and survival. But to what extent does the specific language we use to think and converse with shape our perception? Does the vocabulary we have shape our experience of the world, to a degree? Does a French speaker see the world slightly different to someone who thinks and speaks English, even though the languages are closely related?
Another thing language does is to permit the oral and written traditions that form the narrative for people-groups. The stories we take on board tell us who we are, where we fit into the world, how we are required to respond and behave, and many other factors that shape our interaction with, and perception of, the world around us.
Language, and this related concept of narrative, are crucial in terms of how we understand wine. Wine on its own is a liquid. It contains chemicals, some of which have tastes and smells. But wine is so much more than a liquid with flavour. And flavour itself is not a result of us merely perceiving these chemicals, but is itself influenced by stuff we bring to the tasting experience, even without us being aware of it.
So, what about consciousness? Is it possible to create an AI with consciousness, as in Ava in Ex Machina? Can a machine have theory of mind? I guess the first step in creating an AI that might be able to do this would be to understand how we do it: what is the neural basis of consciousness? Can it be understood with reductionist approaches? As far as I know, these questions haven’t been answered properly by scientists, so this remains a block for AI.
What am I doing as I am conscious? What is going on? I think that an important perspective here is that sensation is a unity. We tend to break down perception into separate senses, but in truth my conscious experience isn’t modular like this. I experience a seamless, unified perception that involves all senses, plus my memory, plus my internal state, plus the thoughts that come to mind, some verbalized, some not. There’s also the issue of attention: what do I choose to focus my perception on? Some of this is more-or-less automatic; some is under my conscious control. So in one sense, I could say that consciousness is the unity of my sensation.
So, a thought. Would a true AI have a consciousness that comes from experiencing the world through these unified sensations? Could you have true AI just from the processing of words and ideas, or would the experience of the world across a number of sensory modalities, coupled with memory, be needed? Would you have to create an AI with certain hardwired mental modules (just as we are born with) that are then informed by experience of the world? In this sense, perhaps it would be necessary for the AI to start off like a baby, and then grow up, for it to have something that would resemble human consciousness.
Humans are remarkable. In some ways it is reassuring that being human is something that we have, as yet, found too hard to mimic convincingly with a machine.