Monday, December 11, 2017

Consciousness and language: why AI will never succeed

This Blog Was Written By
Gabriela Armas

It took me 3 weeks to pen this down into something concrete, as I've written it and rewritten it over a dozen times while dozing off, it felt like I only ever really wanted to write this post in my sleep.

The discussion of consciousness and language is something toyed over pretty frequently by linguists. Most often, it's reduced to the question of  “do we think in language?” and related discussions that then twist into notions that one need not language to think at all. 

One thing particularly excruciating to endure is listening to such discourse being presented in binary postulations, often found in academic settings. People, all too often, are so desperate to ‘come up with something’ that they try to neatly tie up all the answers into one that they overlook the nuances of a complex philosophy and debate. 
It is true, you don't need language to think; Pinker makes several good points in (1), he states that there is a never-ending feedback loop between which mode of thought we access our ideas in, and the significance of context we produce and how that plugs right back in and shapes how we think, which shapes how we speak, which shapes how others process our thoughts and how/what they think and what they say and so on but hesitates to tie this to concepts like self awareness (in all fairness, this is a 4 part video, and only 1 is available).

But ultimately, what does it all mean when we're discussing consciousness and self awareness? Just because an individual knows they exist, are they aware that others exist outside of one’s self? That they have lives of their own and feelings of their own, experiences and perceptions of their own? 
Does language, its utilization, its very concept, and its proper understanding and conveying context make one more readily aware of other’s consciousness as well? And if that is the case, can you teach that? 
This image, or meme really, is what spurred this question in my mind. It was funny, horrifying, and completely fascinating all at the same time....

(The following is a Reddit post )

 After reading it I kind of just sat with my mouth open trying to interpret the fact that there are actually people like this. And even further, there are fully neurotypical people who may in fact live their entire lives never even reaching this standard point of self awareness. I’ll link the reddit thread at the bottom as well (2), that specific user has a bunch of other postings that are just as peculiar and deal with depersonalization, which is pretty atypical. But the comments and experiences in the thread are pretty interesting (and also exactly what you’d expect to see on reddit).

I’m sure by now many of you have heard people say that if you talk to yourself, you’re crazy, or alternatively that if you talk to yourself, you’re a genius. Many researchers have tried distinguishing between the two, but overall, talking to yourself is pretty universal. There are even different layers and names attributed to it: soliloquy, private speech, silent speech, subvocalization, inner monologues, etc. But the reason many people associate this with intelligence is due to the fact that this practice helps you retain and understand concepts more clearly, as well as help to remember and find things more easily. 

By this logic, I’m inclined to believe the original poster on Reddit  and their experiences, and would even be inclined to say that thinking in these more concrete ways rather than weird, floaty, abstract thoughts without words, would help to train you to be more aware of others and possibly even more empathetic. 

Moreover, could you train AI this way?

If any of you are into Sci-Fi or tech in anyway then you probably have some idea of what the Turing test is (as well as its namesake). Basically it’s a test meant to measure the intellectual capabilities of an artificially intelligent program (being?) - if it can speak with another person and go undetected in such away that the human thinks they’re speaking with another human then you’ve successfully produced a strong AI -it has a consciousness, sentience, and mind. 
If consciousness was not tied to language so strongly then why is this linguistic interaction our basis for consciousness? Trying to teach or convey context to another human alone is nearly impossible, there are even slimmer chances that you can teach such an imperatively nuanced idea/skill to a computer. That may sound dramatic to some of you, but when I mean context in this sense, I mean your entire life as context. This is why a lot of the time you can see  somebody ‘understand’ what you’re saying, they may even identify with whatever idea you’re expressing but they don’t and probably won’t ever fully get it, because they haven’t gone through the rhythm and sensations, feelings and responses, thoughts and desires of your own sublime consciousness. 
...This thought is getting a little *too* abstract, so I guess I'll end here. 

I’d be EXTREMELY interested in hearing what your emotional reactions or thoughts to the Reddit post are.  


No comments:

Post a Comment