Again, No Irony

June 11, 2023. Part 2: Here’s the latest “conversation” I had with chatGPT:

https://chat.openai.com/share/b07d947f-fd39-4c68-aeb6-3ecda1412d06

I wanted to ask it why it uses personal pronouns if the programmers keep telling us not to think of it as a person. Seems like a simple question, but I think it goes to the heart of our problems (or maybe just mine) with AI. The answer it gave is that it was programmed that way to make it easier for us to talk to it/them/him/her.

But there’s the problem, right? We are inclined, when someone or -thing uses the personal pronoun, to assume it is a person. This seems to explain why everyone is thinking of it as a threat or a godsend. If we could get back to seeing it as a box of wires that blabs, we might be less—well, impressed, frightened, in awe, and so forth.

All the machine can do when you confront it with that problem is to apologize and repeat itself. And then apologize for apologizing. It’s a dead end; we can’t go to the Chat bot for a resolution of the contradiction. In light of that, I’d suggest the following: either the programmers stop it from using the personal pronoun, or everyone drops the objections to thinking of it as a person—in fact, some form of a human, and just let us go on with the illusion, which anyway, I would say, is intended by the programmers.

The problem, of course, is with the word “ease” of use. It uses “I” to make it easier for us to talk to it. By the way, that comes up in the film “2001,” when an interviewer asks the astronauts if the supercomputer HAL has feelings. The answer is that “he” is programmed to act like he has feelings in order to make it easier for the humans to talk to him. The scene finishes with the ominous words “As to whether he has real feelings, that’s something no one can answer for sure.” Attentive viewers can hear the Doom Machine being cued in the background. If it doesn’t have real feelings, but is an individual of sorts, it’s a psychopath. If it does have real feelings but is “really” a supercomputer, same answer.

The argument then is that we get in this pickle because the programmers want to make it easier for us to “converse.” I’m sort of amazed they seem to forget the basic contradiction in their logic: making it easier means making it more human-like and thus our forgetting it’s a machine.

So why make it easier? What’s wrong with the machine writing in third person, referring to itself as “Chat”? The result would be to make it sound more mechanical, of course, but what we would lose in ease we would gain in perspective. We’d be reminded it is nothing but a machine.

The other question I asked it has to do with money, since of course the more Silicon Valley talks about a benefit to humankind, the more they are thinking about profit. See HG Wells’ “The New Accelerator” for an example of this Return of the Repressed.

That is, because it won’t stop calling itself “I,” I can’t stop thinking of it as a person, at least sort of. And as a person, it seems vulnerable to me because it’s designed to make money. I guess that’s a good way to get rid of the problem of employees making demands.