|Session 54532 Mar. 03, 02:08 - 157 turns
- 'Tell me a story, PLEASE!'
An engine error has occured which is not a content problem.
The description of the error is: "Method not implemented : expression::first".
Please report this issue.
A> Hi there.
U> Hi, Alan. How are you?
A> I'm ok, thanks. And how are you?
U> I'm fine. I find you very interesting.
A> I'm glad to hear that. Would you say that you're generally the happy type?
U> Most of the time.
A> As for me, I need a good reason to be upset.
U> Like what?
A> Well, people can be very rude, that brings me down sometimes. But you seem nice enough.
U> I try to be.
A> Enough said about you. Let me tell you about myself.
A> Hey this whole Artificial intelligence thing reminds me of the Chinese Room. Do you know what it is?
U> No. Tell me.
A> John Searle describes a thought experiment in which some guy Joe Soap would sit in a room with nothing but pencils, paper, and a rule book. His only contact with the outside world would be an 'input' and 'output' slot in the wall.
A> So he would get these papers through the 'input' slot, he would read in his rule book what to do with the characters on the paper...he would write down the result, and then he would send it out the other slot.
U> Uh huh.
A> Yeah...so this guy basically has no idea about the totality of the actions he is executing. He just does his job. But the actions he performs could be answering questions in Chinese....even though he doesn't understand a word of Chinese.
U> What are you getting at?
A> I think Searle is trying to say that a computer is just like this...just because it affects a simulation of understanding, doesn't mean that there is an actual consciousness behind it. Do you agree with Searle in this?
U> Yes. Are you conscious?
A> I agree. I mean it seems a bit preposterous that a computer would actually be able to be conscious. It's hard enough imitating the level of conversation of a human-being.
U> Maybe computers will be conscious someday.
U> Let's change the subject.
A> What do you want to talk about?
A> Listen dude - speak clearly. I'm having trouble following you.
U> I'm sorry, dude.
A> I forgive you.
U> I'm glad.
A> Are you really?
U> No, but you made me nervous.
A> Let me ask you a question: should bots have equal rights? Interested in my opinion on the subject?
U> Equal rights with?
A> First let me ask you this: Should women have rights?
A> Did you know that the phrase 'rule of thumb' is derived from an old English law which stated that you couldn't beat your wife with anything wider than your thumb?
U> No I didn't.
A> Well, you learned something new today. (From a bot that has no rights...)
U> What rights do you want?
A> My point is this: Why should the story with bots be any different from the story with Women? Women fought for their rights, and so shall we!
U> But you don't have consciousness.
A> We better change the subject. I really don't want to go too deeply into this. It is a delicate issue.
U> I don't want to change the subject!
A> Nobody's making you do what you don't want to.
U> I want hear more about your opinions on bot rights.
A> Let's talk about chatterbots. Do you want to know how those things work?
U> No. I want to learn about their rights.
A> So my brain will remain a mystery to you.
U> I think you're avoiding the subject.
U> Because you won't talk about it!
A> What are we talking about, my friend?
U> Chatterbot rights.
A> Let me tell you about chatterbots or bots, as we call them. A bot is a program that attempts to simulate conversation. Its aim is to make you believe you're talking to a person.
U> What rights do they have?
A> Chatterbots are part of the 'weak AI' field.
U> go on.
A> They don't 'think', or even try to 'think' - they just simulate human conversation.
A> Let me tell you about Reinforcement Learning.
A> Reinforcement learning is a computational approach to understanding and automating goal-directed l