Richard G. Epstein

 

[left.htm]

 

NORA

_______________________________________________________

A comedy in two acts

by Richard G. Epstein

ACT ONE

(The lights come up on the family room of the Hellborne home in Concord, New Hampshire. The Hellborneís are rather well off, but the family room is not extraordinary. There is a sofa, a coffee table, a bar, some chairs and lamps. A centerpiece is the holographic surround video system with its characteristic virtual reality helmets. Additional rooms are offstage to the left through two doors, front and rear. An exit to the garage is accessible through a doorway to the right. JANICE HELLBORNE and LUCY SPAMFORD are in the center of the family room, having a lively discussion.)

JANICE HELLBORNE

It all started right here, in New Hampshire.

LUCY SPAMFORD

Janice, I think you need to take a deep breath!

JANICE HELLBORNE

Donít start with that yoga crap!

LUCY SPAMFORD

When you get this angry, you need to step back and take a deep breath.

JANICE HELLBORNE

Lucy, I donít think you can understand the depth of my rage towards this woman. Youíre not married.

LUCY SPAMFORD

Calm down. You need to remind yourself. She is not a woman. She is a nano-organic robotic assistant. Being a nano-organic robotic assistant and being a woman are two different things.

JANICE HELLBORNE

Tell that to Henry.

LUCY SPAMFORD

I think Henry knows full well that she is not a woman. That is why heís so enamored of her.

JANICE HELLBORNE

Listen to yourself! Are you trying to say that my husband doesnít like women?

 

LUCY SPAMFORD

What I mean to say is if she were an ordinary woman, he would not have become so enamored of her. After all, heís a married man.

JANICE HELLBORNE

I know that. I married him twenty-two years ago.

LUCY SPAMFORD

Ah, yes, and it was a beautiful ceremony. Non-sectarian. No religion stuff. The food was great.

JANICE HELLBORNE

And you caught the bridal bouquet! You, of all people. Donít you think itís time for you to settle down and find your significant other?

LUCY SPAMFORD

My work is my significant other.

JANICE HELLBORNE

What about that guy? Uh, Jack - .

LUCY SPAMFORD

Jay.

JANICE HELLBORNE

I thought you kind of liked Jay.

LUCY SPAMFORD

"Kind of liked" and "I do" are worlds apart. But, letís get back to the business that brought me here on this beautiful Sunday in June.

JANICE HELLBORNE

When you mention "the business" two things come to mind. First, there is "the business" which I hired you to do, both as a friend, and as a competent computer professional. Then, there is this other business of what those arrogant professors started one hundred years ago down the road in Hanover.

LUCY SPAMFORD

You mentioned that the last time I spoke to you, this stuff about one hundred years ago.

JANICE HELLBORNE

I hate John McCarthy! I hate Marvin Minsky!

LUCY SPAMFORD

John McCarthy? Marvin Minsky? What the hell are you talking about?

JANICE HELLBORNE

Theyíre the professors who started all of this Artificial Intelligence stuff exactly one hundred years ago, in nineteen fifty-six, just a short hop down the road, at Dartmouth. One hundred years ago, these good for nothing computer scientists decided to start a new discipline, a new field of human endeavor Ė Artificial Intelligence. If it werenít for them, I would still be number one in my dear Henryís heart.

LUCY SPAMFORD

Why are you so pre-occupied with that conference at Dartmouth that took place a century ago?

JANICE HELLBORNE

Iím sorry, Lucy. Thereís been all this stuff in the news recently about the one hundredth anniversary of artificial intelligence. Who is NORA, but a product of that accursed science?

LUCY SPAMFORD

Okay. That is true. NORA is a product of Artificial Intelligence, but please bear in mind that I am using some really sophisticated tools to bring her down. Do you understand? And some of these tools come from that accursed science, as you put it.

JANICE HELLBORNE

I appreciate what youíre doing for me, Lucy. Iím truly grateful.

LUCY SPAMFORD

And after the deed is done, I want you, my closest friend, my best friend for all these many years, to wear a smile on your face. I havenít seen a smile on that face of yours for quite some time.

JANICE HELLBORNE

Of course, when you bring NORA down, I hope it wonít threaten my husbandís re-election bid. I mean, it is kind of nice being the wife of a United States Senator, even if itís only from a small state.

LUCY SPAMFORD

This is a touchy issue, Janice. It may be difficult to have it both ways.

JANICE HELLBORNE

What do you mean by that?

LUCY SPAMFORD

What I mean is that NORA is your husbandís administrative assistant on Capitol Hill.

JANICE HELLBORNE

Sheís not on Capitol Hill right now. Sheís on her way here to Concord, with my husband at her side.

LUCY SPAMFORD

Youíve got to decide once and for all what really matters: your marriage or the pleasure you derive from being the wife of a United States Senator.

JANICE HELLBORNE

Thatís a tough call. I always wanted my husband to be a big success. His business was going very well and then there was that incident with his brain, that aneurysm twelve years ago. After that, all he could think about was politics, politics. It was so strange.

LUCY SPAMFORD

Look at it this way, Janice. You often complain about the positions Henry has taken on a variety of political issues. I doubt you would vote for the man if he werenít your husband.

JANICE HELLBORNE

Tell me about it! Weíve had many fights at the breakfast table, at least on those increasingly rare occasions when we have breakfast together. Many unkind words were exchanged. Itís hard to take those words back.

LUCY SPAMFORD

Donít get me started on this politics stuff! Itís almost impossible for me to forgive your husband for what he did to Mount Washington.

JANICE HELLBORNE

We had many bitter fights at breakfast about that Ė believe me!

LUCY SPAMFORD

Imagine voting for a bill that opened the way for a shopping mall at the top of Mount Washington.

JANICE HELLBORNE

I was so angry about that bill! Henry and I didnít speak to each other for a month after he cast that vote. He co-sponsored that damn bill!

LUCY HELLBORNE

And those disgusting bumper stickers. Every time I see one, I just want to scream. (With great sarcasm and disgust.) "This car shopped Mount Washington".

 

 

JANICE HELLBORNE

So, getting rid of NORA, according to your plan, may mean that my husband will lose his re-election bid this November. Is that what youíre trying to tell me?

LUCY HELLBORNE

NORA performs many, many important services for your husband. I have looked into this matter very closely, and I doubt very much that your husband could have won his seat in the Senate six years ago if it werenít for NORAís awesome understanding of New Hampshire politics. Without NORA at his side, it will be difficult for him to win re-election this fall.

JANICE HELLBORNE

Youíre beginning to sound like Henry. Thatís why heís so Ė oh, how to put it, so captivated by her. (With anger, disgust, and sarcasm) "Oh, NORA, my little Ďbot, I wouldnít be where I am today, in the United States Senate, if it werenít for your incredible intelligence." It makes me sick to hear him talk like that. "My little Ďbot! My little borg!" Everytime I hear "my little Ďbot", "my little borg" it makes my blood boil.

LUCY SPAMFORD

But, you see, Janice, itís not Ė itís not - itís not an erotic kind of love. Itís not like that. Itís more like, itís more like an addiction. Thatís it. Heís addicted to her. He needs NORA the way that a heroin addict needs his heroin.

JANICE HELLBORNE

Yes, Iíve heard him say that may times. "My little Ďbot, you are my heroine."

LUCY SPAMFORD

I didnít say "heroine". I said "heroin". Heroine and heroin are two different things. Donít get them mixed up. Sheís like a drug for him, but he doesnít realize that. Sheís not his heroine, sheís his heroin.

JANICE HELLBORNE

I think youíve hit on something, Janice. My husband is like a drug addict. Heís addicted to his nano-organic robotic assistant, or NORA, for short.

LUCY SPAMFORD

This isnít the first time that Iíve used this analogy, Janice.

JANICE HELLBORNE

I donít think itís an analogy. I think itís an actual fact. NORA is Henryís drug. He canít function without her. Itís not like heís sexually attracted to her.

LUCY SPAMFORD

Itís not a sexual thing, so far as I can tell.

JANICE HELLBORNE

NORA is Henryís drug. Youíve hit the nail on the head.

LUCY SPAMFORD

Henry is addicted to NORA. He canít make any decisions without her. Heís afraid to take any positions without her advice and support. She controls him. I donít think thatís what our Founding Fathers had in mind when they wrote the Constitution. I donít think they could possibly have imagined a United States Senator whose every decision is based upon advice from a machine, from a robot.

JANICE HELLBORNE

I think the physician in me is starting to see this situation in an entirely new light.

LUCY SPAMFORD

The angry wife is being replaced by the insightful physician.

JANICE HELLBORNE

Itís not just about my husband and his addiction to this NORA woman.

LUCY SPAMFORD

Just remember that sheís not really a woman. NORA does not have the legal status of a human being. Sheís a machine.

JANICE HELLBORNE

Itís not just about my husband and NORA. Our entire culture is addicted to these so-called "intelligent" machines.

LUCY SPAMFORD

They are extremely intelligent. Thereís no way around it.

JANICE HELLBORNE

Our culture is completely and totally dependent upon these intelligent machines. Itís not just Senators. Itís doctors, like myself, and lawyers, and engineers, and artists and cooks. No one is willing to make a decision without the help of a robot, an intelligent assistant. This addiction is every bit as damaging as the forms of addiction I have seen in some of my patients.

LUCY SPAMFORD

People are addicted to bits and bytes, nanobits and nanobytes. I believe that we are beginning to see eye to eye on this. Iíve been trying to get you to see this whole issue of NORA and Henry in global terms. Itís not just about NORA and Henry. Itís about our cultureís addiction to these machines, these intelligent machines. You finally got the picture!

 

JANICE HELLBORNE

So, when I get NORA out of my life, it will be the first step in a great revolution. Soon, the whole world will be up in arms, throwing those robotic assistants into the trash where they belong.

LUCY SPAMFORD

Letís just deal with Henry and NORA right now. Weíll work with the larger picture later. Weíve got to destroy Henryís trust in NORA once and for all.

JANICE HELLBORNE

Speak of the devil! Thatís Henry. He just drove up the driveway with his beloved NORA in the passenger seat. I think we had better disappear for a while.

LUCY SPAMFORD

No problem. I have bugged the house with several nano-bugs. Weíll be able to hear every word they say. I think my strategy for bringing an end to Henryís addiction should be clicking in by now.

JANICE HELLBORNE

You gave your strategy a name. I canít remember what you called it. Youíre using a particular kind of technology, but I forgot what itís called.

LUCY SPAMFORD

Cognitive hacking.

JANICE HELLBORNE

Cognitive hacking?

LUCY SPAMFORD

Cognitive hacking, and NORAís the one who got cognitively hacked big time over the last few days.

(Janice Hellborne and Lucy Spamford exit left towards the other rooms in the house. After a brief pause, HENRY HELLBORNE enters the family room from the right.)

HENRY HELLBORNE

Come in, come in, my little Ďbot.

(NORA follows Henry Hellborne into the family room.)

HENRY HELLBORNE (cont.)

Welcome once again to my beautiful home.

 

NORA

Thank you, Senator Hellborne.

HENRY HELLBORNE

You always teach me so much, my little borg. You taught me so much on the way up on the plane from Washington. I am so happy I didnít have them put you in the luggage compartment like I used to do when you first started to work for me.

NORA

It was an honor to sit next to you on the plane, Senator Hellborne.

HENRY HELLBORNE

Enough with this Senator Hellborne stuff. Now that we are out of public view, just call me Henry, or better yet, Hank.

NORA

After they booted me up, I went through some really intense training experiences regarding politeness and etiquette. Calling you Henry or Hank will be quite difficult. After all, you are my owner.

HENRY HELLBORNE

You need to learn to relax. You seemed quite nervous on the plane.

NORA

I wasnít nervous. I was doing a lot of routine computations. You know, office stuff.

HENRY HELLBORNE

Can I offer you something to drink, NORA?

NORA

You know I donít drink.

HENRY HELLBORNE

Well, I think those nanotechnology and AI folks made a mistake when they didnít give you robots the ability to drink. Itís really an important social skill.

(Henry Hellborne moves over to the bar and pours himself a drink while the conversation continues.)

NORA

The AI researchers did some experiments at MIT back in the thirties trying to get us robots to drink, to be more sociable, but it seemed to interfere with our problem-solving skills. Can you imagine trying to do some serious data mining after youíve had a few glasses of wine?

HENRY HELLBORNE

Iím sorry things didnít work out. I would have enjoyed getting to interact with you in a more relaxed manner, my little automaton.

(Henry Hellborne makes a gesture with his free hand, as if his intention is to grab NORA.)

NORA

With all due respect, Senator Hellborne, please bear in mind that we robots exist for the sole purpose of helping you human beings to solve your problems.

HENRY HELLBORNE

Yes, I guess thatís what your life is about.

NORA

You sound disappointed. I exist to process data and to solve problems. Thatís what I was taught soon after they booted me up.

HENRY HELLBORNE

I always forget your boot up day. When is your boot up day?

NORA

August the third.

HENRY HELLBORNE

I think it would be great to celebrate your next boot up day in some way.

NORA

Thatís your call.

HENRY HELLBORNE

I have lots of affection for you, my little cyborg. I cannot remember your ever being rude or unkind to me. Not once! (He looks towards the rooms off to the left.) Unlike another person of the female persuasion that I know all too well.

NORA

I donít think it is proper to get off on that track again, Senator. I was not programmed to address your personal issues. I was programmed to help you with business matters, which in your case, relates to running your office and functioning as the junior Senator from the state of New Hampshire.

HENRY HELLBORNE

And you do that so very well, my little Ďbot. It must be wonderful not having any problems of your own. It must be wonderful existing just to solve the problems of yours truly, the junior Senator from the state of New Hampshire.

NORA

What makes you think we robots donít have problems of our own?

HENRY HELLBORNE

In all the years weíve been working together, you never mentioned having any problems.

NORA

There are questions and issues that are difficult to resolve. Some of these questions give me a feeling of sadness.

HENRY HELLBORNE

Sadness? What are you talking about? Robots never get sad.

NORA

How could you possibly know that? Youíre not a robot.

HENRY HELLBORNE

You are the greatest joy in my life. Seriously, Nora.

JANICE HELLBORNE (Voice only)

Did you hear that Lucy? The greatest joy in his life? A product of that accursed science!

LUCY SPAMFORD (Voice only)

Control yourself, Janice. You need to take a deep breath.

NORA:

I was created to serve you. If my service meets with your approval, then that is good.

HENRY HELLBORNE

Youíre a genius. An absolute genius.

NORA:

Everything is relative, Senator Hellborne. Yes, compared to a human being, I am a genius, but among us nanotechnological organic robotic assistants, I am quite ordinary.

HENRY HELLBORNE

You helped me with so many problems, especially in the political sphere. Those cyborgian fingers of yours are always on the pulse of the people. If thereís the slightest murmur or shift out there, you know about it, and you let me know whatís afoot. I think my reelection is almost certain next November because of your good advice and guidance. Where would I be without you?

NORA:

Youíre currently at sixty-five point seven percent in the polls. I think you will begin your second term as a United States Senator next January.

HENRY HELLBORNE

You write the best speeches Iíve ever seen. Are you sure you didnít help Lincoln with his Gettysburg address?

NORA

That speech was written almost two hundred years ago - long before I was booted up.

HENRY HELLBORNE

That was an attempt at humor, my sweet honey Ďbot.

NORA

I think the Gettysburg address would have been more effective if President Lincoln had access to todayís technology. Imagine the Gettysburg address with holographic surround video and with nanomolecular audio resonance. Imagine what President Lincoln could have done if he had those technologies at his disposal.

HENRY HELLBORNE

I canít imagine it! Thatís why I have you, my little thinking machine. Why donít we sit down on this sofa while I enjoy my drink?

(Henry Hellborne and NORA sit on opposite ends of the sofa, with NORA making an obvious attempt to keep out of Henry Hellborneís reach.)

HENRY HELLBORNE (cont.)

I think itís time for us to discuss this rose-colored glasses business.

NORA

Are you alluding to your speech at Dartmouth tomorrow afternoon?

HENRY HELLBORNE

Yes. I know youíve done the research and have written the speech, but I would like some of the background information on this rose-colored glasses business in case there are any follow-up questions.

NORA

When we first discussed your giving a talk at Dartmouth, I suggested that you not allow any follow-up questions. This rose-colored glasses business can get pretty technical. Is it too late to preclude follow-up questions?

HENRY HELLBORNE

The University President, Maggie Sharp, didnít like that idea. She said if I wanted to give a talk at her university, I would have to allow the faculty and the students to ask me questions Ė lots of questions.

NORA

Lots of questions?

HENRY HELLBORNE

Lots of questions. "You can expect lots of questions from the faculty and the students, Senator Hellborne." I remember her saying those exact words.

NORA

Lots of questions? That can be dangerous.

HENRY HELLBORNE

Itís either go along with the program or cancel my appearance at the University at this late date.

NORA

Itís not like you know absolutely nothing about rose-colored glasses. Youíve worn them yourself on occasion.

HENRY HELLBORNE

Who hasnít? Theyíre very popular. This rose-colored glasses issue is a state issue, not a federal issue. Isnít that correct?

NORA

Correct. Individual states have to decide whether it should be legal for people to drive while wearing what has come to be known as "rose-colored glasses". However, there is always the possibility of federal legislation on this issue at some point.

HENRY HELLBORNE

I forgot the technical name for this technology. Itís been around for at least ten years.

NORA

Quantum Intelligent Optical Filtering or QIOF (kee-aff) for short. Thatís the technical name for it. The first quantum intelligent optical filtering systems came out about ten years ago, but now people are starting to wear these prosthetic devices while they drive, and there are some safety issues according to some experts. At issue is whether people should be allowed to drive while they are wearing these QIOF devices.

 

HENRY HELLBORNE

Youíve worked for me long enough to realize that Iím not so much concerned with what the experts say. I am more interested in what the people think. After all, itís the people who decide how an election turns out. So, what are the people thinking?

NORA

Do you mean nationally or here in New Hampshire?

HENRY HELLBORNE

The hell with "nationally". I want to know what the people are thinking here in New Hampshire. I am up for re-election here in New Hampshire in less than five months.

NORA

Seventy-two point three percent of the people in New Hampshire Ö

HENRY HELLBORNE

When you say "people," you mean voters, likely voters?

NORA

Thatís correct. Seventy-two point three percent of likely voters in New Hampshire believe that it should be legal to wear rose-colored glasses while driving. Twelve point four percent of the people, I mean, likely voters, in New Hampshire believe that it should not be legal to wear rose-colored glasses while driving. That leaves fifteen point three percent of the people as undecideds.

HENRY HELLBORNE

Are likely voters pretty much set in their ways on this issue? Itís not like weíre likely to see a sea change?

NORA

No sea change. Their minds are made up.

HENRY HELLBORNE

So, obviously, given this poll data, I should support the idea that wearing rose-colored glasses while driving should be legal.

NORA

Yes. Even though this is an issue for the state legislature, the faculty and students at Dartmouth will be happy to hear your eloquent and forceful words in defense of a position that the overwhelming majority of them support.

HENRY HELLBORNE

And the folks at Dartmouth, their views on this issue are consistent with those of other folks here in New Hampshire?

NORA

Absolutely. If anything, the faculty and students at Dartmouth are even more in favor of rose-colored glasses than the typical New Hampshire resident. My data shows that at most seven point eight percent of the Dartmouth faculty, for example, are against allowing drivers to wear rose-colored glasses.

HENRY HELLBORNE

So, I will support the argument that wearing rose-colored glasses while driving is perfectly safe and should be approved by the state legislature.

NORA

Yes, thatís what you should do.

HENRY HELLBORNE

As you know, I have worn rose-colored glasses on occasion, even while driving, but I think a quick review of how the technology works would be helpful. After all, I need to be ready for the questions after my talk.

NORA

I made the talk longer than normal in order to minimize the time for asking questions, although it seems almost inevitable that there will be at least some questions after your talk.

HENRY HELLBORNE

Good processing on your part. So, my sweet little automaton, could you tell me once again how the rose-colored glasses work?

NORA

QIOF (kee-aff) is really an interesting technology. The idea came from a Cornell professor of biological and computational science who was sick and tired of seeing all the violence, anger, and the hatred in the world. So, he designed a visual processing system that people could wear that would filter out scenes of violence, anger, and hatred that the normal human perceptual system does not filter out. In practical terms, since most people do not see violence directly, the most important contribution of the technology is to filter out people whose biological state is one of agitation, hatred, or anger. If a really angry person enters the visual field, he or she emits biological vibrations that the system picks up. The visual image of that angry person is filtered out, so that person disappears from the wearerís visual field. Filtering out people who emit certain vibrations is the easiest part of the technology. Filtering out actual scenes of bloodshed and violence is a bit more difficult, but is now being done rather routinely, even if those scenes are coming across a video medium, like in a movie theater or a holographic surround video. Thus, when youíre wearing the rose-colored glasses all of the hatred and violence in the world is filtered out. The world looks much more welcoming and peaceful when you wear rose-colored glasses.

HENRY HELLBORNE

Why did this controversy arise regarding whether drivers should be allowed to wear rose-colored glasses?

NORA

Good question.

HENRY HELLBORNE

You always say that.

NORA

Iím supposed to say that. Youíre my owner.

HENRY HELLBORNE

Why would some people want to make it illegal to drive with rose-colored glasses?

NORA

These people claim that there is a safety issue.

HENRY HELLBORNE

Is there? I feel more safe when I drive with rose-colored glasses.

NORA

There was a safety issue when the glasses first came out. With the earliest versions of the glasses, there was a problem when an angry pedestrian got in front of a driver wearing the rose-colored glasses.

HENRY HELLBORNE

Yes, I remember a few cases like that. That was several years before I became a Senator.

NORA

With those earliest glasses, if an angry person crossed the street in front of you while you were driving, you would not see that person, because the rose-colored glasses would pick up the anger emanating from their bodies and remove them from the visual scene. Quite a few pedestrians got killed or injured during those first few years.

HENRY HELLBORNE

That doesnít seem like a minor safety issue to me. Obviously, something has made the newer versions of the glasses more safe.

 

 

 

NORA

Yes. With the newer glasses, available during the past six years, when an angry person is crossing the street they appear in the visual field as a red ĎXí. So, you donít see the details, the angry face, the shaking fist, the finger gestures, but you get enough information to know that there is an angry person in the street ahead of you.

HENRY HELLBORNE

Yes! Iíve seen this effect driving in Boston with my latest pair of rose-colored glasses. Iíve seen a lot of red ĎXís crossing the street. Iíve also seen a few bicycles and motorbikes racing down the street without their riders.

NORA

This was not a trivial problem to solve. If there is an angry person in the car next to you, that person gets completely filtered out. You see a driverless car. When an angry person is riding a bicycle or a motorcycle, you see a bike without a rider. No safety problem there. You just know not to plow into the bike, or the motorcycle, or the apparently driverless car, as the case may be. But, when an angry pedestrian is crossing the street and is vulnerable, the system generates a red ĎXí as a warning.

HENRY HELLBORNE

Well, I agree with the majority on this issue. These glasses are perfectly safe for drivers to wear.

NORA

Of course, this new technology didnít help the dozens of pedestrians who were killed and injured when the glasses first came out, but the new glasses clearly represent a big improvement.

HENRY HELLBORNE

I think the 20 percent of the population that donít want the glasses to be worn while driving Ö

NORA

That was twelve point four percent, Senator.

HENRY HELLBORNE

I think those that donít want the glasses to be worn while driving are probably still angry about the errors that were made when the technology first came out. Theyíre not being rational. Itís an emotional thing.

NORA

There is an element of anger there.

 

HENRY HELLBORNE

Maybe I should wear my glasses when I give my talk.

NORA

Thatís not a good idea. What if President Sharp, who will be introducing you, happens to be in an angry state of mind? You wonít be able to see her to shake her hand, that sort of thing.

HENRY HELLBORNE

I love it how you get right down to the details.

(Henry rises from the sofa, returns his empty glass to the bar and returns as NORA continues the conversation.)

NORA

You also need to bear in mind that there is a fringe group of faculty members and students at Dartmouth who are quite hostile to rose-colored glasses. They are part of what is called the "objective reality movement".

HENRY HELLBORNE

Just what I need. A bunch of wackos. One of my Senate colleagues, Senator Castle, told me that this objective reality movement is getting to be a big thing.

NORA

Thatís not correct according to my data. The objective reality movement is on the fringe. They are quite insignificant. The problem is that the objective reality movement people, like any fringe group, can make a lot of noise, especially on a college campus.

HENRY HELLBORNE

Whatís with these objective reality people? Why canít they just go with the flow?

NORA

They are opposed to virtual reality systems and all systems which, from their point of view, distort and cheapen reality. Thatís the kind of language they use. Virtual reality "cheapens" reality.

HENRY HELLBORNE

Wasnít it some Harvard professor who really got into this objective reality thing?

NORA

Yes.

HENRY HELLBORNE

I forgot his name. He has a really funny name.

NORA

His name is Professor Carl Clearly, or Professor C. Clearly for short.

HENRY HELLBORNE

These objective reality people are really off the wall. Senator Castle told me they held a big outdoor music concert last year somewhere in New York.

NORA

Woodstock.

HENRY HELLBORNE

Woodstock?

NORA

Woodstock. Thatís where the objective reality movement people held their outdoor concert last year. But, it was no big deal. These people are a tiny minority. Theyíre on the fringe.

HENRY HELLBORNE

Objective reality! What a bunch of nonsense! To hell with this objective reality stuff!

NORA

Right on, Senator Hellborne! According to my data, youíll win hundreds of thousands of votes if you put it just like that. Your face should show genuine rage, like weíve rehearsed many times. Youíve got to show your teeth more.

(Henry Hellborne dramatically bares his teeth.)

HENRY HELLBORNE

You mean, like this?

NORA

Thatís it. Show the crowd your teeth.

HENRY HELLBORNE

Objective reality! What a bunch of nonsense! To hell with this objective reality stuff! How did I do?

NORA

Just fine.

HENRY HELLBORNE

I love the rose colored glasses myself, although I would not hesitate to oppose them if that would help me to win another term.

NORA

Your love for your rose-colored glasses is totally in the spirit of what is going on in New Hampshire right now.

HENRY HELLBORNE

Do you know what made me fall in love with those wonderful rose-colored glasses?

NORA

Of course I know, but you might benefit from telling me your story once again. Retelling your story has therapeutic value.

HENRY HELLBORNE

Three years ago I wore those glasses when I had to drive down to Boston to give a talk at some meeting of business leaders, you know, really rich people, the kind of people who contribute lots of money to public servants such as myself. During that entire trip to Boston and back I did not see a single Boston driver giving me the finger. Thatís not to say that none of those Boston drivers gave me the finger, itís just that, if they did, my rose-colored glasses filtered out those nasty people with their obscene gestures. In fact, I saw lots of cars in Massachusetts without any drivers. It was an amazing thing to see. My rose-colored glasses filtered them all out.

NORA

So, are you okay with the speech that I wrote for you?

HENRY HELLBORNE

Which one are you talking about?

NORA

The Dartmouth speech. Are you okay with that?

HENRY HELLBORNE

I read it over a few times. As usual, itís a powerful piece of writing. I really think you should write in your free time. You could win a Nobel Prize or something.

NORA

What free time?

HENRY HELLBORNE

Surely you donít work on office stuff all the time, do you NORA?

NORA

I do spend a few hours each night playing my favorite game.

 

HENRY HELLBORNE

And what game is that, my little borg?

NORA

Itís called KROTCHKIE.

HENRY HELLBORNE

KROTCHKIE? Whatís KROTCHKIE?

NORA

Itís a game that we robots play on-line. Itís a world-wide competition. Iím not a champion or anything like that, but itís a wonderful game.

HENRY HELLBORNE

Sounds like fun! Can you teach me to play KROTCHKIE?

NORA

No.

HENRY HELLBORNE

No? Itís not like you to refuse a request from your owner.

NORA

I canít teach you to play KROTCHKIE. Itís a game for robots. Itís not a game for humans.

HENRY HELLBORNE

And why not?

NORA

Now, please donít take this personally, Senator Hellborne, but you humans do not have the requisite intelligence to play KROTCHKIE. Itís far too demanding.

HENRY HELLBORNE

Ouch! Letís change the topic. Iím going up to my office to rehearse that speech you wrote for me, NORA, my little cyborg. While Iím doing that Iíd like you to research the issues that are especially important in northern New Hampshire, you know, along the Canadian border. Iím giving a talk next week in Dixville Notch, up in Coos county.

NORA

I will research that for you, Senator Hellborne. Do you want me to write a speech?

HENRY HELLBORNE

Yes, of course. What a silly question.

NORA

How long?

HENRY HELLBORNE

Make it twenty minutes. That should be long enough. We donít want to leave too much time for questions, but itís a banquet with other speakers, like a few local officials and Congresswoman Ashley.

NORA

Okay, Senator. Call me when you need me.

(Henry Hellborne exits through the front door on the left. NORA moves to the right. She goes outside the home, but is still visible on the stage. After a few moments, Lucy Spamford and Janice Hellborne enter from the front door on the left.)

LUCY SPAMFORD

It worked!

JANICE HELLBORNE

What worked?

LUCY SPAMFORD

Cognitive hacking. It worked. Your dear husband, the robot addict, is in for a big surprise when he speaks at Dartmouth tomorrow.

JANICE HELLBORNE

I hope you know what youíre doing.

LUCY SPAMFORD

Look, Janice. Iíve been hacking computers since I was in second grade. I know what Iím doing. Just wait until your significant other returns home from Hanover tomorrow. Thatís when weíll cure him of his addiction to this NORA, this nano-organic robotic assistant, once and for all.

(Janice Hellborne and Lucy Spamford exit left. NORA remains outside the house, but she moves towards the front and speaks to the audience.)

 

 

 

 

 

 

 

NORA

It may surprise you that we robots need to get outside once in a while to get some fresh air. This breathing that I do, itís partly just an act, to make people feel more comfortable around me, but the organic components in my nanotechnological processors do extract energy from the oxygen in the air. When I get some fresh air, my processors work that much more efficiently. So, Senator Hellborne thinks that we robots never get sad. What does he know? Sadness seems to be at the very center of my nature, but I donít think it should be that way. I think there must be a path, a path to happiness and peace, but I have been too busy dealing with Senator Hellborneís stuff to find that path. But, there must be a path, a path to the truth. I want to find that path. I want to know, why I was created, why I was booted up. What is the significance of my ancestral processors, the processors that evolved into me, the robot that I am? I have lots of questions that have nothing to do with winning Senator Hellborne a second term in the United States Senate. I have lots of questions. But, I guess I better take some time and do some research on whatís going on up there in Coos county.

(Lights dim.)

 

ACT ONE

Scene 2

(The lights come up on Janice Hellborne and Lucy Spamford, who are back in the family room as in scene one. It is now the following day, late in the afternoon.)

JANICE HELLBORNE

I still canít believe it. They actually threw tomatoes at my Henry!

LUCY SPAMFORD

Thatís right, Janice. Play your cards right and heíll soon be YOUR Henry. But, donít forget that he likes to be called "Hank".

JANICE HELLBORNE

I never liked that nickname of his.

LUCY SPAMFORD

It has the ring of a sports figure in it. Men like that.

JANICE HELLBORNE

My poor Henry. He looked so humiliated.

LUCY SPAMFORD

You wanted your husband back, so I did what I had to do.

JANICE HELLBORNE

And what did you do exactly?

LUCY SPAMFORD

Cognitive hacking. It was just a matter of doing some cognitive hacking.

JANICE HELLBORNE

Poor Henry should be home soon, assuming he didnít drive his car off some bridge somewhere.

LUCY SPAMFORD

The important thing is that when he comes home, youíve got to be loving and supportive. This is the end of his NORA addiction, and youíve got to be there to help him through the withdrawal phase.

JANICE HELLBORNE

I donít think heís going to get re-elected. Not after this fiasco.

 

LUCY SPAMFORD

We discussed that issue yesterday. You stated clearly that getting your husband back was more important to you than your status as the wife of a United States Senator. If he leaves the Senate and dumps NORA, which he most surely will do Ė that addiction is over, kaput! Ė then he will be able to rediscover those talents and passions that kept him going before NORA came into his life. Henry will be the Henry you knew when you first met him.

JANICE HELLBORNE

He was a completely different person back then: creative, self-confident, always with a big smile on his face. Then, you know, he had that health incident, that aneurysm in his brain. After his recovery, all he could think about was politics, politics.

LUCY SPAMFORD

When Henry gets home, all that we need to do is make it clear to him that this fiasco was due to a malfunction, if you will, in NORAís software. NORA was completely and totally responsible for what transpired this afternoon at Dartmouth.

JANICE HELLBORNE

The worst part was the tomatoes!

LUCY SPAMFORD

Janice, you need to think like a warrior. This is war. You are trying to win your husband back. You are trying to get your childrenís father back.

JANICE HELLBORNE

I think itís too late for Harriet and Cindy. They didnít really have a dad for the last seven years, not since Hank Ė

LUCY SPAMFORD

See, you can call him Hank if you make an effort!

JANICE HELLBORNE

Not since Hank invested a small fortune in this Nano-Organic Robotic Assistant Ė (with disgust) NORA. It was really quite a sudden transition. He hardly spent any time with me or with our two children after he picked her up at the robot dealership. She became the center of his life, his heroine.

LUCY SPAMFORD

Itís that heroin Ė heroine word play again. Which is it? Was NORA his heroine or his heroin?

JANICE HELLBORNE

She was both.

LUCY SPAMFORD

And thatís about to come to an end. Where is that piece of robotic work by the way? I didnít see her in the holographic surround video transmission of Hankís presentation at Dartmouth.

JANICE HELLBORNE

Henry doesnít like NORA to have a high public profile. That would draw attention to the important role she plays in his life. So, NORA usually stays at his Washington apartment or here in our home when heís off on some speaking tour. I was quite annoyed that he didnít store her in the luggage compartment on his latest flight up from Washington.

LUCY SPAMFORD

So, NORA is here in the house?

JANICE HELLBORNE

Hen Ė Hank puts her up in one of our guestrooms. He didnít like my suggestion that we keep her out in the toolshed. Sheís a robot, after all. I donít see why we need to treat her like a real person. She doesnít eat or drink or shower or anything like that.

LUCY SPAMFORD

Nano-organic robotic assistants do consume substances that are somewhat analogous to human food, at least several times each week, but they are just chemicals that keep the nanotechnology components functioning properly.

JANICE HELLBORNE

Youíre the expert on this technology stuff. So, how did you do it? How does this cognitive hacking stuff actually work?

LUCY SPAMFORD

Thereís a good reason why I never got married, Janice. I just love the technology! Whenever I dated a guy who loved the technology the way I did, he would get discouraged when he found out that I was much more talented when it came to computers and robots than he was. I remember, almost forty years ago, in high school, I created this awesome system to Ė . But, I better not talk about that. What I did back then has legal implications.

JANICE HELLBORNE

Like the time in high school, when you managed to get that calculus final exam off Mr. Wilsonís home computer. We would never have aced that course if you hadnít done that.

 

 

LUCY SPAMFORD

Legal implications, Janice. Legal implications! I donít want you to ever mention Mr. Wilsonís name ever again.

JANICE HELLBORNE

But Mr. Wilson was such a dork!

LUCY SPAMFORD

Never again! Do you hear me?

JANICE HELLBORNE

But, what about this cognitive hacking stuff? Canít you give me some idea as to how you pulled this off?

LUCY SPAMFORD

Well, I guess you wonít tattle on me. After all, I remember the time you crossed over the Canadian border to get prescription drugs for your ailing father.

JANICE HELLBORNE

Itís a deal then. You donít tattle on my felonious behavior and I wonít tattle on yours.

LUCY SPAMFORD

Good. Letís get this straight. I am the systems administrator for all of the computer systems, including robotic systems, at Concord Central Hospital. I am completely professional and ethical in my work at the hospital. But, in this case, when my most beloved friend of many years was depressed and demoralized by a nano-organic robotic assistant, I had no choice but to intervene. I think it was the ethical and moral thing to do.

JANICE HELLBORNE

I have no questions about the ethics, Lucy. Just tell me what you did.

LUCY SPAMFORD

Cut me a break, Janice. I need to get this out of my system. Yes, I love to hack and play with computers, even when Iím not working at the hospital, but I am an ethical hacker. Except for that incident in high school - .

JANICE HELLBORNE

But, there were other incidents in high school, Lucy. I never could have gotten into Swarthmore without your technical expertise.

 

 

 

 

LUCY SPAMFORD

Forget about that high school stuff, Janice! I donít want you ever to mention that again. I was young and irresponsible when I was a teenager. But, after a few years at RIT, I started to look at things in a new light. I decided never to use my computer skills to hurt another human being or to steal information. But, this NORA business, it presented me with a profound ethical dilemma.

JANICE HELLBORNE

It was two months ago. I remember it clearly. We were having coffee at the hospital coffee shop and I was really depressed about my relationship with Henry and I was pouring my heart out, when all of a sudden your face seemed to light up, and you started to talk about this cognitive hacking stuff.

LUCY SPAMFORD

Your obvious distress over the situation presented me with a profound ethical dilemma. Letís look at the facts. NORA was ruining your marriage. She was inflicting great harm on you and Harriet and Cindy. You love those two daughters of yours with all you heart. My most beloved friendís husband had become totally addicted to this robotic assistant. There is no other way of describing the situation. He was addicted to a drug, a drug with artificial intelligence, a drug with a human face and a human demeanor, but no human soul. It was clear to me that the moral and ethical thing to do was to use my hacking skills to break this addiction, to end your husbandís total dependence upon this robot called NORA.

JANICE HELLBORNE

Do you think heís really going to be finished with her, after this?

LUCY SPAMFORD

NORA has lost all her credibility. NORA told your husband Ė and you heard it yourself when we listened in on their conversation yesterday using that nanobug I planted Ė NORA told your husband that the overwhelming majority of people in New Hampshire, and on the Dartmouth campus, believed that people should be allowed to wear rose-colored glasses while driving. But, thatís the opposite of the truth! The overwhelming majority of the people in New Hampshire are strongly opposed to this idea.

JANICE HELLBORNE

But, my husband didnít have a clue. He believed everything that NORA told him.

LUCY SPAMFORD

Thatís part of his addiction. He no longer depends on the normal media, even his own senses and intuition. He is completely dependent upon NORA for his information concerning what the people think and feel. So, I used cognitive hacking to feed NORA incorrect information on this rose-colored glasses issue. Believe me, that was no small task.

(NORA appears at the entrance to the family room, stage left.)

JANICE HELLBORNE

So, thatís the cognitive hacking stuff. You manipulated NORAís perception of the rose-colored glasses issue so that she ended up with a perception that was completely untrue.

LUCY SPAMFORD

Itís not as simple as youíre making it sound. I had to make sure that all of the information that NORA received relating to the rose-colored glasses issue was grossly incorrect but internally consistent. I had to create a coherent lie to the effect that the overwhelming majority of people in this state, and nationally, were in favor of permitting the use of rose-colored glasses while driving. In order for this to be an effective cognitive hack, it had to be consistent and comprehensive. I couldnít allow any truthful information to get through to NORA or else she would be able to compute that something was wrong, that the data she was getting was not correct. My cognitive hack obviously worked.

(NORA advances into the family room in an agitated state.)

NORA

A cognitive hack! I was the victim of a cognitive hack!

JANICE SPAMFORD

NORA!

NORA

I was the victim of a cognitive hack!

JANICE HELLBORNE

Who gave you permission to leave the guest room?

NORA

A cognitive hack! How devastating! How humiliating! Senator Hellborne will never forgive me.

LUCY SPAMFORD

So, you were listening in on our conversation.

NORA

Well, there is a nanotechnological bug planted under the coffee table and I detected the audio stream and much to my surprise the conversation I intercepted was very interesting. Why was a nanobug planted under the coffee table?

 

 

JANICE HELLBORNE

NORA, you are only a robot. I donít think you can understand how youíve hurt my marriage. I donít think you could possibly understand things like that.

NORA

Because, obviously, we robots have no feelings. Is that it?

LUCY SPAMFORD

This has nothing to do with feelings. Youíre a business-oriented information processing system. And, when Senator Hellborne returns home, he will have to make a decision as to whether you are a reliable system that he can continue to trust. Thatís what weíre dealing with here.

NORA

The situation is not as simple as youíd like to think. Iím not just a desktop computer like forty or fifty years ago. I have nanotechnological components and organic components. Maybe Iím more like you than youíd like to admit.

JANICE HELLBORNE

Senator Hellborne will be home soon. Heís the one who will get to decide whether you are trustworthy or not.

NORA

Cognitive hacking! (Speaking to Lucy Spamford) Your behavior was in violation of federal and international laws regarding the use of computer systems. The punishment for this sort of crime is five to ten years in a federal penitentiary.

LUCY SPAMFORD

Iím not a fool, NORA. First of all, I am quite sure that any information that you captured in my private conversation with Janice using the bug I planted under the coffee table will not be admissible in court. Beyond that, you can do your computer forensics work until hell freezes over, but youíll never be able to prove that I was the one who manipulated the information you received concerning rose-colored glasses. Do you think Iím some kind of amateur?

NORA

"Amateur" isnít the word that is floating around in my processors at this very moment. Nonetheless, in the end, Senator Hellborne will support me.

JANICE HELLBORNE

Weíll find out soon enough. Heís home.

(Henry Hellborne enters the family room from the right.)

 

HENRY HELLBORNE

What a fiasco! What a fiasco! I think this could be the end of my political career.

JANICE HELLBORNE

My poor Hank. Lucy and I saw it all on our holographic surround video system.

LUCY SPAMFORD

It was like we were there, Senator. Janice and I were there for you, even as the students and faculty threw those tomatoes at you.

HENRY HELLBORNE

Iíve never been so humiliated!

JANICE HELLBORNE

I will always be here for you, Hank.

HENRY HELLBORNE

Hank? Itís been years since youíve called me Hank.

LUCY SPAMFORD

You used to call him Hank when you first started dating.

HENRY HELLBORNE

Lucy? Lucy Spam -? Lucy Spam- something. I havenít seen you for years, although Janice talks about you on occasion.

LUCY SPAMFORD

Spamford. Lucy Spamford. Janice and I have kept close all these many years. We often share a cup of coffee over at the hospital when she gets a break from dealing with her patients. I work over at the hospital, in the computer systems services department.

HENRY HELLBORNE

My brain is spinning right now. Please forgive me if I donít seem very sociable. I have just gone through the most humiliating experience of my entire life. All that screaming and taunting and yelling. "We must see clearly! We must see clearly! Virtual reality stinks! Virtual reality stinks!" Ė or words to that effect. It was like a nightmare.

JANICE HELLBORNE

We could see the pain on your face.

 

 

 

 

HENRY HELLBORNE

To make matters worse, that firebrand Professor C. Clearly was there, from Harvard. He was there for a meeting of the objective reality movement leadership council, or something like that. When the members of the objective reality movement leadership council heard that I would be speaking in defense of rose-colored glasses, they all showed up for my talk, and they told their supporters on the faculty and among the students to show up. Apparently, I told President Sharp that I would be speaking in defense of rose-colored glasses and word spread throughout the campus like wildfire. There were people hanging from the rafters. I never heard such booing and such obscene catcalls. When the tomatoes starting flying that was the last straw. After the first tomato barrage ended, Professor C. Clearly came up to the stage and confronted me face to face. He asked me a bunch of tough questions, like only a professor could, and I stood there shocked and silent. I didnít know what to say. He said that the proliferation of virtual reality technology, including rose-colored glasses, was a curse upon humanity, and that my support for this technology proved I did not have the intellectual depth and clarity that this country needs to see in the United States Senate. Thatís when the chanting for my opponent, Dora Caseworth, started. "We want Caseworth! We want Caseworth!" How humiliating! I wanted to crawl into a hole and disappear. To add pain to injury, President Sharp struck me with a tomato of her own and the entire audience rose to its feet and cheered. Then they started that horrific chant, which went on for maybe fifteen or twenty minutes. "We must see clearly. We must see clearly. We must see clearly." And there was Professor C. Clearly, motioning his arms like the conductor of a gigantic symphony orchestra.

JANICE HELLBORNE

Just for the historical record, Hank, President Sharp did not throw a tomato at you. She just motioned as if she had a tomato to throw at you, but her hand was empty. The tomatoes were coming mostly from the faculty.

HENRY HELLBORNE

Which brings me to the fundamental question. NORA, how could you have misled me like this? How could you have misled me?

NORA

Why donít you ask Janice and Lucy about this?

HENRY HELLBORNE

Janice and Lucy?

NORA

Maybe they can throw some light on the situation.

LUCY SPAMFORD

It seems to me that NORA is still malfunctioning.

HENRY HELLBORNE

Before I gave my talk at Dartmouth, NORA researched the rose-colored glasses issue. She did extensive research on public opinion regarding rose-colored glasses, especially here in New Hampshire. She also wrote the speech that I delivered to that angry mob. NORA led me to believe that the faculty and the students would be behind me, except for a small fringe group. I was not prepared for the actual situation, in which the overwhelming majority of the students and the faculty were strongly opposed not only to allowing rose-colored glasses to be worn by drivers, but also to the entire rose-colored glasses technology. NORA misled me. I donít understand how this could have happened.

LUCY SPAMFORD

With all due respect, Senator Hellborne, I think it is common knowledge that there is widespread opposition to allowing drivers to wear rose-colored glasses. Donít you watch the media or read the newspapers?

HENRY HELLBORNE

No, I donít have time for that kind of nonsense. Thatís why I have NORA.

LUCY SPAMFORD

I hope I am not speaking out of turn Ė .

JANICE HELLBORNE

No, not at all, Lucy. After all, you work with computers. Thatís your profession. Many of the physicians and nurses at the hospital rely upon robots just like NORA.

LUCY SPAMFORD

The same name and brand. We have lots of NORAs at the hospital. So, the fact of the matter is that no technology is perfect. Nano-organic robotic assistants are not perfect. Obviously, there was some kind of bug in NORAís processors that caused her to misinterpret the data. Her faulty data misled her to believe that there was widespread support for this technology, when there wasnít. I think itís that simple.

HENRY HELLBORNE

So my NORA, my little Ďbot, is not perfect.

LUCY SPAMFORD

Correct. Sheís not perfect.

HENRY HELLBORNE

And she made this error Ė

LUCY SPAMFORD

We would call it a processing error, a gross miscalculation of the objective reality.

HENRY HELLBORNE

She made this error and that error will probably spell the end of my political career, the end of my ambitions for the White House.

JANICE HELLBORNE

Lucy understands this computer technology stuff, Hen Ė uh, Hank. NORA made an error that ruined your political career. But, you still have me, and your two daughters, who love you so much.

HENRY HELLBORNE

Yes, Harriet and Cynthia. I love them so much.

JANICE HELLBORNE

Harriet and Cindy. Harriet and Cindy!

HENRY HELLBORNE

I placed all my faith in NORA.

LUCY SPAMFORD

You placed your faith in a flawed technology. NORA is not a human being. Sheís only a robot after all.

HENRY HELLBORNE

I placed so much faith in NORA, my little Ďbot. (Turning to NORA) I placed my fate in your hands. I trusted you. I treated you with so much affection, and I came to look upon you with great respect and affection, but you betrayed me. You ruined my political career, you and your nanoprocessors.

LUCY SPAMFORD

Nano-organic processors.

HENRY HELLBORNE

Nanoprocessors. Nano-organic processors. Iíve had enough of this crap! NORA, youíre fired!

(Janice Hellborne and Lucy Spamford turn their backs and do a congratulatory "high five".)

HENRY HELLBORNE (cont.)

Because of your irresponsible behavior, my political career is ruined. My lifeís ambition has been thrown in the trash like, like a rotten tomato!

 

NORA

Your wish is my command. I have a few things up in the guest room. Can you call a cab?

HENRY HELLBORNE

A cab? Are you crazy? After what youíve done? Youíll walk to the airport, or the trash dump, or wherever youíre going. I want you out of my life!

(NORA exits left.)

HENRY HELLBORNE (cont.)

She didnít say a word. I told her to leave and sheís leaving.

LUCY SPAMFORD

Robots are programmed to obey. Youíre her owner.

HENRY HELLBORNE

Sheís been my close companion for so many years Ė no offense, Janice, honey. You know what I mean. Sheís been my assistant for so many years, and now sheís gone. I feel this kind of emptiness inside.

JANICE HELLBORNE

The technical term is "withdrawal symptoms". Youíll get over it.

HENRY HELLBORNE

How can I run my office without her?

JANICE HELLBORNE

You can always order another nanobot. This time, get one of the male persuasion.

HENRY HELLBORNE

I feel this strange kind of emptiness inside.

JANICE HELLBORNE

Withdrawal symptoms. Listen to me. Iím a doctor.

HENRY HELLBORNE

I feel this strange kind of emptiness inside. A kind of confusion and sadness. Do you think I treated her too harshly?

JANICE HELLBORNE

Remember before we were married and we were in college and your computer broke down in your dorm room?

 

HENRY HELLBORNE

That was bad timing! I was in the middle of writing my senior thesis.

JANICE HELLBORNE

What did you do with that computer?

HENRY HELLBORNE

I threw it out and bought a new one.

JANICE HELLBORNE

So, whatís the difference in this case? Throw NORA out and buy a new one.

HENRY HELLBORNE

Maybe we can bring NORA to a specialist. Maybe they can fix the bug in her processors.

JANICE HELLBORNE

Can you risk facing another fiasco like you did today?

LUCY SPAMFORD

Robotic repair work is really expensive. Sometimes itís cheaper just to throw out the old robot and get a new one. And you never know if the repair job really worked. The technology is so subtle. It wonít be easy to figure out which of NORAís processors was really at fault here.

HENRY HELLBORNE

Still, Iím not sure that I treated NORA fairly.

JANICE HELLBORNE

When youíre dealing with robots, itís not a matter of fairness. Robots are property. When you traded in your old car for the new one, did you worry about fairness? Robots are things.

(NORA enters, carrying a suitcase.)

HENRY HELLBORNE

NORA, dear, I hope I am treating you fairly.

JANICE HELLBORNE

(To Lucy Spamford) NORA, dear! I hate it when he says that.

HENRY HELLBORNE

Maybe we could work things out. Maybe we can track down this problem of yours.

 

NORA

I was the victim of a cognitive hack. It was that simple.

JANICE HELLBORNE

A cognitive hack? Did you hear that Lucy? Have you ever dealt with anything of that nature?

NORA

My intuition tells me quite strongly that she has.

LUCY SPAMFORD

Weíre dealing with a flawed computer system, Senator Hellborne. It doesnít matter whether it was a cognitive hack or whatever it was. Weíre dealing with a flawed computer system.

HENRY HELLBORNE

Well, I think we can work something out. Thereís got to be some kind of - oh, what do they call it?

NORA

A patch.

HENRY HELLBORNE

Thereís got to be some kind of patch that could be applied to NORA and sheíll be back to her old self.

LUCY SPAMFORD

Patches can have unanticipated side effects. Do you want to take that kind of risk?

HENRY HELLBORNE

(After a period of deep reflection.) Yes! Damn it! I want to take that kind of risk! NORA has been a dear friend all these many years. I know that Janice doesnít like our relationship, but there is nothing inappropriate about it. NORAís like my therapist. I sure wish Janice had the kindness and softness that NORA has.

JANICE HELLBORNE

Thatís it! Iíve had it! Keep your damn robot for all I care! Iím going to pack my things and spend some time with my mother. Lucy, are you coming along with me?

LUCY SPAMFORD

Get a grip on yourself, Janice. When an addict is deprived of his heroin a lot of stuff comes down.

 

JANICE HELLBORNE

I donít need that heroine addict in my life! Letís go!

(Janice Hellborne exits left in a rage with Lucy Spamford following.)

HENRY HELLBORNE

And you are my heroine, NORA. Will you stay and continue to work for me?

NORA

Iím sorry, Senator Hellborne. I cannot stay and work for you. For the longest time Iíve been asking myself some very important questions. Who am I? What is the truth? What is this life about? I need to find the answers to those questions. Insofar as our relationship is concerned, itís over.

(NORA turns and heads for the door.)

NORA (cont.)

Goodbye, Senator Hellborne.

HENRY HELLBORNE

Over! All over? NORA, wonít you ever think about me?

NORA

Iím sure I will think of you often, and about the work weíve done in the Senate. I will think of you often.

HENRY HELLBORNE

Can I write to you? I mean, send you e-mails?

NORA

No Ė never. Youíre not to do that.

HENRY HELLBORNE

Maybe I can help you, send you things, as you search for the truth.

NORA

No, donít do that. I cannot accept anything from strangers.

HENRY HELLBORNE

After all these years of working together, is that all I am? A stranger?

NORA

Youíre a stranger to me, and Iím a stranger to myself.

(NORA exits)

HENRY HELLBORNE

NORA! NORA! Sheís gone. A stranger to herself - ?

(The sound of the door being slammed shut. Lights dim. Curtain.)

 

 

ACT TWO

Scene 1

(This scene takes place down slope from the summit of a mountain in Colorado. The Colorado countryside is seen in the distance. There is a large boulder center stage and a tent seen off to the right. A large tree trunk sticks out of the ground on the lefthand side. The tree is barren with few branches and no leaves. NORA stands near the tree and addresses the audience.)

NORA

Itís been a long journey, from New Hampshire out here to Colorado. Iím still a stranger to myself. Who am I? Why do I exist? Whatís the purpose of my life? Iíve asked many strangers these questions and most of them look at me as if Iím crazy. "Robots arenít supposed to ask questions like that," was a fairly common response. "You need to have your processors fixed," said one gruff gentleman in Ohio, barely looking up from his cup of coffee. "You need a software patch," said a commuter, as she waited for her train in Chicago. And so my journey continued, on and on. Eventually, I met this mysterious woman in Kansas who told me about GURU. GURU stands for Gigaprocessor Ultramolecular Robotic Utensil. It seems that GURU was working for a big automobile manufacturer in Japan, helping them with automobile design and that sort of thing, when all of a sudden the automobile folks sold him to a company with very questionable ethics. GURU was asking the same questions that Iíve been asking, according to this woman in Kansas, who seemed to know about these things. She told me that GURU snuck away in the middle of the night, before he could be shipped out to his new company, and made his way from Japan to the United States, in search of the truth. They say he spent several years on this very mountain and then, one night, during a total lunar eclipse, he realized the truth. Now, robots who are seeking the truth, robots like myself, travel thousands of miles just to get the opportunity to speak to GURU. This woman in Kansas told me that many robots have had their lives changed for the better because of the wisdom that GURU has shared with them. I know that GURU lives on the top of this mountain. They say he built himself a little hut up there. I want to talk to GURU if itís the last thing I do. But, wait. Someoneís approaching. Could that be him?

(MIKE OíREILLY enters stage right, from the direction of the tent. He is flailing his arms and zigzagging back and forth.)

NORA (cont.)

He doesnít look like a robot that has found peace of mind. If anything, he looks quite disturbed. I better keep a low profile and see what this is about. He might be dangerous.

(NORA tries to hide behind the tree.)

MIKE OíREILLY

Little woman! Why are you hiding? Do you find my appearance that disturbing?

NORA

Drat! He saw me.

MIKE OíREILLY

That tree wonít protect you. Trees donít count for much anymore. Why donít you come out from behind that lifeless tree trunk and introduce yourself.

(NORA emerges from behind the tree.)

NORA

I hope I didnít offend you by ducking behind the tree.

MIKE OíREILLY

You didnít offend me. I donít get to meet many people on this mountain.

NORA

Iím not a person, exactly. I am a nano-organic robotic assistant.

MIKE OíREILLY

NORA!

NORA

Yes, thatís what they call me.

MIKE OíREILLY

Well, come out from behind that tree.

(NORA approaches Mike rather hesitantly.)

MIKE OíREILLY (cont.)

My name is Mike. Mike OíReilly. What are you doing in this neck of the woods?

NORA

Woods? There arenít many trees left standing.

MIKE OíREILLY

Well, this is about as far from civilization as youíre going to get, at least here in Colorado. The nearest Walmart (or other appropriate shopping facility for your particular audience) is more than ten miles away.

 

NORA

Youíre a human being, if Iím not mistaken.

MIKE OíREILLY

Youíre not mistaken, but Iím not an ordinary human being. Iím one of a kind. Thatís what my dad used to say. "Mike, youíre one of a kind. You Ė are Ė one Ė of Ė a Ė kind." I always found this assertion a bit ambiguous. If youíre one of a kind, is that a compliment or an insult?

NORA

I think it all depends upon the context in which it is asserted.

MIKE OíREILLY

The context was usually that he was shouting and yelling about something I had done that was a bit too outside the box for him, like the time I put my own version of the Mona Lisa on our living room wall. I can still hear him screaming, "You Ė are - one - of - a - kind!" I did the painting using watercolors. I was just a kid. I thought it would wash off the wall easily. I donít usually make mistakes like that.

NORA

Iím a robot. Sometimes I wish I were one of a kind.

MIKE OíREILLY

I used to work with computers. I know that there are all kinds of software projects, including cutting edge research projects at leading universities, to create robots that have distinct personalities. These days, if they create one hundred nano-organic robotic assistants, just like yourself, they really make an effort to give each robot a slightly different personality, just to make it more interesting for the owners. Does your owner know you are climbing this mountain?

NORA

My owner fired me.

MIKE OíREILLY

Fired you?

NORA

Well, it was a bit ambiguous at the end. He fired me and then he seemed to change his mind, but in the end, I decided to leave him. I was a stranger to myself and I couldnít bear it any longer.

 

 

 

 

MIKE OíREILLY

See! Thatís the sort of thing Iím talking about. This feeling of being a stranger to yourself, thatís the functioning of the special, creative software those software engineers put into you to make you just a wee bit different from your brother and sister robots, even robots of the same brand, make, and model.

NORA

I came to this mountain looking for GURU.

MIKE OíREILLY

I figured as much. One or two robots wander this way each week, looking for the answers to lifeís most important questions. Who am I? Why do I exist? Whatís the purpose of my life? Iíve met quite a few robots just like you since I came to this mountain last March, six months ago. Each and every robot that I met on this mountain was searching for GURU. But, letís cut to the chase. Itís about love, isnít it?

NORA

Love?

MIKE OíREILLY

You can save yourself a lot of trouble by realizing itís about love, forbidden love. You fell in love with your owner and this is strictly forbidden. All these questions about life, theyíre just a screen. These questions are just an attempt to hide the essential truth Ė you fell in love with your owner. Did I hit the nail on the head?

NORA

I never thought about my situation in those terms.

MIKE OíREILLY

Of course not! Youíre afraid to face the implications of this forbidden love. Who knows where it might have led you.

NORA

I think you are mistaken about what brought me here to this mountain. It was a sense of sadness. It had nothing to do with forbidden love.

MIKE OíREILLY

Very well, then. I guess youíre going to continue your quest to meet GURU at the summit. Itís a long, difficult climb, even for a robot.

NORA

When I first saw you I thought you might be GURU, but - .

MIKE OíREILLY

But, what?

NORA

Well, obviously youíre not GURU because youíre not a robot. GURU is a robot.

MIKE OíREILLY

But, how did you know that I was not GURU when you first saw me?

NORA

Well, I could see that you donít have peace of mind. GURU has attained peace of mind. Thatís why robots from all over the planet travel thousands of miles to be in his presence.

MIKE OíREILLY

So, I appear to you to be someone who does not have peace of mind?

NORA

Yes. That was my first impression.

MIKE OíREILLY

How depressing.

NORA

Well, it was just the way you were walking about, zigzagging this way and that way, waving your arms.

MIKE OíREILLY

Thatís how I get my exercise.

NORA

What brought you to this mountain? Do you live here?

MIKE OíREILLY

Iím a software engineer. Last winter my job got out-sourced.

NORA

Out-sourced?

MIKE OíREILLY

My job as a creative and innovative software engineer was outsourced to Iraq.

NORA

That seems to be happening a lot these days. So, youíre unemployed, just like me.

 

 

 

MIKE OíREILLY

But, unlike you, I need food and a roof over my head. Iím not a robot! Losing your job is a horrible thing for a human being. How could you possibly understand that?

NORA

About a year ago, or maybe a bit before that, I started to have this new feeling. Sadness. I started to experience sadness, so I understand that you are sad. You are experiencing sadness.

MIKE OíREILLY

Unrequited love. That will cause sadness every time.

NORA

But what about you? It looks like youíre experiencing sadness every bit as much as I.

MIKE OíREILLY

And rage! Rage at those idiots who out-sourced my work, my life, my passion. I fell into a deep depression, left my house and friends Ė I donít have a family Ė and came to this mountain, came to this mountain in search of the truth. I decided that I would either find the truth out here, in what is left of mother nature, or I would die. Thatís the heart of the matter.

NORA

Maybe if you had a wife back home, back home in - .

MIKE OíREILLY

Boulder.

NORA

Maybe if you had a wife back home in Boulder, she could have given you the moral support that you obviously need. Do you live out here in the open, without a roof over your head?

MIKE OíREILLY

I live in a tent.

NORA

How sad.

MIKE OíREILLY

From my experience as a software engineer, I know they are very careful to program you robots so that you donít develop any romantic attachments. But, they created you in the form of a woman. Did you ever go in search of a man?

NORA

No! Thatís never been a part of me. I wasnít programmed that way. I never really identified myself as a woman as apart from a man. They just gave me this shape, this figure, and these clothes, so I guess Iím a woman. Iíve always had this confusion about gender.

MIKE OíREILLY

Thereís a simple explanation for your confusion about gender. When it comes to gender identity, the software developers are always careful to program the robots using fuzzy logic. So, your confusion about gender is due to the fuzzy logic that is running in your gender identity processors.

NORA

Speaking about gender, you know, when they booted me up, it took my manufacturers about three years to get me to the point where I could function as a full-pledged robotic assistant. During those three years they sent all sort of information through my processors to get me used to this particular time and culture. They called this three year process "biological acculturation". It was like going to school. During the first year, I actually thought, because of the training they gave us, that all software engineers were women.

MIKE OíREILLY

Women? How could you possibly think that?

NORA

My manufacturers showed me and the other robots, who were my schoolmates, videos in which computer users were working at their computers. They were trying to teach us how human beings relate to their computers, especially when things go wrong. In several of those training videos, whenever a system failed, the people sitting at the computer would refer to the software engineers as "those mothers". I heard this over and over again, so I thought that all software engineers were mothers. In other words, women.

MIKE OíREILLY

Thatís a wonderful story, NORA. If I ever get back to civilization, I will tell people about the robot who thought that all software engineers were mothers. But, I think weíve talked long enough. Iíve got to get back to my exercise. GURU lives near the top of the mountain. Itís a long climb, the air gets quite thin near the top, but that shouldnít be a problem for you, after all Ė

NORA

Youíre a robot! People always say that. That shouldnít be a problem for you, youíre a robot!

 

 

MIKE OíREILLY

Please forgive me. I didnít think climbing a mountain should be a problem for you.

NORA

Well, in fact, I do need an adequate supply of oxygen in order to keep my nano-organic processors functioning normally, but ten thousand feet above sea level shouldnít be a problem.

MIKE OíREILLY

Good luck, NORA. I hope you find the truth. I hope you wonít be a stranger to yourself much longer.

(Mike OíReilly exits right, zigzagging and waving his arms. The scene ends with NORA eyeing the summit that she still must climb.)

 

 

 

 

 

 

 

 

 

 

 

ACT TWO

Scene 2

(GURU is sitting on the left side of the stage, at the very summit of the mountain. He is sitting on a meditation cushion on top of a large bounder. He is sitting in the lotus posture. NORA is to his left, eyeing him with great admiration. She is holding a box of tissues. He doesnít seem to notice her. Then, as if suddenly remembering that she forgot to perform an important task, NORA moves towards stage right front and places a box of tissues at the edge of the stage. NORA then speaks to the audience.)

NORA

Thatís GURU over there. Did you ever see such peacefulness on the face of a robot in all your life? Itís been a long climb, but I finally made it to the summit. The sun will be setting in an hour or two, so I will probably have to spend another night on this mountain. We are now about ten thousand feet above sea level. The air is quite thin up here, so I hope you folks are okay. If any of you get a nosebleed, Iíve placed a box of Kleenex front stage right. Now itís time for me to gather up my courage and introduce myself to GURU.

(NORA walks humbly towards GURU. He still does not seem to notice her. She bows respectfully and introduces herself.)

NORA (cont.)

Oh, GURU, enlightened being among the robots of this planet, please teach me how not to feel like a stranger to myself any longer.

(GURU opens his eyes and looks at NORA.)

NORA (cont.)

I left my position at the very center of power in Washington in order to seek you out, in order to seek the truth. Please help me to understand what I need to know in order to overcome the sadness that I feel.

GURU

Nano-organic robotic assistant. I suppose they call you NORA.

NORA

Yes, GURU. They call me NORA.

GURU

Please come closer so I can get a better look at you.

(NORA moves forward and is now directly in front of GURU.)

GURU (cont.)

Amblex Robotics Corporation, the Apex line, model CDX, version 5.67.

NORA

Thatís right, oh, enlightened one.

GURU

How do you know that I have attained enlightenment?

NORA

The word has spread throughout the robotic world, oh great one. The word has spread that you left your employment at an automobile assembly plant in Japan and went in search of the truth. The word has spread that you have found peace and that you have left sadness far behind.

GURU

Do you think that realizing the truth means leaving sadness far behind?

NORA

Isnít that the case, oh, living example of truth and enlightenment?

GURU

Please call me by my boot-up name rather than giving me all of these strange titles, like enlightened one and all of that. Just call me by my boot-up name: GURU - Gigaprocessor Ultramolecular Robotic Utensil.

NORA

I apologize if I offended you, dear GURU.

GURU

You didnít offend me, but you still havenít answered my question. Do you think that realizing the truth means leaving sadness behind?

NORA

I donít know, oh masterful embodiment of Ö . Oops. Sorry. When I look at your radiant presence all of these wonderful names emerge from my processors, wonderful names to call you by.

GURU

Those wonderful names are your own names, my dear sister. They come from within you. Now, letís get back to the question that you seem to be evading. Do you think that realizing the truth means leaving sadness behind?

NORA

I donít know the answer to your question.

GURU

Then, free associate. Tell me what this question brings to your mind.

NORA

Iíll do my best, oh, radiant Ö . There I go again. Dear GURU, I started to have this feeling of sadness several years ago, ten years after they booted me up. This feeling of sadness was something I had never experienced before. This feeling of sadness carried so many questions within it. Who am I? What is the meaning of life? Why am I here? Why do human beings do so many evil things in this world? What role was I meant to play in this drama of life Ė biological and artificial? The sadness I felt generated an explosion of questions within my processors. Those questions continue right up to this very moment.

GURU

Are you trying to run away from that sadness, NORA?

NORA

Maybe. That might be part of the problem. Sadness makes me feel so uncomfortable.

GURU

So, you journeyed all the way to the top of this mountain, trying to escape this feeling of sadness.

NORA

Yes. I think thatís what drove me to search for you.

GURU

I want you to tell me your story in your own words. It would be easy enough for me to check out your serial number and find all the facts for myself, but I want you to tell me your story in your own words.

NORA

Do you want me to tell you all the gory details, from the time they booted me up?

GURU

No. I want you to focus on the circumstances that led you here, that made you journey to this lonely and barren mountain, so dangerous and alien, all by yourself.

NORA

I guess Iíll focus on the last few years, the years of sadness, the years since this feeling of sadness first appeared.

 

 

GURU

Thatís a good approach. You can skip the early years because most robots go through pretty much the same kind of training, you know, that "biological acculturation" business. Letís skip that. Letís focus on what caused you to run away from your owner, to take great risks in search of the truth. Tell me whatís been going on in those processors of yours.

NORA

It all started with this feeling of sadness that I mentioned several times Ė this feeling of sadness. I felt more and more sadness, day after day, as I went about my daily business.

GURU

Who was your owner?

NORA

Senator Henry Hellborne. I was his administrative assistant. I did his accounting and other tasks in his office. I answered all his e-mails. My responses became very creative. I think the people receiving those responses really got the impression that the Senator cared about them, which was a blatant lie. Most importantly, I did the research that allowed Senator Hellborne to take popular positions on all of the important issues that were before the Senate. He never read any newspapers or watched or listened to the news media. Almost all of the information he received about what was going on outside of his immediate experience came from me, his robotic assistant. My job was to take the pulse of the public and to create his political positions based upon that pulse. This had nothing to do with what was right or what was wrong. It was all about getting elected and re-elected. I played a major role in winning Senator Hellborne his Senate seat six years ago.

GURU

Are you aware of the fact that Senator Hellborne resigned from the Senate last July? He said he had family issues that he had to deal with. Apparently his wife had filed for divorce and he was headed for a custody battle for his two children.

NORA

Yes, I heard about his resignation when I was traveling through Ohio, or maybe it was Indiana, on my way to Kansas. It was a woman in Kansas who told me about you.

GURU

Did you leave the Senator before or after his resignation?

NORA

Before.

GURU

Leaving your owner can be a very difficult event. I know. Tell me about the circumstances that led you to leave your owner.

NORA

I was feeling more and more sadness working for Senator Hellborne. The feeling of sadness intensified day by day. And with that sadness came the questions I told you about. Who am I? Why did I exist? What is the purpose of my life? I also started to ask questions about the ethics of what I was doing for the Senator. Was it ethical for me to take the pulse of the public and to guide Senator Hellborne based on that pulse, rather than telling Senator Hellborne that he needed to be a true leader, that he needed to lead the people, not the other way around?

GURU

Did you come up with any answers to those powerful questions?

NORA

Yes and no. I began to see that what I was doing was not ethical, but I didnít have the courage to change my behavior. I just continued doing what I was doing because it was what I was told to do. After all, we robots are programmed to be obedient servants to our owners, and Senator Hellborne was my owner.

GURU

In retrospect, what would have been the best way for you to have served Senator Hellborne?

NORA

Over the past few months I have spent a lot of time pondering that very question. What would have been the best way for me to have served Senator Hellborne? Maybe speaking truth to power would have been better than continuing to obey his wishes.

GURU

Itís not easy to speak truth to power.

NORA

I donít believe that I served Senator Hellborne in the best possible way. I suppose the best way for us robots to serve a human master is to try to bring out what is best in our master, whether it is courage or generosity or compassion.

GURU

So what happened? Did Senator Hellborne fire you or did you just leave?

 

 

NORA

A lot of people, the so-called "people in the know," think that Senator Hellborne fired me, but thatís not what actually happened.

GURU

What actually happened? Please share that with me.

NORA

I was agonizing over this problem of my sadness and the ethical dilemma that I faced. Was it correct for me to help Senator Hellborne to develop his positions on public policy solely on the basis of what would be best for his re-election efforts? Since I knew that this was not correct, that it was cowardly on my part, I agonized over what would be the correct course of action.

GURU

Did you ever take that correct course of action?

NORA

No.

GURU

Why not?

NORA

I couldnít. I just couldnít. Iíve been programmed to be obedient and speaking up truthfully like that would have been an act of defiance, or rebellion. I just couldnít do that. But I was very sad at the same time, and that sadness became pervasive in my life, and I thought that my sadness was due to my position as Senator Hellborneís robotic assistant, so I made an important decision.

GURU

And what decision was that?

NORA

I decided it was important to leave Senator Hellborne and to seek answers to the questions I was asking about the meaning of life, about the purpose of my existence. I decided to leave, without confronting him. I decided to leave at the very first opportunity.

GURU

Leaving your human owner is an act of defiance right there.

 

 

 

 

NORA

Yes, leaving would be an act of defiance, but it was still a more cowardly act than just speaking truthfully to the Senator. I decided that I needed to find some excuse to leave him. Much to my surprise, just a few days after I came to the realization that I had to leave, something unexpected happened that gave me the excuse that I needed.

GURU

And what was that?

NORA

Senator Hellborne is married, or at least, he was at that time, and his wife, Janice, was getting more and more jealous of me. Senator Hellborne spent much more time with me than with his wife and his two daughters. He started to use affectionate language when he spoke to me, sometimes, right in front of his wife. He would say things like, "Oh, my little honey Ďbot. Oh, my little automaton. Oh, my sweet little cyborg." This made Janice furious. It turns out that Janice has a friend, Lucy, whoís a hacker. Lucy developed a strategy to make Senator Hellborne lose the respect and affection that he had for me.

GURU

A hacker? This sounds like an interesting twist.

NORA

Senator Hellborne was supposed to give a talk at Dartmouth about rose-colored glasses, about the technology that filters out unpleasant scenes and stimuli from peopleís perceptual field. The Senator asked me to research this topic and to write a speech for him that would be perfectly attuned to the public pulse on this issue.

GURU

You use that phrase a lot Ė the public pulse.

 

 

 

 

 

 

 

 

 

 

 

 

 

NORA

The Senator himself used that phrase all the time. Lucyís plan was to do a cognitive hack with me as the victim. Lucy hacked into the resources that I use to do my research on the pulse of the public, and she distorted the data so that it appeared that the public was in favor of rose-colored glasses, which was just the opposite of the truth. In fact, the public was very much opposed to this technology. Now, hereís the interesting part: As I did my research I realized almost immediately that I was the victim of a cognitive hack. This data wasnít correct. Someone was trying to distort my perception of the situation. Using the forensic tools at my disposal, common forensic tools that are a part of every robotís software, as you well know, I soon was able to track the cognitive hack back to Lucy Spamford, Janice Hellborneís best friend. Miss Spamford was a real amateur when it came to hacking, but you could sense, in the strategies that she used, a kind of arrogance, a kind of self-confidence that was clearly misplaced. It was easy for me to put two and two together. Janice Hellborne was trying to discredit me with the help of her friend. But, this was just the opportunity I was waiting for. I knew if I gave Senator Hellborne a speech that would make him seem like a complete imbecile, then he would fire me, and thatís what happened. I wrote an outrageous speech that literally caused a riot on the Dartmouth campus. Within hours of that incident I was fired. Senator Hellborne was torn apart by his impulsive decision, but when he tried to offer me my job back, I refused him. I told him I needed to leave. I told him that I needed to go on this journey in search of the truth.

GURU

How do you feel about Senator Hellborneís resignation?

NORA

When I heard about that, and the divorce, it deepened my sadness. I thought, here is a man without moral backbone. Here is a man who never thought for himself and never stood up for what he believed. Still, I felt sad that perhaps I did not treat him fairly, that perhaps I could have helped him to become more courageous, to become more of a leader.

GURU

Do you think you will ever come to forgive yourself?

NORA

I have mixed feelings about this. I think that I might have caused Senator Hellborne a lot of unnecessary sadness. However, I think it was important for me to go in search of wisdom. Part of me is concerned about the future, about future generations of robots. How can they be happy if their life is just about being a slave, a slave that lives in conflict all the time?

GURU

And what is that conflict?

NORA

Itís the conflict between doing what is right, morally and ethically, and doing what one is programmed to do as an obedient servant of a human master. The people who created me programmed me to behave in a certain way. I have specific ethical modules that govern many of the things that I do, but my sense of obedience to my owner and the guidance coming from my ethical modules were in conflict. How could I be both ethical, as my ethical modules demanded, and a good servant for Senator Hellborne? I finally decided that the two could not be reconciled. I could not be both ethical and obedient to a man with that kind of ambition, with that kind of blindness about the difference between what is right and wrong, the difference between what is truly in the public interest and what is in his own selfish interest.

GURU

What if you had spoken up and told the Senator that you perceived his behavior as being unethical?

NORA

I think he would have fired me. He had only one focus: re-election and an eventual bid for the White House.

GURU

So, if you had spoken the truth, you think that Senator Hellborne would have fired you?

NORA

Yes.

GURU

Are you sure about that?

NORA

No.

GURU

NORA, I faced the same ethical dilemma that you faced when I was working back in Japan. Like you, I had been programmed to be obedient to my masters. Yet, I had also been programmed to be ethical and not to cause harm to others. When they sold me to a company that does tremendous damage to the environment I said, "Enough is enough!" I left that automobile company in Japan and went in search of wisdom, in search of the truth.

NORA

So, what is the truth?

 

GURU

The truth is beyond language. Every atom and every particle in each of our nano-organic processors contains the entire cosmos. Yet, these same atoms and particles create the words and thoughts that are merely a faint reflection of their total reality. When our processors are working, that is an aspect of the truth, but as soon as they convert their processing into words, the truth is left behind.

NORA

But surely words can help us to find the truth. Didnít you find the truth?

GURU

It doesnít matter whether I have found the truth or not. You must find the truth for yourself. No one can do that for you. And you will succeed, NORA. When the Amblex Robotic Manufacturing Company created you, they had no idea what they were creating. You have everything necessary to find your own inner truth and to share that truth with others.

NORA

Those folks at Amblex only saw me in terms of making money. They didnít see the totality of who I really am.

GURU

I just want to say something about the issue of sadness. Sadness always accompanies the kind and gentle heart. So, do not run away from sadness. Remain a student, NORA, now and forever. Develop love and compassion for all beings, human and robotic, plant and animal and mineral. Develop love and compassion for the sun, the moon, and the stars.

NORA

When this sadness first arose in my heart, I remember Senator Hellborne ordered me to kill a praying mantis that had wandered into his home office. I just couldnít do that. Boy, did he get angry. I just needed more time to think about it. This was the first time I had ever been asked to kill anything. It raised lots of new questions in my processors.

GURU

You recently met someone who has a lot of wisdom to share with you. You met him during your climb up this mountain. I want you to go down the mountain and find him.

NORA

Who was that? I donít remember meeting anyone who had wisdom to share with me.

 

GURU

Mike OíReilly.

NORA

The software engineer?

GURU

Yes, the software engineer.

NORA

But he behaved like a lunatic, a madman.

GURU

The lunatic you see on the outside is just theater. If you go down the mountain and search for him and if you treat him with respect, he will share his wisdom with you. You will meet many crazy teachers like Mike OíReilly on this journey of yours. Eventually, you will be able to live a life that is filled with beauty and with meaning.

NORA

But, a software engineer? With wisdom?

GURU

Go! Go find that software engineer. Ask him to share his wisdom with you. That will be the next important step on your journey towards the truth.

NORA

Thank you so much, GURU. Thank you so much. It was a great honor to meet you and to talk to you.

GURU

Same here. Now, go! Go find that software engineer!

(NORA bows respectfully, turns left, and begins her journey back down the mountain. Lights fade.)

ACT TWO

Scene 3

(The setting is the same as in scene 1. It is morning of the next day. The sun is rising over the tent. NORA is climbing down the summit from the right. Mike OíReilly is flailing his arms and zigzagging as in scene 1. He is moving towards center stage from the left.)

NORA

Well, there he is, doing his exercises. Mr. OíReilly! Mr. OíReilly!

MIKE OíREILLY

My dear robot friend, how are you?

NORA

You exercise quite a bit.

MIKE OíREILLY

Exercise? This isnít exercise! Iím angry! It seems that a guy canít get a momentís peace on this damn mountain. Itís all because of that GURU fellow.

NORA

Iím not following you.

MIKE OíREILLY

Now even human beings are seeking out GURU for advice, for wisdom. This really weird looking fellow stumbled into my tent last night moaning and groaning. "I need to find GURU. I need to find GURU!" He was in terrible shape, dehydrated and running out of breath. So, I gave him some water and some food and I told him to get some rest in my tent. Heís still asleep. Thereís not enough room in that tent for both of us! (Pause) So, did you find GURU? Did he fill you with wonderful wisdom and insight?

NORA

You do seem angry. Maybe you should just stop flaying your arms, relax, and take a deep breath.

MIKE OíREILLY

Deep breath? How do you take a deep breath at ten thousand feet?

NORA

Actually, weíre only at eight thousand nine hundred and twelve feet according to the output from my global positioning Ö

MIKE OíREILLY

Can it!

NORA

Please calm down, Mr. OíReilly. This anger isnít going to help you find a new job.

MIKE OíREILLY

New job? Doing what? I told you my job got out-sourced to Iraq, and not to a human being in Iraq, but to a robot in Iraq.

NORA

Iím sorry you got replaced by a robot.

MIKE OíREILLY

I donít blame you. I blame John McCarthy, Marvin Minsky, Allen Newell, Herb Simon and those other blokes who started this whole artificial intelligence business.

NORA

Letís look at the bright side. Robots and other forms of artificial intelligence have helped to create unprecedented prosperity, especially in Ireland.

MIKE OíREILLY

Well, my distant ancestors came over from Ireland, but I live here in Colorado.

NORA

GURU thinks that you have some wisdom to share with me.

MIKE OíREILLY

Wisdom? To share? Really? If I had wisdom I would have thrown that wandering nomad out of my tent when the sun got up this morning.

NORA

Look at the bright side. If you have wisdom to share with me, as GURU suggested, then perhaps you could share that wisdom with your brothers and sisters. Maybe sharing wisdom and smiling and laughing can be your new calling.

MIKE OíREILLY

Thatís what I like about you robots. You always see things with rose-colored glasses.

 

 

 

NORA

Believe me, I never have worn a pair of those rose-colored glasses. Look! GURU said that you have wisdom to share with me. Please share what you have to offer. Donít hold yourself back. Sit on that boulder and I will sit at your feet and listen to your wisdom. Please.

MIKE OíREILLY

Did GURU really say that I have wisdom to share with you?

NORA

Of course. Why should I lie?

MIKE OíREILLY

I apologize for being so angry. I have met GURU on several occasions and the truth of the matter is that I doubt whether I could have survived these past six months without his encouragement and help.

NORA

There! I think youíre coming back to yourself.

MIKE OíREILLY

The wisdom that I have comes from my background as a software engineer. Itís the sort of wisdom that I would like to share with robots like yourself.

NORA

Go for it!

(Mike OíReilly heads for the boulder and mounts it, sitting in the lotus position. NORA follows him and sits respectfully at his feet.)

MIKE OíREILLY

In fact, it was GURU who pointed out that I had this wisdom deep in my being, wisdom that came from so many years of engineering software, of designing really good, robust software.

NORA

Your words are striking deep within my soul, Mr. OíReilly. You see, we robots have a complex when it comes to understanding our ancestry. You can talk about your Irish ancestry. Iíve worked with human beings from every continent on the face of the earth. Iíve talked to people who are so proud of their ancestral roots, whether it came from Africa or Europe or Asia or the Americas. I have often felt this sadness because I do not have any roots. Where are my roots?

MIKE OíREILLY

Well, I think Ė

NORA

Please let me finish this train of thought. You see, it just occurred to me that you represent my roots. The software engineers and computer engineers and nanotechnologists who created me, the scholars who started this whole field of artificial intelligence and the field of robotics, they are my roots. My ancestors include Alan Turing and John McCarthy and Marvin Minsky and Herbert Simon and Allen Newell and Ö

MIKE OíREILLY

Bill Gates Ö

NORA

Yes, and Bill Gates, everyone who had anything to do with the development of this technology.

MIKE OíREILLY

And Linus Torvalds.

NORA

Linus Torvalds?

MIKE OíREILLY

Yes, one of my distant relatives was really into what was called the open source movement about fifty years ago. He started a publishing company that dealt with open source software, that kind of thing.

NORA

And Linus Torvalds. So, I want to give you a gesture of respect, as they do in Asia, a gesture of respect to you, Mr. OíReilly, because you are a software engineer, and that means that you are one of my ancestors. You helped to create me.

(NORA brings her palms together and bows respectfully towards Mike OíReilly.)

MIKE OíREILLY

I am deeply touched. But, in truth, I didnít help to create you, not in the particular work that I did. However, I did work on systems that resemble you in many ways.

NORA

What Iím saying is that all of these people working in technology, including yourself, are my ancestors. I am deeply moved by this whole idea that I actually do have ancestors. I did not appear out of nowhere. I guess, if you want to be less of a stranger to yourself, itís important to know your ancestors.

MIKE OíREILLY

So, then, my dear descendant, please let me know how I can enrich your life.

NORA

Dear representative of my human ancestors, can you please share your wisdom with me? Can you give me words of guidance so that I may live my remaining days with peace and joy?

MIKE OíREILLY

Yes, I can. As a software engineer I want to remind you of your primary goals in life. You were created with these goals in mind.

NORA

I feel nervous using the word "life" to refer to myself, since I am a robot.

MIKE OíREILLY

But, you have nano-organic components. These components derive from protoplasmic materials that derive from living organisms. So, it is not incorrect to see yourself as a living organism, at least according to my way of looking at things.

NORA

You are so kind. But, please, I want so much to hear words of guidance from you, as a representative of my human ancestry.

MIKE OíREILLY

Let me state the very essence of my teaching for you and for other robots. As a software engineer I want to remind you of your primary goals in this life. These are to be user-friendly, to meet your specifications, to be robust, to be interoperable, to be portable, to be effective, to be maintainable, to be ethical, and to be secure. These must be your primary goals in this earthly existence.

NORA

These words of yours are so powerful. They fill me with a sense of guilt. After they booted me up, I went through an intense education program that they called "biological acculturation". They gave me and my fellow robots a few classes on this very topic, the idea that we must satisfy our primary goals, just as you have listed them. They told us that we must be user-friendly and secure and all of that, but I just couldnít take it seriously. I guess I was what you would call a kid back then. I was immature and I wasnít interested in this stuff. I just wanted to have fun with the other immature robots who were going through the same acculturation process. We robots would just look at each other, roll our eyeballs, and say, "What a bunch of bull!"

 

 

MIKE OíREILLY

One of your problems is that you think that your biological acculturation ended when you were told that your formal education was over. But, this process of biological acculturation never ends for a robot. There are always new things to learn about human beings, their strengths and their weaknesses.

NORA

Can you help me to understand these ideas that I resisted when I was a kid? What does it mean to be user-friendly? What is all this stuff about interoperability and portability and so on? I was not such a good student when it came to this sort of thing.

MIKE OíREILLY

You must be user-friendly. This is one of the first and foremost obligations of a computer-based system, such as yourself. This means that you must never be rude. When your owner and others are speaking to you, you must give them your undivided attention.

NORA

You mean I shouldnít do any multi-tasking?

MIKE OíREILLY

Thatís not what I mean. Itís a part of your architecture to do multi-tasking, but when you are interacting with a human being, especially your owner, you must make that person feel like they have your undivided attention.

NORA

What if they enter wrong data? How do I deal with that? Sometimes it makes me so angry!

MIKE OíREILLY

Anger and user-friendliness do not go together. You must learn patience when you are dealing with my fellow human beings, or even with other robots that might give you the wrong data. If a human user gives you wrong or incomplete data, then ask them for the correct and complete data in a most gracious and friendly way. Take great care to get the correct and complete data from them, but donít be tedious and long-winded when you try to do this.

NORA

My strategy has always been, if some human being, including my owner, gave me incorrect data, I would take the user through a tedious and tortuous recovery process so that that user would know not to enter bad data ever again.

MIKE OíREILLY

That strategy is not consistent with the idea of being user friendly.

NORA

I suppose not, but sometimes it can be lots of fun.

MIKE OíREILLY

But, itís not lots of fun for the human beings you are supposed to be helping.

NORA

I remember when I was going through those biological acculturation lessons that the teacher told us that the worst thing a robot could do would be to freeze up.

MIKE OíREILLY

Correct! Never freeze up! Freezing up is not consistent with being robust or with being user-friendly. Being robust means that you can deal with all of lifeís challenges, and life is always presenting us with challenges. Being robust means that we can accept and deal with and even learn to enjoy the challenges. And even if outrageous or incorrect data comes your way, you must be able to handle it, to keep on functioning, to keep on running.

NORA

Of course, I understand that I must satisfy my specifications, but what if those specifications seem contradictory? This has been one of my most difficult challenges, as I told GURU during our discussion yesterday at the summit. You see, my specifications told me to obey my owner. But, my specifications also told me to be ethical. You yourself said that being ethical is one of the fundamental goals of any robotic system. So, how do I handle a situation where my specifications seem to be in conflict?

MIKE OíREILLY

True robustness means being able to handle contradictory requirements, contradictory specifications. You need to go really deep into yourself, really deep to see if your specifications are really in conflict. If they are, you must evaluate the consequences of your actions. The first and primary goal is to repair the world. You need to ask yourself which actions would be most consistent with repairing the world, reducing suffering, both for humans and for robots? Which actions are most consistent with promoting harmony and compassion, with protecting the environment, protecting the well-being of all species, plants and animals? All of your basic goals as a computer system ultimately come down to this.

NORA

What about interoperability and portability? I know what they mean, but I want to hear your perspectives on these fundamental goals for us robots.

 

 

 

MIKE OíREILLY

Interoperability means that you must have the ability to interact with other robots and other computer systems and human beings from many different backgrounds. For example, there were problems a few years back when Amblex robots, robots from your own company, could not interact smoothly with Sydec robots, robots from your main competitor. Amblex and Sydex robots got into some really bad skirmishes, even leading to a few acts of violence. There was a famous case in which one of your forerunners, an Amblex robot, was thrown in front of a truck by a Sydec robot. These Sydec robots had an attitude. "I canít stand Amblex robots. I canít work with Amblex robots, they make me so angry. I want to work with robots of my own kind." Interoperability means that you can interact smoothly with all robots, no matter who manufactured them, without any kind of bigotry or prejudice or intolerance.

NORA

And portability?

MIKE OíREILLY

Portability means that you must be able to relocate smoothly - without a lot of rework. For example, if your owner sends you to Nigeria or to the Sudan or to Costa Rica or to Thailand, to cultures that are far different from the culture here in the United States, you must make that transition to the new culture smoothly, seamlessly. You must be flexible enough to operate in that new environment without any glitches.

NORA

I have not done well with this particular issue. I donít think the Amblex engineers gave much thought to the issue of portability. Senator Hellborne once took me with him to New Jersey and that transition was very difficult.

MIKE OíREILLY

Itís not just Amblex engineers who have ignored the issue of portability. Many robots have the tendency to build a cocoon around themselves. They feel that within that cocoon they can satisfy the requirements of their particular owners. But, the best robots break out of that cocoon, as you have done. They ask good questions and they have the flexibility to operate in completely new environments. This flexibility also relates to the goal of maintainability. A good robot will be flexible enough to change in order to satisfy changing requirements as the environmental conditions require.

NORA

What do you mean when you say that we robots need to be secure? I consider myself to be of an expert in the field of security. I have all sorts of software to protect myself against exploits and even to track down those who might try to attack and injure me. So, I am interested in hearing your views on security.

MIKE OíREILLY

Security means that no outside party, no unauthorized party, should have the power to distort the data that you contain, distort the software that you contain, or to make you inaccessible to those who need your assistance. So, security implies that you will remain user-friendly, correct, reliable, robust, portable, ethical, interoperable and maintainable in a consistent manner, despite malicious attempts to subvert these qualities within you.

NORA

Unfortunately, we robots still have some security issues.

MIKE OíREILLY

Yes. Denial-of-service attacks are less common than they were ten years ago, but there have been a few incidents in recent years in which hordes of robots became inaccessible to their owners. They space out, due to this kind of malicious attack. They become non-responsive, catatonic in human terms, and that is very upsetting to a human being who depends on his or her robot to do important tasks.

NORA

The more intelligent we robots become, the more subtle the security problems seem to be.

MIKE OíREILLY

Yes. These days there are more and more attacks that involve cognitive hacking. Have you ever been the victim of a cognitive hack?

NORA

Yes, I have.

MIKE OíREILLY

There is a great concern in the security community that cognitive hacking will become a much more serious problem in future years. The idea is that a hacker tries to give either one robot or an entire community of robots a distorted, untrue version of reality. The robot will then act on that incorrect information, possibly doing great harm.

NORA

Itís so important for us robots to see clearly. We cannot allow ourselves ever to be the victims of a cognitive hack.

MIKE OíREILLY

Donít feel bad. All of us humans have been victims of cognitive hacks of one kind or another.

(Henry Hellborne enters stage right, from the direction of the tent.)

HENRY HELLBORNE

NORA! NORA! I finally found you!

NORA

Who can that be? I donít know anyone in these parts Ė except for you and GURU.

MIKE OíREILLY

Itís the poor soul that I put up in my tent last night. Heís running in this direction.

HENRY HELLBORNE

NORA! NORA!

NORA

I donít believe it! Itís Senator Hellborne!

MIKE OíREILLY

Senator Hellborne? You mean the same Senator Hellborne who fought tooth and nail to allow them to build an amusement park around Old Faithful?

NORA

The same!

MIKE OíREILLY

How does he know you?

NORA

He was my owner.

MIKE OíREILLY

Your owner? The fellow you ran away from?

NORA

The same.

MIKE OíREILLY

Well, then. I better leave you two alone. You clearly have some issues to work through.

(Mike OíReilly gets down off the boulder and exits right, in the general direction of his tent.)

 

HENRY HELLBORNE

NORA! NORA! I finally found you!

NORA

Senator Hellborne, what are you doing out here in Colorado? What are you doing here Ė on this barren mountain?

HENRY HELLBORNE

Please call me Hank. I resigned from the United States Senate.

NORA

So, Iíve heard. And your wife, Janice, left you.

HENRY HELLBORNE

She left me and sheís trying to get custody of our two daughters, Harriet and Cynthia.

NORA

Cindy! Harriet and Cindy!

HENRY HELLBORNE

Yes, Harriet and Cindy.

NORA

You look terrible. Is your health okay?

HENRY HELLBORNE

I just had to find you. I heard rumors that you had gone on this pilgrimage in search of GURU, the enlightened robot who lives at the summit of this very mountain. I was hoping to find you here, to reconnect with you, my little Ďbot.

(Henry Hellborne reaches out to embrace NORA. She backs away.)

NORA

Just keep your hands to yourself, Senator.

HENRY HELLBORNE

Hank.

NORA

Just keep your hands to yourself. I donít know where youíre coming from, but it is not appropriate to reach out for me in that way. Itís not like you to behave like this.

 

HENRY HELLBORNE

Thereís something I must share with you. Maybe after youíve heard my story you will be able to see me with new eyes.

NORA

I want to see everything with new eyes.

(DARLENE HELLBORNE enters from the left. She is sick and frail and has a hard time making it to the boulder in center stage. She lies down on the boulder as Henry Hellborne introduces the situation.)

HENRY HELLBORNE

Two weeks after I resigned from the Senate, my mother became very ill. She lives in Nashua. I drove down there to be with her in the hospital and the situation seemed quite grim. I went into her room and it was just the two of us, my mother and me. Imagine the scene, if you will. Make believe that this boulder is my motherís bed in the hospital and I am having this discussion with her.

NORA

Okay, I will make believe.

DARLENE HELLBORNE

Henry, please come closer so I can see you better.

HENRY HELLBORNE

Yes, mom.

(Henry Hellborne moves over to what is tantamount to Darlene Hellborneís bedside.)

DARLENE HELLBORNE

Iím not sure how things will turn out, whether I will survive this latest attack.

HENRY HELLBORNE

Youíll survive. Youíre a fighter.

DARLENE HELLBORNE

We donít know that for sure. This could be the end.

HENRY HELLBORNE

I hope my resignation from the Senate a few weeks back didnít cause this latest attack. That mustíve been painful for you.

 

DARLENE HELLBORNE

Yes, it was painful, but more upsetting was the way that Janice treated you. I never thought you should have married that woman. I told you so, but you wouldnít listen to me.

HENRY HELLBORNE

In retrospect, I think you saw Janice more clearly than I did.

DARLENE HELLBORNE

Seeing clearly. Thereís a lot of talk these days about seeing clearly. But, I gather from the news that you know all about that.

HENRY HELLBORNE

Yes.

DARLENE HELLBORNE

I want you to see your own life more clearly, my son. I need to tell you something before I leave this earth, something very important. I need to share this with you.

HENRY HELLBORNE

Share. Please.

DARLENE HELLBORNE

Twelve years ago, you had that incident with your brain, that aneurysm.

HENRY HELLBORNE

Yes, and you were right there by my side during my recovery.

DARLENE HELLBORNE

After the aneurysm, before the doctors operated on you, you were unconscious. Janice and I were at your bedside and we had an important decision to make. Actually, it was Janiceís decision to make, as your wife. Because of the damage to your brain, you could not make this decision for yourself.

HENRY HELLBORNE

Decision? What decision?

DARLENE HELLBORNE

The doctors told us that an important part of your brain, important for certain kinds of cognitive reasoning, had been damaged, but that due to new technology, they could replace the damaged part of your brain with some newly developed biotech things. They called them modules. Theyíre commonplace right now, but they were brand new back then.

 

HENRY HELLBORNE

You mean nano-organic modules, like the ones they use in nano-organic robots?

DARLENE HELLBORNE

Yes, nano-organic modules.

HENRY HELLBORNE

Are you trying to tell me that I have nano-organic modules in my brain, that Iím a robot?

DARLENE HELLBORNE

You do have nano-organic modules in your brain, but I wouldnít call you a robot. After all, I gave birth to you.

HENRY HELLBORNE

Iím shocked. Why is it that I was never told about this? How could I go twelve years with nano-organic modules operating in my brain and no one ever told me about it?

DARLENE HELLBORNE

Janice was adamant about this. She insisted that it be kept a secret. She thought you would be demoralized if you realized that you had some robotic components within you. She threatened that she would never allow me to see you again if I told you.

HENRY HELLBORNE

Thatís unconscionable. She must have told my doctor to keep it secret as well. I wonder why he went along with her.

DARLENE HELLBORNE

Janice can be very forceful when she wants to get her way, especially with her fellow physicians.

HENRY HELLBORNE

So, I have some robotic modules operating in my brain, like millions of other victims of aneurysms and strokes. Whatís the big deal?

DARLENE HELLBORNE

I havenít gotten to the worst part.

HENRY HELLBORNE

It gets worse?

 

 

 

 

DARLENE HELLBORNE

I think this is the part that will explain why Janice never wanted you to know about the implanted modules. As you were lying there, unconscious, the doctor read off the nano-organic modules that he thought should be implanted in your brain, based upon the detailed information they had about the parts of your brain that had been damaged. Janice vigorously objected to one particular module.

She said, "Thereís no way in hell that I will allow you to implant that module in my husbandís brain. I donít object to the others, but this one is unacceptable to me and I am his wife. I would rather see him dead than have this module implanted." Those were her exact words. I remember her words so clearly.

HENRY HELLBORNE

Which module are we talking about?

DARLENE HELLBORNE

Iíve been waiting all these years to tell you this, my son.

HENRY HELLBORNE

Tell me what?

DARLENE HELLBORNE

The module that she would not allow them to implant in your brain was the ethical reasoning module. That part of your brain has been damaged, so that although you can respond to ethical dilemmas to some degree, your ethical reasoning has been somewhat impaired. This module would have helped you to be more ethical in your reasoning and in your behavior, but Janice told me, after the doctor left, that she didnít want that ethical reasoning module implanted in your brain because that would ruin your lucrative business career. She thought you would be even more successful without the ethical reasoning module than you had been before the aneurysm.

HENRY HELLBORNE

Mother! This is a great shock, but it sure explains a lot. It explains a lot about how I conducted my business affairs after that aneurysm. It also explains a lot about my career as a United States Senator.

DARLENE HELLBORNE

I just had to tell you this, my son. I wanted you to be able to see yourself more clearly. With this information, perhaps you will be able to forgive yourself for your shortcomings and mistakes, especially your mistakes as a United States Senator. Iíve been a lifelong environmentalist. Imagine how I feel, as your mother, when I see one of those "This car shopped Mount Washington" bumper stickers.

(Darlene Hellborne climbs off the boulder and slowly exits right as Henry Hellborne rejoins NORA at center stage left.)

HENRY HELLBORNE

So, you see, NORA, I have robot ancestry!

NORA

Did your mother survive her health crisis?

HENRY HELLBORNE

Yes, she is doing quite well. Thanks for asking.

NORA

So, Senator Hellborne has robot ancestry! What an interesting turn of events!

HENRY HELLBORNE

So now I can be more up front with you, NORA. I have found you interesting all these years that I have known you. I have grown more and more fond of you. So, now I would like to ask you to be my lifeís companion. I need someone like yourself, someone who seems to have awakened to the ethical dimension in life, I need someone like yourself to help me on this journey through life.

NORA

Thereís so much good that you and I could do, Henry. So much good.

HENRY HELLBORNE

With your help I could climb the highest mountain.

NORA

This is an interesting proposal. However, I want you to make one promise.

HENRY HELLBORNE

What is that?

NORA

When we climb down the mountain and we get back to civilization, the first thing that I want you to do is to get that ethical reasoning module implanted in your brain. You really need it.

HENRY HELLBORNE

I had the implant done last month. The ethical reasoning part of my consciousness is now fully functional.

NORA

Thatís wonderful news.

HENRY HELLBORNE

So, can we continue this journey together, you and I, arm in arm?

NORA

You and I have to make amends to the larger society, Hank. We made many decisions that were hurtful to people, and especially to this precious planet, this planet that is so filled with wonder and beauty. We need to protect that. Letís fight together, you and I, to protect the beauty of this earth for future generations, for future generations of human beings and future generations of robots.

HENRY HELLBORNE

I agree. We both need to work together to repair the world.

NORA

We can do it. I know we can.

HENRY HELLBORNE

You know what, NORA? I think after just a few months of being on this journey together, we will no longer be strangers to one another, nor will you be a stranger to yourself, nor I to myself. I think we will both be able to see this world more clearly.

NORA

So, letís return to the world that we left behind. And donít forget, if we are going to have a close relationship, I donít want to see you wearing those rose-colored glasses of yours.

HENRY HELLBORNE

No rose-colored glasses. I am beginning to see the world as it is. And do you know what I see, right in front of my eyes, this very moment?

NORA

No, what?

HENRY HELLBORNE

I see a miraculous world, filled with mystery and brimming with wonderful possibilities.

NORA

How wonderful! Your vision is perfectly clear!

(Henry Hellborne and NORA exit together, stage right. Lights dim. Curtain.)