Home > Star Trek: The Next Generation

The Measure of a Man

00:00:01

Captain's Log, Stardate 42523.7.

00:00:05

We are en route

00:00:07

to the newly established Starbase 173 for port call.

00:00:11

Crew rotation is scheduled

00:00:12

and we will be off-loading experiment modules.

00:00:16

Hold it. That's my chair.

00:00:19

My luck is always lousy unless I start on the dealer's right.

00:00:22

That would seem to be superstition.

00:00:24

Bitter experience has taught me it's a fundamental truth.

00:00:29

Okay. The game is five-card stud nothing wild.

00:00:32

Ante up.

00:00:33

This game is exceedingly simple.

00:00:35

With only 52 cards, 21 of which I will see and four other players there are a limited number of winning combinations.

00:00:41

There's more to this than just the cards, Data.

00:00:43

But of course.

00:00:45

The bets will indicate the relative strength of each hand.

00:00:48

Time to pluck a pigeon.

00:00:52

Five.

00:00:55

I'm in.

00:00:56

I, too.

00:00:58

Mm-hmm.

00:00:59

Call.

00:01:02

A seven and a six.

00:01:04

Ladies.

00:01:07

I bet ten.

00:01:10

I'll see that.

00:01:12

Call.

00:01:14

Fold.

00:01:16

Yeah, me, too. I'm out.

00:01:30

I bet five.

00:01:33

I'll see it.

00:01:34

Your five and five.

00:01:41

Too rich for me.

00:01:46

No help.

00:01:48

( players oohing )

00:01:51

I bet ten.

00:01:53

Your ten and ten.

00:02:09

Is that what is known as a poker face?

00:02:12

Are you playing or not?

00:02:17

I fold.

00:02:23

( groans and laughter )

00:02:25

You had nothing.

00:02:27

He bluffed you, Data.

00:02:29

It makes very little sense to bet when you cannot win.

00:02:32

But I did win.

00:02:34

I was betting that you wouldn't call.

00:02:36

How could you tell?

00:02:37

PULASKI: Instinct, Data.

00:02:39

Instinct.

00:02:40

The game is seven-card high/low with a buy on the last card.

00:02:44

And just to make it more interesting the man with the ax takes all.

00:03:03

My god.

00:03:19

Phillipa Louvois and back in uniform.

00:03:23

It's been ten years but seeing you again like this makes it seem like 50.

00:03:29

If we weren't around all these people do you know what I would like to do?

00:03:32

Bust a chair across my teeth?

00:03:33

After that.

00:03:34

Oh, ain't love wonderful?

00:03:56

PICARD: Space, the final frontier.

00:04:01

These are the voyages of the Starship Enterprise.

00:04:05

Its continuing mission--

00:04:07

to explore strange new worlds

00:04:11

to seek out new life and new civilizations

00:04:15

to boldly go where no one has gone before.

00:04:20

♪♪

00:05:20

So, what are you doing out here?

00:05:24

I am in charge of the 23rd Sector JAG Office.

00:05:27

We're brand-new.

00:05:28

I have no staff, but one terrified little Ensign.

00:05:32

And, hopefully, we can make some good law out here.

00:05:34

Anything is possible.

00:05:37

So you came back to Starfleet.

00:05:40

Still the most worthwhile place to be.

00:05:42

You had no reason to leave.

00:05:43

They forced me out.

00:05:44

Mm.

00:05:47

No, that was your own damn stubborn pride.

00:05:49

When I prosecuted you in the Stargazer court-martial

00:05:52

I was doing my job.

00:05:54

Oh, you did more than your job.

00:05:55

You enjoyed it.

00:05:56

Not true.

00:05:58

A court-martial is standard procedure when a ship is lost.

00:06:00

I was doing my duty as an officer of the Judge Advocate General.

00:06:04

You always enjoyed the adversarial process more than getting at the truth.

00:06:14

Well, I hope that you've learned a little wisdom along the way.

00:06:21

You know, I never thought I would say this but it's good to see you again.

00:06:28

It brings a sense of order and stability to my universe to know that you're still a pompous ass.

00:06:39

And a damn sexy man.

00:06:42

MAN: Captain Picard?

00:06:46

Admiral. Captain Louvois.

00:06:47

You're acquainted with Captain Picard?

00:06:49

Oh, yes.

00:06:50

We're old... friends.

00:06:53

Excuse me.

00:06:55

Picard, call me.

00:06:57

You can buy me dinner.

00:07:00

Captain, it's good to see you again.

00:07:02

Admiral.

00:07:03

May I present Commander Bruce Maddox.

00:07:05

Commander.

00:07:06

He has an interesting proposal for you, but that can wait for awhile.

00:07:09

I'm eager to see the Enterprise.

00:07:11

Yes, sir. This way.

00:07:23

RIKER: Admiral on the Bridge.

00:07:28

I was a little surprised at the decision to put a base in force so close to the Neutral Zone.

00:07:33

ADMIRAL: As you know, we've had disturbing news from both sides of the zone.

00:07:37

We're here to respond when needed.

00:07:39

And it won't hurt to have the Romulans know that we're nearby.

00:07:44

Ah. Well, Captain, I want to thank you for this opportunity.

00:07:48

For 500 years, every ship that has born the name of the Enterprise has become a legend.

00:07:52

This one is no different.

00:07:53

Admiral.

00:07:55

Ah, yes.

00:07:57

Captain, Commander Maddox is here to work on your android.

00:08:00

Please take care of him.

00:08:06

How have you been, Data?

00:08:10

My condition does not alter with the passage of time, Commander.

00:08:13

Are the two of you acquainted?

00:08:15

Yes, I evaluated Data when it first applied to the academy.

00:08:18

And was the sole member of the committee to oppose my entrance on the grounds that I was not a sentient being.

00:08:25

What exactly will this work entail?

00:08:29

I am going to disassemble Data.

00:08:35

All right, explain this procedure.

00:08:38

( clears throat )

00:08:40

Ever since I first saw Data at the entrance evaluation at the Starfleet Academy

00:08:44

I've wanted to understand it.

00:08:46

I became a student of the works of Dr. Noonien Soong, Data's creator and I've tried to continue his work.

00:08:54

I believe I am very close to the breakthrough that will enable me to duplicate Dr. Soong's work and replicate this.

00:09:06

But, as a first step I must disassemble and study it.

00:09:11

Data is going to be my guide.

00:09:15

Data?

00:09:17

It sounds intriguing.

00:09:20

How will you proceed?

00:09:22

I will run a full diagnostic on Data evaluating the condition of its current software.

00:09:26

I will then dump its core memory into the Starbase mainframe computer and begin a detailed analysis of its construction.

00:09:33

You've constructed a positronic brain?

00:09:36

Yes.

00:09:39

Have you determined how the electron resistance across the neural filaments is to be resolved?

00:09:45

Not precisely.

00:09:48

That would seem to be a necessary first step.

00:09:52

I am confident that I will find the answer once I examine the filament links in your anterior cortex.

00:10:00

But, if the answer is not forthcoming your model will not function.

00:10:05

I do not anticipate any problems.

00:10:08

You seem a little vague on the specifics.

00:10:11

What are the risks to Commander Data?

00:10:15

Negligible.

00:10:17

Captain, I believe his basic research lacks the specifics necessary to support an experiment of this magnitude.

00:10:27

Commander Data is a valued member of my Bridge crew.

00:10:31

Based on what I've heard, I cannot allow Commander Data to submit himself to this experiment.

00:10:37

I was afraid this might be your attitude, Captain.

00:10:40

Here are Starfleet's transfer orders separating Commander Data from the Enterprise and reassigning it to Starbase 173 under my command.

00:10:51

Data, I will see you in my office tomorrow at 0900 hours.

00:11:09

( door chimes )

00:11:17

Come.

00:11:21

You sent for me, sir?

00:11:23

Data, please sit down.

00:11:29

Well, we have a problem.

00:11:33

I find myself in complete agreement with that assessment of the situation, sir.

00:11:39

Your service to this ship has been exemplary.

00:11:42

I don't want to lose you.

00:11:46

I will not submit to the procedure, sir.

00:11:52

Data...

00:11:56

I understand your objections, but I have to consider Starfleet's interests.

00:12:03

What if Commander Maddox is correct?

00:12:06

There is a possibility that many more beings like yourself can be constructed.

00:12:13

Sir, Lieutenant La Forge's eyes are far superior to human biological eyes, true?

00:12:20

Mm-hmm.

00:12:22

Then why are not all human officers required to have their eyes replaced with cybernetic implants?

00:12:35

I see.

00:12:38

It is precisely because I am not human.

00:12:43

That will be all, Mr. Data.

00:12:55

Computer, pull all relevant information with regard to Starfleet regulations on the transfer of officers.

00:13:02

COMPUTER: Working.

00:13:13

My God.

00:13:14

Twice in as many days.

00:13:16

I need your help.

00:13:17

An historic moment.

00:13:20

I have been trying to make sense of this gobbledygook, but it's beyond me.

00:13:24

The fact is, my android officer, Data, is being transferred compulsorily to be made part of a highly dangerous and ill-conceived experiment, and I want it stopped.

00:13:33

He can refuse to undergo the procedure but we can't stop the transfer.

00:13:40

Once this Maddox has... got control of Data anything could happen.

00:13:46

I don't trust that man.

00:13:47

We agree to certain risks when we join Starfleet.

00:13:50

Yes, acceptable risks, justified risks but I can't accept this.

00:13:54

It's unjustified. It's unfair.

00:13:56

He has rights.

00:13:57

All this passion over a machine?

00:14:00

Don't start.

00:14:02

This is important to me.

00:14:06

Is there an option?

00:14:08

There is always an option.

00:14:11

He can resign.

00:14:16

I see.

00:14:19

So you came to me for help.

00:14:21

Yes, I came to you.

00:14:23

You're the JAG Officer for this sector.

00:14:26

I had no choice but to come to you.

00:14:27

Wait.

00:14:31

I didn't mean it that way.

00:14:35

I'm glad you felt you could... well, come to me.

00:14:39

The word "trust" just isn't in your vocabulary, is it?

00:14:43

Good try-- nine out of ten for effort.

00:14:45

I wish things were different.

00:14:48

I wish I could believe that.

00:16:09

"When in disgrace with fortune and men's eyes

00:16:11

I all alone beweep my outcast state."

00:16:17

Is it just words to you?

00:16:19

Or do you fathom the meaning?

00:16:21

Is it not customary to request permission before entering an individual's quarters?

00:16:26

I thought that we could talk this out.

00:16:28

That I could try to persuade you.

00:16:33

Your memories and knowledge will remain intact.

00:16:36

Reduced to the mere facts of the events.

00:16:39

The substance, the flavor of the moment could be lost.

00:16:44

Take games of chance...

00:16:47

Games of chance?

00:16:48

Yes, I had read and absorbed every treatise and textbook on the subject and found myself well prepared for the experience.

00:16:56

Yet, when I finally played poker

00:16:59

I discovered that the reality bore little resemblance to the rules.

00:17:04

And the point being?

00:17:05

That, while I believe it is possible to download information contained in a positronic brain

00:17:10

I do not believe you have acquired the expertise necessary to preserve the essence of those experiences.

00:17:18

There is an ineffable quality to memory which I do not believe can survive your procedure.

00:17:23

"Ineffable quality."

00:17:29

I had rather we had done this together but, one way or the other, we are doing it.

00:17:35

You're under my command.

00:17:38

No, sir.

00:17:40

I am not under yours nor anyone else's command.

00:17:43

I have resigned from Starfleet.

00:17:45

Resigned?

00:17:47

You can't resign.

00:17:50

I regret the decision, but I must.

00:17:54

I am the culmination of one man's dream.

00:17:57

This is not ego or vanity, but when Dr. Soong created me he added to the substance of the universe.

00:18:04

If, by your experiments, I am destroyed something unique, something wonderful will be lost.

00:18:10

I cannot permit that.

00:18:12

I must protect his dream.

00:18:16

And so must I.

00:18:18

But keep packing because one way or the other you will be reporting.

00:18:33

Captain's Log, supplemental.

00:18:34

Commander Bruce Maddox, having been thwarted

00:18:36

by Data's abrupt resignation,

00:18:38

is now seeking a legal remedy for his woes.

00:18:41

Captain Louvois has requested my presence

00:18:44

at those discussions.

00:18:46

Your response is emotional and irrational.

00:18:48

Irrational? You are endowing Data with human characteristics because it looks human, but it is not.

00:18:53

If it were a box on wheels

00:18:56

I would not be facing this opposition.

00:18:58

Overt sentimentality is not one of Captain Picard's failings.

00:19:01

Trust me, I know.

00:19:04

I will tell you again.

00:19:05

Data is a valued member of my crew.

00:19:08

He's an outstanding Bridge Officer.

00:19:10

If I am permitted to make this experiment, the horizons for human achievement become boundless.

00:19:16

Consider, every ship in Starfleet with a Data on board.

00:19:22

Utilizing its extraordinary capabilities, acting as our hands and eyes in dangerous situations.

00:19:27

Look, you're preaching to the choir, here.

00:19:29

Why don't you get to the point?

00:19:31

Data must not be permitted to resign.

00:19:33

Data is a Starfleet Officer.

00:19:35

He still has certain rights.

00:19:36

Rights, rights!

00:19:37

I'm sick to death of hearing about rights.

00:19:39

What about my right not to have my life work subverted by blind ignorance?

00:19:43

We have rule of law in this Federation.

00:19:46

You cannot simply seize people and experiment with them to prove your pet theories.

00:19:49

Thank you.

00:19:50

MADDOX: Now you're doing it.

00:19:52

Data is an extraordinary piece of engineering but it is a machine.

00:19:56

If you permit it to resign it will destroy years of work in robotics.

00:20:00

Starfleet does not have to allow the resignation.

00:20:04

Commander, who do you think you're working for?

00:20:06

Starfleet is not an organization that ignores it's own regulations when they become inconvenient.

00:20:12

Whether you like it or not, Data does have rights.

00:20:15

Let me put it another way.

00:20:18

Would you permit the computer of the Enterprise to refuse a refit?

00:20:24

That's an interesting point.

00:20:27

But the Enterprise computer is property.

00:20:29

Is Data?

00:20:30

Of course.

00:20:32

( sighing ) There may be law to support this position.

00:20:35

Then find it.

00:20:37

A ruling with such broad-ranging implications must be supported.

00:20:40

Phillipa...

00:20:42

I hope you will use the same zeal that you did in the Stargazer court-martial.

00:20:56

Data, you're supposed to rip the wrapping off.

00:21:00

With the application of a little care, Wes, the paper can be utilized again.

00:21:08

Data, you're missing the point.

00:21:16

The Dream of the Fire by K'Ratak.

00:21:20

Thank you, Worf.

00:21:21

It was in the hands of the Klingons that the novel attained its full stature.

00:21:25

I couldn't disagree more but we'll save that argument for another day.

00:21:29

DATA: Excuse me, please.

00:21:33

Is something wrong?

00:21:37

Of course there is.

00:21:39

You're going away.

00:21:40

No one regrets that necessity more than myself, but you do understand my reasons?

00:21:46

Sure, I understand.

00:21:49

I just don't like you being forced out.

00:21:51

It's not fair.

00:21:55

As Dr. Pulaski would, at this junction, no doubt remind us

00:21:59

"life is rarely fair."

00:22:02

Sorry, that just doesn't make it any better.

00:22:07

I shall miss you, Geordi.

00:22:09

Yeah.

00:22:11

Me, too.

00:22:16

Take care of yourself, Data.

00:22:26

I have completed my research.

00:22:28

Based on the Acts of Cumberland passed in the early 21st century

00:22:31

Data is the property of Starfleet.

00:22:34

He cannot resign and he cannot refuse to cooperate with Commander Maddox.

00:22:42

What if I challenge this ruling?

00:22:44

Then I shall be required to hold a hearing.

00:22:46

Then I so challenge.

00:22:47

Convene your hearing.

00:22:49

Captain, that would be exceedingly difficult.

00:22:51

This is a new base.

00:22:52

I have no staff.

00:22:54

Surely, Captain, you have regulations to take care of such an eventuality.

00:22:59

There are.

00:23:01

I can use serving officers as legal counsel.

00:23:04

You, as the senior officer, would defend.

00:23:08

Very good.

00:23:10

And the unenviable task of prosecuting this case would fall on you, Commander as the next most senior officer of the defendant's ship.

00:23:19

I can't.

00:23:21

I won't.

00:23:22

Data's my comrade. We have served together.

00:23:24

I not only respect him, I consider him my friend.

00:23:27

When people of good conscience have an honest dispute we must still sometimes resort to this kind of adversarial system.

00:23:36

You just want me to prove that Data is a mere machine.

00:23:38

I can't do that, because I don't believe it.

00:23:40

I happen to know better so I am neither qualified nor willing.

00:23:45

You're going to have to find someone else.

00:23:47

Then I will rule summarily based on my findings.

00:23:49

Data is a toaster.

00:23:51

Have him report immediately to Commander Maddox for experimental refit.

00:24:00

I see.

00:24:01

I have no choice but to agree.

00:24:03

Good, and I expect you to do your duty in that courtroom.

00:24:07

If I find for one minute that you are not doing your best

00:24:10

I will end this, then and there.

00:24:14

You don't have to remind us of our duty.

00:24:21

You just... just remember yours.

00:24:26

I have never forgotten it.

00:24:28

Not then, and certainly not now.

00:24:43

( door chimes )

00:24:44

Come.

00:24:48

Data...

00:24:49

Captain Louvois has issued a ruling.

00:24:50

You are the property of Starfleet Command.

00:24:53

You cannot resign.

00:24:56

I see.

00:24:58

From limitless options

00:24:59

I am reduced to none, or rather one.

00:25:02

I can only hope that Commander Maddox is more capable than it would appear.

00:25:06

Data, you're not going to submit.

00:25:07

We are going to fight this.

00:25:09

I've challenged the ruling.

00:25:11

Captain Louvois will be compelled to convene a hearing.

00:25:13

She may be overly attached to the letter of the law but I suspect that she still understands its spirit.

00:25:20

We will put to rest this question of your legal status once and for all.

00:25:24

Now, I have been asked to represent you but if there is some other officer with which you would feel more happy...

00:25:29

Captain, I have complete confidence in your ability to represent my interests.

00:25:48

Computer, identify Riker, William T.

00:25:52

Access code theta alpha two, seven, three, seven, blue enable.

00:25:56

COMPUTER: Riker, William T., identified.

00:26:00

Ready.

00:26:01

Access all available technical schematics on Lieutenant Commander Data.

00:26:06

Working.

00:26:35

This hearing, convened on Stardate 42527.4 is to determine the legal status of the android known as Data.

00:26:43

The Office of Judge Advocate General has rendered a finding of property.

00:26:46

The defense has challenged.

00:26:48

Commander Riker?

00:26:50

Your Honor, there is only one issue and one relevant piece of evidence.

00:26:54

I call Lieutenant Commander Data.

00:27:05

COMPUTER: Verify Lieutenant Commander Data.

00:27:08

Current assignment: USS Enterprise.

00:27:11

Starfleet Command decoration for valor.

00:27:14

Your Honor, we'll stipulate to all of this.

00:27:16

Objection, Your Honor.

00:27:18

I want this read.

00:27:20

All of it.

00:27:21

Sustained.

00:27:23

...valor and gallantry, Medal of Honor with clusters

00:27:27

Legion of Honor, the Star Cross.

00:27:33

LOUVOIS: Proceed, Commander.

00:27:37

Commander, what are you?

00:27:40

An android.

00:27:42

Which is?

00:27:43

Webster's 24th Century Dictionary, Fifth Edition, defines an android as an automaton made to resemble a human being.

00:27:51

"An automaton."

00:27:53

Made by whom?

00:27:56

Sir?

00:27:57

Who built you, Commander?

00:27:58

Dr. Noonien Soong.

00:28:00

And he was?

00:28:01

The foremost authority on cybernetics.

00:28:03

More basic than that.

00:28:05

What was he?

00:28:09

Human?

00:28:12

Thank you.

00:28:15

Commander, what is the capacity of your memory and how fast can you access information?

00:28:20

I have an ultimate storage capacity of 800 quadrillion bits.

00:28:24

My total linear computational speed has been rated at 60 trillion operations per second.

00:28:35

Your Honor, I offer into evidence

00:28:37

Prosecution's Exhibit "A."

00:28:38

A rod of parsteel, tensile strength 40 kilo-bars.

00:28:48

Commander, would you bend that?

00:28:49

PICARD: Objection.

00:28:50

There are many life-forms possessed with megastrength.

00:28:53

These issues are not relevant to this hearing.

00:28:55

I'm afraid I can't agree, Captain.

00:28:57

Proceed with your demonstration, Commander.

00:29:17

Drawing on the log of the construction of the prototype android Lore also constructed by Noonien Soong

00:29:22

I request to be allowed to remove the Commander's hand for your inspection.

00:29:28

Objection.

00:29:34

It doesn't matter.

00:29:38

Objection withdrawn.

00:29:43

Proceed, Commander.

00:29:50

I'm sorry.

00:30:11

The Commander is a physical representation of a dream an idea conceived of by the mind of a man.

00:30:18

Its purpose-- to serve human needs and interests.

00:30:22

It's a collection of neural nets and heuristic algorithms.

00:30:29

Its responses dictated by an elaborate software written by a man.

00:30:33

Its hardware built by a man.

00:30:36

And now...

00:30:39

And now a man will shut it off.

00:30:48

Pinocchio is broken.

00:30:49

Its strings have been cut.

00:31:05

I request a recess.

00:31:08

Granted.

00:31:24

Do you mean his argument was that good?

00:31:27

Riker's presentation was devastating.

00:31:30

He almost convinced me.

00:31:32

Well, you've got the harder argument.

00:31:34

By his own admission Data is a machine.

00:31:36

Mm-hmm, that's true.

00:31:40

You're worried about what's going to happen to him?

00:31:43

No.

00:31:44

I've had to send people on far more dangerous missions.

00:31:49

Well, then this should work out fine.

00:31:52

Maddox could get lucky and create a whole army of Datas-- all very valuable.

00:31:56

Oh, yes, no doubt.

00:31:58

He's proved his value to you.

00:32:05

In ways that I cannot even begin to calculate.

00:32:10

And now he's about to be ruled the property of Starfleet.

00:32:15

That should increase his value.

00:32:20

In what way?

00:32:22

Well, consider that, in the history of many worlds there have always been disposable creatures.

00:32:28

They do the dirty work.

00:32:30

They do the work that no one else wants to do because it's too difficult or too hazardous.

00:32:35

And an army of Datas, all disposable...

00:32:39

You don't have to think about their welfare.

00:32:41

You don't think about how they feel.

00:32:43

Whole generations of disposable people.

00:32:52

You're talking about slavery.

00:32:57

Oh, I think that's a little harsh.

00:32:59

I don't think that's a little harsh.

00:33:01

I think that's the truth.

00:33:05

But that's a truth that we have obscured behind a comfortable, easy euphemism-- property.

00:33:15

But that's not the issue at all, is it?

00:33:26

PICARD: Commander Riker has dramatically demonstrated to this court that Lieutenant Commander Data is a machine.

00:33:35

Do we deny that?

00:33:36

No, because it is not relevant.

00:33:39

We, too, are machines, just machines of a different type.

00:33:45

Commander Riker has also reminded us that Lieutenant Commander Data was created by a human.

00:33:52

Do we deny that? No.

00:33:54

Again, it is not relevant.

00:33:56

Children are created from the building blocks of their parents' DNA.

00:34:04

Are they property?

00:34:08

I call Lieutenant Commander Data to the stand.

00:34:28

What are these?

00:34:29

My medals.

00:34:30

Why do you pack them?

00:34:32

What logical purpose do they serve?

00:34:35

I do not know, sir.

00:34:37

I suppose none.

00:34:38

I just wanted them.

00:34:40

Is that vanity?

00:34:45

And this?

00:34:46

A gift from you, sir.

00:34:48

You value it?

00:34:49

Yes, sir.

00:34:50

Why?

00:34:52

It is a reminder of friendship and service.

00:35:09

And this?

00:35:11

You have no other portraits of your fellow crew members.

00:35:14

Why this person?

00:35:18

I would prefer not to answer that question, sir.

00:35:20

I gave my word.

00:35:23

Under the circumstances, I don't think Tasha would mind.

00:35:29

She was special to me, sir.

00:35:31

We were... intimate.

00:35:45

Thank you, Commander.

00:35:47

I have no further questions for this witness.

00:35:52

Commander Riker, do you want to cross?

00:35:54

I have no questions, Your Honor.

00:35:56

Thank you.

00:35:57

You may step down.

00:36:03

I call to the stand Commander Bruce Maddox as a hostile witness.

00:36:18

COMPUTER: Verify, Maddox, Bruce, Commander.

00:36:22

Current assignment: Associate Chair of Robotics

00:36:25

Daystrom Technological Institute.

00:36:28

Major papers... Yes, yes, yes.

00:36:29

Suffice it to say he's an expert.

00:36:31

Commander, it is your contention that Lieutenant Commander Data is not a sentient being and, therefore, not entitled to all the rights reserved for all life-forms within this Federation?

00:36:41

Data is not sentient, no.

00:36:44

Commander, would you enlighten us?

00:36:45

What is required for sentience?

00:36:49

Intelligence, self-awareness, consciousness.

00:36:54

Prove to the court that I am sentient.

00:36:56

This is absurd.

00:36:58

We all know you're sentient.

00:36:59

So I'm sentient, but Commander Data is not?

00:37:02

MADDOX: That's right.

00:37:03

Uh-huh. Why?

00:37:06

Why am I sentient?

00:37:08

Well, you are self-aware.

00:37:09

Ah, that's the second of your criteria.

00:37:11

Let's deal with the first, intelligence.

00:37:14

Is Commander Data intelligent?

00:37:16

Yes.

00:37:18

It has the ability to learn and understand and to cope with new situations.

00:37:23

Like this hearing.

00:37:25

Yes.

00:37:27

What about self-awareness?

00:37:28

What does that mean?

00:37:30

Why... why am I self-aware?

00:37:32

Because you are conscious of your existence and actions.

00:37:36

You are aware of yourself and your own ego.

00:37:40

Commander Data, what are you doing now?

00:37:42

I am taking part in a legal hearing to determine my rights and status-- am I a person or property?

00:37:48

And what's at stake?

00:37:49

My right to choose.

00:37:51

Perhaps my very life.

00:37:55

"My rights."

00:37:56

"My status."

00:37:57

"My right to choose."

00:38:00

( sighing )

00:38:02

"My life."

00:38:06

Well, he seems reasonably self-aware to me, Commander.

00:38:16

I'm waiting.

00:38:20

This is exceedingly difficult.

00:38:22

Do you like Commander Data?

00:38:24

I...

00:38:27

I don't know it well enough to like or dislike it.

00:38:32

But you admire him?

00:38:33

Oh, yes.

00:38:34

It's an extraordinary piece...

00:38:37

Of engineering and programming.

00:38:38

Yes, you have said that.

00:38:39

Commander, you have devoted your life to the study of cybernetics in general?

00:38:42

Yes.

00:38:44

And Commander Data in particular?

00:38:45

Yes. And now you propose to dismantle him?

00:38:48

So that I can learn from it and construct more.

00:38:50

How many more? As many as are needed.

00:38:54

Hundreds, thousands, if necessary.

00:38:58

There is no limit.

00:39:02

A single Data and forgive me, Commander is a curiosity.

00:39:08

A wonder even.

00:39:09

But thousands of Datas-- isn't that becoming a race?

00:39:18

And won't we be judged by how we treat that race?

00:39:22

Now tell me, Commander, what is Data?

00:39:26

I don't understand.

00:39:27

What is he?

00:39:29

A machine.

00:39:30

Is he? Are you sure? Yes!

00:39:31

You see, he's met two of your three criteria for sentience.

00:39:33

So, what if he meets the third, consciousness, in even the smallest degree?

00:39:36

What is he then? I don't know.

00:39:39

Do you?

00:39:41

Do you?

00:39:46

Do you?

00:39:52

Well, that's the question you have to answer.

00:39:56

Your Honor, a courtroom is a crucible.

00:39:58

In it, we burn away irrelevancies until we are left with a pure product-- the truth-- for all time.

00:40:03

Now, sooner or later, this man, or others like him will succeed in replicating Commander Data.

00:40:09

Now, the decision you reach here today will determine how we will regard this creation of our genius.

00:40:17

It will reveal the kind of a people we are what he is destined to be.

00:40:20

It will reach far beyond this courtroom and this one android.

00:40:26

It could significantly redefine the boundaries of personal liberty and freedom.

00:40:31

Expanding them for some savagely curtailing them for others.

00:40:38

Are you prepared to condemn him and all who come after him to servitude and slavery?

00:40:44

Your Honor, Starfleet was founded to seek out new life.

00:40:48

Well, there it sits.

00:40:53

Waiting.

00:41:01

You wanted a chance to make law.

00:41:02

Well, here it is.

00:41:03

Make it a good one.

00:41:17

It sits there looking at me and I don't know what it is.

00:41:24

This case has dealt with metaphysics with questions best left to saints and philosophers.

00:41:31

I'm neither competent nor qualified to answer those.

00:41:36

But I've got to make a ruling to try to speak to the future.

00:41:44

Is Data a machine? Yes.

00:41:49

Is he the property of Starfleet?

00:41:52

No.

00:41:54

We have all been dancing around the basic issue-- does Data have a soul?

00:42:01

I don't know that he has.

00:42:03

I don't know that I have.

00:42:07

But I have got to give him the freedom to explore that question himself.

00:42:13

It is the ruling of this court that Lieutenant Commander Data has the freedom to choose.

00:42:33

I formally refuse to undergo your procedure.

00:42:37

I will cancel that transfer order.

00:42:40

Thank you.

00:42:43

And, Commander, continue your work.

00:42:46

When you are ready, I will still be here.

00:42:50

I find some of what you propose... intriguing.

00:43:03

He's remarkable.

00:43:05

You didn't call him "it."

00:43:17

You see, sometimes it does work.

00:43:24

Phillipa.

00:43:29

Dinner?

00:43:32

You buying?

00:43:48

Sir, there is a celebration on the holodeck.

00:43:52

I have no right to be there.

00:43:54

Because you failed in your task?

00:43:56

No, God, no.

00:43:58

I came that close to winning, Data.

00:44:00

Yes, sir.

00:44:01

I almost cost you your life.

00:44:03

Is it not true... that had you refused to prosecute

00:44:08

Captain Louvois would have ruled summarily against me?

00:44:11

Yes.

00:44:13

That action injured you and saved me.

00:44:18

I will not forget it.

00:44:24

You're a wise man, my friend.

00:44:26

Not yet, sir but, with your help, I am learning.

00:44:42

♪♪