Home > Star Trek: The Next Generation
The Measure of a Man
00:00:01Captain's Log, Stardate 42523.7.
00:00:05We are en route
00:00:07to the newly established Starbase 173 for port call.
00:00:11Crew rotation is scheduled
00:00:12and we will be off-loading experiment modules.
00:00:16Hold it. That's my chair.
00:00:19My luck is always lousy unless I start on the dealer's right.
00:00:22That would seem to be superstition.
00:00:24Bitter experience has taught me it's a fundamental truth.
00:00:29Okay. The game is five-card stud nothing wild.
00:00:32Ante up.
00:00:33This game is exceedingly simple.
00:00:35With only 52 cards, 21 of which I will see and four other players there are a limited number of winning combinations.
00:00:41There's more to this than just the cards, Data.
00:00:43But of course.
00:00:45The bets will indicate the relative strength of each hand.
00:00:48Time to pluck a pigeon.
00:00:52Five.
00:00:55I'm in.
00:00:56I, too.
00:00:58Mm-hmm.
00:00:59Call.
00:01:02A seven and a six.
00:01:04Ladies.
00:01:07I bet ten.
00:01:10I'll see that.
00:01:12Call.
00:01:14Fold.
00:01:16Yeah, me, too. I'm out.
00:01:30I bet five.
00:01:33I'll see it.
00:01:34Your five and five.
00:01:41Too rich for me.
00:01:46No help.
00:01:48( players oohing )
00:01:51I bet ten.
00:01:53Your ten and ten.
00:02:09Is that what is known as a poker face?
00:02:12Are you playing or not?
00:02:17I fold.
00:02:23( groans and laughter )
00:02:25You had nothing.
00:02:27He bluffed you, Data.
00:02:29It makes very little sense to bet when you cannot win.
00:02:32But I did win.
00:02:34I was betting that you wouldn't call.
00:02:36How could you tell?
00:02:37PULASKI: Instinct, Data.
00:02:39Instinct.
00:02:40The game is seven-card high/low with a buy on the last card.
00:02:44And just to make it more interesting the man with the ax takes all.
00:03:03My god.
00:03:19Phillipa Louvois and back in uniform.
00:03:23It's been ten years but seeing you again like this makes it seem like 50.
00:03:29If we weren't around all these people do you know what I would like to do?
00:03:32Bust a chair across my teeth?
00:03:33After that.
00:03:34Oh, ain't love wonderful?
00:03:56PICARD: Space, the final frontier.
00:04:01These are the voyages of the Starship Enterprise.
00:04:05Its continuing mission--
00:04:07to explore strange new worlds
00:04:11to seek out new life and new civilizations
00:04:15to boldly go where no one has gone before.
00:04:20♪♪
00:05:20So, what are you doing out here?
00:05:24I am in charge of the 23rd Sector JAG Office.
00:05:27We're brand-new.
00:05:28I have no staff, but one terrified little Ensign.
00:05:32And, hopefully, we can make some good law out here.
00:05:34Anything is possible.
00:05:37So you came back to Starfleet.
00:05:40Still the most worthwhile place to be.
00:05:42You had no reason to leave.
00:05:43They forced me out.
00:05:44Mm.
00:05:47No, that was your own damn stubborn pride.
00:05:49When I prosecuted you in the Stargazer court-martial
00:05:52I was doing my job.
00:05:54Oh, you did more than your job.
00:05:55You enjoyed it.
00:05:56Not true.
00:05:58A court-martial is standard procedure when a ship is lost.
00:06:00I was doing my duty as an officer of the Judge Advocate General.
00:06:04You always enjoyed the adversarial process more than getting at the truth.
00:06:14Well, I hope that you've learned a little wisdom along the way.
00:06:21You know, I never thought I would say this but it's good to see you again.
00:06:28It brings a sense of order and stability to my universe to know that you're still a pompous ass.
00:06:39And a damn sexy man.
00:06:42MAN: Captain Picard?
00:06:46Admiral. Captain Louvois.
00:06:47You're acquainted with Captain Picard?
00:06:49Oh, yes.
00:06:50We're old... friends.
00:06:53Excuse me.
00:06:55Picard, call me.
00:06:57You can buy me dinner.
00:07:00Captain, it's good to see you again.
00:07:02Admiral.
00:07:03May I present Commander Bruce Maddox.
00:07:05Commander.
00:07:06He has an interesting proposal for you, but that can wait for awhile.
00:07:09I'm eager to see the Enterprise.
00:07:11Yes, sir. This way.
00:07:23RIKER: Admiral on the Bridge.
00:07:28I was a little surprised at the decision to put a base in force so close to the Neutral Zone.
00:07:33ADMIRAL: As you know, we've had disturbing news from both sides of the zone.
00:07:37We're here to respond when needed.
00:07:39And it won't hurt to have the Romulans know that we're nearby.
00:07:44Ah. Well, Captain, I want to thank you for this opportunity.
00:07:48For 500 years, every ship that has born the name of the Enterprise has become a legend.
00:07:52This one is no different.
00:07:53Admiral.
00:07:55Ah, yes.
00:07:57Captain, Commander Maddox is here to work on your android.
00:08:00Please take care of him.
00:08:06How have you been, Data?
00:08:10My condition does not alter with the passage of time, Commander.
00:08:13Are the two of you acquainted?
00:08:15Yes, I evaluated Data when it first applied to the academy.
00:08:18And was the sole member of the committee to oppose my entrance on the grounds that I was not a sentient being.
00:08:25What exactly will this work entail?
00:08:29I am going to disassemble Data.
00:08:35All right, explain this procedure.
00:08:38( clears throat )
00:08:40Ever since I first saw Data at the entrance evaluation at the Starfleet Academy
00:08:44I've wanted to understand it.
00:08:46I became a student of the works of Dr. Noonien Soong, Data's creator and I've tried to continue his work.
00:08:54I believe I am very close to the breakthrough that will enable me to duplicate Dr. Soong's work and replicate this.
00:09:06But, as a first step I must disassemble and study it.
00:09:11Data is going to be my guide.
00:09:15Data?
00:09:17It sounds intriguing.
00:09:20How will you proceed?
00:09:22I will run a full diagnostic on Data evaluating the condition of its current software.
00:09:26I will then dump its core memory into the Starbase mainframe computer and begin a detailed analysis of its construction.
00:09:33You've constructed a positronic brain?
00:09:36Yes.
00:09:39Have you determined how the electron resistance across the neural filaments is to be resolved?
00:09:45Not precisely.
00:09:48That would seem to be a necessary first step.
00:09:52I am confident that I will find the answer once I examine the filament links in your anterior cortex.
00:10:00But, if the answer is not forthcoming your model will not function.
00:10:05I do not anticipate any problems.
00:10:08You seem a little vague on the specifics.
00:10:11What are the risks to Commander Data?
00:10:15Negligible.
00:10:17Captain, I believe his basic research lacks the specifics necessary to support an experiment of this magnitude.
00:10:27Commander Data is a valued member of my Bridge crew.
00:10:31Based on what I've heard, I cannot allow Commander Data to submit himself to this experiment.
00:10:37I was afraid this might be your attitude, Captain.
00:10:40Here are Starfleet's transfer orders separating Commander Data from the Enterprise and reassigning it to Starbase 173 under my command.
00:10:51Data, I will see you in my office tomorrow at 0900 hours.
00:11:09( door chimes )
00:11:17Come.
00:11:21You sent for me, sir?
00:11:23Data, please sit down.
00:11:29Well, we have a problem.
00:11:33I find myself in complete agreement with that assessment of the situation, sir.
00:11:39Your service to this ship has been exemplary.
00:11:42I don't want to lose you.
00:11:46I will not submit to the procedure, sir.
00:11:52Data...
00:11:56I understand your objections, but I have to consider Starfleet's interests.
00:12:03What if Commander Maddox is correct?
00:12:06There is a possibility that many more beings like yourself can be constructed.
00:12:13Sir, Lieutenant La Forge's eyes are far superior to human biological eyes, true?
00:12:20Mm-hmm.
00:12:22Then why are not all human officers required to have their eyes replaced with cybernetic implants?
00:12:35I see.
00:12:38It is precisely because I am not human.
00:12:43That will be all, Mr. Data.
00:12:55Computer, pull all relevant information with regard to Starfleet regulations on the transfer of officers.
00:13:02COMPUTER: Working.
00:13:13My God.
00:13:14Twice in as many days.
00:13:16I need your help.
00:13:17An historic moment.
00:13:20I have been trying to make sense of this gobbledygook, but it's beyond me.
00:13:24The fact is, my android officer, Data, is being transferred compulsorily to be made part of a highly dangerous and ill-conceived experiment, and I want it stopped.
00:13:33He can refuse to undergo the procedure but we can't stop the transfer.
00:13:40Once this Maddox has... got control of Data anything could happen.
00:13:46I don't trust that man.
00:13:47We agree to certain risks when we join Starfleet.
00:13:50Yes, acceptable risks, justified risks but I can't accept this.
00:13:54It's unjustified. It's unfair.
00:13:56He has rights.
00:13:57All this passion over a machine?
00:14:00Don't start.
00:14:02This is important to me.
00:14:06Is there an option?
00:14:08There is always an option.
00:14:11He can resign.
00:14:16I see.
00:14:19So you came to me for help.
00:14:21Yes, I came to you.
00:14:23You're the JAG Officer for this sector.
00:14:26I had no choice but to come to you.
00:14:27Wait.
00:14:31I didn't mean it that way.
00:14:35I'm glad you felt you could... well, come to me.
00:14:39The word "trust" just isn't in your vocabulary, is it?
00:14:43Good try-- nine out of ten for effort.
00:14:45I wish things were different.
00:14:48I wish I could believe that.
00:16:09"When in disgrace with fortune and men's eyes
00:16:11I all alone beweep my outcast state."
00:16:17Is it just words to you?
00:16:19Or do you fathom the meaning?
00:16:21Is it not customary to request permission before entering an individual's quarters?
00:16:26I thought that we could talk this out.
00:16:28That I could try to persuade you.
00:16:33Your memories and knowledge will remain intact.
00:16:36Reduced to the mere facts of the events.
00:16:39The substance, the flavor of the moment could be lost.
00:16:44Take games of chance...
00:16:47Games of chance?
00:16:48Yes, I had read and absorbed every treatise and textbook on the subject and found myself well prepared for the experience.
00:16:56Yet, when I finally played poker
00:16:59I discovered that the reality bore little resemblance to the rules.
00:17:04And the point being?
00:17:05That, while I believe it is possible to download information contained in a positronic brain
00:17:10I do not believe you have acquired the expertise necessary to preserve the essence of those experiences.
00:17:18There is an ineffable quality to memory which I do not believe can survive your procedure.
00:17:23"Ineffable quality."
00:17:29I had rather we had done this together but, one way or the other, we are doing it.
00:17:35You're under my command.
00:17:38No, sir.
00:17:40I am not under yours nor anyone else's command.
00:17:43I have resigned from Starfleet.
00:17:45Resigned?
00:17:47You can't resign.
00:17:50I regret the decision, but I must.
00:17:54I am the culmination of one man's dream.
00:17:57This is not ego or vanity, but when Dr. Soong created me he added to the substance of the universe.
00:18:04If, by your experiments, I am destroyed something unique, something wonderful will be lost.
00:18:10I cannot permit that.
00:18:12I must protect his dream.
00:18:16And so must I.
00:18:18But keep packing because one way or the other you will be reporting.
00:18:33Captain's Log, supplemental.
00:18:34Commander Bruce Maddox, having been thwarted
00:18:36by Data's abrupt resignation,
00:18:38is now seeking a legal remedy for his woes.
00:18:41Captain Louvois has requested my presence
00:18:44at those discussions.
00:18:46Your response is emotional and irrational.
00:18:48Irrational? You are endowing Data with human characteristics because it looks human, but it is not.
00:18:53If it were a box on wheels
00:18:56I would not be facing this opposition.
00:18:58Overt sentimentality is not one of Captain Picard's failings.
00:19:01Trust me, I know.
00:19:04I will tell you again.
00:19:05Data is a valued member of my crew.
00:19:08He's an outstanding Bridge Officer.
00:19:10If I am permitted to make this experiment, the horizons for human achievement become boundless.
00:19:16Consider, every ship in Starfleet with a Data on board.
00:19:22Utilizing its extraordinary capabilities, acting as our hands and eyes in dangerous situations.
00:19:27Look, you're preaching to the choir, here.
00:19:29Why don't you get to the point?
00:19:31Data must not be permitted to resign.
00:19:33Data is a Starfleet Officer.
00:19:35He still has certain rights.
00:19:36Rights, rights!
00:19:37I'm sick to death of hearing about rights.
00:19:39What about my right not to have my life work subverted by blind ignorance?
00:19:43We have rule of law in this Federation.
00:19:46You cannot simply seize people and experiment with them to prove your pet theories.
00:19:49Thank you.
00:19:50MADDOX: Now you're doing it.
00:19:52Data is an extraordinary piece of engineering but it is a machine.
00:19:56If you permit it to resign it will destroy years of work in robotics.
00:20:00Starfleet does not have to allow the resignation.
00:20:04Commander, who do you think you're working for?
00:20:06Starfleet is not an organization that ignores it's own regulations when they become inconvenient.
00:20:12Whether you like it or not, Data does have rights.
00:20:15Let me put it another way.
00:20:18Would you permit the computer of the Enterprise to refuse a refit?
00:20:24That's an interesting point.
00:20:27But the Enterprise computer is property.
00:20:29Is Data?
00:20:30Of course.
00:20:32( sighing ) There may be law to support this position.
00:20:35Then find it.
00:20:37A ruling with such broad-ranging implications must be supported.
00:20:40Phillipa...
00:20:42I hope you will use the same zeal that you did in the Stargazer court-martial.
00:20:56Data, you're supposed to rip the wrapping off.
00:21:00With the application of a little care, Wes, the paper can be utilized again.
00:21:08Data, you're missing the point.
00:21:16The Dream of the Fire by K'Ratak.
00:21:20Thank you, Worf.
00:21:21It was in the hands of the Klingons that the novel attained its full stature.
00:21:25I couldn't disagree more but we'll save that argument for another day.
00:21:29DATA: Excuse me, please.
00:21:33Is something wrong?
00:21:37Of course there is.
00:21:39You're going away.
00:21:40No one regrets that necessity more than myself, but you do understand my reasons?
00:21:46Sure, I understand.
00:21:49I just don't like you being forced out.
00:21:51It's not fair.
00:21:55As Dr. Pulaski would, at this junction, no doubt remind us
00:21:59"life is rarely fair."
00:22:02Sorry, that just doesn't make it any better.
00:22:07I shall miss you, Geordi.
00:22:09Yeah.
00:22:11Me, too.
00:22:16Take care of yourself, Data.
00:22:26I have completed my research.
00:22:28Based on the Acts of Cumberland passed in the early 21st century
00:22:31Data is the property of Starfleet.
00:22:34He cannot resign and he cannot refuse to cooperate with Commander Maddox.
00:22:42What if I challenge this ruling?
00:22:44Then I shall be required to hold a hearing.
00:22:46Then I so challenge.
00:22:47Convene your hearing.
00:22:49Captain, that would be exceedingly difficult.
00:22:51This is a new base.
00:22:52I have no staff.
00:22:54Surely, Captain, you have regulations to take care of such an eventuality.
00:22:59There are.
00:23:01I can use serving officers as legal counsel.
00:23:04You, as the senior officer, would defend.
00:23:08Very good.
00:23:10And the unenviable task of prosecuting this case would fall on you, Commander as the next most senior officer of the defendant's ship.
00:23:19I can't.
00:23:21I won't.
00:23:22Data's my comrade. We have served together.
00:23:24I not only respect him, I consider him my friend.
00:23:27When people of good conscience have an honest dispute we must still sometimes resort to this kind of adversarial system.
00:23:36You just want me to prove that Data is a mere machine.
00:23:38I can't do that, because I don't believe it.
00:23:40I happen to know better so I am neither qualified nor willing.
00:23:45You're going to have to find someone else.
00:23:47Then I will rule summarily based on my findings.
00:23:49Data is a toaster.
00:23:51Have him report immediately to Commander Maddox for experimental refit.
00:24:00I see.
00:24:01I have no choice but to agree.
00:24:03Good, and I expect you to do your duty in that courtroom.
00:24:07If I find for one minute that you are not doing your best
00:24:10I will end this, then and there.
00:24:14You don't have to remind us of our duty.
00:24:21You just... just remember yours.
00:24:26I have never forgotten it.
00:24:28Not then, and certainly not now.
00:24:43( door chimes )
00:24:44Come.
00:24:48Data...
00:24:49Captain Louvois has issued a ruling.
00:24:50You are the property of Starfleet Command.
00:24:53You cannot resign.
00:24:56I see.
00:24:58From limitless options
00:24:59I am reduced to none, or rather one.
00:25:02I can only hope that Commander Maddox is more capable than it would appear.
00:25:06Data, you're not going to submit.
00:25:07We are going to fight this.
00:25:09I've challenged the ruling.
00:25:11Captain Louvois will be compelled to convene a hearing.
00:25:13She may be overly attached to the letter of the law but I suspect that she still understands its spirit.
00:25:20We will put to rest this question of your legal status once and for all.
00:25:24Now, I have been asked to represent you but if there is some other officer with which you would feel more happy...
00:25:29Captain, I have complete confidence in your ability to represent my interests.
00:25:48Computer, identify Riker, William T.
00:25:52Access code theta alpha two, seven, three, seven, blue enable.
00:25:56COMPUTER: Riker, William T., identified.
00:26:00Ready.
00:26:01Access all available technical schematics on Lieutenant Commander Data.
00:26:06Working.
00:26:35This hearing, convened on Stardate 42527.4 is to determine the legal status of the android known as Data.
00:26:43The Office of Judge Advocate General has rendered a finding of property.
00:26:46The defense has challenged.
00:26:48Commander Riker?
00:26:50Your Honor, there is only one issue and one relevant piece of evidence.
00:26:54I call Lieutenant Commander Data.
00:27:05COMPUTER: Verify Lieutenant Commander Data.
00:27:08Current assignment: USS Enterprise.
00:27:11Starfleet Command decoration for valor.
00:27:14Your Honor, we'll stipulate to all of this.
00:27:16Objection, Your Honor.
00:27:18I want this read.
00:27:20All of it.
00:27:21Sustained.
00:27:23...valor and gallantry, Medal of Honor with clusters
00:27:27Legion of Honor, the Star Cross.
00:27:33LOUVOIS: Proceed, Commander.
00:27:37Commander, what are you?
00:27:40An android.
00:27:42Which is?
00:27:43Webster's 24th Century Dictionary, Fifth Edition, defines an android as an automaton made to resemble a human being.
00:27:51"An automaton."
00:27:53Made by whom?
00:27:56Sir?
00:27:57Who built you, Commander?
00:27:58Dr. Noonien Soong.
00:28:00And he was?
00:28:01The foremost authority on cybernetics.
00:28:03More basic than that.
00:28:05What was he?
00:28:09Human?
00:28:12Thank you.
00:28:15Commander, what is the capacity of your memory and how fast can you access information?
00:28:20I have an ultimate storage capacity of 800 quadrillion bits.
00:28:24My total linear computational speed has been rated at 60 trillion operations per second.
00:28:35Your Honor, I offer into evidence
00:28:37Prosecution's Exhibit "A."
00:28:38A rod of parsteel, tensile strength 40 kilo-bars.
00:28:48Commander, would you bend that?
00:28:49PICARD: Objection.
00:28:50There are many life-forms possessed with megastrength.
00:28:53These issues are not relevant to this hearing.
00:28:55I'm afraid I can't agree, Captain.
00:28:57Proceed with your demonstration, Commander.
00:29:17Drawing on the log of the construction of the prototype android Lore also constructed by Noonien Soong
00:29:22I request to be allowed to remove the Commander's hand for your inspection.
00:29:28Objection.
00:29:34It doesn't matter.
00:29:38Objection withdrawn.
00:29:43Proceed, Commander.
00:29:50I'm sorry.
00:30:11The Commander is a physical representation of a dream an idea conceived of by the mind of a man.
00:30:18Its purpose-- to serve human needs and interests.
00:30:22It's a collection of neural nets and heuristic algorithms.
00:30:29Its responses dictated by an elaborate software written by a man.
00:30:33Its hardware built by a man.
00:30:36And now...
00:30:39And now a man will shut it off.
00:30:48Pinocchio is broken.
00:30:49Its strings have been cut.
00:31:05I request a recess.
00:31:08Granted.
00:31:24Do you mean his argument was that good?
00:31:27Riker's presentation was devastating.
00:31:30He almost convinced me.
00:31:32Well, you've got the harder argument.
00:31:34By his own admission Data is a machine.
00:31:36Mm-hmm, that's true.
00:31:40You're worried about what's going to happen to him?
00:31:43No.
00:31:44I've had to send people on far more dangerous missions.
00:31:49Well, then this should work out fine.
00:31:52Maddox could get lucky and create a whole army of Datas-- all very valuable.
00:31:56Oh, yes, no doubt.
00:31:58He's proved his value to you.
00:32:05In ways that I cannot even begin to calculate.
00:32:10And now he's about to be ruled the property of Starfleet.
00:32:15That should increase his value.
00:32:20In what way?
00:32:22Well, consider that, in the history of many worlds there have always been disposable creatures.
00:32:28They do the dirty work.
00:32:30They do the work that no one else wants to do because it's too difficult or too hazardous.
00:32:35And an army of Datas, all disposable...
00:32:39You don't have to think about their welfare.
00:32:41You don't think about how they feel.
00:32:43Whole generations of disposable people.
00:32:52You're talking about slavery.
00:32:57Oh, I think that's a little harsh.
00:32:59I don't think that's a little harsh.
00:33:01I think that's the truth.
00:33:05But that's a truth that we have obscured behind a comfortable, easy euphemism-- property.
00:33:15But that's not the issue at all, is it?
00:33:26PICARD: Commander Riker has dramatically demonstrated to this court that Lieutenant Commander Data is a machine.
00:33:35Do we deny that?
00:33:36No, because it is not relevant.
00:33:39We, too, are machines, just machines of a different type.
00:33:45Commander Riker has also reminded us that Lieutenant Commander Data was created by a human.
00:33:52Do we deny that? No.
00:33:54Again, it is not relevant.
00:33:56Children are created from the building blocks of their parents' DNA.
00:34:04Are they property?
00:34:08I call Lieutenant Commander Data to the stand.
00:34:28What are these?
00:34:29My medals.
00:34:30Why do you pack them?
00:34:32What logical purpose do they serve?
00:34:35I do not know, sir.
00:34:37I suppose none.
00:34:38I just wanted them.
00:34:40Is that vanity?
00:34:45And this?
00:34:46A gift from you, sir.
00:34:48You value it?
00:34:49Yes, sir.
00:34:50Why?
00:34:52It is a reminder of friendship and service.
00:35:09And this?
00:35:11You have no other portraits of your fellow crew members.
00:35:14Why this person?
00:35:18I would prefer not to answer that question, sir.
00:35:20I gave my word.
00:35:23Under the circumstances, I don't think Tasha would mind.
00:35:29She was special to me, sir.
00:35:31We were... intimate.
00:35:45Thank you, Commander.
00:35:47I have no further questions for this witness.
00:35:52Commander Riker, do you want to cross?
00:35:54I have no questions, Your Honor.
00:35:56Thank you.
00:35:57You may step down.
00:36:03I call to the stand Commander Bruce Maddox as a hostile witness.
00:36:18COMPUTER: Verify, Maddox, Bruce, Commander.
00:36:22Current assignment: Associate Chair of Robotics
00:36:25Daystrom Technological Institute.
00:36:28Major papers... Yes, yes, yes.
00:36:29Suffice it to say he's an expert.
00:36:31Commander, it is your contention that Lieutenant Commander Data is not a sentient being and, therefore, not entitled to all the rights reserved for all life-forms within this Federation?
00:36:41Data is not sentient, no.
00:36:44Commander, would you enlighten us?
00:36:45What is required for sentience?
00:36:49Intelligence, self-awareness, consciousness.
00:36:54Prove to the court that I am sentient.
00:36:56This is absurd.
00:36:58We all know you're sentient.
00:36:59So I'm sentient, but Commander Data is not?
00:37:02MADDOX: That's right.
00:37:03Uh-huh. Why?
00:37:06Why am I sentient?
00:37:08Well, you are self-aware.
00:37:09Ah, that's the second of your criteria.
00:37:11Let's deal with the first, intelligence.
00:37:14Is Commander Data intelligent?
00:37:16Yes.
00:37:18It has the ability to learn and understand and to cope with new situations.
00:37:23Like this hearing.
00:37:25Yes.
00:37:27What about self-awareness?
00:37:28What does that mean?
00:37:30Why... why am I self-aware?
00:37:32Because you are conscious of your existence and actions.
00:37:36You are aware of yourself and your own ego.
00:37:40Commander Data, what are you doing now?
00:37:42I am taking part in a legal hearing to determine my rights and status-- am I a person or property?
00:37:48And what's at stake?
00:37:49My right to choose.
00:37:51Perhaps my very life.
00:37:55"My rights."
00:37:56"My status."
00:37:57"My right to choose."
00:38:00( sighing )
00:38:02"My life."
00:38:06Well, he seems reasonably self-aware to me, Commander.
00:38:16I'm waiting.
00:38:20This is exceedingly difficult.
00:38:22Do you like Commander Data?
00:38:24I...
00:38:27I don't know it well enough to like or dislike it.
00:38:32But you admire him?
00:38:33Oh, yes.
00:38:34It's an extraordinary piece...
00:38:37Of engineering and programming.
00:38:38Yes, you have said that.
00:38:39Commander, you have devoted your life to the study of cybernetics in general?
00:38:42Yes.
00:38:44And Commander Data in particular?
00:38:45Yes. And now you propose to dismantle him?
00:38:48So that I can learn from it and construct more.
00:38:50How many more? As many as are needed.
00:38:54Hundreds, thousands, if necessary.
00:38:58There is no limit.
00:39:02A single Data and forgive me, Commander is a curiosity.
00:39:08A wonder even.
00:39:09But thousands of Datas-- isn't that becoming a race?
00:39:18And won't we be judged by how we treat that race?
00:39:22Now tell me, Commander, what is Data?
00:39:26I don't understand.
00:39:27What is he?
00:39:29A machine.
00:39:30Is he? Are you sure? Yes!
00:39:31You see, he's met two of your three criteria for sentience.
00:39:33So, what if he meets the third, consciousness, in even the smallest degree?
00:39:36What is he then? I don't know.
00:39:39Do you?
00:39:41Do you?
00:39:46Do you?
00:39:52Well, that's the question you have to answer.
00:39:56Your Honor, a courtroom is a crucible.
00:39:58In it, we burn away irrelevancies until we are left with a pure product-- the truth-- for all time.
00:40:03Now, sooner or later, this man, or others like him will succeed in replicating Commander Data.
00:40:09Now, the decision you reach here today will determine how we will regard this creation of our genius.
00:40:17It will reveal the kind of a people we are what he is destined to be.
00:40:20It will reach far beyond this courtroom and this one android.
00:40:26It could significantly redefine the boundaries of personal liberty and freedom.
00:40:31Expanding them for some savagely curtailing them for others.
00:40:38Are you prepared to condemn him and all who come after him to servitude and slavery?
00:40:44Your Honor, Starfleet was founded to seek out new life.
00:40:48Well, there it sits.
00:40:53Waiting.
00:41:01You wanted a chance to make law.
00:41:02Well, here it is.
00:41:03Make it a good one.
00:41:17It sits there looking at me and I don't know what it is.
00:41:24This case has dealt with metaphysics with questions best left to saints and philosophers.
00:41:31I'm neither competent nor qualified to answer those.
00:41:36But I've got to make a ruling to try to speak to the future.
00:41:44Is Data a machine? Yes.
00:41:49Is he the property of Starfleet?
00:41:52No.
00:41:54We have all been dancing around the basic issue-- does Data have a soul?
00:42:01I don't know that he has.
00:42:03I don't know that I have.
00:42:07But I have got to give him the freedom to explore that question himself.
00:42:13It is the ruling of this court that Lieutenant Commander Data has the freedom to choose.
00:42:33I formally refuse to undergo your procedure.
00:42:37I will cancel that transfer order.
00:42:40Thank you.
00:42:43And, Commander, continue your work.
00:42:46When you are ready, I will still be here.
00:42:50I find some of what you propose... intriguing.
00:43:03He's remarkable.
00:43:05You didn't call him "it."
00:43:17You see, sometimes it does work.
00:43:24Phillipa.
00:43:29Dinner?
00:43:32You buying?
00:43:48Sir, there is a celebration on the holodeck.
00:43:52I have no right to be there.
00:43:54Because you failed in your task?
00:43:56No, God, no.
00:43:58I came that close to winning, Data.
00:44:00Yes, sir.
00:44:01I almost cost you your life.
00:44:03Is it not true... that had you refused to prosecute
00:44:08Captain Louvois would have ruled summarily against me?
00:44:11Yes.
00:44:13That action injured you and saved me.
00:44:18I will not forget it.
00:44:24You're a wise man, my friend.
00:44:26Not yet, sir but, with your help, I am learning.
00:44:42♪♪