Home > Star Trek: The Next Generation

The Offspring

00:00:02

Captain's Log, Stardate 43657.0.

00:00:06

While Commander Riker is away on personal leave

00:00:08

the Enterprise has traveled to Sector 396

00:00:11

to begin charting the Salebi Asteroid Belt.

00:00:18

He sent for you, too?

00:00:19

Yes. He was very mysterious.

00:00:21

Do you have any idea what this is about?

00:00:23

Something happened at that cybernetics conference.

00:00:25

Since he's come back, he's spent every off-duty minute in that lab.

00:00:28

It's not like Data to be so secretive.

00:00:30

And cautious.

00:00:32

He's kept that lab locked every minute.

00:00:33

Now, how would you know that?

00:00:35

Uh-huh.

00:00:42

DATA: Oh, you are early.

00:00:46

One moment, please.

00:01:06

DATA: You may enter now.

00:01:14

Come on, Data.

00:01:15

What is this?

00:01:16

Yeah, Data, what's going on?

00:01:19

I have invited you here to meet someone.

00:01:34

DATA: This is Lal.

00:01:40

Lal...

00:01:41

Say hello to Counselor Deanna Troi.

00:01:47

Hello, Counselor Deanna Troi.

00:01:49

How do you do, Lal?

00:01:52

I am functioning within normal parameters.

00:01:56

Lal... this is Geordi La Forge.

00:02:00

Purpose for exterior drapings, Father?

00:02:03

It is an accepted custom that we wear clothing.

00:02:07

Data, it called you "Father."

00:02:11

Yes, Wesley, Lal is my child.

00:02:47

Space, the final frontier.

00:02:52

These are the voyages of the Starship Enterprise.

00:02:56

Its continuing mission--

00:02:58

to explore strange new worlds...

00:03:01

to seek out new life and new civilizations...

00:03:06

to boldly go where no one has gone before.

00:04:11

Captain's Log, supplemental.

00:04:13

I have just been advised of a highly unusual project

00:04:15

undertaken by Commander Data.

00:04:18

Lal has a positronic brain, one very similar to my own.

00:04:22

I began programming it at the cybernetics conference.

00:04:25

LA FORGE: But nobody's ever been able to do that, Data.

00:04:27

At least, not since you were programmed.

00:04:29

DATA: True, but there was a new sub-micron matrix-transfer technology introduced at the conference which I discovered could be used to lay down complex neural net pathways.

00:04:38

So, you did a transfer from your brain into Lal's?

00:04:41

Exactly, Wesley.

00:04:43

I realized for the first time that it was possible to continue Dr. Soong's work.

00:04:48

My initial transfers produced very encouraging results.

00:04:52

So I brought Lal's brain back with me.

00:04:54

PICARD: Data...

00:04:58

I would like to have been consulted.

00:05:03

I have not observed anyone else on board consulting you about their procreation, Captain.

00:05:07

( sighs deeply )

00:05:11

Why didn't you give it a more human look, Data?

00:05:14

I have decided to allow my child to choose its own sex and appearance.

00:05:19

Commander Data, at your convenience

00:05:21

I would like to talk with you in my ready room.

00:05:22

Counselor.

00:05:29

I insist we do whatever we can to discourage the perception of this new android as a child.

00:05:34

It is not a child.

00:05:35

It is an invention, albeit an extraordinary one.

00:05:38

Why should biology rather than technology determine whether it's a child?

00:05:43

Data has created an offspring-- a new life out of his own being.

00:05:47

To me, that suggests a child.

00:05:49

If he wishes to call Lal his child then who are we to argue?

00:05:53

Well, if he must, but I fail to understand how a five-foot android with heuristic learning systems and the strength of ten men can be called a child.

00:06:01

You've never been a parent.

00:06:11

What you have done will have serious ramifications.

00:06:15

I am truly dismayed that you told no one of what you were doing.

00:06:21

I am sorry, Captain.

00:06:23

I did not anticipate your objections.

00:06:25

Do you wish me to deactivate Lal?

00:06:28

It's a life, Data!

00:06:30

It can't be activated and deactivated simply.

00:06:34

This is a most... stupendous undertaking.

00:06:40

Have you any idea what will happen when Starfleet learns about this?

00:06:44

I have followed all of Starfleet's regulations to the best of my ability.

00:06:49

I expected they would be pleased.

00:06:52

( sighs deeply )

00:06:54

Well... you have taken on quite a responsibility, Data.

00:06:59

To prepare, I have scanned all available literature on parenting.

00:07:04

There seems to be much confusion on this issue.

00:07:08

One traditional doctrine insists "spare the rod and spoil the child," suggesting a punitive approach, while another, more liberal attitude would allow the child enormous freedom.

00:07:19

Data... And what Klingons do to their children...

00:07:22

Data, I am not talking about parenting!

00:07:24

I am talking about the extraordinary consequences of creating a new life!

00:07:28

Does that not describe becoming a parent, sir?

00:07:39

Data, you are seeking to achieve what only your own creator has been able to achieve-- to make another functioning sentient android-- to make another Data.

00:07:56

That is why I must attempt this, sir.

00:07:59

I have observed that in most species there is a primal instinct to perpetuate themselves.

00:08:06

Until now, I have been the last of my kind.

00:08:10

If I were to be damaged or destroyed

00:08:13

I would be lost forever.

00:08:15

But if I am successful with the creation of Lal my continuance is assured.

00:08:22

I understand the risks, sir, and I am prepared to accept the responsibility.

00:08:36

( woman chuckling )

00:08:37

Gender female.

00:08:39

That's right, just like me.

00:08:42

Gender male.

00:08:44

Correct.

00:08:46

And I am gender... neuter.

00:08:48

Inadequate.

00:08:51

That is why you must choose a gender, Lal, to complete your appearance.

00:08:55

What are criteria?

00:08:56

Access your data bank on sexuality, level two.

00:09:00

That will define the parameters.

00:09:02

Whatever you decide will be yours for your lifetime.

00:09:05

It's a decision that will affect how people interrelate with you.

00:09:09

I choose your sex and appearance.

00:09:12

No, Lal.

00:09:13

That would be confusing.

00:09:14

We are taking you to the holodeck to show you several thousand composites I have programmed.

00:09:20

You may choose from them.

00:09:22

Several thousand?

00:09:24

This is a big decision.

00:09:33

DATA: Counselor?

00:09:36

Lal has narrowed the choices to four.

00:09:38

Would you like to see?

00:09:39

Yes.

00:09:40

Yes, of course, Data.

00:09:44

Computer, Lal-- gender sequence finalists, begin.

00:09:50

An Andorian female.

00:09:52

Interesting.

00:09:54

You'll be the only one on board the Enterprise, Lal.

00:09:57

Hmm. That could make socialization more difficult.

00:10:03

A human male.

00:10:05

Very attractive.

00:10:06

There's no problem with socialization here.

00:10:10

A human female.

00:10:12

I like her.

00:10:13

A Klingon male.

00:10:15

A friend for Worf.

00:10:18

They're all very interesting.

00:10:20

Do you have a favorite?

00:10:22

Yes, I have chosen.

00:10:30

I have completed assembly of the replicated anatomy.

00:10:33

I was able to provide Lal with more realistic skin and eye color than my own.

00:10:43

Congratulations, Data.

00:10:45

It's a girl.

00:10:56

This is home, Lal.

00:10:58

Home.

00:11:00

Place of residence.

00:11:02

Social unit formed by a family living together.

00:11:06

Yes.

00:11:07

We are a family, Lal.

00:11:11

Chair.

00:11:16

To sit in.

00:11:19

Sit.

00:11:25

Good.

00:11:27

Painting.

00:11:30

Painting.

00:11:32

Colors produced on a surface by applying a pigment.

00:11:35

Yes.

00:11:36

I will teach you to recognize the artistry in paintings.

00:11:44

Soft.

00:11:45

Yes, very good, Lal.

00:11:48

You have correctly processed the sense of touch.

00:11:52

There are many fascinating experiences

00:11:55

I wish to share with you.

00:11:57

Painting.

00:11:59

No.

00:12:00

That is a flower, Lal.

00:12:04

Inhale.

00:12:05

( sniffs )

00:12:07

( sniffs )

00:12:09

Smell!

00:12:10

Yes.

00:12:11

Show me more, Father.

00:12:13

DATA: Second Officer, Science Log, supplemental.

00:12:15

Training in social skills

00:12:17

at the most elementary level has begun.

00:12:20

Lal is progressing very slowly

00:12:22

but is not deterred by early setbacks.

00:12:29

While motor coordination has improved 12 percent,

00:12:32

reflexes still need to develop.

00:12:39

Visual comprehension is especially difficult for Lal.

00:12:43

Translating her vast data banks into recognizable applications

00:12:47

may improve with additional transfers.

00:12:49

She is also learning

00:12:51

to supplement her innate android behavior

00:12:53

with simulated human responses.

00:13:02

And it is interesting to note

00:13:04

that as I observe Lal learning about her world,

00:13:06

I share in her experience, almost as though

00:13:09

I am learning things over again.

00:13:27

The transfer itself is fairly simple.

00:13:30

Each neural pathway in my brain is duplicated precisely in hers.

00:13:34

Theoretically, the duplicate brains should be able to store and process the same information, but until all of the transfers are complete we will not know for certain.

00:13:42

What does Lal do when you're on duty?

00:13:44

She studies in our quarters.

00:13:46

She requires very little supervision.

00:13:48

Lal is quite self-sufficient.

00:13:50

Have you considered sending her to school?

00:13:52

She already has access to the sum of human knowledge-- from me.

00:13:57

Data, she could learn a lot by being with children her own age.

00:14:00

She is only two weeks old.

00:14:02

Okay, close to her own age.

00:14:05

CRUSHER: Dr. Crusher to Ensign Crusher.

00:14:07

Aren't you supposed to be getting a haircut, Wesley?

00:14:11

I'm on my way.

00:14:14

Parents...

00:14:16

Nothing personal.

00:14:24

Lal, the third cross-link transfer series is complete.

00:14:31

Father, what is my purpose?

00:14:34

Purpose?

00:14:36

My function.

00:14:38

My reason for being.

00:14:40

That is a complex question, Lal.

00:14:45

I can only begin to answer by telling you that our function is to contribute in a positive way to the world in which we live.

00:14:55

Why am I me, instead of someone else?

00:14:58

Because you are my child.

00:15:01

Where did I come from?

00:15:04

These questions...

00:15:05

These questions suggest that we have made a successful transfer of the heuristic associative pathways.

00:15:13

You will now begin to process information on logic, aesthetics, metaphysics and epistemology.

00:15:20

You are truly becoming sentient, Lal.

00:15:23

How?

00:15:25

By developing the awareness to question and examine your perceptions.

00:15:32

Why do we have two hands?

00:15:33

Why not three or four?

00:15:35

Why is the sky black?

00:15:37

Why do...?

00:15:38

( click )

00:15:42

Tomorrow will be your first day of school, Lal.

00:15:54

I assure you, Admiral, there is no better guide into this life for Lal than Data.

00:15:58

He's doing an excellent job.

00:16:00

We all have enormous admiration for what Commander Data has already achieved but we have superior facilities and personnel here at Galor IV.

00:16:08

A starship is hardly the proper setting for...

00:16:11

This starship's mission is to seek out new life and that is exactly what Commander Data is doing, under my guidance.

00:16:16

We all want what's best for the new android.

00:16:19

As do I.

00:16:21

( sighs )

00:16:24

I would be willing to consider releasing Lal and Data to you so that he may continue his work with her.

00:16:31

His presence would undoubtedly retard the new android's progress.

00:16:35

Admiral, to you, Lal is a new android.

00:16:40

But, to Data... she's his child.

00:16:44

His child?

00:16:45

Yes, Admiral-- it may not be easy for you and I to see her that way but he does, and I respect that.

00:16:55

They will remain here for now.

00:16:57

Starfleet's policy on research is clear.

00:17:00

You're making your stand on very uncertain ground.

00:17:04

I do hope it doesn't fall out from under you.

00:17:07

Haftel out.

00:17:14

She achieved a very high score on a test of academic achievement.

00:17:18

A perfect score.

00:17:20

Yes, which is why we started her out with the older children, but Lal couldn't understand the nuances of how they related to each other.

00:17:29

I see.

00:17:31

We decided the best thing to do would be to put her with younger children.

00:17:35

That would seem to be reasonable.

00:17:38

It isn't working out that way.

00:17:48

The children are afraid of her.

00:17:56

Father, what is the significance of laughter?

00:18:01

It is a human physiological response to humor.

00:18:05

Then, judging from their laughter, the children at school found my remarks humorous.

00:18:10

So without understanding humor, I have somehow mastered it.

00:18:17

Deck 15.

00:18:23

Lal.

00:18:26

Yes, Father?

00:18:27

The children were not laughing with you.

00:18:29

They were laughing at you.

00:18:31

Explain.

00:18:33

One is meant kindly, the other is not.

00:18:39

Why would they wish to be unkind?

00:18:41

Because you are different.

00:18:43

Differences sometimes scare people.

00:18:46

I have learned that some of them use humor to hide their fear.

00:18:51

I do not wish to be different.

00:19:02

Doctor?

00:19:04

I require your advice as a successful parent.

00:19:08

Uh! W-Well, thank you, Data.

00:19:11

I'd like to think I was.

00:19:14

Well, please sit down.

00:19:22

How's Lal?

00:19:25

Lal is realizing she is not the same as other children.

00:19:30

Is it lonely for her?

00:19:32

She does not feel the emotion of loneliness, but she can observe how isolated she is from the others.

00:19:38

She wishes to be more like them.

00:19:40

I do not know how to help her.

00:19:43

Lal is passing into sentience.

00:19:45

It is perhaps the most difficult stage of her development.

00:19:49

When... when Wesley was growing up he was an extraordinarily bright boy, but he had a hard time making friends.

00:20:02

I think the other children were a little intimidated by him.

00:20:06

That is precisely what happened to Lal in school.

00:20:10

How did you help him?

00:20:12

Well... first, I went back to my own childhood and remembered how painful it was for me because... I remember a time when I wasn't very popular, either.

00:20:27

And when I told that to Wesley it made him feel a little better.

00:20:31

He knew I understood what he was going through.

00:20:33

I have not told Lal how difficult it was for me to assimilate.

00:20:37

I did not wish to discourage her.

00:20:39

Perhaps that was an error in judgment.

00:20:42

You didn't have anyone with experience to help you through sentience.

00:20:47

She, at least, has you.

00:20:51

Just help her realize she's not alone and... be there to nurture her when she needs love and attention.

00:21:02

I can give her attention, Doctor, but I am incapable of giving her love.

00:21:10

Now, why do I find that so hard to believe?

00:21:16

( rapid beeping )

00:21:19

WORF: Captain, incoming signal.

00:21:22

Starfleet priority one.

00:21:24

Admiral Haftel.

00:21:26

On my monitor, Lieutenant.

00:21:32

Admiral.

00:21:33

HAFTEL: Captain Picard, I hope I didn't disturb you.

00:21:36

Not at all.

00:21:38

I have discussed my concerns with Starfleet Command.

00:21:41

You are to hold your position until I join you.

00:21:45

And I shall personally review the android's development.

00:21:49

Understood.

00:21:50

I should advise you, Captain, that if I'm not satisfied with what I see,

00:21:54

I am empowered to take the android back with me.

00:21:57

Haftel out.

00:22:05

Captain's Log, supplemental.

00:22:07

We are holding position pending the arrival

00:22:08

of Admiral Haftel from Starfleet Research.

00:22:11

Commander Data is completing his final neural transfers

00:22:15

to the android he has named "Lal,"

00:22:18

which, I have learned, in the language Hindi

00:22:20

means "beloved."

00:22:24

That looks very good.

00:22:26

Thanks, Guinan.

00:22:29

Hello, Data.

00:22:30

Guinan.

00:22:39

Lal, how are you?

00:22:41

I am functioning within normal...

00:22:44

I am fine, thank you.

00:22:46

Good.

00:22:47

Guinan, Lal needs to observe human behavior.

00:22:50

Well, she's in the right place for it.

00:22:53

And, for this opportunity she is willing to provide services to assist you.

00:22:57

Ah.

00:22:58

Father says I would learn a great deal from working with someone as old as you.

00:23:05

You're hired.

00:23:08

The most important part about working someplace like this is the art of listening.

00:23:12

I have some expertise, so I shall teach you.

00:23:15

That would be most beneficial.

00:23:17

I've been programmed with a listing of 1,412 known beverages.

00:23:21

What did you say?

00:23:22

I've been programmed with a listing of 1,400--

00:23:25

"I've"?

00:23:26

You have used a verbal contraction.

00:23:28

You said "I've" instead of "I have."

00:23:31

It is a skill my program has never mastered.

00:23:33

Then I will desist.

00:23:34

No.

00:23:35

You have exceeded my abilities.

00:23:38

I do not object, but...

00:23:39

I do not understand how this has occurred.

00:23:42

PICARD: Picard to Commander Data.

00:23:44

Please report to my ready room.

00:23:45

Aye, sir.

00:23:57

I am certain the Admiral is anxious to meet Lal.

00:24:01

I have been sending him regular status reports on her development.

00:24:05

His visit is not just an inspection of Lal's progress.

00:24:09

He has expressed a concern for her environment.

00:24:13

Her environment, sir?

00:24:14

He believes the Daystrom Annex on Galor IV would be more suitable.

00:24:21

Then he wishes to relocate us.

00:24:24

Not you, Data.

00:24:27

Just her.

00:24:31

I would not be in favor of that, sir.

00:24:33

There are many things that she can learn only from me-- my lifetime of experiences, the mistakes I have made and what I have learned from them...

00:24:42

WORF: Captain, Commander Riker's shuttle has just returned.

00:24:45

Acknowledged, Lieutenant.

00:24:47

Will you advise Commander Riker

00:24:49

I will meet with him in one hour.

00:24:50

Picard out.

00:24:52

The Admiral is taking the position that Lal's development should be overseen by the most experienced personnel.

00:25:05

Then he is questioning my ability as a parent.

00:25:08

In a manner of speaking.

00:25:11

Does the Admiral have children?

00:25:14

Yes, I believe he does, Data.

00:25:18

Why?

00:25:19

I am forced to wonder how much experience he had as a parent when his first child was born.

00:25:35

You see?

00:25:36

What are they doing?

00:25:38

It's called flirting.

00:25:40

They seem to be communicating telepathically.

00:25:43

They're both thinking the same thing if that's what you mean.

00:25:52

Guinan, is the joining of hands a symbolic act for humans?

00:25:57

It shows affection.

00:25:59

Humans like to touch each other.

00:26:01

They start with the hands and go from there.

00:26:06

He's biting that female!

00:26:08

No, he's not biting her.

00:26:09

They're pressing lips.

00:26:10

It's called kissing.

00:26:16

Why are they leaving?

00:26:19

Lal, there are some things your father's just going to have to explain to you when he thinks you're ready.

00:26:33

You're new around here, aren't you?

00:26:36

Yes.

00:26:40

Lal? Lal!

00:26:41

Put him down.

00:26:44

DATA: Commander?

00:26:45

What are your intentions toward my daughter?

00:26:48

Your daughter?

00:26:53

Nice to meet you.

00:27:05

I watch them and I can do the things they do, but I will never feel the emotions.

00:27:13

I'll never know love.

00:27:15

It is a limitation we must learn to accept, Lal.

00:27:19

Then why do you still try to emulate humans?

00:27:22

What purpose does it serve except to remind you that you are incomplete?

00:27:27

I have asked myself that many times as I have struggled to be more human.

00:27:36

Until I realized it is the struggle itself that is most important.

00:27:42

We must strive to be more than we are, Lal.

00:27:45

It does not matter that we will never reach our ultimate goal.

00:27:50

The effort yields its own rewards.

00:27:53

You are wise, Father.

00:27:57

It is the difference between knowledge and experience.

00:28:04

I learned today that humans like to hold hands.

00:28:08

It is a symbolic gesture of affection.

00:28:37

No objective viewpoint could see it any other way.

00:28:40

Forgive me, Admiral, I thought you were sent here to form an opinion, not to justify one.

00:28:46

Captain, let's not make this any more difficult than it needs to be, hmm?

00:28:51

I see no need for it to be difficult at all.

00:28:54

I understand your concerns.

00:28:55

What I'm asking for is time, patience.

00:28:58

If you have an open mind

00:28:59

I'm sure you will see that it is imperative that Data and Lal be kept together during the formative stages of her development.

00:29:06

After that, I have no doubt Commander Data will be delighted to deliver her to Starfleet Research.

00:29:11

That's not satisfactory.

00:29:13

If mistakes are made, the damage that's done might be irreparable.

00:29:19

I'm convinced the damage will be irreparable if they're separated.

00:29:24

Captain, are we talking about breaking up a family?

00:29:29

Isn't that rather a sentimental attitude about androids?

00:29:36

They're living, sentient beings.

00:29:38

Their rights and privileges in our society have been defined.

00:29:41

I helped define them.

00:29:43

Yes, Captain, and I am more than willing to acknowledge that.

00:29:47

What you must acknowledge is that Lal may be a technological step forward in the development of artificial intelligence.

00:29:56

A most significant step.

00:29:58

Yes, and work like this demands to be done with controlled procedures.

00:30:01

Which Commander Data is following.

00:30:03

In effective isolation, and that is what Starfleet Research finds unacceptable.

00:30:14

So Lal now possesses the sum of my programming.

00:30:17

Her neural nets are laid down identically to yours?

00:30:21

There do seem to be some variations on the quantum level.

00:30:24

She can use contractions.

00:30:26

I cannot.

00:30:27

An aberration.

00:30:28

What have you done about this?

00:30:31

I have maintained records on positronic matrix activity, behavioral norms, and all verbal patterns.

00:30:36

I have seen no other evidence of aberrations.

00:30:39

It would seem you have actually improved upon yourself, Data.

00:30:42

Is that not the goal of every parent, sir?

00:30:45

But, as a good father, don't you think it would be better especially in the light of this new aberration if Lal were close to people trained in diagnostic and evaluative procedures?

00:30:57

I am programmed with the procedures you mentioned, sir, and in any meaningful evaluation of Lal you would require a model for a basis of comparison.

00:31:07

I am the only model available, Admiral.

00:31:09

You haven't mastered human cultural and behavioral norms yourself yet, have you?

00:31:15

No, sir.

00:31:16

Where is Lal now?

00:31:28

This is your idea of appropriate guidance?

00:31:33

It is an opportunity for her to observe human behavior, and more importantly, for her to interact with her crewmates.

00:31:41

Thank you.

00:31:44

She is capable of running over 60 trillion calculations a second, and you have her working as a cocktail waitress.

00:31:51

Admiral, she is under the strict guidance of a woman in whom I have absolute trust.

00:31:57

Ten-Forward is the center of the ship's social activity.

00:32:00

Everyone on board comes here.

00:32:02

I'm not convinced the sort of behavior she observes here will be a positive influence.

00:32:06

Well, most people when they come in here behave themselves, and when they don't

00:32:09

I ask them to leave.

00:32:11

Admiral Haftel, Guinan.

00:32:14

She runs Ten-Forward.

00:32:15

How is Lal doing?

00:32:17

Oh, she spills a few drinks every now and then, but she's learning.

00:32:20

( woman laughing )

00:32:21

No, I think I've had enough, thank you.

00:32:24

Excuse me.

00:32:25

I want that android out of here.

00:32:27

Now, Admiral, you've been in one or two bars in your time.

00:32:31

Have her report to me immediately for an interview.

00:32:54

Well, Lal, I've been looking forward to meeting you.

00:32:57

Why?

00:32:59

You're very important to us at Starfleet Research.

00:33:02

We have quite a facility at Galor IV.

00:33:05

I want to show it to you.

00:33:07

In fact, the Admiral is suggesting you be moved to Galor IV, Lal.

00:33:11

Have I done something wrong?

00:33:13

Oh, no, of course not.

00:33:15

We just want to broaden your experience.

00:33:17

There's only so much you can learn on a starship.

00:33:20

I'm sure you'll agree to that.

00:33:22

Yes, I'll agree.

00:33:23

Good.

00:33:25

Thus, the natural conclusion would be when I have learned all there is to learn aboard the starship

00:33:30

I would relocate to Galor IV.

00:33:32

That is not the natural conclusion here.

00:33:35

I believe it is.

00:33:38

You see, Lal, the Admiral is concerned that you need more... guidance than your father can provide here on the Enterprise.

00:33:47

Yes, don't misunderstand me.

00:33:49

I have great respect for your father.

00:33:52

You do not speak with respect.

00:33:54

She seems very adversarial.

00:33:57

I'm merely stating a fact, Admiral.

00:34:00

I don't think your father has taught you selective judgment in the verbalization of your own thoughts.

00:34:08

Now, that is a skill we will help you develop.

00:34:11

My father is already helping me, sir.

00:34:14

The question is, has he helped you enough?

00:34:21

Are you asking me, sir?

00:34:22

No, I didn't mean to ask...

00:34:24

Why don't we, Admiral?

00:34:26

In all these discussions, no one has ever mentioned her wishes.

00:34:31

She's a free, sentient being.

00:34:33

PICARD: What are your wishes, Lal?

00:34:35

I wish to remain here, Captain Picard.

00:34:44

Thank you, Lal.

00:34:46

You're excused.

00:35:00

( door chimes )

00:35:01

Come in.

00:35:08

Hello, Lal.

00:35:11

How are you?

00:35:13

Troi...

00:35:14

Admiral... Admiral...

00:35:17

An admiral from Starfleet has come to take me away, Troi.

00:35:23

I am scared.

00:35:25

You are scared, aren't you?

00:35:31

I feel it.

00:35:35

How is this possible?

00:35:37

I don't know.

00:35:45

This is what it means to feel.

00:35:51

This is what it means... "feel."

00:36:06

You have Lal off to a wonderful start in life, Commander.

00:36:12

And that's what being a parent is all about.

00:36:16

However, I have finally decided that I must ask you to release her to me.

00:36:21

May I ask why, sir?

00:36:23

All other arguments aside, there's one that is irrefutable.

00:36:27

There are only two Soong-type androids in existence.

00:36:31

It would be very dangerous to have you both in the same place, especially aboard a starship.

00:36:36

One lucky shot by a Romulan, we'd lose you both.

00:36:40

Admiral, that is a fine argument, but... it doesn't change my feeling that the proper place for Lal to develop is by Data's side.

00:36:53

You're not a parent, Captain.

00:36:55

I am.

00:36:57

I have learned, with difficulty, that there comes a time when...

00:37:04

There comes a time when all parents must give up their children for their own good.

00:37:09

But this is not the time.

00:37:11

Damn it, even I can see the... the umbilical cord is virtually uncut.

00:37:18

The child...

00:37:23

The child depends on him.

00:37:27

Mr. Data, it would be better for Lal if she left knowing that you had voluntarily decided that this was the best course of action.

00:37:39

Admiral, when I created Lal, it was in the hope that someday she would choose to enter the academy and become a member of Starfleet.

00:37:49

I wanted to give something back in return for all that Starfleet has given me.

00:37:54

I still do.

00:37:57

But Lal is my child.

00:38:00

You ask that I volunteer to give her up.

00:38:03

I cannot.

00:38:05

It would violate every lesson

00:38:07

I have learned about human parenting.

00:38:10

I have brought a new life into this world.

00:38:13

And it is my duty, not Starfleet's to guide her through these difficult steps to maturity, to support her as she learns, to prepare her to be a contributing member of society.

00:38:28

No one can relieve me from that obligation, and I cannot ignore it.

00:38:35

I am... her father.

00:38:45

Then I regret that I must order you to transport Lal aboard my ship.

00:38:53

Belay that order, Mr. Data.

00:38:55

I beg your pardon.

00:38:57

I will take this to Starfleet myself.

00:39:00

I am Starfleet, Captain.

00:39:04

Proceed, Commander.

00:39:05

Hold your ground, Mr. Data.

00:39:06

Captain, you are jeopardizing your command and your career.

00:39:13

There are times, sir, when men of good conscience cannot blindly follow orders.

00:39:23

You acknowledge their sentience but you ignore their personal liberties and freedom.

00:39:31

( sighs )

00:39:34

Order a man to hand his child over to the state?

00:39:38

Not while I am his Captain.

00:39:41

If you wish, you can accompany us to Starfleet, where we shall...

00:39:45

TROI: Troi to Commander Data.

00:39:47

Report to your lab at once.

00:39:50

Acknowledged, Counselor.

00:39:51

He's on his way. Is there a problem?

00:39:54

Yes, Captain, something is terribly wrong with Lal.

00:40:01

TROI: It lasted barely a moment.

00:40:03

She experienced fear and confusion and then, for no apparent reason she walked out of my quarters.

00:40:09

She didn't say another word.

00:40:11

She just started walking here and each step became more and more difficult.

00:40:15

Lal is programmed to return to the lab in the event of a malfunction.

00:40:20

Father?

00:40:21

Yes, Lal, I am here.

00:40:24

A malfunction... emotional awareness.

00:40:28

It appears to be a symptom of cascade failure.

00:40:31

It would require initialization of the base matrix without wiping out higher functions.

00:40:36

I agree.

00:40:39

May I assist?

00:40:42

Thank you, Admiral.

00:40:44

If you'll excuse us,

00:40:45

Commander Data and I have much to do.

00:40:53

( door opens and closes )

00:40:56

( instrument whirring )

00:41:03

( click )

00:41:07

( beeping )

00:41:19

( door opens )

00:41:27

She... she won't survive much longer.

00:41:31

There was nothing anyone could have done.

00:41:35

We'd repolarize one pathway, and another would collapse.

00:41:42

And then another.

00:41:45

His hands... were moving faster than I could see, trying to stay ahead of each breakdown.

00:41:54

He refused to give up.

00:41:56

He was remarkable.

00:42:03

It just... wasn't meant to be.

00:42:13

Lal, I am unable to correct the system failure.

00:42:20

I know.

00:42:22

We must say good-bye now.

00:42:28

I feel...

00:42:32

What do you feel, Lal?

00:42:35

I love you, Father.

00:42:46

I wish I could feel it with you.

00:42:49

I will feel it for both of us.

00:42:56

Thank you for my life.

00:43:03

Flirting.

00:43:06

Laughter.

00:43:08

Painting, family.

00:43:13

Female.

00:43:15

Human...

00:43:24

Lal suffered complete neural system failure at 1300 hours.

00:43:30

I have deactivated the unit.

00:43:34

The crew is saddened by your loss, Mr. Data.

00:43:40

I thank you for your sympathy, but... she is here.

00:43:47

Her presence so enriched my life that I could not allow her to pass into oblivion.

00:43:53

So I incorporated her programs back into my own.

00:43:57

I have transferred her memories to me.

00:44:04

Mr. Data, will you take your position.

00:44:10

Mr. Crusher, lay in a course for the Starbase on Otar II.

00:44:19

WESLEY: Course is set, sir.

00:44:22

PICARD: Engage.