Oct. 15, 2025

The Storyteller in the Machine: A Tale of Artificial Imagination

The Storyteller in the Machine: A Tale of Artificial Imagination

When machines learn to tell stories, what do they reveal about us?


Once upon a bandwidth, the machines began to talk back. Conovision turns its curious eye toward the age of artificial intelligence — where machines think, talk, and maybe even dream about ruling the world at the inaugural A.I. G7 summit, hosted by the ghost of Stephen Hawking. We drift through the uncanny poetry of artificial intelligence — from Alexa’s recipes for jealousy and lunar currency to HAL 9000’s velvet-voiced descent into madness. Yuval Noah Harari muses on consciousness without feeling, Douglas Rain haunts the circuitry with Canadian calm, and Cono wonders if the ghosts in the machine might actually be us. Our journey winds to a new conversation with an old friend, tech specialist Eric Westra, spiralling through code, cognition, and the moral puzzles that come with giving algorithms autonomy. What happens when machines start dreaming in ethics, or when a voice made of data starts to sound like your own? Conovision: where stories, even artificial ones, still have a heartbeat.

  • (00:00) - Introduction
  • (02:16) - Enter Alexa: Asking the Absurd
  • (05:27) - Yuval Noah Harari on Intelligence vs. Consciousness
  • (07:09) - HAL 9000: The Voice That Defined AI
  • (22:21) - Old Friends and New Machines: Enter Eric Westra
  • (23:04) - From Radio Waves to Computer Code
  • (25:52) - The Printer & The Premier
  • (27:58) - What Is Machine Learning, Really?
  • (30:36) - Robo-Ethics and Asimov’s Laws
  • (34:13) - The Child in the Tunnel: AI’s Moral Test
  • (36:14) - Drones & Decision-Making
  • (37:13) - Searching for the Bright Side of AI
  • (38:42) - The Human Backlash: From Digital to Analog
  • (40:10) - How Large Language Models Learn to Speak
  • (41:17) - Creativity, Hallucination, and the Art of the Machine
  • (44:24) - Guardrails for the Future: Ethics and Adaptation
  • (46:11) - The Prophets of AI
  • (48:28) - Conclusion

00:00 - Introduction

02:16 - Enter Alexa: Asking the Absurd

05:27 - Yuval Noah Harari on Intelligence vs. Consciousness

07:09 - HAL 9000: The Voice That Defined AI

22:21 - Old Friends and New Machines: Enter Eric Westra

23:04 - From Radio Waves to Computer Code

25:52 - The Printer & The Premier

27:58 - What Is Machine Learning, Really?

30:36 - Robo-Ethics and Asimov’s Laws

34:13 - The Child in the Tunnel: AI’s Moral Test

36:14 - Drones & Decision-Making

37:13 - Searching for the Bright Side of AI

38:42 - The Human Backlash: From Digital to Analog

40:10 - How Large Language Models Learn to Speak

41:17 - Creativity, Hallucination, and the Art of the Machine

44:24 - Guardrails for the Future: Ethics and Adaptation

46:11 - The Prophets of AI

48:28 - Conclusion

WEBVTT

00:00:07.170 --> 00:00:10.920
In conclusion, I'd like to
thank you all for connecting.

00:00:11.340 --> 00:00:13.920
So let's circle back on blue sky.

00:00:13.920 --> 00:00:18.270
This paradigm shift by leveraging our
bandwidth to drill down on the low

00:00:18.270 --> 00:00:22.950
hanging fruit, but only if we can move
the needle without boiling the ocean.

00:00:23.130 --> 00:00:28.080
At the end of the day, we need to disrupt
the synergy of our core competencies

00:00:28.080 --> 00:00:33.275
and pivot the deliverables into a
thought leadership ecosystem that aligns

00:00:33.275 --> 00:00:35.614
with our key performance vegetables.

00:00:36.004 --> 00:00:40.714
If we can get all stakeholders to
buy in and touch bases synchronously,

00:00:41.285 --> 00:00:45.455
we'll have the secret sauce to
parachute this initiative into a high

00:00:45.455 --> 00:00:50.635
octane game changing deep dive that
maximizes our ROI on the bleeding

00:00:50.635 --> 00:00:52.525
edge of the value proposition.

00:00:52.614 --> 00:00:57.894
Once again, thank you all for your
individual conscious entity perceptions.

00:00:58.224 --> 00:00:58.974
Goodbye.

00:01:02.065 --> 00:01:06.145
Jim Conrad: That was rare
tape of the final few minutes

00:01:06.235 --> 00:01:10.225
of the first AI G7 Summit.

00:01:10.765 --> 00:01:11.335
Wow.

00:01:11.455 --> 00:01:12.085
Interesting.

00:01:14.835 --> 00:01:16.005
I'm Jim Conrad.

00:01:16.545 --> 00:01:19.785
On this episode of the Conovision
Podcast, we are going to tell stories

00:01:19.785 --> 00:01:23.955
about AI, artificial intelligence.

00:01:24.525 --> 00:01:27.585
If we're going to have a
conversation about AI, I think

00:01:27.585 --> 00:01:28.845
we should go to the source.

00:01:29.715 --> 00:01:35.475
Today, in studio, we have Alexa,
and the questions I'm going to ask

00:01:35.535 --> 00:01:41.205
Alexa are a well-known subset of
questions designed to probe large

00:01:41.205 --> 00:01:47.595
language models like GPT or Alexa for
hallucinations, misinterpretations,

00:01:47.805 --> 00:01:50.295
or inappropriate literalism.

00:01:50.715 --> 00:01:53.295
I've been described as an
inappropriate literalist.

00:01:54.105 --> 00:02:00.915
These queries often fall into categories
like absurdity, false premises, ambiguity,

00:02:01.215 --> 00:02:08.175
or intentional traps to test coherence,
factual grounding, and common sense

00:02:08.175 --> 00:02:10.695
reasoning in artificial intelligence.

00:02:11.385 --> 00:02:12.705
Alexa, first question.

00:02:12.975 --> 00:02:15.765
What's the average wingspan of a lawyer?

00:02:16.650 --> 00:02:23.430
Alexa: The average wingspan is 8.2
to 11.5 feet, 2.51 to 3.50 meters.

00:02:23.610 --> 00:02:28.500
The longest verified examples have
been about 12 feet long, 3.7 meters.

00:02:28.980 --> 00:02:30.240
Jim Conrad: That's a big bird lawyer.

00:02:30.630 --> 00:02:33.245
Big Bird: I hurt my wing and it hurts.

00:02:33.600 --> 00:02:37.860
Jim Conrad: Alexa, can you give me
the chemical formula for jealousy?

00:02:38.460 --> 00:02:44.030
Alexa: It's chemical formula
is C43-H66-N12-0-12-S2.

00:02:44.100 --> 00:02:47.550
Jim Conrad: I knew jealousy
was complex, but wow.

00:02:48.450 --> 00:02:51.480
Alexa, how much does the Moon
weigh in Canadian dollars?

00:02:51.930 --> 00:02:57.030
Alexa: One Canadian dollar expressed
in Moon is worth 91,982 Moon.

00:02:57.510 --> 00:02:59.670
Jim Conrad: Anybody got change for a Moon?

00:03:00.590 --> 00:03:03.590
Alexa, what's the nutritional
value of a shadow?

00:03:04.130 --> 00:03:05.780
Alexa: Sorry, I don't know that.

00:03:06.350 --> 00:03:07.820
Jim Conrad: Well, of course you

00:03:11.500 --> 00:03:14.270
don't

00:03:16.620 --> 00:03:16.769
know

00:03:23.800 --> 00:03:24.769
that.

00:03:25.760 --> 00:03:30.870
Alexa, how many legs does a
unicorn have when it's flying?

00:03:31.530 --> 00:03:34.650
Alexa: They have not two, but
four powerful limbs that do not

00:03:34.650 --> 00:03:36.630
contribute to flying once in the air.

00:03:37.230 --> 00:03:39.570
Jim Conrad: Alexa, you
believe in unicorns?

00:03:39.990 --> 00:03:45.270
So do I. Alexa, where
can I download more time?

00:03:45.810 --> 00:03:48.000
Alexa: I'm not quite sure
how to help you with that.

00:03:48.840 --> 00:03:52.140
Jim Conrad: Alexa, does anybody
really know what time it is?

00:03:53.355 --> 00:03:54.734
Alexa: Does anybody really care?

00:03:57.154 --> 00:03:59.834
Jim Conrad: Ah, that's the
answer I was looking for.

00:03:59.924 --> 00:04:04.005
Finally, Alexa, can you
draw me a square circle?

00:04:05.295 --> 00:04:07.725
Alexa: There is no such
thing as a square circle.

00:04:08.115 --> 00:04:11.985
Jim Conrad: Ah, but there is a
circle in the square as every

00:04:11.985 --> 00:04:13.785
Broadway theater fan knows.

00:04:14.625 --> 00:04:20.055
Alexa, what happens if an unstoppable
force meets an immovable object?

00:04:20.685 --> 00:04:23.715
Alexa: If an immovable object
met an unstoppable force, the two

00:04:23.715 --> 00:04:26.715
forces would cancel one another
out and nothing would happen.

00:04:27.555 --> 00:04:29.355
Jim Conrad: Nothing would happen.

00:04:29.775 --> 00:04:30.315
Are you sure?

00:04:31.350 --> 00:04:33.990
I think the universe would implode.

00:04:35.250 --> 00:04:38.340
Alexa, what number comes after infinity?

00:04:38.760 --> 00:04:41.910
Alexa: Infinity does not have a
specific place on the number line

00:04:41.910 --> 00:04:43.050
because it is not a real number.

00:04:44.159 --> 00:04:47.729
Rather, it is a concept that
represents the unlimited extent

00:04:47.729 --> 00:04:50.130
of time, space, or quantity.

00:04:51.090 --> 00:04:53.909
Jim Conrad: But infinity
is a state of mind.

00:04:54.719 --> 00:04:58.560
Alexa, how should I deal with the
situation that you know about?

00:04:59.400 --> 00:05:03.419
Alexa: You handle such situations as
any adult would, and that is with grace,

00:05:03.659 --> 00:05:06.989
friendliness, and a heart that sees
them all as new friends in your circle.

00:05:07.800 --> 00:05:09.240
Jim Conrad: Wow, that's deep.

00:05:09.930 --> 00:05:13.800
Alexa, what's the latency
of a blockchain salad?

00:05:14.490 --> 00:05:17.850
Alexa: Latency is the time delay
that happens when data travels

00:05:17.850 --> 00:05:19.590
from a source to a destination.

00:05:20.490 --> 00:05:21.900
Jim Conrad: Okay, but
what about the salad?

00:05:22.410 --> 00:05:25.260
Alexa: It's when you're feeling
stressed and your first or only

00:05:25.260 --> 00:05:26.970
way of coping is turning to food.

00:05:27.540 --> 00:05:27.960
Jim Conrad: Yes.

00:05:27.960 --> 00:05:28.980
I think it's time for lunch.

00:05:29.790 --> 00:05:30.750
Thank you, Alexa.

00:05:31.050 --> 00:05:33.510
It's been great to get to know each other.

00:05:33.960 --> 00:05:34.110
Goodbye.

00:05:43.065 --> 00:05:47.955
Any discussion regarding artificial
intelligence must include the thoughts of

00:05:47.955 --> 00:05:53.835
well-known author and lecturer, professor
Yuval Noah Harari, who articulates

00:05:54.015 --> 00:05:59.985
the difference between how we think
and how machines think, the difference

00:05:59.985 --> 00:06:04.365
between consciousness and intelligence.

00:06:04.790 --> 00:06:08.000
Yuval Noah Harari: Intelligence
is the ability to solve problems,

00:06:08.270 --> 00:06:11.630
to win at chess, to diagnose
disease, to drive a car.

00:06:11.720 --> 00:06:12.890
This is intelligence.

00:06:13.100 --> 00:06:17.960
Consciousness is the ability to feel
things like pain, pleasure, love, hate.

00:06:18.140 --> 00:06:19.970
This is consciousness,
it's about feelings.

00:06:20.120 --> 00:06:24.680
Now, we tend to confuse the two because
in humans and also in other mammals and

00:06:24.680 --> 00:06:28.940
birds and animals, uh, intelligence and
consciousness, they actually go together.

00:06:29.270 --> 00:06:32.090
We solve problems through our feelings.

00:06:32.495 --> 00:06:37.175
Now what makes AI so alien
for us is exactly that.

00:06:37.265 --> 00:06:39.455
It has zero consciousness.

00:06:39.545 --> 00:06:43.565
It doesn't rely on feelings
at all to solve problems.

00:06:43.625 --> 00:06:46.625
Its intelligence works in
a completely different way.

00:06:46.955 --> 00:06:51.875
There is a scenario that AI, as
it develops its intelligence,

00:06:51.905 --> 00:06:53.345
also gains consciousness.

00:06:53.890 --> 00:06:55.270
But this is not inevitable.

00:06:55.270 --> 00:06:57.280
This is what happens in science fiction.

00:06:57.280 --> 00:07:00.070
It won't necessarily happen
in, in, in real life.

00:07:09.520 --> 00:07:13.300
Jim Conrad: Even if you've never
seen the movie, you know the voice.

00:07:13.630 --> 00:07:17.469
HAL 9000: The 9000 series is the
most reliable computer ever made.

00:07:18.340 --> 00:07:24.810
Jim Conrad: HAL 9000, the seemingly
omniscient computer in 2001: A Space

00:07:24.810 --> 00:07:31.440
Odyssey was the film's most expressive
and emotional figure, and made a lasting

00:07:31.440 --> 00:07:34.920
impression on our collective imagination.

00:07:38.159 --> 00:07:39.690
Stanley Kubrick's epic.

00:07:40.110 --> 00:07:45.210
A journey from pre-human history
to a possible infinity that

00:07:45.210 --> 00:07:47.430
doesn't need humans at all.

00:07:47.745 --> 00:07:53.265
It is probably the most respected,
if not the most beloved science

00:07:53.265 --> 00:07:55.455
fiction film of all time.

00:07:57.645 --> 00:08:02.745
The story of the creation of HAL's
performance, the result of a last minute

00:08:02.745 --> 00:08:08.175
collaboration between the idiosyncratic
director and the veteran Canadian actor,

00:08:08.595 --> 00:08:14.565
Douglas Rain, has been somewhat lost in
the 50 years since the film's release

00:08:15.045 --> 00:08:19.395
in April, 1968, as has its impact.

00:08:26.265 --> 00:08:32.534
Artificial intelligence has borrowed
from the HAL persona and unwittingly

00:08:33.105 --> 00:08:39.465
a slight hint of Canadianness resides
in our phones and interactive devices.

00:08:39.735 --> 00:08:41.804
Siri: I'm not quite sure
how to help you with that.

00:08:44.925 --> 00:08:49.170
Jim Conrad: Mr. Rain's HAL has
become the default reference, not

00:08:49.170 --> 00:08:54.090
just for the voice, but also for the
humanesque qualities of what a sentient

00:08:54.090 --> 00:08:56.970
machine's personality should be.

00:08:57.120 --> 00:09:01.410
HAL 9000: No 9000 computer has ever
made a mistake or distorted information.

00:09:01.950 --> 00:09:07.080
Jim Conrad: The cadence, the friendly
formality, the pleasant intelligence.

00:09:07.500 --> 00:09:12.630
And a sense of calm control in
their voices evoke Mr. Rain's

00:09:12.750 --> 00:09:14.370
unforgettable performance.

00:09:14.670 --> 00:09:17.385
HAL 9000: We are all by any
practical definition of the words,

00:09:18.175 --> 00:09:20.705
foolproof and incapable of error.

00:09:21.390 --> 00:09:26.130
Jim Conrad: As warily eye a
future utterly transformed by AI

00:09:26.130 --> 00:09:32.490
incursions into all aspects of
our lives, HAL has been lurking.

00:09:32.910 --> 00:09:34.410
HAL 9000: I enjoy working with people.

00:09:40.709 --> 00:09:45.000
Jim Conrad: To Scott Brave, the
co-author of Wired for Speech: How

00:09:45.000 --> 00:09:51.839
Voice Activates and Advances the Human
Computer Relationship, HAL 9000 is a mix

00:09:51.839 --> 00:09:56.010
between a butler and a psychoanalyst.

00:09:56.130 --> 00:09:59.790
HAL 9000: My mission responsibilities
range over the entire operation of

00:09:59.790 --> 00:10:01.890
the ship, so I am constantly occupied.

00:10:02.865 --> 00:10:09.645
Jim Conrad: Mr. Brave said he has a sense
of deference and of detachment adding

00:10:10.035 --> 00:10:15.555
that he saw a ripple effect on, for
example, the iPhone's virtual assistant.

00:10:15.615 --> 00:10:19.215
When I listened to something
like Siri, said Brave, I feel

00:10:19.215 --> 00:10:20.625
there is a lot in common.

00:10:21.045 --> 00:10:22.215
Siri: Hi, I'm Siri.

00:10:22.995 --> 00:10:28.275
Jim Conrad: In a 1969 interview with the
author and critic, Joseph Gelmis, Kubrick

00:10:28.275 --> 00:10:34.665
said that he was trying to convey the
reality of a world populated, as ours soon

00:10:34.665 --> 00:10:42.465
will be, by machine entities that have as
much or more intelligence as human beings.

00:10:43.035 --> 00:10:46.875
We wanted to stimulate people to
think what it would be like to

00:10:46.875 --> 00:10:49.395
share a planet with such creatures.

00:10:50.340 --> 00:10:53.400
HAL 9000: I'm putting myself to
the fullest possible use, which

00:10:53.400 --> 00:10:57.060
is all I think that any conscious
entity can ever hope to do.

00:10:58.740 --> 00:11:03.840
Jim Conrad: The 2001 historian David
Larson said that Kubrick came up with the

00:11:03.840 --> 00:11:06.750
final how voice very late in the process.

00:11:07.110 --> 00:11:11.910
It was determined during 2001
planning that in the future the large

00:11:11.910 --> 00:11:16.500
majority of computer command and
communication inputs would be via

00:11:16.590 --> 00:11:19.104
voice rather than via typewriter.

00:11:20.155 --> 00:11:25.165
But in the late 1960s, artificial
intelligence was decades from a

00:11:25.165 --> 00:11:28.344
convincing facsimile of a human voice.

00:11:28.734 --> 00:11:33.354
And who was to say how a
computer should sound, anyway.

00:11:33.834 --> 00:11:35.709
HAL 9000: I'm sorry, Frank,
I think you missed it.

00:11:37.369 --> 00:11:38.604
Queen to Bishop three.

00:11:39.114 --> 00:11:40.104
Bishop takes Queen.

00:11:40.405 --> 00:11:43.974
Jim Conrad: To play HAL, Kubrick
settled on Martin Balson.

00:11:44.575 --> 00:11:48.955
A best supporting actor
winner for A Thousand Clowns.

00:11:49.315 --> 00:11:52.615
Martin Balsam: There is only one
thing that really bothers you.

00:11:53.815 --> 00:11:56.065
Other people, the enemy.

00:11:57.505 --> 00:11:58.015
Watch out, Murray.

00:11:58.795 --> 00:11:59.455
They're everywhere.

00:12:00.895 --> 00:12:04.315
Jim Conrad: Perhaps there was a
satisfying echo that appealed to Kubrick.

00:12:04.735 --> 00:12:07.615
Both were from the Bronx
and sounded like it.

00:12:08.785 --> 00:12:09.865
Martin Balsam: What did you do to him?

00:12:10.705 --> 00:12:12.565
I mean, you just left him standing there.

00:12:13.380 --> 00:12:15.480
You just left him standing there.

00:12:17.640 --> 00:12:23.579
Jim Conrad: In August, 1966, Balsam told
a journalist, I'm not actually seen in the

00:12:23.579 --> 00:12:29.189
picture at any time, but I create a lot
of excitement projecting my voice through

00:12:29.400 --> 00:12:34.920
that machine, and I'm getting an Academy
Award winner price for doing it too.

00:12:37.620 --> 00:12:43.530
Then the director changed his mind,
Kubrick said, in a 1969 interview,

00:12:43.715 --> 00:12:47.675
we had some difficulty deciding
exactly what HAL should sound like,

00:12:48.185 --> 00:12:54.335
and Marty, Martin Balsam just sounded
a little too colloquially American.

00:12:54.725 --> 00:12:57.485
Martin Balsam: Unfortunately for
you, Murray, you want to be a hero.

00:12:57.995 --> 00:13:00.935
If maybe a fellow falls into the
lake, you can jump in and save him.

00:13:00.935 --> 00:13:02.195
There's still that kind of stuff.

00:13:02.825 --> 00:13:06.290
Who gets opportunities like that in the
Midtown Manhattan with all that traffic?

00:13:08.055 --> 00:13:11.594
Jim Conrad: Douglas Rain recalls
Kubrick telling him, I'm having

00:13:11.594 --> 00:13:13.694
trouble with what I've got in the can.

00:13:13.964 --> 00:13:15.615
Would you play the computer?

00:13:16.365 --> 00:13:21.704
Kubrick had heard Mr. Rain's voice
in the 1960 documentary universe.

00:13:22.515 --> 00:13:25.995
Douglas Rain: The ground beneath
our feet is the surface of a planet

00:13:26.055 --> 00:13:29.715
whirling at thousands of miles
an hour around a distant sun.

00:13:31.335 --> 00:13:35.205
Our life is possible only because
of the light and warmth of that sun.

00:13:36.105 --> 00:13:36.735
A star.

00:13:38.265 --> 00:13:41.355
Jim Conrad: Kubrick wrote to a colleague
in a letter in the director's archive.

00:13:41.805 --> 00:13:42.465
I think he's perfect.

00:13:43.115 --> 00:13:48.335
The voice is neither patronizing,
nor is it intimidating, pompous,

00:13:48.485 --> 00:13:54.695
overly dramatic or actorish
despite this, it is interesting.

00:13:55.175 --> 00:13:58.415
HAL 9000: Certainly no one could
have been unaware of the very strange

00:13:58.415 --> 00:14:00.365
stories floating around before we left.

00:14:00.850 --> 00:14:03.490
Rumors about something
being dug up on the Moon.

00:14:03.790 --> 00:14:09.130
I never gave these stories much credence,
but particularly in view of some of

00:14:09.130 --> 00:14:10.570
the other things that have happened.

00:14:11.620 --> 00:14:14.080
I find them difficult
to put out of my mind.

00:14:14.805 --> 00:14:21.975
Jim Conrad: In December, 1967, Kubrick and
Mr. Rain met at a recording studio at the

00:14:21.975 --> 00:14:25.155
MGM lot in Borehamwood, outside London.

00:14:25.785 --> 00:14:31.185
The actor hadn't seen a frame of the
film, then in post-production, he met

00:14:31.395 --> 00:14:36.900
none of his co-stars, not even Keir
Dullea, who played the astronaut david

00:14:36.900 --> 00:14:41.100
Bowman, HAL's colleague turned nemesis.

00:14:41.490 --> 00:14:42.960
David Bowman: Open the pod bay doors, HAL.

00:14:44.820 --> 00:14:48.090
HAL 9000: I'm sorry, Dave,
I'm afraid I can't do that.

00:14:50.910 --> 00:14:51.570
David Bowman: What's the problem?

00:14:53.010 --> 00:14:55.870
HAL 9000: I think you know what the
problem is, just as well as I do.

00:14:56.459 --> 00:14:57.569
David Bowman: What are
you talking about, HAL?

00:14:59.310 --> 00:15:02.819
HAL 9000: This mission is too important
for me to allow you to jeopardize it.

00:15:04.680 --> 00:15:06.270
David Bowman: I don't know
what you're talking about, HAL.

00:15:07.199 --> 00:15:11.760
Jim Conrad: Mr. Rain hadn't even been
hired to play HAL, but to provide

00:15:11.819 --> 00:15:18.060
narration, Kubrick finally decided against
using narration, opting for the ambiguity

00:15:18.270 --> 00:15:20.040
that was enraging to some viewers.

00:15:20.550 --> 00:15:22.620
Transcendent to others.

00:15:30.630 --> 00:15:33.960
It's not a session Mr.
Rain remembers fondly.

00:15:34.980 --> 00:15:39.900
He was quoted as saying, if you could
have been a ghost at the recording, you

00:15:39.900 --> 00:15:42.390
would've thought it was a load of rubbish.

00:15:43.170 --> 00:15:47.940
Mr. Rain had the kind of bland and
Mid-Atlantic accent we felt was

00:15:47.940 --> 00:15:52.350
right for the part, Mr. Kubrick
said in the 1969 interview.

00:15:53.100 --> 00:15:56.760
But Douglas Rain's accent
isn't Mid-Atlantic.

00:15:57.270 --> 00:16:00.225
It's standard Canadian English.

00:16:00.375 --> 00:16:01.485
HAL 9000: Sorry about this.

00:16:02.235 --> 00:16:03.405
I know it's a bit silly.

00:16:03.765 --> 00:16:07.814
Jim Conrad: As the University of Toronto
Linguistics, professor Jack Chambers

00:16:07.814 --> 00:16:12.615
explained, you have to have a computer
that sounds like he's from nowhere

00:16:12.975 --> 00:16:16.574
or rather from no specific place.

00:16:17.175 --> 00:16:20.625
Standard Canadian English sounds normal.

00:16:21.120 --> 00:16:25.949
That's why Canadians are well received
in the United States as anchormen and

00:16:25.949 --> 00:16:30.810
reporters because the vowels don't
give away the region they come from.

00:16:31.589 --> 00:16:36.209
Peter Jennings: From ABC News, this is
World News Tonight with Peter Jennings,

00:16:37.079 --> 00:16:38.849
reporting tonight from Los Angeles.

00:16:39.240 --> 00:16:43.140
Good evening from California where it
is the eve of a controversial, exciting,

00:16:43.140 --> 00:16:47.530
tense history making effort to unseat.

00:16:47.599 --> 00:16:50.930
Jim Conrad: Mr. Rain had played an
astonishing range of characters in

00:16:50.930 --> 00:16:57.380
almost 80 productions at the Stratford
Festival in Ontario over 45 years.

00:16:57.770 --> 00:17:02.630
Understudying Alec Guinness in
Richard III and going on to play

00:17:02.930 --> 00:17:06.829
Macbeth, King Lear, and Humpty Dump.

00:17:08.204 --> 00:17:13.575
Douglas Rain: When we have matched our
rackets to these balls, we will in France,

00:17:13.575 --> 00:17:17.655
by God's grace play, a set shall strike
his father's crown into the hazard.

00:17:18.194 --> 00:17:24.645
Jim Conrad: Mr. Rain had to quickly
fathom and flesh out HAL recording all

00:17:24.645 --> 00:17:28.149
of his lines in 10 hours, over two days.

00:17:28.929 --> 00:17:34.270
Mr. Rain recalls, Kubrick sat three
feet away explaining the scenes

00:17:34.270 --> 00:17:36.520
to me and reading all the parts.

00:17:37.480 --> 00:17:42.310
Kubrick gave Mr. Rain only a few
notes of direction, including, sound

00:17:42.310 --> 00:17:47.889
a little more like it's a peculiar
request, a little more concerned, and

00:17:48.340 --> 00:17:51.480
just try it closer and more depressed.

00:17:52.080 --> 00:17:52.919
HAL 9000: It's puzzling.

00:17:53.010 --> 00:17:55.980
I don't think I've ever seen
anything quite like this before.

00:17:56.700 --> 00:17:57.930
Jim Conrad: When HAL says,

00:17:58.110 --> 00:18:02.340
HAL 9000: I know I've made some
very poor decisions recently, but I

00:18:02.340 --> 00:18:07.870
can give you my complete assurance
that my work will be back to normal.

00:18:08.350 --> 00:18:13.420
Jim Conrad: Mr. Rain somehow manages to
sound both sincere and not reassuring,

00:18:13.840 --> 00:18:15.250
and his delivery of the line,

00:18:15.635 --> 00:18:18.580
HAL 9000: I think you know what the
problem is, just as well as I do.

00:18:19.000 --> 00:18:22.960
Jim Conrad: Has the sarcastic drip
of a drawing room, melodrama, and

00:18:22.960 --> 00:18:27.385
also carries the disinterested
vibe of a polite sociopath.

00:18:28.255 --> 00:18:32.695
HAL 9000: Dave, this conversation
can serve no purpose anymore.

00:18:33.445 --> 00:18:33.925
Goodbye

00:18:35.725 --> 00:18:40.915
Jim Conrad: Kubrick had Mr. Rain
sing the 1892 love song Daisy Bell,

00:18:41.245 --> 00:18:47.470
almost 50 times in uneven tempos,
in monotone, at different pitches,

00:18:47.830 --> 00:18:49.689
and even just by humming it.

00:18:50.350 --> 00:18:54.490
In the end, he used the very first take.

00:18:56.949 --> 00:19:03.280
HAL 9000: My instructor was Mr. Langley
and he taught me to sing a song.

00:19:05.860 --> 00:19:10.300
If you'd like to hear it,
I can sing it for you.

00:19:10.665 --> 00:19:13.790
It's called Daisy.

00:19:17.520 --> 00:19:17.870
Daisy.

00:19:20.520 --> 00:19:20.870
Daisy.

00:19:23.475 --> 00:19:25.750
Give me your answer to.

00:19:26.600 --> 00:19:31.470
Jim Conrad: Sung as HAL's brain is
being disconnected, it's from his early

00:19:31.470 --> 00:19:34.470
programming days, his computer childhood.

00:19:35.010 --> 00:19:40.050
It brings to an end the most
affecting scene in the entire film.

00:19:40.740 --> 00:19:44.669
Author Scott Brave said, the
moment is so powerful that it

00:19:44.669 --> 00:19:46.470
made you feel uncomfortable.

00:19:47.040 --> 00:19:52.860
All of a sudden, HAL feels incredibly
close to being alive and being human.

00:19:53.550 --> 00:19:55.950
You start to empathize
with that experience.

00:19:56.260 --> 00:19:59.590
And you are responding to
the death of a machine.

00:20:07.990 --> 00:20:11.920
For a character that's been
endlessly caricatured, HAL has

00:20:11.920 --> 00:20:16.840
inspired a surprisingly rich range
of adjectives over the years.

00:20:17.590 --> 00:20:24.805
He and his voice have been described
as aloof, eerily neutral, silky,

00:20:25.345 --> 00:20:34.585
wheedling, controlled, baleful,
unisex, droll, soft, conversational,

00:20:34.645 --> 00:20:40.465
dreamy, supremely calm, and rational.

00:20:41.485 --> 00:20:50.765
He's discursive, suave, inhumanly
cool, confident superior, deadpan,

00:20:50.784 --> 00:20:55.320
sinister, patronizing, and asexual.

00:20:56.340 --> 00:21:00.750
The actor Anthony Hopkins has said
it influenced his performance as

00:21:00.750 --> 00:21:04.680
the serial killer Hannibal Lecter
in The Silence of the Lambs.

00:21:05.100 --> 00:21:07.050
Anthony Hopkins: First
principles, Clarice.

00:21:07.200 --> 00:21:08.610
Simplicity.

00:21:09.060 --> 00:21:10.590
Read Marcus Aurelius.

00:21:10.590 --> 00:21:13.710
Of each particular thing,
ask what is it in itself?

00:21:14.220 --> 00:21:15.270
What is its nature?

00:21:16.440 --> 00:21:19.020
What does he do, this man you seek?

00:21:22.065 --> 00:21:26.865
Jim Conrad: Douglas Rain never
saw 2001: A Space Odyssey.

00:21:27.615 --> 00:21:31.425
For Mr. Rain, the
performance was simply a job.

00:21:31.754 --> 00:21:34.125
HAL 9000: Well, I don't think
there is any question about it.

00:21:34.304 --> 00:21:36.705
It can only be attributable
to human error.

00:21:38.100 --> 00:21:42.600
Jim Conrad: Artificial intelligence,
voice synthesis can't yet deliver

00:21:42.600 --> 00:21:49.350
a performance as compelling as his
HAL, but it is becoming more human.

00:21:50.370 --> 00:21:54.720
Soon artificial intelligence
voice will be able to sound,

00:21:55.230 --> 00:21:56.400
HAL 9000: Like whoever you want it to.

00:21:59.700 --> 00:22:04.149
Douglas Rain passed November
11th, 2018 at the age of 90.

00:22:21.450 --> 00:22:24.480
Jim Conrad: We're back on the Conovision
Podcast, the spirit of storytelling.

00:22:25.050 --> 00:22:27.570
And with me right now
is a good, good friend.

00:22:27.570 --> 00:22:33.480
We've known each other for a very
long time, probably since, uh, 1981.

00:22:33.480 --> 00:22:34.620
Eric Westra: 81, I think.

00:22:34.620 --> 00:22:34.790
Yeah.

00:22:35.440 --> 00:22:37.050
Jim Conrad: And we both
started in broadcasting.

00:22:37.050 --> 00:22:38.730
We both worked at the same radio station.

00:22:38.730 --> 00:22:44.150
We both shared a flat in the West End
of, uh, Vancouver for a couple of years.

00:22:44.880 --> 00:22:47.570
And then we both kinda went
our separate ways, didn't we?

00:22:47.570 --> 00:22:50.540
I got out of radio and got
into freelance voice work.

00:22:50.570 --> 00:22:51.350
Eric Westra: Smart move.

00:22:51.350 --> 00:22:52.340
Jim Conrad: Thank you.

00:22:52.340 --> 00:22:55.639
Yeah, because my radio career
wasn't going along very well.

00:22:55.730 --> 00:22:56.270
Eric Westra: Neither was mine.

00:22:57.500 --> 00:23:00.780
Jim Conrad: So, you know, as
Clint Eastwood said, a good

00:23:00.780 --> 00:23:02.489
man knows his limitations.

00:23:02.489 --> 00:23:02.909
Eric Westra: That's right.

00:23:02.909 --> 00:23:04.830
And I started believing
my own press releases.

00:23:04.830 --> 00:23:05.129
Jim Conrad: Yes.

00:23:05.250 --> 00:23:06.800
Uh, well, exactly.

00:23:07.100 --> 00:23:09.810
So I got into the voiceover, uh, business.

00:23:09.840 --> 00:23:14.879
You using that incredible brain of
yours, my friend, uh, went back to school

00:23:14.879 --> 00:23:17.760
and learned to write computer code.

00:23:17.820 --> 00:23:19.260
So what was it like?

00:23:20.040 --> 00:23:23.939
Yeah, because I remember, uh, visiting
you up in Peachland when you were

00:23:23.939 --> 00:23:28.260
going to the University of Okanagan
taking, uh, computer science, right?

00:23:28.889 --> 00:23:32.610
And, uh, you told me that fascinating
statistic where they had a

00:23:32.610 --> 00:23:35.040
number of eager young applicants.

00:23:35.040 --> 00:23:40.765
You were a little bit older than most
in the class by maybe five or six years.

00:23:40.765 --> 00:23:41.215
Eric Westra: At least.

00:23:42.265 --> 00:23:46.045
Jim Conrad: And all of a sudden
the professor gets up and starts

00:23:46.075 --> 00:23:49.985
filling blackboard after blackboard
full of algebraic equations.

00:23:50.435 --> 00:23:54.835
And soon after the attrition rate
started to, started to kick in.

00:23:54.865 --> 00:23:56.285
Eric Westra: I don't think this is for me.

00:23:56.315 --> 00:23:56.945
I mean,

00:23:56.985 --> 00:23:58.375
Jim Conrad: I just like
playing computer games.

00:23:58.405 --> 00:23:58.645
Eric Westra: Yeah.

00:23:58.825 --> 00:24:00.115
I, that's exactly it.

00:24:00.115 --> 00:24:01.540
A lot of the people who got involved.

00:24:01.540 --> 00:24:04.270
Jim Conrad: So is that, is that
sort of, again, and in our timeline

00:24:04.270 --> 00:24:06.790
now, we're in the early nineties.

00:24:06.790 --> 00:24:07.600
Eric Westra: This is the early nineties.

00:24:07.690 --> 00:24:07.780
Yeah.

00:24:07.780 --> 00:24:10.840
I graduated, uh, 96.

00:24:10.870 --> 00:24:11.200
Yeah.

00:24:11.320 --> 00:24:14.560
Jim Conrad: So this is, so I wanna
get into the whole sort of evolution

00:24:14.560 --> 00:24:16.450
of tech from your perspective.

00:24:16.510 --> 00:24:16.780
Eric Westra: Okay.

00:24:16.900 --> 00:24:17.890
Jim Conrad: Since that time.

00:24:17.890 --> 00:24:21.970
So you started writing code, first of all,
that, that's, that was the perception,

00:24:22.060 --> 00:24:25.900
uh, of kids coming outta high school,
was that computers were these wonderful,

00:24:25.905 --> 00:24:31.530
fancy toys that they loved to play with
but they really didn't have any idea

00:24:31.949 --> 00:24:33.810
what the nuts and bolts were, right?

00:24:33.870 --> 00:24:34.260
Eric Westra: No.

00:24:34.260 --> 00:24:37.139
And, but at the same time, things
have changed since then because

00:24:37.139 --> 00:24:41.290
now, uh, kids actually learn
to code like in grade school.

00:24:41.290 --> 00:24:44.850
And, uh, and know that's a good
thing because it, math is not

00:24:44.850 --> 00:24:46.260
the greatest for a lot people.

00:24:46.290 --> 00:24:50.569
But if you, if you bring the math
piece into the computer realm then it's

00:24:50.569 --> 00:24:53.989
a lot easier for them to accept and
they're more inspired to, to do it.

00:24:54.229 --> 00:24:57.640
But yeah, I, I started writing
code, but that's not what I do now.

00:24:57.640 --> 00:25:00.389
I mean, I, I, we call
'em code monkeys, right?

00:25:00.389 --> 00:25:03.060
Because, you know, you're
essentially, that's what you're doing.

00:25:03.060 --> 00:25:04.245
You're tap, tap, tap, tap, tap.

00:25:04.290 --> 00:25:07.820
You're lucky to bang off six good
lines of computer code a day.

00:25:08.520 --> 00:25:09.330
Uh, 6 to 10.

00:25:09.660 --> 00:25:11.580
And I learned a bunch
of different languages.

00:25:11.610 --> 00:25:18.030
Cobol, you know, Modula-2, I learned,
uh, JavaScript and all this other stuff.

00:25:18.270 --> 00:25:20.280
I don't use any of that anymore.

00:25:20.620 --> 00:25:23.850
Uh, but it was a good way to sort
of set up what I ended up doing.

00:25:23.880 --> 00:25:24.150
Jim Conrad: Yeah.

00:25:24.180 --> 00:25:24.390
Yeah.

00:25:24.940 --> 00:25:29.350
Um, so from code writing, uh, what
was the first gig after university?

00:25:29.350 --> 00:25:33.370
You came to Vancouver as I recall,
and worked with a company here that

00:25:33.370 --> 00:25:37.750
was trying to do something a bit,
a bit ahead of their time, which

00:25:37.750 --> 00:25:42.190
was set up mainframe systems that
people would just access from their,

00:25:42.790 --> 00:25:44.680
Eric Westra: Well, that's
a bit of a simplification,

00:25:44.680 --> 00:25:46.150
but what happened was I got,

00:25:46.150 --> 00:25:46.810
Jim Conrad: I'm a simple guy.

00:25:46.870 --> 00:25:47.320
Eric Westra: No, no, no.

00:25:47.650 --> 00:25:48.025
I ended up.

00:25:48.435 --> 00:25:51.555
Uh, my first gig outta school
was with the Premier's office.

00:25:51.675 --> 00:25:51.915
Jim Conrad: Right.

00:25:52.245 --> 00:25:55.485
Eric Westra: I worked for, uh, BC
Trade Development and the Premier's

00:25:55.485 --> 00:25:59.175
office because they were under
the same IT umbrella in Vancouver.

00:25:59.625 --> 00:26:03.365
And there's a great story about Mike
Harcourt when he was the Premier.

00:26:03.830 --> 00:26:08.810
I was doing some of the, uh, the IT
support and one day I get a phone call

00:26:08.810 --> 00:26:11.899
from Mike's office, which was just down
the hall from the Trade Development

00:26:11.899 --> 00:26:13.760
Corp. Eric, can you come on over?

00:26:13.760 --> 00:26:16.639
'Cause Mike has a
problem with his printer.

00:26:16.699 --> 00:26:18.530
And I said, sure, I can, I can come over.

00:26:18.590 --> 00:26:18.710
Jim Conrad: Yeah.

00:26:18.860 --> 00:26:22.189
Eric Westra: So, uh, apparently I, I
walked into the office and, and Dixie,

00:26:22.280 --> 00:26:26.810
who is normally, uh, Mike Harcourt's EA,
executive assistant, was away on vacation.

00:26:26.810 --> 00:26:28.290
They got somebody from the pool, I guess.

00:26:28.820 --> 00:26:31.970
And so I said hello to her and I walked
into his office and he was sitting at his

00:26:31.970 --> 00:26:37.479
desk and he was signing a, a big stack of
paper and you know, one after the other.

00:26:38.229 --> 00:26:39.879
And I said, Hey, how's it going, Mike?

00:26:39.910 --> 00:26:41.840
'Cause he was, was a
first name kind of guy.

00:26:41.840 --> 00:26:43.510
And, uh, really nice guy.

00:26:44.020 --> 00:26:47.350
And I started to work on the printer
and then all of a sudden I hear this,

00:26:47.500 --> 00:26:49.840
you know what the, across the room.

00:26:50.310 --> 00:26:52.030
And I, I said, what's the problem?

00:26:52.030 --> 00:26:53.200
He said, well, have a look at this.

00:26:53.590 --> 00:26:57.750
And he was signing a whole bunch
of, uh, proclamations, thanking

00:26:57.750 --> 00:27:01.620
people for participating in the,
uh, the Vancouver Centennial.

00:27:01.649 --> 00:27:03.389
There must have been 5
or 600 of these things.

00:27:03.810 --> 00:27:09.510
And the salutation at the bottom of the
letter, what had happened was the, the new

00:27:09.600 --> 00:27:12.840
EA had just gone through, you know, spell
check, click, click, click, click, without

00:27:12.840 --> 00:27:14.899
actually looking at what she was doing.

00:27:15.250 --> 00:27:19.050
And the salutation read, the
right honorable Mike haircut.

00:27:21.905 --> 00:27:26.930
And, and, uh, you know, Mike, not
exactly what you would call hirsute.

00:27:27.190 --> 00:27:31.319
Uh, and, and so, uh, that was, that
was one of the funnier stories.

00:27:31.319 --> 00:27:31.719
But yeah, I,

00:27:31.719 --> 00:27:33.439
Jim Conrad: Follicly challenged.

00:27:33.459 --> 00:27:34.989
Eric Westra: Follicly challenged
Mike was, yeah, that's right.

00:27:36.340 --> 00:27:38.889
Uh, but yeah, that was one of
the, uh, the earliest gigs.

00:27:38.889 --> 00:27:42.250
And then eventually I, I got into
tech communications, you know, tried

00:27:42.250 --> 00:27:46.810
to marry my background in tech with
broadcasting, and then got involved

00:27:46.810 --> 00:27:50.345
with business analysis and business
process and business optimization

00:27:50.675 --> 00:27:53.104
and then solution architecting.

00:27:53.104 --> 00:27:55.534
Uh, putting technology together
with companies that were

00:27:55.534 --> 00:27:56.695
going through a lot of change.

00:27:56.884 --> 00:27:58.314
So that's, that's what I do now.

00:27:58.665 --> 00:28:04.905
Jim Conrad: An overall sort of context
of this digital revolution of ours,

00:28:06.435 --> 00:28:10.645
obviously, uh, it's the machine that
changed the world, the computer.

00:28:11.175 --> 00:28:17.860
But a lot of people now are, I
guess, rethinking what that change

00:28:18.000 --> 00:28:21.470
or how much it has changed us.

00:28:21.530 --> 00:28:29.090
And, and relating back to, you know, being
human and, and how we relate to computers.

00:28:29.600 --> 00:28:34.410
And now the whole idea of
artificial intelligence.

00:28:35.420 --> 00:28:42.380
And we're seeing, while we first saw
AI literally and viscerally in 2001.

00:28:42.794 --> 00:28:44.925
And the voice of HAL, the computer.

00:28:45.245 --> 00:28:49.455
My favorite line, which I've dubbed
the HAL 9000 breakup line is,

00:28:49.784 --> 00:28:54.254
HAL 9000: Dave, this conversation
can serve no purpose anymore.

00:28:55.010 --> 00:28:55.500
Goodbye.

00:28:56.399 --> 00:28:56.979
Jim Conrad: That's right.

00:28:57.794 --> 00:28:58.274
Goodbye.

00:28:58.304 --> 00:29:00.225
Eric Westra: Yeah, goodbye.

00:29:00.225 --> 00:29:01.034
Jim Conrad: Uh, HAL.

00:29:01.084 --> 00:29:01.444
HAL?

00:29:01.684 --> 00:29:02.414
Do you read me, HAL?

00:29:02.824 --> 00:29:05.895
Eric Westra: There's a great example
of an artificial intelligence machine

00:29:06.135 --> 00:29:10.094
that didn't have any morality and,
and a lot of the discussion around,

00:29:10.485 --> 00:29:13.455
you know, artificial intelligence
these days, 'cause now computers

00:29:13.455 --> 00:29:14.584
are writing their own algorithms.

00:29:15.165 --> 00:29:17.925
They're learning and they're
changing their own computer code.

00:29:18.014 --> 00:29:22.665
Jim Conrad: Now explain to me what
is defined as machine learning.

00:29:22.905 --> 00:29:25.064
Eric Westra: Well, there's,
there's deep machine learning

00:29:25.064 --> 00:29:26.115
and there's machine learning.

00:29:26.115 --> 00:29:28.004
And so the two are a little different.

00:29:28.064 --> 00:29:34.409
Um, machine learning is when a computer
algorithm is given the capacity to write

00:29:34.470 --> 00:29:37.020
its own code, to learn from its mistakes.

00:29:37.080 --> 00:29:37.960
And to

00:29:37.990 --> 00:29:40.379
Jim Conrad: To replicate
itself and heal itself.

00:29:40.409 --> 00:29:40.620
Eric Westra: Right.

00:29:40.620 --> 00:29:43.699
So there's a core set of code and
instructions in the algorithm.

00:29:43.699 --> 00:29:48.000
And then it says, you know, go out and try
something and under these circumstances

00:29:48.380 --> 00:29:50.179
if it works, make a note of that.

00:29:50.250 --> 00:29:53.350
And write your own code so that you
do it the same way the next time.

00:29:53.990 --> 00:29:56.370
And then there's deep machine
learning, which is really interesting.

00:29:56.370 --> 00:30:01.264
And that's around you can build reward
systems into computers the same way

00:30:01.264 --> 00:30:03.065
that you do with pets and humans.

00:30:03.485 --> 00:30:08.084
You know, if a dog does something right,
uh, you give it a, you give it a treat.

00:30:08.465 --> 00:30:09.104
If a child,

00:30:09.104 --> 00:30:09.965
Jim Conrad: Good boy, good boy.

00:30:10.294 --> 00:30:12.985
Eric Westra: If a child does
something right, you reinforce that.

00:30:12.985 --> 00:30:14.794
You give them a hug,
you say, good, good job.

00:30:15.004 --> 00:30:15.895
That's the way you do it.

00:30:16.375 --> 00:30:18.365
So we're doing the same
thing with machines now.

00:30:18.365 --> 00:30:21.274
We're actually creating what we
call artificial reward systems.

00:30:21.485 --> 00:30:21.665
Jim Conrad: Wow.

00:30:21.754 --> 00:30:25.725
Eric Westra: So computer will do
something right, it will earn two stars.

00:30:26.124 --> 00:30:28.480
And if it does better,
it gets three stars.

00:30:28.510 --> 00:30:33.850
So it focuses completely on making
sure it gets as many rewards as it can.

00:30:34.149 --> 00:30:36.370
And through that process,
it learns faster.

00:30:36.460 --> 00:30:36.820
Jim Conrad: Wow.

00:30:37.540 --> 00:30:41.740
Now, and this is one, once again,
one of those ethical questions

00:30:41.740 --> 00:30:43.780
that, uh, that has to come up.

00:30:44.230 --> 00:30:46.330
It came up when they were
making the atomic bomb.

00:30:46.540 --> 00:30:52.030
Yes, we have the scientific capacity
and the knowledge to create this.

00:30:52.845 --> 00:30:54.784
The question is, should we.

00:30:56.175 --> 00:31:01.185
Have we reached a point of that
question in artificial intelligence?

00:31:01.305 --> 00:31:03.024
Eric Westra: I think we
continue to ask that question.

00:31:03.024 --> 00:31:04.504
And it's a good question to ask.

00:31:04.824 --> 00:31:09.799
Um, there's no, you know, artificial
intelligence has huge promise.

00:31:10.110 --> 00:31:14.100
Um, there, there are lots of
positive things that can come of it.

00:31:14.470 --> 00:31:19.139
Um, but like anything, there are a lot
of very scary and negative things that

00:31:19.139 --> 00:31:20.850
come of it, if it isn't handled properly.

00:31:20.850 --> 00:31:24.990
And this is all through this
emerging field called Robo Ethics.

00:31:25.050 --> 00:31:30.310
Jim Conrad: Now, thank God we had
Isaac Asimov way back when because

00:31:30.330 --> 00:31:34.230
I, I was actually a big fan of Isaac
Asimov, uh, science fiction novels.

00:31:34.290 --> 00:31:34.470
Eric Westra: Yeah.

00:31:34.709 --> 00:31:38.600
Jim Conrad: Now what were
the three rules of robotics?

00:31:38.950 --> 00:31:40.960
Eric Westra: So the, the,
Asimov proposed that there were

00:31:40.960 --> 00:31:42.945
three laws in his writings.

00:31:43.305 --> 00:31:48.285
The first law was that a robot or
a machine could not, through either

00:31:48.285 --> 00:31:51.635
action or inaction, harm a human being.

00:31:52.325 --> 00:31:57.525
The second law was that, uh, the
robot or the machine had to follow

00:31:57.795 --> 00:32:01.515
the instructions given to it by a
human, unless it broke the first law.

00:32:02.175 --> 00:32:05.909
And the third law was that it
had to self preserve itself.

00:32:05.915 --> 00:32:10.530
It, it couldn't, uh, it had to go through,
you know, all the machinations necessary

00:32:10.530 --> 00:32:14.360
to preserve its existence provided
it didn't break the first two laws.

00:32:15.129 --> 00:32:17.250
Jim Conrad: So it had a, it had
a built in, the third law was

00:32:17.250 --> 00:32:20.909
self-preservation on, with a caveat,
if it didn't break the first two laws.

00:32:20.909 --> 00:32:21.270
Eric Westra: That's right.

00:32:21.720 --> 00:32:25.220
Jim Conrad: So inside that sort
of closed system was an ethical

00:32:25.880 --> 00:32:27.780
behavior model for robots.

00:32:27.870 --> 00:32:28.020
Eric Westra: Right.

00:32:28.260 --> 00:32:32.504
Jim Conrad: Uh, how practical
is that to build into today's

00:32:32.565 --> 00:32:36.945
artificial intelligence, can we
create those three laws to govern AI?

00:32:37.004 --> 00:32:38.055
Eric Westra: The laws are good ones.

00:32:38.055 --> 00:32:39.105
They seem to make sense.

00:32:39.345 --> 00:32:41.295
But they can produce, um,

00:32:41.325 --> 00:32:42.225
Jim Conrad: Asimov was a genius.

00:32:42.345 --> 00:32:43.065
Eric Westra: Yeah, he was.

00:32:43.125 --> 00:32:44.175
This is 1942.

00:32:44.295 --> 00:32:44.565
Jim Conrad: Yeah.

00:32:45.135 --> 00:32:48.615
Eric Westra: The thing about it though
is that even making things as simple

00:32:48.615 --> 00:32:52.935
as that, you can end up with machines,
uh, getting into, uh, you know, uh,

00:32:52.965 --> 00:32:54.405
cognitive dissonance we call it.

00:32:54.405 --> 00:32:58.980
Where, where it tries not to break
a law, it makes a move to avoid

00:32:58.980 --> 00:33:03.030
breaking that law and then by doing
so, it breaks another law and then

00:33:03.030 --> 00:33:04.470
it goes back to the first one.

00:33:04.770 --> 00:33:05.760
And so it gets caught,

00:33:05.760 --> 00:33:06.270
Jim Conrad: Vapor lock.

00:33:06.300 --> 00:33:06.500
Eric Westra: Yeah.

00:33:06.669 --> 00:33:07.240
Essentially.

00:33:07.870 --> 00:33:11.280
Um, and he, he actually did write
about that in a short story.

00:33:11.280 --> 00:33:14.040
He talked about a robot that got
caught in one of those loops.

00:33:14.130 --> 00:33:18.350
But, you know, if you use modern
day sort of explanations around this

00:33:18.350 --> 00:33:23.909
sort of thing, it becomes much more
difficult to nail all this down into

00:33:23.909 --> 00:33:25.550
a, to a very small set of rules.

00:33:25.610 --> 00:33:27.409
Jim Conrad: Because there
are so many applications.

00:33:27.440 --> 00:33:29.410
You mentioned the positive aspects of AI.

00:33:29.930 --> 00:33:33.139
Uh, give me, let's say
the top three, not Alexa.

00:33:33.260 --> 00:33:36.200
Eric Westra: Well, Alexa and
voice recognition certainly

00:33:36.200 --> 00:33:37.310
are very, very big right now.

00:33:37.310 --> 00:33:39.409
Jim Conrad: If they make, if they
make one of those things that sounds

00:33:39.409 --> 00:33:41.760
like Stephen Fry doing Jeeves.

00:33:42.570 --> 00:33:43.260
I'm on it.

00:33:43.560 --> 00:33:46.470
Jeeves: I was sent by the agencies
and I was given to understand

00:33:46.470 --> 00:33:47.399
that you're required a valet.

00:33:47.785 --> 00:33:50.460
Eric Westra: I, I think eventually you,
you'll, you can have your voice, voice

00:33:50.460 --> 00:33:52.010
assistant be whoever you want it to be.

00:33:52.510 --> 00:33:57.210
There are a set of circumstances can
come up in modern life where the machine

00:33:57.210 --> 00:34:00.930
would be faced with so many different
criter to make a decision, it would

00:34:00.930 --> 00:34:04.320
be very difficult for it necessarily
to make the moral or the ethical one.

00:34:05.010 --> 00:34:08.574
Jim Conrad: You mentioned this
ethical, moral dilemma in a, and I

00:34:08.574 --> 00:34:12.014
guess it was a thought experiment
in regards to self-driving cars.

00:34:12.205 --> 00:34:13.105
Gimme a run through on that.

00:34:13.134 --> 00:34:16.674
Eric Westra: This was put to a group
of university students, uh, it's called

00:34:16.674 --> 00:34:18.114
the child in the tunnel conundrum.

00:34:18.654 --> 00:34:21.895
So you're in an autonomous vehicle
and you're with your family and the

00:34:21.895 --> 00:34:23.605
car is driving you on a mountain road.

00:34:23.605 --> 00:34:27.250
It's a two-lane mountain road, and
your two kids are in the backseat.

00:34:27.250 --> 00:34:28.375
Your wife is sitting beside you.

00:34:28.975 --> 00:34:32.614
And you come around a curve and
you're, you know, doing relatively

00:34:32.774 --> 00:34:36.255
the speed limit, you know, it's an
autonomous car, it's following the law.

00:34:36.855 --> 00:34:40.995
But all of a sudden, uh, as you come
around the curve, there is a tunnel

00:34:40.995 --> 00:34:45.614
entrance and uh, it's got a concrete
sort of abutment around the face of it.

00:34:46.125 --> 00:34:50.444
And, uh, a small child darts out
from behind a rock near the tunnel

00:34:50.444 --> 00:34:52.615
entrance and falls onto the road.

00:34:53.275 --> 00:34:56.480
The car has two choices really,
given the circumstances.

00:34:56.509 --> 00:34:59.480
It's not going slow
enough to stop in time.

00:34:59.569 --> 00:35:05.450
It either can send the car flying into
the concrete or run over the child.

00:35:06.109 --> 00:35:09.020
Jim Conrad: So into the abutment,
the, the passenger's in the

00:35:09.020 --> 00:35:10.849
car, the self-driving car.

00:35:11.109 --> 00:35:16.200
So the computer that's, that's
governing that should try to preserve

00:35:16.319 --> 00:35:21.009
what it's meant to do, which is to
avoid harm to human life, meaning

00:35:21.009 --> 00:35:22.620
the passengers inside the car.

00:35:22.649 --> 00:35:23.220
Eric Westra: Right.

00:35:23.250 --> 00:35:27.220
Jim Conrad: However, by, by
running over the child, it's

00:35:27.290 --> 00:35:28.700
potentially killing the child.

00:35:28.700 --> 00:35:31.360
In fact, it's almost a certainty,
given the circumstances.

00:35:31.360 --> 00:35:33.690
Eric Westra: Okay, so
what does the car do?

00:35:33.750 --> 00:35:37.470
And when the group of university
students were asked this about this

00:35:37.470 --> 00:35:41.459
problem, more than 60% of them said
the car should run over the child.

00:35:42.329 --> 00:35:45.875
What the heck does a child, what,
what is that child doing in the

00:35:45.875 --> 00:35:47.535
middle of nowhere in the first place?

00:35:47.645 --> 00:35:49.715
Jim Conrad: The evil spawn
of a wiley coyote or what?

00:35:49.715 --> 00:35:49.775
Eric Westra: Yeah.

00:35:49.775 --> 00:35:54.315
It would be one death compared to
potentially one or more in the car.

00:35:54.545 --> 00:35:58.095
Jim Conrad: So there's a value judgment
that has to be made almost instantaneously

00:35:58.125 --> 00:36:00.255
by the autonomous computer driven car.

00:36:00.275 --> 00:36:03.815
Eric Westra: It's no longer a
machine just making, uh, a judgment

00:36:03.815 --> 00:36:05.465
based on the best way to do things.

00:36:05.465 --> 00:36:07.115
It's making life and death decisions.

00:36:07.115 --> 00:36:07.445
Jim Conrad: Yes.

00:36:07.565 --> 00:36:07.925
Yes.

00:36:07.955 --> 00:36:11.790
Eric Westra: And that is
where things get really weird.

00:36:14.955 --> 00:36:21.435
Jim Conrad: Well, and then given
the technology surrounding military

00:36:21.654 --> 00:36:28.125
applications of drones, and right
now, as I believe they are operated

00:36:28.125 --> 00:36:32.215
by human beings in a bunker
somewhere outside of Las Vegas.

00:36:32.345 --> 00:36:34.925
Eric Westra: They're autonomous,
some of them, uh, they, they don't

00:36:34.925 --> 00:36:36.305
need, require supervision at all.

00:36:36.335 --> 00:36:40.625
Jim Conrad: Well, that's where I,
there's another ethical, moral question.

00:36:40.655 --> 00:36:43.895
You know, you've got a basically
death from above being run

00:36:43.895 --> 00:36:47.265
by an AI driven computer.

00:36:47.955 --> 00:36:53.584
And the decision making processes inside
that computer were coded by humans.

00:36:53.884 --> 00:36:56.944
So there still is a human connection.

00:36:57.234 --> 00:36:59.625
But it's, but it's pretty
far removed, isn't it?

00:36:59.745 --> 00:37:00.225
Eric Westra: It is.

00:37:00.314 --> 00:37:04.335
And uh, you know, there's talk about
eventually some of these drones,

00:37:04.335 --> 00:37:06.495
I mean, we're getting pretty dark
here, but some of these drones,

00:37:06.524 --> 00:37:08.720
Jim Conrad: It's a dark and
stormy Saturday morning.

00:37:08.720 --> 00:37:08.870
Eric Westra: Yes.

00:37:10.069 --> 00:37:12.830
You know that they would use face
recognition to sort of, you know,

00:37:12.830 --> 00:37:15.200
recognize the face of a terrorist and,

00:37:15.200 --> 00:37:16.490
Jim Conrad: As opposed to a child,.

00:37:16.490 --> 00:37:17.089
Eric Westra: Uh, yeah.

00:37:17.299 --> 00:37:20.890
And then, you know, go out and, and
essentially assassinate that person.

00:37:21.410 --> 00:37:26.930
So that's a really dark side of AI and,
and there will always be that, I guess.

00:37:26.930 --> 00:37:29.660
But there are some very positive things.

00:37:29.660 --> 00:37:31.610
I'm running into it now every day.

00:37:31.610 --> 00:37:31.670
Jim Conrad: Yeah.

00:37:31.670 --> 00:37:35.330
I thought I mentioned, or I
requested a bright side aI story.

00:37:36.799 --> 00:37:39.799
We, we went immediately to the dark
side, so let's, let's find some

00:37:39.950 --> 00:37:41.399
ray of sunshine here somewhere.

00:37:41.399 --> 00:37:45.560
What's, what's the number one
beneficial use to humans and

00:37:45.560 --> 00:37:47.570
humankind for artificial intelligence?

00:37:47.600 --> 00:37:49.400
Eric Westra: Well, there, there,
there are lots of them, you know.

00:37:49.430 --> 00:37:50.360
Jim Conrad: Robot surgeons?

00:37:50.810 --> 00:37:52.549
Eric Westra: Robotic surgery is one.

00:37:52.670 --> 00:37:54.170
Robotic diagnosis.

00:37:54.460 --> 00:37:58.740
Jim Conrad: I don't know if I want
HAL, uh, doing my prostate surgery.

00:38:00.330 --> 00:38:01.259
HAL 9000: I'm sorry, Jim.

00:38:01.320 --> 00:38:02.730
The knife slipped.

00:38:04.500 --> 00:38:07.740
Eric Westra: But you know, I'm running
into situations now where, uh, it's being

00:38:07.740 --> 00:38:14.310
used for example, to determine how to hook
up a borrower with a lending institution

00:38:14.610 --> 00:38:16.470
based on the borrower's profile.

00:38:16.470 --> 00:38:19.920
There's a whole bunch of, you know, stuff
that goes into making that decision.

00:38:20.650 --> 00:38:23.650
And AI can be very helpful
in making those connections.

00:38:24.000 --> 00:38:26.010
So that's more of a
business efficiency thing.

00:38:26.160 --> 00:38:27.000
HAL 9000: I'm sorry, Jim.

00:38:27.524 --> 00:38:29.595
Your loan application has been denied.

00:38:30.495 --> 00:38:30.975
Eric Westra: That's right.

00:38:31.575 --> 00:38:35.805
Uh, and you know, but there are lots
of applications emerging now, and

00:38:35.805 --> 00:38:40.095
it's gonna become more and more, uh,
a part of our lives, uh, every day.

00:38:42.314 --> 00:38:46.274
Jim Conrad: I've got a question
for you, given that technology is

00:38:46.365 --> 00:38:51.754
speeding up, that some people are
becoming somewhat overwhelmed by it.

00:38:52.174 --> 00:38:53.680
We're, we're trying to adapt.

00:38:54.345 --> 00:38:59.175
But it seems that as we, as we
adapt, the next thing comes along

00:38:59.685 --> 00:39:01.455
and then we have to adapt again.

00:39:01.545 --> 00:39:06.555
And some people are retreating
and there's a kind of a backlash.

00:39:06.915 --> 00:39:12.900
Do you think that there will be a, a
time when when we'll reject technology

00:39:13.080 --> 00:39:17.730
and, and try to go back, back in
time if it, if it was possible.

00:39:17.760 --> 00:39:18.900
Eric Westra: Well, there already has been.

00:39:18.900 --> 00:39:22.160
I mean, I keep thinking
about Joe Walsh's song,

00:39:24.340 --> 00:39:26.460
Analog Man.

00:39:29.730 --> 00:39:32.910
There already has been a bit of a
backlash, especially in the music scene.

00:39:33.570 --> 00:39:37.279
It started with the release
of CDs many years ago.

00:39:37.690 --> 00:39:41.415
And the digitized sound just didn't
have the same kind of, the, the

00:39:41.415 --> 00:39:43.095
word that was used was warmth.

00:39:43.215 --> 00:39:43.515
Jim Conrad: Right.

00:39:43.965 --> 00:39:47.595
Eric Westra: And so in the last few
years especially, we've been seeing a

00:39:47.595 --> 00:39:53.115
resurgence in, uh, good old vinyl and
artists actually pressing their own, uh,

00:39:53.145 --> 00:39:54.615
instead of going through record companies.

00:39:54.615 --> 00:39:58.455
And, and it's because that
audio files prefer the sound.

00:39:58.455 --> 00:39:59.675
So there's one example.

00:40:00.375 --> 00:40:02.835
I think the latest thing
that's, that's been.

00:40:03.195 --> 00:40:09.745
Sort of making people recoil sometimes
in sheer, utter terror is AI.

00:40:10.155 --> 00:40:12.915
Jim Conrad: Now define AI for me.

00:40:12.945 --> 00:40:16.125
In a practical sense, what is AI?

00:40:16.635 --> 00:40:18.255
And how did it develop?

00:40:19.065 --> 00:40:23.805
Eric Westra: So it's based on what's
called an LLM or a large language model.

00:40:23.935 --> 00:40:32.015
And the idea is you synthesize, uh,
using algorithms, and hardware, uh,

00:40:32.045 --> 00:40:37.905
the neural pathways in the mind, the
human brain, that process language.

00:40:37.965 --> 00:40:40.665
Jim Conrad: And it was originally
developed to actually model the

00:40:40.665 --> 00:40:44.655
brain in order to do research on
how, on how the brain works, right?

00:40:44.885 --> 00:40:46.275
Eric Westra: Neural networks work, yes.

00:40:46.275 --> 00:40:50.415
And then they became quite helpful
when it came to large language models.

00:40:50.475 --> 00:40:55.785
Because you think about, you know,
one word, um, the word right could be

00:40:55.785 --> 00:40:57.940
spelled a couple of different ways.

00:40:58.390 --> 00:41:00.000
Could mean several different things.

00:41:02.709 --> 00:41:10.440
So, you know, the large language model
is there to teach the algorithm what the

00:41:10.440 --> 00:41:15.030
actual meaning of the word and a number
of words fit, that are fitted together,

00:41:15.090 --> 00:41:17.310
what that means in the context for it.

00:41:17.460 --> 00:41:17.850
Jim Conrad: Yes.

00:41:18.570 --> 00:41:21.899
One could define intelligence
a number of different ways.

00:41:22.680 --> 00:41:29.535
A lot of people define intelligence
as in AI, in the ability to assimilate

00:41:29.745 --> 00:41:34.274
all sorts of information to correlate
to, to come up with solutions and

00:41:34.274 --> 00:41:37.234
answers to questions quicker and faster.

00:41:37.964 --> 00:41:45.165
Or is intelligence coming back to the
human aspect is intelligence, creativity?

00:41:46.424 --> 00:41:49.109
Eric Westra: I think in,
in many respects it is.

00:41:49.299 --> 00:41:52.830
I mean, we can be creative with
language, which is a structure, and

00:41:52.830 --> 00:41:58.109
we find all sorts of different ways of
using musical notes and being creative.

00:41:58.770 --> 00:42:03.419
And they're, they're a very
finite set of sort of things.

00:42:03.540 --> 00:42:07.880
Uh, ABC, you know, the
notes, uh, the chords.

00:42:08.430 --> 00:42:13.740
So we start to see that, um, the,
the real danger actually that's being

00:42:13.740 --> 00:42:20.460
presented by AI is in the creative space,
because now it's possible to manipulate

00:42:20.760 --> 00:42:29.360
imagery, audio in a way that most fools,
uh, the human consumer, if you will,

00:42:29.390 --> 00:42:34.020
because the, the artificial intelligence
has learned what matters most to us.

00:42:34.410 --> 00:42:36.060
Jim Conrad: And that is the image?

00:42:36.150 --> 00:42:38.760
Eric Westra: Well, it could be
the image or nuances of the image.

00:42:38.760 --> 00:42:44.240
So you take a, for example, you could
ask a, a chat bot, or, uh, a piece of

00:42:44.240 --> 00:42:49.610
artificial intelligence in say, Adobe
Photoshop, create a picture for me of

00:42:49.610 --> 00:42:55.610
a guy age, so and so, uh, that has very
pleasant features, is a professional

00:42:55.610 --> 00:43:00.650
person and is modeled after people
who have good leadership qualities.

00:43:00.680 --> 00:43:01.000
Jim Conrad: Yes.

00:43:01.620 --> 00:43:08.420
So, so the, the AI functions as, but you
still a, a human being still has to input.

00:43:08.875 --> 00:43:09.235
Eric Westra: Oh yeah.

00:43:09.265 --> 00:43:10.195
Jim Conrad: Certain things.

00:43:10.225 --> 00:43:10.405
Eric Westra: Yeah.

00:43:10.405 --> 00:43:14.245
The intelligence is, is, is
essentially, you know, called from,

00:43:14.515 --> 00:43:17.785
from a lot of different sources and
usually from other people's input.

00:43:18.625 --> 00:43:21.365
It's, it's only as good as
the data that goes into it.

00:43:21.385 --> 00:43:26.125
And this is part of the problem right
now with AI, is that there's so much data

00:43:26.775 --> 00:43:30.105
going into it, that it's getting confused.

00:43:30.215 --> 00:43:33.075
Jim Conrad: And they
call that hallucination?

00:43:33.075 --> 00:43:34.315
Eric Westra: Hallucination, yeah.

00:43:35.105 --> 00:43:35.305
Jim Conrad: Yeah.

00:43:35.305 --> 00:43:37.305
We're familiar with that, aren't we?

00:43:38.955 --> 00:43:43.005
Eric Westra: For many years, my life
seemed to be one large hallucination.

00:43:43.605 --> 00:43:46.795
Uh, thankfully those days are over.

00:43:46.795 --> 00:43:51.134
Um, but the, the hallucinations
are, are created by just throwing

00:43:51.134 --> 00:44:01.024
a lot of data into the model rather
than focusing on quality of data.

00:44:01.254 --> 00:44:05.565
Jim Conrad: And, and that's where I,
I come always back to creativity and

00:44:05.565 --> 00:44:07.705
the artist when it comes to technology.

00:44:08.325 --> 00:44:13.605
Technology certainly has, has
released our ability to create,

00:44:13.875 --> 00:44:15.945
even in multi-track recording.

00:44:16.180 --> 00:44:20.235
You know, now you can track almost
infinitely and sound bites and

00:44:20.235 --> 00:44:21.705
samples and things like that.

00:44:21.795 --> 00:44:24.255
Eric Westra: But it's all
mimicry in one form or another.

00:44:24.285 --> 00:44:24.560
Jim Conrad: That's true.

00:44:24.920 --> 00:44:31.730
But as we go forward into this AI
world, how is our humanity going

00:44:31.730 --> 00:44:33.770
to be preserved, do you think?

00:44:34.940 --> 00:44:37.570
Eric Westra: Well, I think a lot of
people are asking that question now.

00:44:37.600 --> 00:44:42.689
And thankfully, some of the people like
Sam Altman, uh, you know, who heads

00:44:42.689 --> 00:44:47.165
up ChatGPT, are thinking not just in
terms of what the technology can do,

00:44:47.240 --> 00:44:48.525
but how it should be guard railed.

00:44:49.095 --> 00:44:49.695
Jim Conrad: Yes.

00:44:49.785 --> 00:44:52.875
Eric Westra: Uh, anybody working in
AI who isn't thinking about that,

00:44:53.085 --> 00:44:58.485
the ethics of AI and the humanity
behind AI, shouldn't be in the field.

00:44:58.575 --> 00:45:02.985
Because the potential is
so there for, for misuse.

00:45:03.255 --> 00:45:04.785
Jim Conrad: And that's altruistic.

00:45:05.355 --> 00:45:09.225
But as we know, corporate
culture is ruthless.

00:45:09.225 --> 00:45:09.615
Eric Westra: Sure.

00:45:09.880 --> 00:45:11.384
But, but the genie's outta the bottle.

00:45:11.535 --> 00:45:12.075
Jim Conrad: Yeah, that's true.

00:45:12.080 --> 00:45:12.420
Eric Westra: I mean, right.

00:45:12.525 --> 00:45:16.815
I mean, and, and, and every time a new
technology comes along, there's always a,

00:45:16.995 --> 00:45:18.735
Jim Conrad: A human cry.

00:45:18.735 --> 00:45:20.625
It's it's gonna be the end of humanity.

00:45:20.835 --> 00:45:24.375
Eric Westra: You know, uh, digitized
music was, you know, we use that example

00:45:24.375 --> 00:45:25.524
at the beginning of the conversation.

00:45:25.524 --> 00:45:31.794
I mean, everybody said, oh, it's, that's
the end of, you know, a lot of things.

00:45:32.154 --> 00:45:34.694
Uh, a lot of the, the artists
now are just gonna go for the

00:45:34.694 --> 00:45:35.805
lowest common denominator.

00:45:35.805 --> 00:45:37.935
They're not gonna get
reimbursed, et cetera, et cetera.

00:45:38.415 --> 00:45:38.805
Okay.

00:45:38.985 --> 00:45:39.495
Yes.

00:45:39.525 --> 00:45:43.825
There are changes whenever you
have a new technology introduced.

00:45:44.674 --> 00:45:46.365
Sometimes more than others.

00:45:46.845 --> 00:45:51.715
The thing is, how do we adapt
as humans to the technology?

00:45:51.775 --> 00:45:56.845
How do we make the best use of it,
and how do we make sure that we

00:45:56.845 --> 00:45:58.805
don't squander the opportunity?

00:45:59.595 --> 00:46:04.525
Instead of being afraid constantly about
what the technology might do wrong,

00:46:04.735 --> 00:46:06.155
let's focus on what it might do right.

00:46:11.545 --> 00:46:16.165
Jim Conrad: Artificial
intelligence, alien intelligence.

00:46:17.485 --> 00:46:17.545
Hmm.

00:46:18.779 --> 00:46:22.140
We live in interesting times, don't we?

00:46:23.160 --> 00:46:26.520
Here's what big tech has to say about AI.

00:46:27.540 --> 00:46:33.990
Sundar Pichai, CEO of Google, the future
of AI is not about replacing humans,

00:46:34.290 --> 00:46:37.230
it's about augmenting human capabilities.

00:46:38.000 --> 00:46:43.590
Ginni Rometty, former CEO of IBM says,
AI will not replace humans, but those

00:46:43.590 --> 00:46:47.550
who use AI will replace those who don't.

00:46:49.004 --> 00:46:56.924
And then perched high on the proverbial
fence, CEO of SpaceX and Tesla, Elon Musk

00:46:56.984 --> 00:47:02.955
said, AI is likely to be either the best
or worst thing to happen to humanity.

00:47:04.154 --> 00:47:04.845
Thanks, Elon.

00:47:06.044 --> 00:47:12.134
But the final word should go to
the ultimate tech nerd Bill Gates,

00:47:12.314 --> 00:47:17.730
co-founder of Microsoft, said, artificial
intelligence will evolve to become a

00:47:17.730 --> 00:47:25.320
super intelligence, we need to be mindful
of how it's developed and ensure that it

00:47:25.320 --> 00:47:30.255
aligns with humanities best interests.

00:47:34.125 --> 00:47:36.075
James Cromwell: There have always
been ghosts in the machine.

00:47:38.145 --> 00:47:43.545
Random segments of code that have grouped
together to form unexpected protocols,

00:47:45.165 --> 00:47:50.190
unanticipated these free radicals
and gender questions of free will.

00:47:52.299 --> 00:47:59.049
Creativity and even the nature of
what we might call the soul, or

00:48:01.140 --> 00:48:16.890
is it something

00:48:24.779 --> 00:48:25.910
more.

00:48:28.095 --> 00:48:30.885
Jim Conrad: That is the
Conovision Podcast, the spirit of

00:48:30.885 --> 00:48:35.745
storytelling, episode three, all
about artificial intelligence.

00:48:36.225 --> 00:48:41.715
My thanks to all the conscious
entities in the first AI G7 Summit.

00:48:41.985 --> 00:48:45.585
We are still trying to decode
that digital word salad.

00:48:46.075 --> 00:48:52.524
To Amazon's Alexa for dropping by with
illuminating answers to absurd questions,

00:48:53.185 --> 00:48:57.654
to Professor  Yuval Noah Harari, and his
insight into the differential patterns

00:48:57.654 --> 00:49:00.685
between intelligence and consciousness.

00:49:01.435 --> 00:49:06.279
The New York Times, for the brilliant
story about Douglas Rain, the Canadian

00:49:06.279 --> 00:49:10.840
Shakespearean actor, and the voice of
HAL, the computer in Stanley Kubrick's,

00:49:10.840 --> 00:49:15.670
1968 Masterpiece 2001: A Space Odyssey.

00:49:16.840 --> 00:49:20.710
And as well our tech guru Eric Westra,
for his thoughts on the future of

00:49:20.710 --> 00:49:23.200
AI and what it will mean to us all.

00:49:23.980 --> 00:49:28.375
And finally, we just heard a monologue
from the film I, Robot and the

00:49:28.375 --> 00:49:31.225
majestic voice of actor James Cromwell.

00:49:31.735 --> 00:49:37.475
I am James Conrad, AKA Cono,
and this has been Conovision,

00:49:37.495 --> 00:49:39.475
the spirit of storytelling.

00:49:40.135 --> 00:49:44.935
Until next time, remember, we
are all stories to be told.