Episode 8

full
Published on:

13th Feb 2025

Audacity, Antiquity, and AI

In this episode of the Confluence Podcast, hosts Randall Stephens and Evan Troxel are joined by special guest Christy Chapman from the University of Kentucky to discuss how AI has unlocked antiquity.

The discussion focuses on the origins of the Vesuvius Challenge, highlighting its key figure, Brent Seales, a research professor at the University of Kentucky. Christy, who works closely with Brent, shares insights about the ambitious goal to virtually unroll and read ancient scrolls carbonized by the eruption of Mount Vesuvius.

Using advanced technology and AI, Christy explains the significant progress made, including the pivotal role of the open-source contest funded by Nat Friedman from GitHub. The conversation covers the technical and human aspects of the project, emphasizing the importance of clear communication, community involvement, and the relentless pursuit of seemingly impossible ideas.

Later in the interview, the discussion around communication in technical projects has direct implications for both the Architecture, Engineering, and Construction (AEC) industry and software development teams. Just as the Vesuvius Challenge team had to bridge the gap between technical experts and conservators, AEC professionals and software developers must effectively communicate complex technical solutions to various stakeholders, from clients and product managers to contractors and end users. The emphasis on understanding stakeholder concerns, building trust, and creating clear communication channels is particularly relevant when implementing new technologies or methodologies in construction projects and software development cycles. The conversation highlights how successful innovation requires not just technical excellence, but also the ability to bring people along through empathy, clear explanation, and careful consideration of their perspectives and concerns.

Episode Links:

-----

The Confluence podcast is a collaboration between TRXL and AVAIL, and is produced by TRXL Media.

Transcript
Randall Stevens:

Welcome Welcome to another Confluence Podcast.

2

:

podcast I'm Randall Stephens, and as

usual, I've got Evan Troxel with me,

3

:

and today's guest is Christy Chapman.

4

:

It's going to be a fun episode, a little

bit different than previous ones we've

5

:

done in that Christy, She isn't in the

AEC industry, but she was, uh, I was

6

:

telling Evan before the call at this

year's Confluence, she spoke, I'll give

7

:

the background about why, why she was

invited, and a little bit about what she

8

:

talked about, but she was the favorite

speaker, like I was just telling her,

9

:

it's like, we survey everybody afterwards,

and it's always like, a lot of fun, so.

10

:

First of all, thanks Christy for coming

on and yeah, this is going to be fun

11

:

to talk about so I'll kind of cue it

up and then we can just kind of riff

12

:

on this and you can jump in but So

the connection is that I'm here in

13

:

Lexington, Kentucky I've been involved.

14

:

I'm a graduate of the

University of Kentucky.

15

:

I went to architecture school

there and I've known for now 25

16

:

years A guy named Brent Seals,

who's, uh, you can look him up.

17

:

Uh, he's kind of famous now.

18

:

Uh, and you'll, you'll learn through

this conversation why, but he, when I

19

:

first met him, you know, in the late

nineties, uh, he was at the university

20

:

and was really the computer vision expert.

21

:

Uh, one of those guys there that

was, I was into the graphic side.

22

:

And so I got introduced to Brent and

got to know him at the university.

23

:

So he's been a research professor

at the university of Kentucky.

24

:

Uh, since, you know, sometime, I assume,

in the 90s that he came and, uh, became

25

:

a, uh, a tenured professor there, uh, in

the computer science department, but he

26

:

was running what was called the Center for

Visualization and Virtual Environments,

27

:

and I was just involved over there because

of all the things, graphics, and all the

28

:

kind of cool stuff that they were doing.

29

:

Fast forward, you know, uh,

if you've seen, I keep saying

30

:

it's like the sexiest podcast.

31

:

Project that you could

imagine being involved with.

32

:

So Brent had this bright idea

and We're going to try to get

33

:

Brent on, uh, some point too.

34

:

We'll, we'll get Brent on here to talk

about this, but at some point, 20 odd

35

:

years ago, Brent had this idea that you

should be able to use a CT scanner or,

36

:

you know, scan these Vesuvian scrolls.

37

:

So if you know when Mount

Vesuvius erupted, you and

38

:

it covered Pompeii in ash.

39

:

Now what they found is

there are all these scrolls.

40

:

I would say they look

like little piles of poop.

41

:

They

42

:

look like a turd.

43

:

Uh, but it's like these scrolls were

covered in ash and there are hundreds of

44

:

them that they found, but they could never

read them or know what was inside of them.

45

:

So Brent started this project 20 years

ago thinking that, Hey, could you

46

:

scan these like with a CT scanner?

47

:

And then.

48

:

virtually unwrapped them so that you

could read what was on these pages.

49

:

It sounds like crazy.

50

:

Crazy enough that 20 years

later, they're actually doing it.

51

:

So it's actually, you know,

it's, it's, it's amazing.

52

:

You know, it's one of those, and

that's what Christie can, you know,

53

:

her talk was like, you know, how do

you go from some crazy idea through

54

:

all these hurdles and hoops and ups

and downs and all this kind of stuff.

55

:

But, uh, for those of you, you know,

listening to this podcast, go Google

56

:

Vesuvius challenge and go watch.

57

:

They've been on 60 Minutes, they've

been on all kinds of TV shows and all

58

:

these interviews now that it's happened.

59

:

But anyway, I'll use that to tee it up.

60

:

Christie, maybe you can tell a little

bit about yourself and . We can just,

61

:

Christy: Yeah, well, thanks

so much for having me.

62

:

I had a great time at the, uh,

Confluence event, and really

63

:

enjoyed meeting everybody.

64

:

Um, I'm happy to hear that everyone

enjoyed my talk because you, if you

65

:

remember, I did go a little bit over

and so people were getting really

66

:

hungry, so I'm glad to know that

didn't affect anything, but, Um,

67

:

I've been working for Brent Seals.

68

:

That's um, as uh, Randall said,

uh, that's the professor's name.

69

:

You can Google that too.

70

:

It's probably, a lot more will

pop up about the history and

71

:

everything if you use his name

instead of the Vesuvius Challenge.

72

:

But, I've been working for him since 2016.

73

:

I kind of started working for him as

just a freelance writer and editor.

74

:

And then, um, my role kind of

evolved into basically a, you know

75

:

Chief cat herder is what I say.

76

:

I try to manage or I do manage.

77

:

Pretty much all of the projects,

um, uh, on the logistics side.

78

:

I do everything except

code, is what I tell people.

79

:

And I actually do a little bit

of that, but not for my job.

80

:

So, um, yeah.

81

:

I came to the project right after

the Yengeti scroll was, that work had

82

:

been done and helped write that paper

and get that paper published and out.

83

:

And, um, it made front page news

n, way back then, that was in:

84

:

Um, so Yeah, that's a little bit about me.

85

:

I actually work remotely most of the

time from Florida But I am back and forth

86

:

to UK because most of what I do on the

team I don't have to be on campus to do.

87

:

Randall Stevens: Yeah.

88

:

So just to dig in a little bit more

background about the kind of evolution

89

:

of that project or kind of what led to

all this was they began scanning and kind

90

:

of just proving out the theory and then

writing code in the early days, literally

91

:

to, you know, if you imagine taking a

piece of paper and rolled it up and if

92

:

you were to scan it, the theory was that

the ink would show up or whatever was

93

:

on the paper would show up different in

those scans and those cross section scans.

94

:

And then the trick would be, okay, if

you've scanned through this, can you

95

:

virtually put it back together and then

96

:

unfold it, unwrap it?

97

:

And that was the first stuff went up.

98

:

First,

99

:

Christy: Well, and that was that was

the principle that continued I mean

100

:

with the scroll from Engaddy, which

I mentioned that was the first time

101

:

Anything had ever been virtually or

digitally opened and read that could

102

:

never be opened and read physically.

103

:

And that was actually a scroll that was

found near the Dead Sea, um, in Israel.

104

:

It was found in the 70s.

105

:

They were excavating this, uh, ancient,

uh, not ancient, but, Medieval Byzantine

106

:

time period synagogue that had burned and

came across where the ark would have been.

107

:

They came across that spot and noticed

there were these little charcoal Things

108

:

about the size of my finger lying there.

109

:

So, one of the archaeologists scooped

them up and put them in a box, and

110

:

they sat on a shelf until, you know,

:

111

:

the Dead Sea Scrolls and stuff.

112

:

And so they went ahead and scanned that,

Like you said, in a micro CT scanner.

113

:

And because at the time it was

written, most of the ink was made,

114

:

um, with ingredients and ingredients.

115

:

That included some type of metal

like, um, you know, uh, lead in

116

:

the ink or something like that.

117

:

And because the way x ray works

is it basically is looking

118

:

for differences in density.

119

:

Because the metal in the ink

is so much denser than the, uh,

120

:

than the parchment on which the

information was right, was written.

121

:

It showed up in the x ray.

122

:

And so, you know, there was a process.

123

:

It didn't just appear, which everyone

always kind of thinks, but there

124

:

was a set of algorithmic steps

that the data had to go through,

125

:

um, in order for that to happen.

126

:

So that was the first time, and because

it was out of, um, because it had the, uh,

127

:

some type of metal in the ink, it was, it

was easier than the herculaneum scrolls.

128

:

The herculaneum scrolls had a

couple of, um, More challenging, um,

129

:

issues than the scroll from En-Gedi.

130

:

Randall Stevens: Yeah, they, some of

the imagery that you'll see if you go

131

:

and Google around this project was,

you know, fast forward, AI has now come

132

:

into play and they actually, we can

maybe dive in a little bit, Christy,

133

:

to the story about how this all got,

kind of came together during kind of

134

:

COVID time, but anyway, there was a

135

:

challenge put together and now they're

using AI because, um, It's not even human.

136

:

You can't even, a human can't even see the

137

:

letters

138

:

on pieces of parchment that you know, that

139

:

you, that you can read.

140

:

It's indiscernible to a human eye and

it's so minute little changes that only

141

:

you know, now all of a sudden using

some training with ai all of a sudden

142

:

is being able to pick up all this.

143

:

Anyway, it's like I

said, it's a very sexy.

144

:

Project.

145

:

Christy: least a positive, um, you know,

positive kind of hopeful use of AI.

146

:

It's actually giving something

to society instead of really.

147

:

You know, taking something

away or replacing something.

148

:

Um, the way that the Vesuvius Challenge

came about is, um, like I said, because

149

:

the ink of, um, the En-Gedi scroll

had metal in it, it, it, it, It was,

150

:

our process worked very well on it.

151

:

But the problem with the ink that was

used in the, um, in ancient Rome in

152

:

the, at the time these were penned,

uh, is that it was made outta carbon.

153

:

Um, you know, they would like burn wood

literally and mix the soot with something,

154

:

and that would, and something else.

155

:

And that would be their ink.

156

:

So when you take something that

is, um, you know, take an ink made

157

:

out of carbon, and then you write

on papyrus, which is a, a carbon.

158

:

Um, object has carbon in it, and

then you carbonize it all, you

159

:

have carbon on carbon in carbon.

160

:

So it becomes impossible, yeah, you're

exactly right, it becomes impossible

161

:

to see the, the ink in the x ray.

162

:

Um, it's completely invisible,

like it just looks like a,

163

:

a blank x ray of papyrus.

164

:

Um, and so, I guess about in 2017, they

started working on a convolutional neural

165

:

network, which is just a type of, you

know, machine learning tool to train up

166

:

a network that could look at the CT data.

167

:

Metaphorically speaking, of course.

168

:

Look at the CT data, um, learn

that this voxel, which is a 3D

169

:

pixel, basically, very, very tiny.

170

:

This tiny 3D spot, um, has ink

and this tiny 3D spot doesn't.

171

:

And you train the network to

recognize the difference in the data.

172

:

Something that we can't see, but

the computer, because at the end

173

:

of the day, data is all just a

bunch of zeros and ones, right?

174

:

Arranged in different ways or whatever,

and so we could pick up the different

175

:

way the data appears when there's

ink versus when there isn't ink.

176

:

And we had to train that network,

we had to take photographs of some

177

:

of the scrolls that had been opened,

you know, hundreds years ago.

178

:

Um.

179

:

Because they did try.

180

:

I mean, when they found them at first,

they burned them because they, like I

181

:

said, they look like, and you said that

they look like just charred logs or waste,

182

:

and so they burned them for fuel, but

then eventually someone dropped one, and

183

:

they realized there was Greek in it, and

so they started excavating and really

184

:

tried to open them and read them, and they

sort of successfully were, were able to

185

:

open some, and, and there have been some

additions, created of the open papyrus

186

:

telling us what they say, but it's, that

was still very challenging and incomplete.

187

:

It's a very fragmented collection, but

those, that destruction actually, which

188

:

if you Google and you see pictures,

you'll see it's just a huge mess.

189

:

That destruction actually is what provided

the ground truth or the training data

190

:

that we needed because we took pictures

of those and And we Also put those, put

191

:

some fragments in a micro CT scanner,

the same one, we had a photo of it and

192

:

then we had x ray data of it, and aligned

those in a way, um, so that every spot

193

:

where there was ink in the photo, we

could tell the micro CT data, this spot

194

:

has ink, and so you go through that

whole process, you know, Lots and lots

195

:

and lots of times, and it learns what

the data signature, if you will, is when

196

:

there's ink versus when there's not ink.

197

:

And then you just apply that to

new CT data on a fragment or a

198

:

scroll that it's never seen, and

the idea is that it will be able to

199

:

do the same thing in that scroll.

200

:

It will be able to take what it's

learned from the old, from the other

201

:

thing, the fragment, and apply it

to the thing it's never seen before

202

:

and make those same determinations.

203

:

Yeah, here's ink, there's not ink, etc.

204

:

And that's what makes the difference.

205

:

The ink appear and, you know, we always

have a, we always have to explain to

206

:

people that this is, we're, this is

not letter recognition, you know, we're

207

:

not training a network to recognize

using English, uh, an A or a B or a C.

208

:

We're training a network to

recognize that this tiny little

209

:

3D square basically has ink.

210

:

So, if you can imagine, you know,

just pinpointing each of those

211

:

spots where the computer sees.

212

:

realizes there's ink, eventually you

see a letter shape, and eventually

213

:

you see spaces, and eventually you see

words, and eventually you see lines.

214

:

So that's how it works.

215

:

Randall Stevens: Also

means any language, right?

216

:

Christy: Yeah, exactly.

217

:

And, and It would work on drawings even,

you know, anything that, That you can

218

:

train in terms of how the ink would

look, you know, So the way the Vesivius

219

:

challenge came about is we had done that

k and we published a paper in:

220

:

where we had done that work on prototypes

and we had a very, very tiny Herculaneum

221

:

fragment, like, just one character was

like lunate Sigma, um, and we had, we had

222

:

proven that the concept work, but it was

a, it's, if you know anything about AI,

223

:

it requires a lot, a lot of training data.

224

:

And we didn't have very much, and

um, it also requires just a lot of

225

:

coding and a lot of work, and we

were, we had basically one graduate

226

:

student working on the problem.

227

:

So, progress was slow, then 2020

happened, and of course, you know,

228

:

we all know that our lives all

kind of came to a screeching halt.

229

:

Um, because even if we could

continue working in the lab,

230

:

on the computer, it was still a

very distracting, upsetting time.

231

:

So, progress slowed down, and because

people didn't have anything to do, as

232

:

you know, a lot of people just started

surfing the internet, trying to fill time.

233

:

And Nat, Friedman is actually one of

those, and he fell down the rabbit hole

234

:

of, uh, Information about ancient Rome

and one link led to another, and he

235

:

stumbled upon a lecture that Brent gave

in:

236

:

and the progress we had made, and that we

knew that this, this method would work.

237

:

Randall Stevens: just for everybody,

Nat, Nat was the CEO of GitHub.

238

:

Christy: right.

239

:

Randall Stevens: at Microsoft, had

made a lot of money at Microsoft, and

240

:

found this project and called up Brent.

241

:

Christy: Yeah, he, uh, he, uh, He read

about it, listened to the talk, and

242

:

then, you know, read about it, and

there's a quote from him, uh, in one

243

:

of the news articles that came out

later after the, after the challenge,

244

:

with the grand prize was awarded, and

he, it was like, something like, How

245

:

the hell did I not know about this?

246

:

I mean, and then he called it the coolest,

you know, the coolest tech project ever.

247

:

So, yeah, he reached out to Brent, um,

Brent didn't really recognize him, or,

248

:

he just sent emails, and, Eventually,

Brent did respond and, um, to make a long

249

:

story short, they met, um, they talked

about what the challenges were, where we

250

:

were, and why we were, why it was so slow.

251

:

Academic research is very slow.

252

:

Um, it involves, you know, you gotta

write a grant and get funding and

253

:

then you have to find the right,

you have to have the students and

254

:

et cetera, et cetera, et cetera.

255

:

So, they came up with this

idea and that suggested, well,

256

:

hey, why don't we do a contest?

257

:

I'll raise the money.

258

:

You know, from all my

259

:

Randall Stevens: Rich friends.

260

:

Christy: Silicon Valley entrepreneur

type people who like to venture,

261

:

uh, invest in new ventures.

262

:

Um, and you can release all your data.

263

:

You can release all the software

code that you've already written.

264

:

You can write tutorials so that

people can understand how to use the

265

:

things that you've already written.

266

:

And it'll be great.

267

:

We'll, you know, have a contest and

somebody will be a win and we'll win

268

:

and we'll be reading Herculaneum.

269

:

So, um, that's how that came about,

and, and, uh, Brent agreed, even though,

270

:

it was, you can imagine that we, we

actually had team meetings about it to

271

:

decide, do we really want to do this?

272

:

What were the risks?

273

:

And what was the upside?

274

:

And was it worth it?

275

:

And academia is a place where people

hold on very, very tightly to their

276

:

Randall Stevens: I don't think

it's just there though, Christy,

277

:

I think that's one of the big

278

:

Christy: Oh, really?

279

:

Randall Stevens: in this, right?

280

:

That people tend to, you know,

They want to hold on to everything.

281

:

Everything's like some intellectual

property and, you know,

282

:

sometimes you have to let it go.

283

:

I mean, we talk, uh, on this podcast

because we have a lot of people

284

:

developing software and using technology

about, you know, build or buy.

285

:

Do I, you know, do I want

286

:

Christy: Mm

287

:

hmm.

288

:

Randall Stevens: Do I

289

:

want to build this and kind of own

it and it's some proprietary thing

290

:

or should I be, you know, looking

for something that's commercially

291

:

available out there and, you know, is

it really special what I'm doing or not?

292

:

But I

293

:

think there, I think there are

294

:

lessons to be learned in,

in this part, in this story.

295

:

Christy: Yeah.

296

:

Well, it was like, yeah, it was very,

um, it was very difficult To decide

297

:

that because Stephen was really

the grad student, Stephen Parsons.

298

:

He was in the middle of his PhD work and

he's basically giving it up, you know,

299

:

um, the world, yeah.

300

:

And it it, it took some work for

our, for the little team that we had.

301

:

There was me, who's not a coder.

302

:

There was Dr.

303

:

Seals, who really doesn't, or Brent, who

doesn't code anymore, and then we had

304

:

two guys, you know, who were working on

trying to help get the people up to speed

305

:

about how to use the tools, you know.

306

:

Anyway, so Nat raised the money and

we started with a 200, 000 prize pool

307

:

and then within like, I don't know,

just a couple of days, the word spread

308

:

and other people contacted him and

wanted to contribute and we wound

309

:

up with a million dollar prize pool.

310

:

and then there was some press about

the contest launching, and that

311

:

was on a couple of podcasts that

are heavily, um, listened to by

312

:

coders, programmers, and so yeah.

313

:

before long, you know, we had a lot of

people on the, on the, um, Discord server,

314

:

which is the chat, um, Uh, platform that

they used to communicate with each other

315

:

and off we went, So it was exciting, um,

and also nerve wracking a little bit,

316

:

but in the end it really, it really was a

genius move on Brent's part because, you

317

:

know, we achieved so much more than we

would have been able to achieve just with

318

:

one person.

319

:

Randall Stevens: Christy, the

first, the prize was broken up

320

:

into like, first one to decipher.

321

:

X amount of text or the first

words even, or a sentence.

322

:

One,

323

:

it was a word.

324

:

right?

325

:

The

326

:

Christy: Right.

327

:

Yeah.

328

:

So the first, the first thing

they, they did is they had a

329

:

Kaggle contest, which was basically

just, um, improving the ink ID.

330

:

So they did a little contest then just

on Kaggle, but the, the major one was,

331

:

yeah, the first letters prize, um, which

you had to find so many letters within

332

:

a certain, um, certain size space.

333

:

Space with only so many missing right?

334

:

and it is so they they intentionally

made it a first letters prize Because

335

:

you might find a you might find letters

within that space with some missing

336

:

letters and therefore you're not

going to be able to Have a word but

337

:

turned out, you know, Luke Farriter

The person who won he found a word.

338

:

purple

339

:

Randall Stevens: Wasn't he an undergrad?

340

:

Christy: Yes, he was an undergrad?

341

:

at the University of

342

:

Randall Stevens: Nebraska

343

:

Christy: and he had been working as a

SpaceX intern Um, and he heard about

344

:

the contest basically, um, through

Nat talking on the podcast about it.

345

:

And he was Like Oh, I got to do this.

346

:

You know, its kind of funny, they

use this term called nerd sniped.

347

:

Um, and I guess, you know, they,

there are so many people who are so

348

:

passionate about this project that

it's, it's really been interesting,

349

:

um, to see the, You know, technological

community, the computer programmers

350

:

get so excited And be so invested, you

know, Luke took his money that he won

351

:

40, 000 And he bought more computers.

352

:

So he had more computing power because

you have to have a lot of computing

353

:

power to, I mean, the, the challenge

had set some up, you know, that people

354

:

could use, but he, he, He bought more

computers, reinvested in other words.

355

:

Um,

356

:

Randall Stevens: And then, and then

357

:

Evan Troxel: you said?

358

:

Christy: so that it was in 2020 when

Nat sort of fell down the rabbit

359

:

hole of, um, the Rome, ancient Rome

and found out about Brent's work.

360

:

And then what happened was he just kept

following, waiting for us to reveal

361

:

more text and there was no, no word.

362

:

2020 turned into 2021, 2021 turned

into:

363

:

Brent was in 2022, like in

the summer of that year.

364

:

They met, and then we had a

meeting in January, and then the

365

:

contest launched in March of 2023.

366

:

Randall Stevens: it went fast too,

Evan, if you go look at the story,

367

:

the, uh, The first word, and then it

was, and then within just a handful

368

:

of months, they had done hundreds of

words like were, you know, it went

369

:

really fast once they started, you

370

:

Christy: Once it worked,

it worked, yeah, The,

371

:

Evan Troxel: I have a few questions here.

372

:

I'm just curious because

it started going fast.

373

:

So, so were, were teams allowed

to build on other teams work?

374

:

Is this kind of like the whole

idea was that it was all open?

375

:

Every, everything

376

:

everybody did was open.

377

:

Christy: Yes, it was an open

source, um, completely open.

378

:

source contest.

379

:

And, um, You know, the, the organizers,

the team, um, was really brilliant in

380

:

that they realized even though it's

supposed to be open source, Right.

381

:

people are not necessarily

going to share what they're

382

:

doing, what they're working on.

383

:

And everybody, we all knew it was a, it

was going to take a village, literally.

384

:

I mean, it was going to take

everybody working and everybody

385

:

sharing because there was just

so many parts to the puzzle.

386

:

And so they set up what

were called progress prizes.

387

:

The, uh, First Letters

Prize was one of those.

388

:

It was the, it was a big one,

but they also set up other

389

:

prizes, 5, 000, 2, 000, 10, 000.

390

:

They would have a call for a

particular tool or a particular

391

:

problem to be solved or whatever.

392

:

And, uh, People would do that

work and submit it and they

393

:

would be rewarded for it.

394

:

And one of the stipulations was that if

you submitted for the progress prize,

395

:

you had to release that, data and release

or release that, code so it could be

396

:

used by the rest of the community.

397

:

So, yeah, by the, by the time

the contest ended, um, or maybe

398

:

even by the, yeah, by the end of

the year, that was the deadline.

399

:

December 31st, 2023 was the deadline.

400

:

They had awarded something

like 50 progress prizes.

401

:

Um, you know, I think they've, except

for the 40, 000 First Letters prize

402

:

they varied from like 1, 000 to 10,

403

:

Randall Stevens: that, that, that the

big prize that got awarded the young man

404

:

that won the original prize teamed up

with two others right in Europe, and they

405

:

ended up

406

:

together winning the big grand prize.

407

:

Christy: Yeah, one of the things that,

you know, the other thing that Progress

408

:

Prizes did was alert team members to

each other and what their strengths

409

:

were and what They were working on, etc.

410

:

And so, yeah, Luke and, um,

uh, Yousef, uh, Luke Farreter

411

:

was from the University of.

412

:

Nebraska, Yusef is from Egypt

originally, but he was a graduate

413

:

student at a university in Germany.

414

:

Um, Yusef came in right behind

Luke, um, and was the second prize

415

:

winner in that first letters prize.

416

:

And actually, when you talk about it

moving fast, Um, we had those, those

417

:

first letters, and, from Luke, and

then within a week, we had, you know,

418

:

five partial columns from Yusef.

419

:

So it just really went, and then by the

end of the year, so that was in October,

420

:

when that was awarded, um, by the end

of the year, December 31st, we ended up

421

:

with those 15 and a half columns, um,

which was the most, You know, text ever

422

:

revealed from, from inside something

like that that couldn't be opened.

423

:

Randall Stevens: They think,

Evan, that these scrolls were in

424

:

the private library of the father

in law of Julius Caesar, right?

425

:

Evan Troxel: Wow.

426

:

Christy: yeah, that's right.

427

:

Evan Troxel: Geez.

428

:

it's the snowball,

429

:

Randall Stevens: hundreds of them still

430

:

there, right, that they haven't done

431

:

Evan Troxel: charcoal, charcoal, fingers,

432

:

but so,

433

:

Christy: These are actually bigger.

434

:

The Yeti scroll is small,

but these are more like, you

435

:

know, this size, I would say.

436

:

Evan Troxel: So, maybe talk about

the physical, like what's actually

437

:

happening with the, micro CT machine.

438

:

So like, is it, it's bombarding this thing

from all sides and, and capturing what,

439

:

like, I'm just curious, like what this

440

:

process looks

441

:

like in physical world.

442

:

Christy: Sure,

443

:

So it's a cat scan, basically

super high powered cat scan.

444

:

Same thing you would have done,

you know, your brain or whatever.

445

:

But, um, this is much, much higher.

446

:

energy and more radiation.

447

:

So, you know, much higher than

448

:

Randall Stevens: Yeah, they've

had to go to didn't they go to

449

:

London or some different place

to find the right equipment?

450

:

It's not just pure.

451

:

Christy: right.

452

:

Like, yeah,

453

:

Evan Troxel: can't just get it

454

:

Christy: we have,

455

:

yeah, we've used desktop

machines, but they weren't

456

:

capturing fine enough resolution.

457

:

In other words, like I was saying, the

ink is in these little tiny voxels, right?

458

:

You're telling it voxel by voxel.

459

:

Yes, there's ink.

460

:

Yes, there's ink.

461

:

And if You say your voxels are, you know.

462

:

Randall Stevens: How many microns?

463

:

So

464

:

Christy: is is getting up.

465

:

This you're gonna miss it

It's got to be really fine.

466

:

So the tape the benchtop sources

weren't weren't advanced enough So

467

:

yeah, what we did is we took the scrolls,

to a particle accelerator a synchrotron

468

:

high energy physics facility that does

all kinds of experiments and stuff But

469

:

one of them is they can do Micro CT, um,

and that was no small feat, that's a whole

470

:

other story, but getting the institutions

that own these precious objects to allow

471

:

us to pack them up, fly them to, you

know, Or take the train, in that case.

472

:

They went from Paris, actually.

473

:

This, this, the one, the first one that,

um, they, they virtually unrolled was

474

:

from Paris, the collection in Paris.

475

:

Um, they took the train

to, you know, Oxford.

476

:

It's near Oxford in England.

477

:

That was a, that was a

major accomplishment and

478

:

a major ordeal in and of

479

:

Randall Stevens: there's different,

480

:

these scrolls, even though they originated

in Pompeii, they've ended up, some of them

481

:

have ended up in museums around the world.

482

:

Christy: yeah, they were found

in a sister city of Pompeii.

483

:

called Herculaneum.

484

:

So the volcano is here and Pompeii is

over here and Herculaneum's back there

485

:

Randall Stevens: did I tell you

that I was there a few years ago?

486

:

Yeah, I actually was near,

uh, near the Vesuvius and got

487

:

Christy: Oh, the volcano, Yeah.

488

:

Randall Stevens: but this past summer

we, uh, we were vacationed in Sicily and

489

:

there's another active volcano there.

490

:

Literally, it was like in

our backyard, like spewing.

491

:

I was like, holy crap, this

is, it harkened back to like,

492

:

I'm like, I'm going to bed.

493

:

I may not be here in the morning.

494

:

If this thing explodes,

it's like, Start to feel it.

495

:

Christy: and even, uh, there's some

near Naples on the, other side of

496

:

Naples that are active, you know, too.

497

:

so basically, you know, we had to devise

a way of, of placing these objects in the

498

:

scanner, in the beam is what we call it.

499

:

Basically, yeah, it just,

it, it's, there's a beam that

500

:

comes out, an X ray beam.

501

:

What happens is a particle accelerator,

you have electrons in a circle.

502

:

You can, you can Google diamond

light source and you'll see this

503

:

beautiful picture of the facility and

it's just a big, huge metal circle.

504

:

And the electrons go around faster

and faster and faster and faster.

505

:

And when they're diverted from

that course, they emit an X ray.

506

:

So, around that circle, uh, every

so many meters, there's a, uh,

507

:

an X ray chamber, basically.

508

:

And the beam is, is diverted, they use

magnets, actually, to move the electrons.

509

:

But, it's, uh, the, uh, beam

is diverted into these, uh,

510

:

beam into that space

and creates the x ray.

511

:

You have the beam coming in and

then you have a detector basically

512

:

that captures the picture on

the other side, how it works.

513

:

It's, it's just a 3D x

ray is really what it is.

514

:

And yeah, you have it.

515

:

We, you, we had to put it

in there vertically in order

516

:

to get the best imaging.

517

:

And so we created special cases, you

know, that were form fitted and the

518

:

way it works is the object sits in the

beam that's coming out and it rotates.

519

:

It's ever so minutely and captures

X-rays at every one of those points.

520

:

And then at the end there's

software that takes all of that

521

:

and puts it together, reconstructs

it into a 3D model of the object.

522

:

And because it's an X-ray, you

can see everything inside it too.

523

:

You don't just see the 3D shape of it,

but you get these every time it turns.

524

:

You know, you, you get these slices.

525

:

Um, and you can see the, structure

and it's those slices that we

526

:

use to do the virtual unwrapping,

527

:

Randall Stevens: We'll put some

links in the in the show notes to

528

:

these videos that show all of this.

529

:

It's uh, yeah, I mean, it's just,

it's amazing when you go look at it.

530

:

Evan Troxel: It sounds absolutely

531

:

incredible, like straight out

of a movie, and I'm pretty sure

532

:

they do this every week on like

a some CSI show on CBS, right?

533

:

I but it is, like science fiction

534

:

come

535

:

to reality

536

:

Randall Stevens: can't make it up.

537

:

And you know, I mean, I think Christy,

you know, the, the talk you gave was,

538

:

you know, a lot about just, how audacious

this just even for Brent to like, have

539

:

this thought, and then to dedicate

that much time and energy, Right?

540

:

with all the ups and downs and everything.

541

:

It's like, it should never have happened.

542

:

And then when it

543

:

finally does,

544

:

everybody just assumes it like,

Oh, they did this overnight.

545

:

It's like, no, no, no, no, no, no.

546

:

Evan Troxel: You

547

:

invented tons of

548

:

Randall Stevens: on lots and lots of work.

549

:

And, You know, to Brent's credit,

right, really became his life, you know,

550

:

passion for, but it's, uh,

you know, I say it's sexy

551

:

because it's like, it's

technology, it's antiquity.

552

:

It's the, you know, who knows what's

written on all these things, right?

553

:

That will get unlocked and what

people were thinking and writing

554

:

about it in the time and, um,

555

:

Christy: Well, and it's really, it's

really important in terms of that

556

:

because we have so little material

actually from the ancient world.

557

:

A lot of what we have that was written by

people actually living in antiquity was

558

:

copied at some point by the medievalists.

559

:

yeah.

560

:

Um, and so you, you get it

through some, you, you get it

561

:

almost secondhand, So to speak.

562

:

You know, there was an original

word that was copied, and that's

563

:

kind of what we have today.

564

:

So we have very little original

material, actually, from antiquity.

565

:

And, you know, people have said who

are experts in ancient history and

566

:

things like that, that because there

is so much, I mean, you're right,

567

:

there are hundreds of these scrolls.

568

:

And if we can virtually unwrap them and

open them all, that it could change a lot

569

:

about, you What we think about the ancient

world or, you know, at minimum, explain a

570

:

lot more, uh, most of what they've found

so far is just philosophy, philosophical

571

:

texts, you know, um, Epicurean philosophy

in particular, but not everything.

572

:

And, you know, I don't know what the

percentage of the overall collection

573

:

that's actually been opened that they've

been able to read, but it's pretty small

574

:

compared to, you know, if you count

all of the scrolls that they found.

575

:

So, yeah, it is an important

collection, you know, because of that.

576

:

Randall Stevens: So that's

the big challenge now, right?

577

:

That, uh, now that they kind of

have started getting the technology.

578

:

one, there's different people

want to claim ownership of

579

:

Christy: Yeah, well, you know,

580

:

Randall Stevens: It's hard to get

your hands on the scrolls now.

581

:

It's like everybody starts, you

know, kind of withdrawing into

582

:

their corners and it's like, okay,

583

:

Evan Troxel: Mm.

584

:

Yeah.

585

:

Christy: because, um, They've always been

valued, you know, and they, they, the

586

:

libraries and institutions, what happened

was when they were found, the, the king

587

:

of Naples at the time, sort of gave them

out, um, to fellow dignitaries, um, pretty

588

:

much, I think it's peace offerings, right?

589

:

They, he gave a set of three or five,

maybe, to Napoleon, and that's why France

590

:

has some, because, you know, Napoleon

was running around taking over everybody,

591

:

so.

592

:

Randall Stevens: gift here.

593

:

Christy: Yeah, so France wound up with

some, um, and then they, he gave several

594

:

to the Prince of Wales at the time, and

those are now in the British Library

595

:

in London and at the, um, University

of Oxford's Bodleian Library in Oxford.

596

:

Um, the ones in Paris are

at the Institut de France.

597

:

Um, so that's how that happened

that they got so dispersed.

598

:

Um, and yeah, now, They've always been

well taken care of, but nobody ever

599

:

really thought they would amount to

anything other than a relic from the past.

600

:

Right.

601

:

So now, yeah, It's even the people

who own them have gotten a lot

602

:

more, um, interested in what

we're doing and, and, and stuff.

603

:

People, you know, a couple of

originally Brent, when I think I, I

604

:

mentioned in my talk, he originally

went to the University of Oxford.

605

:

Um, and asked him because he knew

they had, had some scrolls and

606

:

asked him if he could scan them.

607

:

And they're like, no, no, thanks.

608

:

You know, because again, it

was a pretty audacious idea.

609

:

This is back.

610

:

This was in 2005.

611

:

Um, and I mean, you know, we

didn't have an iPhone in:

612

:

I don't think, think the iPhone 2008 or

613

:

Evan Troxel: Do that.

614

:

Yeah.

615

:

Christy: So when you think about,

if you can, if we can manage to get

616

:

ourselves back there, it's so hard

to even think of life without an

617

:

iPhone and All of the things that have

happened technologically along with

618

:

it since then, you know, it was, it

was audacious, absolutely an audacious

619

:

idea, um, but now they have actually,

uh, had, they've allowed the team to

620

:

scan one of their scrolls at the Diamond

Light Source Synchrotron in Oxford,

621

:

and that data was just released, Right.

622

:

before Thanksgiving, I believe, to the

Vesuvius challenge, and already they

623

:

have had, um, had results where ink

is appearing in that particular scan.

624

:

Um, and that, this one actually,

apparently, was penned with some

625

:

kind of ink that had some type of

metal in it, or something that's

626

:

creating a density, density shift.

627

:

There's something in the ink that makes

it, um, Um, have a density that is

628

:

enough different from the papyrus that

it's on, that the ink shows up with the

629

:

traditional virtual unwrapping method

without even yet applying the AI tool.

630

:

So, you know, we are, and the good

thing about that is, you don't have to

631

:

train on fragments from another scroll.

632

:

So, if, when you train on what you

know is ink from the scroll itself,

633

:

then you're, you're really good.

634

:

training apples to apples, right?

635

:

Um, you're training it to read itself

instead of training something else

636

:

that's in the same, you know, collection.

637

:

But it could be very different,

and we have actually found that.

638

:

The scrolls are actually, they

respond very differently to the AI.

639

:

Just because the network works on one,

in other words, doesn't necessarily

640

:

mean it's going to work on another.

641

:

But because if you think about

it, you know, this is ancient

642

:

Rome, the ink recipe is not.

643

:

standardized.

644

:

You know, the scribe is just mixing

up his ink, and it may be one way, one

645

:

time, the same scribe, and it may be,

646

:

Evan Troxel: Right.

647

:

Christy: Yeah,

648

:

and the same thing with the papyrus, you

know, the papyrus is not always the same.

649

:

Randall Stevens: Didn't

come out of a factory.

650

:

Christy: Yes, exactly.

651

:

So, you know, we, we kind of, I think,

naively in our modern day think, oh,

652

:

these are all in the same library.

653

:

They're all carbonized.

654

:

They all look the same, right?

655

:

They're just wads of black.

656

:

So, therefore, it's all just going

to, it's going to be transferable.

657

:

And the truth is, even if you just

think of going to a, library today,

658

:

the differences in, you know, paper

books, you know, they're very different.

659

:

Different fonts, different sizes,

different papers, you know, so, um,

660

:

we've definitely discovered since the

grand prize, um, and since we've gotten

661

:

some more data, because we've been able

to scan some more intact scrolls, that

662

:

they're very different and respond very

663

:

Randall Stevens: Yeah, I guess it's a,

you know, in the, in the, When I said

664

:

the success has made, you know, some

people kind of retreat, but it also

665

:

probably has the opposite effect too,

which is now that we've seen success,

666

:

now you have permission to be, Oh,

we want, we want some of that too.

667

:

Can you,

668

:

Christy: yeah.

669

:

exactly.

670

:

No one wants to be left out now.

671

:

Yeah.

672

:

Randall Stevens: to take.

673

:

I would think, uh, maybe Christie, you

know, a little bit about, you know, what

674

:

you talked about, uh, at our live event

was, You know, the challenge is, you know,

675

:

there's obviously a bunch of technical,

very technical things going on behind

676

:

the scenes, but the other, because it

is such a, Human, everybody, everybody

677

:

that sees this project would probably

find something interesting about it.

678

:

So very broad, uh, audience and

obviously it caught the attention.

679

:

And like I said, it's been on a lot of

every newspaper and TV show and podcast.

680

:

When you go start Googling, if you

hadn't seen it already, all of a sudden,

681

:

it'll just start showing up all over the

place because it's been widely covered.

682

:

But you were talking about, you

know, from your position and what

683

:

you're doing with the team there.

684

:

Like, how do you, you know, which

I think can relate to the audience

685

:

that, uh, that Evan and I, you know,

kind of catered this podcast to.

686

:

But the idea that you've

got to communicate.

687

:

What can be complex and technical

things to a, to a general audience?

688

:

And how do you, how do you get people

interested in something like this?

689

:

What, what have you learned

in this process that, that,

690

:

that you could share with us?

691

:

Christy: Yeah, well, you know, one

of the main things about any type of

692

:

communication is know your audience.

693

:

So, you know, a computer scientist

speaking computer science ese

694

:

to a librarian at an institution

that's been around for hundreds

695

:

of years is not going to work.

696

:

So you really have to know your audience

and think about, you know, what, number

697

:

one, what that person cares about.

698

:

So for us, you know, It it was all

about, um, most of the people, the,

699

:

the institutions, of course, they

care about the preservation and

700

:

the conservation of the object.

701

:

So our first thing we had to do

before Brent could really even

702

:

do anything was figure out a

way to keep these objects safe.

703

:

And I didn't include this in the

talk because I had so much, but, you.

704

:

know, he went and, and there's an artist

that you probably know him, Randall.

705

:

I can't, I don't know his name,

even though I've been out to his.

706

:

I want to say his name was Tim something,

but, you know, Brent worked with him

707

:

to create plaster models that They

could build and use to transport, you

708

:

know, put it in it and stand it up.

709

:

in the machine.

710

:

And that was back in 2009, um,

before you could really do 3D

711

:

scanning, which is what we do now.

712

:

We do something called photogrammetry

where you just take regular

713

:

photographs all the way around an

object and there's software that

714

:

will put it together in a 3D form.

715

:

Um, so that was first, you know,

first you have to, you have to

716

:

figure out what the person's.

717

:

Priorities are and

what's important to them.

718

:

Um, and how you can allay

any fears that they have.

719

:

And then the second thing is, when you

are trying to explain what you're going

720

:

to do with this object, you have to

put it in terms that relate to them.

721

:

I always start out by explaining that,

you know, micro CT is a cat scan.

722

:

Just, you know, everybody knows someone

pretty much who's had a cat scan.

723

:

And that's what it is.

724

:

And It's a 3D x ray.

725

:

Everyone's had an x ray.

726

:

So

727

:

Randall Stevens: It

728

:

doesn't hurt you.

729

:

Christy: Yeah, so if you can get out

of the, if you can get away from the

730

:

terminology that is so present in your

mind and think about it in a different

731

:

way, you know, like you would try to

explain it to your grandmother or try to

732

:

explain it to a child, not in a way that

is pedantic or Um, off putting or you're

733

:

talking down to someone, but just so you

can relate what you're doing to them.

734

:

Um, that's another very very

important thing is being humble.

735

:

Um, you know, these

objects, they're not ours.

736

:

They're not even American.

737

:

Um, you know, I think that, um,

Originally, if you watch the 60

738

:

minutes piece, which I played a

clip of, you know, Brent did kind of

739

:

go in and he just thought everyone

was going to, yeah, here you go.

740

:

Great.

741

:

I'm So glad this American

scientist is over here and wants

742

:

to, you know, open my scrolls.

743

:

And that was absolutely not what happened.

744

:

Um, and I think that, and he will tell

you this, because I've heard him say

745

:

it, you know, he learned to respect.

746

:

The, the, history and the, not that he

didn't originally, but, but you just don't

747

:

necessarily think that way when you're

a scientist and you're, you've got a

748

:

goal and you want to build something or

749

:

Randall Stevens: It's logical.

750

:

It's all very logical.

751

:

Yep.

752

:

Hmm.

753

:

Christy: um, you know, you have

to be humble and have respect

754

:

to for the the lay person or for

the, the person who's, who's not.

755

:

Technically, um, uh,

involved in your field.

756

:

Randall Stevens: So

757

:

I would categorize the, you know, the

explaining as like risk mitigation.

758

:

That's

759

:

like,

760

:

mitigate the risk.

761

:

There's got to though, what's the carrot?

762

:

Like, what's the, why would they even

763

:

want it?

764

:

I get the, okay, now that I know you're

not going to destroy these things,

765

:

why would I still want to do it?

766

:

Christy: Yeah, that's the next step is

to, um, figure out what's in it for them.

767

:

You know, how can you communicate

what's in it for them in a way that

768

:

makes them want to take the risk?

769

:

Because it is a big risk, and you

know, the team, my team, one of the

770

:

reasons Brent hired me was actually

to represent the non technical person.

771

:

I mean, I understand it all,

because I kind of have that?

772

:

kind of brain, but I also come from a

humanities background and I'm a writer,

773

:

so I'm a translator is what I tell people.

774

:

But, um, You know, you, you have to

understand, what it looks like from them,

775

:

and, and my team would get frustrated.

776

:

Look, it's not invasive, you

know, we built these cases,

777

:

you don't have to worry.

778

:

And I'm like, you know what guys,

that's great, but their job,

779

:

their one job, they have one job.

780

:

And it is to keep these objects, yes,

it is to keep these objects safe.

781

:

So if they weren't difficult,

For you guys to work with, they

782

:

would not be doing their job.

783

:

So you have to let them do their

job, answer their questions.

784

:

I mean, I'll never forget, we were

doing a project in New York at

785

:

the Morgan Library and Museum, and

it was actually a book, a codex.

786

:

From, like, 5th or 6th century.

787

:

Um, it's written in an Egyptian Coptic

language, which is a transitional language

788

:

from hieroglyphics into Greek, actually.

789

:

Um, and they, it had been burned.

790

:

So you couldn't really open it, but

there were some loose leaves and

791

:

they knew it contained the Acts of

the Apostles, at least on two pages.

792

:

So they wanted to, we wanted

to use our technology.

793

:

You know, scan it, virtually open it,

and et cetera, and the last day of the

794

:

project, the very last day, I came in that

morning, and the, um, paper conservator

795

:

there, the person who is in charge of

this, she had a stack of papers that she

796

:

had been reading about the damage that

synchrotron imaging, which I've explained

797

:

to you about what a synchrotron is and

how it's very different from a desktop

798

:

machine, which we were using at the

time, synchrotron imaging, you know, It

799

:

causes molecular changes to papyrus and

all this kind of stuff, and she was not

800

:

going to put that book in the scanner.

801

:

We had scanned, already scanned it maybe

once, but today was the day we were

802

:

gonna have the longest scan and the

best scan and the one that was probably

803

:

gonna give us the data we needed, and

she was not gonna put it in there.

804

:

I mean, she was not.

805

:

And I tried to explain to her, well,

that those studies were done, you know,

806

:

at a much, much higher energy facility

with much, much stronger radiation.

807

:

And, you know, if you, it's the same as,

you know, if you take a magnifying glass,

808

:

the light coming into the magnifying

glass, Is at one level, but when it goes

809

:

through the magnifying glass and comes

out at the other end, it'll burn paper.

810

:

We're just using the, the light before

it gets to the magnifying glass.

811

:

This is not after it comes outta

the magnifying glass, which is kind

812

:

of the, you know, analogy with the

synchrotron and the particle accelerator.

813

:

Nope, she wasn't doing it,

so Brent had already left.

814

:

And he, his family had come with him

and they had already left and were

815

:

in the car driving and I called him

and I'm like, you know, Maria is

816

:

not gonna put this in the machine.

817

:

You have got to talk to her.

818

:

So he gets on the phone, she gets

off and she says, you know, he, he

819

:

said the exact same thing you did.

820

:

He told me blah, blah, blah, blah.

821

:

I used The analogy, of

the magnifying glass.

822

:

And so then she was okay.

823

:

So, You know, it wasn't, and that, that

doesn't, you have to respect that, right?

824

:

It's her job to keep these safe.

825

:

She has scientific documentation

that we're changing the molecular

826

:

structure of this object, and

827

:

it wasn't enough for me, you

know, a non computer scientist.

828

:

I had only been working for Brent at that

point, like, a year, maybe, um, you know,

829

:

she needed to hear from the guy, right?

830

:

And you just have to.

831

:

You know, sometimes you have to do that.

832

:

You have to talk people off the

ledge, and you have to understand,

833

:

and you have to, you know, give them

the access and the comfort they need.

834

:

I mean, we went ahead and did it, and it

was fine, and we actually were able to,

835

:

you can see some of it on our website.

836

:

Um, it's, it's called the M910 project,

um, and you can see some of the

837

:

images where we were able to see what

the pages actually said, you know.

838

:

But, that's a perfect example

of, you know, being able

839

:

to communicate to somebody.

840

:

in a way that is respectful and

also explains things so that

841

:

it puts their fears at ease.

842

:

I don't know what kind of communication

issues, you know, your listeners might

843

:

have, but, um, I have found even just

among my colleagues, um, you know, being

844

:

nice, saying thank you, saying please,

you know, it, you, it may not matter

845

:

right now in this conversation, but

it builds an impression of you that.

846

:

Is, is a part of who's asking

the thing the next time that

847

:

they are uncomfortable doing.

848

:

Right?

849

:

Um, and not talking down to people.

850

:

I mean, you know, in an academic

environment, it's kind of funny because,

851

:

you know, I'm not an academic, but I

will talk to some of my colleagues who

852

:

are academics and just me one on one.

853

:

It's one kind of conversation,

but you get the two of them

854

:

talking together to each other.

855

:

And I'm like.

856

:

What on earth did you two get into?

857

:

Yeah,

858

:

and they just, you know, it's just weird.

859

:

Um, so just don't talk

down to people either.

860

:

If you're in a position where you know

a lot about something and they don't,

861

:

they're automatically intimidated.

862

:

You don't have to intimidate them.

863

:

They're already intimidated.

864

:

So the best thing you can do is just,

you know, not talk down to people and

865

:

explain to them what they need to know.

866

:

The other thing that's always good

is to have them ask questions.

867

:

Because then you can figure out what

it is they actually need to hear or,

868

:

you know, I always listen way more than

I talk when I'm in a mute situation.

869

:

Evan Troxel: two

870

:

ears, one mouth, right?

871

:

Yeah, I, the one of the other things that

I thought of when you were talking about

872

:

the, you know, she talks to you and then

she talks to Brent and she got kind of the

873

:

same story, but maybe hearing it a second

874

:

time, it landed a little bit better.

875

:

Um, but also like his, his.

876

:

His reputation's on

the line and hearing it

877

:

from him, like,

878

:

like, one thing that bothers me a lot

about like corporate communication is

879

:

when the boss, whoever that is, he or

she would, would give somebody else the

880

:

job of delivering bad news or delivering

881

:

some, making a promise or

882

:

whatever it is.

883

:

And it means so much more coming

from someone who really has skin

884

:

in the game and accountability,

you know, and, and, and that I

885

:

think can make a big difference.

886

:

And so part of the strategy

of communication is also who

887

:

delivers it, how they deliver it,

888

:

what language they're speaking.

889

:

It's all of these things

and it's complicated, right?

890

:

But, but it's all part of the equation.

891

:

Christy: Yeah, absolutely.

892

:

I mean, that's, like I said, he was

the man he had, she had to talk to him.

893

:

And I think, that was right, you

know, for her to want that and

894

:

for him to take the time to do it.

895

:

Um, because that's her job.

896

:

She has one job

897

:

Evan Troxel: Yeah.

898

:

Randall Stevens: I think I

899

:

think a lot of the people that

Evan and I do end up, you know,

900

:

communicating with through this

podcast are in usually technical roles.

901

:

Uh, and then, you know, they're having to

communicate and to, to people that you may

902

:

think are technical, but a lot of times

you're in the, in, in these firms, they,

903

:

a lot of times aren't that technical.

904

:

And, uh, it, it, it, it, it.

905

:

My guess is maybe Evan, you've got more

insights into this because you, you were

906

:

even in your own practices that you were

involved with, but probably the people

907

:

that are most successful in those kinds

of roles are the ones that can communicate

908

:

in the ways that you're, you know,

909

:

Evan Troxel: Absolutely.

910

:

100%.

911

:

Randall Stevens: have to be empathetic,

have to understand, you know, I like

912

:

to think I, I do teach and I've always

feel like I'm a teacher at heart.

913

:

So I love to explain things.

914

:

Something that I think

I know something about.

915

:

I'll talk all day long about it.

916

:

I love, you know, I just love explaining

things that I think, that I think I

917

:

know something about to somebody else.

918

:

So maybe that's part of it too, right?

919

:

It's like, okay, you gotta be, understand

that you like to explain things to people.

920

:

The, the not, you know, not talking

down the, uh, you know, I think,

921

:

I think that's a biggie, right?

922

:

Is to be like, Hey, don't, you're not a

show off just cause you know, you know.

923

:

Christy: Yeah.

924

:

And, you know

925

:

Randall Stevens: They know a lot

more than about a million other

926

:

things that I don't understand.

927

:

It works both directions, right?

928

:

Christy: Right.

929

:

Everybody has their area of expertise.

930

:

It's just not the same

as yours always right?

931

:

And, um, I think a lot of times

people don't realize that they're

932

:

talking down to someone, or that

it's perceived as them talking down.

933

:

You know, um, when you've explained

this technical process or this tool

934

:

a million times, you get frustrated

when, you know, and, and so it's

935

:

just, you really have to constantly

be checking yourself and realize that

936

:

it is important enough to do that.

937

:

Um, I think communication is extremely

underrated in terms of how important

938

:

it is to keep things functioning.

939

:

Um, I mean, you know, we all

have communication programs and

940

:

departments and stuff like that.

941

:

But, you know, just your daily

conversations, it's really, really

942

:

important for you to think about what

you're saying and how you're saying it.

943

:

Um, and, you know, a lot of times

people in these situations when they're,

944

:

they've, you know, You know, they

feel like they should know, maybe.

945

:

They feel like, um, you're

gonna expect them to know.

946

:

And so they might pretend to know and

be, you know, And you just kind of have

947

:

to ignore that and, and recognize on

your own where they're from or whatever.

948

:

And ask, like I said, just ask questions,

949

:

Randall Stevens: they

950

:

won't always

951

:

Christy: to feel out that.

952

:

Randall Stevens: right?

953

:

People won't always tell you when

they don't understand something,

954

:

Christy: No, They will absolutely not.

955

:

You know, it's funny because, um, we'll

be in meetings and somebody will like,

956

:

even our students, you know, we have our

students working on different projects

957

:

with different equipment here in the lab.

958

:

And it's all virtually, I mean,

it's all digital restoration

959

:

kinds of stuff for the most part.

960

:

You know, humanities oriented stuff.

961

:

Scientific work, or computer technology

work on humanities projects, but they'll

962

:

sometimes mention something because

they're doing something over here that,

963

:

you know, Brent's not really involved

in, or that's not his specialty,

964

:

and he still will ask questions.

965

:

You know, um, so you have to make it

so that, and that's just because he's

966

:

confident enough that, you know, he

doesn't, he knows what he doesn't know,

967

:

and he's happy, he's okay with that.

968

:

But, but when you're, when you're

in a position of power, so to.

969

:

speak, which, uh, You

970

:

know, a tech, um, a tech,

uh, person is often in these,

971

:

or they feel like it Right.

972

:

in these, um, companies.

973

:

Um.

974

:

You know, the person may not feel safe

asking questions or, or displaying

975

:

their ignorance or, you know, and

you just really have a lot of it

976

:

is putting people at ease, you

know, trying to put people at ease.

977

:

Um, you know, we work a lot

with our students actually on

978

:

their communication skills.

979

:

That's another thing I do here

is work with our undergrads.

980

:

You know, they have to do

presentations about their work

981

:

and it's because engineers are

not always the best communicators.

982

:

Um, you know.

983

:

We're all made differently and the

way our brains work gravitate one way

984

:

or another and So they're sometimes

they're just not the best communicators,

985

:

but you can learn, you know You can

absolutely learn to listen first and to

986

:

ask questions and to smile And you know

some of the things that are just simple

987

:

and basic, you know Um, not talk down

to people, you know, not use jargon.

988

:

We really try to hammer that home

because it's really important.

989

:

Um, you know, most people who are

computer scientists are not going to

990

:

be just talking to computer scientists.

991

:

Evan Troxel: Right.

992

:

Christy: They're not going to be

just talking to tech people because

993

:

what their work is doing is, you

know, is for non tech people, right?

994

:

We're all using the things

that the tech people create.

995

:

So, it's a really important

skill that we really try to

996

:

work with our students on Yeah.

997

:

Evan Troxel: on our youngest child with,

and he's 18 and it's like, you can't just.

998

:

It can totally overwhelm the

person with the things that

999

:

you're interested in, right?

:

00:56:13,329 --> 00:56:17,439

And so it's, you know, the saying is

like, it's like better for someone

:

00:56:17,439 --> 00:56:19,839

to think that you don't know what

you're talking about than to open

:

00:56:19,839 --> 00:56:20,979

your mouth and prove it, right?

:

00:56:20,989 --> 00:56:23,579

So there, there is a

way around that, right?

:

00:56:23,589 --> 00:56:26,269

And it is to listen and it

is to ask questions and just

:

00:56:26,279 --> 00:56:30,479

be curious because if you can learn

something, you're in a way better position

:

00:56:30,649 --> 00:56:35,189

than you are just, you know, Sitting

there nodding along saying nothing versus

:

00:56:35,299 --> 00:56:39,669

overpowering the situation with what

you're interested in that they may be

:

00:56:39,669 --> 00:56:41,029

completely disinterested in.

:

00:56:41,029 --> 00:56:41,629

So I,

:

00:56:41,779 --> 00:56:44,409

I, I love the things that you're talking

about because communication is an

:

00:56:44,419 --> 00:56:47,019

underrated skill and it really does move

:

00:56:47,019 --> 00:56:48,689

things along in, in

:

00:56:48,810 --> 00:56:50,549

very profound

:

00:56:50,559 --> 00:56:51,049

ways.

:

00:56:51,069 --> 00:56:51,709

And it's

:

00:56:51,719 --> 00:56:54,169

it's really interesting to hear

you talking about that even

:

00:56:54,169 --> 00:56:55,839

at, in the grad school levels

:

00:56:55,839 --> 00:56:56,584

and academics

:

00:56:56,884 --> 00:56:57,684

Randall Stevens: I'm pretty, I'm

:

00:56:57,684 --> 00:57:02,454

pretty sure 20 years ago that I probably

would not have admitted that I didn't.

:

00:57:03,065 --> 00:57:06,345

Know something in a conversation,

but I can tell you it's one

:

00:57:06,345 --> 00:57:07,715

of the weapons that I've

:

00:57:07,715 --> 00:57:08,055

learned.

:

00:57:08,945 --> 00:57:12,685

Oh man, It's like if somebody says

something and it happens all the

:

00:57:12,685 --> 00:57:17,305

time, you know, I'm in conversations

with our customer base and it's like,

:

00:57:17,305 --> 00:57:21,375

if they say something, I'll just be

like, I have no idea what that means.

:

00:57:21,375 --> 00:57:25,225

And it's, you know, all of a

sudden it's disarming, right?

:

00:57:25,225 --> 00:57:25,545

It's like,

:

00:57:25,545 --> 00:57:26,005

okay.

:

00:57:26,505 --> 00:57:30,015

And then when I do act like I know

something, I think it's, I think that's,

:

00:57:30,115 --> 00:57:31,545

that's what I've learned is like.

:

00:57:32,095 --> 00:57:34,915

There are things that I think

I know a lot about, it's

:

00:57:34,935 --> 00:57:35,735

why people come,

:

00:57:35,905 --> 00:57:40,015

right, and and seek out, you know, either

stuff that we're doing, but you can

:

00:57:40,385 --> 00:57:46,005

easily kind of disarm that conversation

as soon as you let your, hey, I don't

:

00:57:46,005 --> 00:57:50,285

know everything, when I think I know

something, I was telling Evan before

:

00:57:50,285 --> 00:57:52,985

this call about another friend of mine,

Lamar, that I have come and talked to in

:

00:57:52,985 --> 00:57:54,995

my class, and one of the favorite things.

:

00:57:54,995 --> 00:58:00,630

he, that I love that he says is, if

it came out of my mouth, I believed

:

00:58:00,630 --> 00:58:02,230

it didn't mean it was right, but I

:

00:58:02,230 --> 00:58:02,640

believe,

:

00:58:03,090 --> 00:58:03,470

right.

:

00:58:03,890 --> 00:58:06,405

It's like, that's such a great, it's

:

00:58:06,430 --> 00:58:09,980

like, you can trust me, And I

think that, uh, you know, today

:

00:58:10,070 --> 00:58:11,800

there's a lot of distrust, right?

:

00:58:11,800 --> 00:58:12,330

When people

:

00:58:12,330 --> 00:58:13,340

are spewing

:

00:58:13,485 --> 00:58:16,560

things and you know, and when you get

caught, not knowing it, it's like,

:

00:58:16,560 --> 00:58:18,660

man, I'll never trust you again.

:

00:58:19,280 --> 00:58:19,900

Uh, but

:

00:58:20,200 --> 00:58:24,190

I do think that there's a huge lesson

in, Hey, if somebody says something,

:

00:58:24,530 --> 00:58:26,960

people are using acronyms and jargon

all the time, I'd be like, I've

:

00:58:26,960 --> 00:58:28,730

never, I've never heard that never.

:

00:58:29,075 --> 00:58:31,465

Christy: Well, and like I said,

it just rolls off their tongue.

:

00:58:31,465 --> 00:58:32,245

They don't even realize they're

:

00:58:32,435 --> 00:58:32,775

Randall Stevens: I call it.

:

00:58:32,775 --> 00:58:33,115

talking

:

00:58:33,155 --> 00:58:36,655

shop We talk shop all the time,

especially when you're around colleagues.

:

00:58:36,655 --> 00:58:40,105

And it's like, when you go out to a

different audience, it's like, they're not

:

00:58:40,105 --> 00:58:41,305

going to know what you were talking about.

:

00:58:41,605 --> 00:58:41,995

Christy: Right.

:

00:58:42,175 --> 00:58:42,505

Yeah.

:

00:58:42,755 --> 00:58:45,265

Randall Stevens: and may, and may

not you may not know it because they

:

00:58:45,265 --> 00:58:48,375

may not admit that they didn't even

understand, you know, half the words

:

00:58:48,395 --> 00:58:49,425

that were coming out of your mouth.

:

00:58:49,435 --> 00:58:50,755

So, a lot,

:

00:58:50,775 --> 00:58:54,095

a lot, of lessons to be learned there

and just internal communication.

:

00:58:54,435 --> 00:58:59,495

But you know, one of the, um, one of the

things that we talk about, you know, in.

:

00:59:00,430 --> 00:59:04,280

our, our business with software

going into these firms.

:

00:59:05,770 --> 00:59:09,310

Buying software is the, is the easy part.

:

00:59:09,720 --> 00:59:14,540

We, you know, we talk about how the

real challenge is, how does the person

:

00:59:14,540 --> 00:59:19,750

who's going to buy this get 300 people

inside their firm to do something that.

:

00:59:19,750 --> 00:59:21,870

they want them to do with this?

:

00:59:21,930 --> 00:59:27,350

That's the really hard thing to get

done is like, how do I, but, but I think

:

00:59:27,360 --> 00:59:28,690

that's where the lessons in this are.

:

00:59:28,690 --> 00:59:29,279

It's like, yeah.

:

00:59:29,550 --> 00:59:30,560

What do you communicate?

:

00:59:30,600 --> 00:59:31,250

What's the outcome?

:

00:59:31,250 --> 00:59:34,000

Why do they want, why

should they want to do this?

:

00:59:34,000 --> 00:59:36,240

Or, you know, cause you're talking

about changing behaviors and

:

00:59:36,600 --> 00:59:37,205

when you change

:

00:59:37,220 --> 00:59:39,650

somebody's behavior, you

know, good luck, Right.

:

00:59:39,760 --> 00:59:39,960

It's

:

00:59:40,035 --> 00:59:42,735

Christy: Well, and you know,

with software, so much of

:

00:59:42,735 --> 00:59:44,445

it comes down to efficiency.

:

00:59:45,015 --> 00:59:49,485

You will become more efficient using

the new thing, but there's a long

:

00:59:49,485 --> 00:59:52,905

time before that and you're more

efficient on the old thing, right?

:

00:59:53,545 --> 00:59:56,475

So, um, that's, that's always tricky.

:

00:59:56,625 --> 00:59:59,405

Um, It's it's, it's very painful.

:

00:59:59,495 --> 01:00:00,235

Absolutely.

:

01:00:00,360 --> 01:00:02,430

Randall Stevens: be way up

somebody's priority list, right?

:

01:00:02,440 --> 01:00:05,190

To, to, to want to take on that pain.

:

01:00:06,100 --> 01:00:06,600

Uh,

:

01:00:06,900 --> 01:00:07,940

uh, but anyway,

:

01:00:08,495 --> 01:00:10,935

Christy: Well, I'm figuring out, you

know, how to help people through that.

:

01:00:10,935 --> 01:00:14,955

I, think that you can't just throw

software, I mean, You know, a lot of

:

01:00:14,955 --> 01:00:19,755

times you get software thrown at you, and

here you go, and even the training is not

:

01:00:19,755 --> 01:00:25,775

good, you know, and there's, I think there

is, again, it's a communication problem.

:

01:00:25,815 --> 01:00:28,165

It's, it's a pure communication problem.

:

01:00:28,165 --> 01:00:33,455

How do you explain to someone what

this is, how to use it, how it's

:

01:00:33,455 --> 01:00:36,755

going to benefit them, and then

help them make that transition.

:

01:00:36,985 --> 01:00:41,175

You know, coming up with, with, uh,

with ways that alleviate the pain.

:

01:00:41,705 --> 01:00:45,285

Um, I, I guess you can't avoid

some of it, but, I do feel like

:

01:00:45,285 --> 01:00:50,345

there are probably ways that, that,

you know, can be mitigated too.

:

01:00:50,865 --> 01:00:54,685

Um, if I figure that out, maybe

I'll make a lot of money, right?

:

01:00:54,685 --> 01:00:58,454

Consultant.

:

01:00:58,760 --> 01:01:00,800

Randall Stevens: this down, but, you

know, I will, I do want to try to get

:

01:01:00,800 --> 01:01:02,960

Brent on, uh, but, but I was just gonna

:

01:01:02,960 --> 01:01:06,590

say, you know, I've known, obviously as

I said, I've known Brent for a long time,

:

01:01:06,590 --> 01:01:10,520

but, you know, he is One of those special

guys that can bridge between, you know,

:

01:01:10,910 --> 01:01:16,430

su super, super smart technology,

but approachable, and you can talk to

:

01:01:16,430 --> 01:01:17,660

him and have a normal conversation.

:

01:01:17,660 --> 01:01:18,140

He's not,

:

01:01:18,185 --> 01:01:18,495

Christy: Yes.

:

01:01:18,865 --> 01:01:19,160

Yes.

:

01:01:19,940 --> 01:01:20,660

Randall Stevens: so I'm sure that

:

01:01:20,660 --> 01:01:23,420

that helped in the, uh,

evolution of this project.

:

01:01:23,425 --> 01:01:24,145

Christy: Oh yes.

:

01:01:24,455 --> 01:01:24,725

Yes.

:

01:01:24,775 --> 01:01:25,575

Absolutely.

:

01:01:25,575 --> 01:01:30,480

And you know, he, Like I said,

he adapted to, you know, he,

:

01:01:30,480 --> 01:01:32,170

he figured out his audience,

:

01:01:32,830 --> 01:01:33,070

Randall Stevens: yeah,

:

01:01:33,070 --> 01:01:36,590

Christy: you know, realized who

his audience were, what their needs

:

01:01:36,590 --> 01:01:38,070

were, what their concerns were.

:

01:01:38,130 --> 01:01:38,450

yeah,

:

01:01:39,336 --> 01:01:43,886

Evan Troxel: thing that I was going to say

that, that really is impressionable about

:

01:01:43,896 --> 01:01:48,236

this whole thing was this, this prize,

the way that that prize thing happened.

:

01:01:48,236 --> 01:01:52,776

And, and Brent had this idea almost 20

years ago, maybe 20 plus years ago, right?

:

01:01:52,796 --> 01:01:56,056

You're, you're coming

up on 20 years at least.

:

01:01:56,076 --> 01:01:59,656

And, A lot has changed in 20 years.

:

01:01:59,656 --> 01:02:02,256

It went from, from an

idea to reality, right?

:

01:02:02,256 --> 01:02:05,446

And I, I would be maybe curious,

maybe we save this question for

:

01:02:05,446 --> 01:02:09,586

Brent, but like what, is there,

is there another audacious goal in

:

01:02:09,586 --> 01:02:10,386

the works or,

:

01:02:11,146 --> 01:02:15,846

but the, the, there's this, going back

to the original thing, it's like where,

:

01:02:15,846 --> 01:02:19,136

where a lot of people keep these,

these ideas close to the vest, right?

:

01:02:19,136 --> 01:02:22,506

And they, they don't want people to

know, they don't want to share, but the

:

01:02:22,506 --> 01:02:25,766

way you ran the prize, Because of, um,

:

01:02:26,526 --> 01:02:27,206

who was, who was the,

:

01:02:27,286 --> 01:02:31,516

because of the way Nat came in and,

and had this idea about how to run this

:

01:02:31,526 --> 01:02:33,626

prize and, and to open source at all.

:

01:02:33,936 --> 01:02:36,736

I'm sure there were a lot of cool

lessons learned there, but, but

:

01:02:36,736 --> 01:02:40,566

just the idea of like, Brent sharing

this idea, somebody else kind of

:

01:02:40,566 --> 01:02:42,586

following along in the shadows, right?

:

01:02:42,606 --> 01:02:46,046

Like not, not really just, just

waiting for more information, never,

:

01:02:46,086 --> 01:02:47,326

the information never came out.

:

01:02:47,326 --> 01:02:51,416

And then they, they approached

Brent to, to, to open this up.

:

01:02:51,746 --> 01:02:52,646

To scale, right?

:

01:02:52,646 --> 01:02:55,306

I mean, and that's what a lot of venture

people are pretty good at, right?

:

01:02:55,306 --> 01:02:58,576

It's like figuring out ways to, to

get it out in front of more people.

:

01:02:58,896 --> 01:03:04,586

But, but just the idea behind

all that of sharing ideas and

:

01:03:04,586 --> 01:03:06,646

somebody else comes alongside that.

:

01:03:13,616 --> 01:03:15,836

And I just want to encourage

the audience, right?

:

01:03:15,836 --> 01:03:18,216

There's a lot of people working

on their little pet projects.

:

01:03:18,986 --> 01:03:22,826

It's really cool to talk about those

projects because you will find the

:

01:03:22,836 --> 01:03:26,716

other weirdos out there who are as

interested in that thing as you are.

:

01:03:27,241 --> 01:03:31,771

Or slightly adjacent who are willing to

team up and do these things together.

:

01:03:32,021 --> 01:03:34,801

I just think there's a really important

lesson there and it's, and we're seeing

:

01:03:34,801 --> 01:03:41,061

it here at the highest levels in academia

with venture and GitHub and technology

:

01:03:41,061 --> 01:03:45,891

and, and the antiquities and, and

there's just so much going on there.

:

01:03:45,891 --> 01:03:48,651

But, but if, if no one ever

talked about this stuff.

:

01:03:48,816 --> 01:03:50,756

Nothing would have ever happened.

:

01:03:51,156 --> 01:03:54,936

and and so I just wanted to kind

of go back to that part of it

:

01:03:54,936 --> 01:03:56,476

and just get your kind of your

:

01:03:56,476 --> 01:03:57,696

take on, on that,

:

01:03:57,846 --> 01:03:59,036

how that all unfolded.

:

01:03:59,336 --> 01:04:04,581

Christy: Yeah, so, you know, the Forbes

magazine, I don't know if you guys saw

:

01:04:04,581 --> 01:04:08,171

it, I think I mentioned it at Confluence,

um, and I'll try to send you the link,

:

01:04:08,211 --> 01:04:14,211

because it's an excellent, um, assessment

of, of the Vesuvius Challenge, um, and

:

01:04:14,211 --> 01:04:16,821

it's talking about that specifically.

:

01:04:16,881 --> 01:04:22,051

I mean, there's, he talks about

Brent's, um, leadership generosity.

:

01:04:22,276 --> 01:04:26,326

And being willing to bring so many

people in at the moment where he was

:

01:04:26,326 --> 01:04:31,056

on the verge, actually, you know, we

we had we had the proof of concept.

:

01:04:31,466 --> 01:04:34,286

so anyway, it's absolutely about that.

:

01:04:34,326 --> 01:04:36,029

And, we've been burned.

:

01:04:36,029 --> 01:04:41,520

I mean, you know, Brent had some

collaborations go sour and he had, um,

:

01:04:41,720 --> 01:04:47,330

colleagues, partners who were working with

him on this project who ran off, took the

:

01:04:47,330 --> 01:04:49,210

data and published papers without him.

:

01:04:49,730 --> 01:04:52,040

So, you know, there are no guarantees.

:

01:04:52,515 --> 01:04:58,654

But, you know, what you know, Brent

gained and what the team gained, what

:

01:04:58,654 --> 01:05:05,035

everybody gained by, by not holding

so tight onto the idea and the data

:

01:05:05,065 --> 01:05:09,895

and being willing to share and, and

to just open it up, like you said,

:

01:05:10,195 --> 01:05:12,665

open it up to, to the world at large.

:

01:05:13,035 --> 01:05:19,325

Um, it really is, I think, uh, a lesson,

as you say, and that particular piece,

:

01:05:19,325 --> 01:05:22,325

the, um, somebody from those, um.

:

01:05:22,695 --> 01:05:27,175

Sloan School of Management, they write a

column for Forbes, and it's really good

:

01:05:27,215 --> 01:05:30,005

in terms of, you know, talking about that.

:

01:05:30,148 --> 01:05:32,293

it's hard to really

lose by being generous.

:

01:05:32,593 --> 01:05:32,993

Evan Troxel: Hmm.

:

01:05:34,818 --> 01:05:38,958

Christy: Sometimes you get burned, but at

the end of the day, you know, even on the,

:

01:05:38,968 --> 01:05:44,468

the situation where the, the other, um,

team members published the article without

:

01:05:44,468 --> 01:05:47,078

him, in the end, they wound up being sued.

:

01:05:48,628 --> 01:05:51,428

And if Brent had been a part of that

project, if he had been a part of

:

01:05:51,438 --> 01:05:53,298

that paper, he would have been sued.

:

01:05:53,298 --> 01:05:57,508

And, you know, there's

always karma or whatever

:

01:05:57,593 --> 01:05:58,683

Evan Troxel: Yeah, that's what I was gonna

:

01:05:58,683 --> 01:05:59,363

say, Karma.

:

01:05:59,688 --> 01:06:00,288

Christy: Yeah.

:

01:06:00,798 --> 01:06:04,198

So, um, and I mean, you know, the

other thing is I make the point

:

01:06:04,218 --> 01:06:09,643

in the, in the talk, you know, be

audacious, but don't be a fool,

:

01:06:10,328 --> 01:06:10,658

right?

:

01:06:11,078 --> 01:06:12,248

There's two different things.

:

01:06:12,628 --> 01:06:16,448

You know, being audacious is having the,

having the idea and believing in it.

:

01:06:16,488 --> 01:06:16,928

And.

:

01:06:17,388 --> 01:06:20,258

you know, pursuing it with wisdom.

:

01:06:21,378 --> 01:06:25,448

Um, and you have to be wise about

who you choose your partners to be,

:

01:06:25,543 --> 01:06:25,913

Evan Troxel: Yeah.

:

01:06:25,993 --> 01:06:26,913

Discernment is

:

01:06:26,913 --> 01:06:27,678

the word I think of

:

01:06:27,693 --> 01:06:28,343

there, right?

:

01:06:28,503 --> 01:06:29,213

Yeah, you do.

:

01:06:29,223 --> 01:06:31,343

you can't just jump into it willy nilly.

:

01:06:31,343 --> 01:06:32,733

You, you really do need to

:

01:06:33,293 --> 01:06:34,693

apply discernment to the

:

01:06:34,693 --> 01:06:35,433

situation.

:

01:06:35,433 --> 01:06:36,473

And sometimes it's a yes.

:

01:06:36,473 --> 01:06:37,453

And sometimes it's a no.

:

01:06:37,663 --> 01:06:39,203

I would say often more times, it's.

:

01:06:39,653 --> 01:06:41,933

If it's a maybe, it's

probably a no, right?

:

01:06:42,063 --> 01:06:43,853

It's like, hell yes, or

:

01:06:44,133 --> 01:06:44,543

no.

:

01:06:44,948 --> 01:06:45,328

Christy: Right.

:

01:06:45,953 --> 01:06:48,423

Evan Troxel: And, and that happens

through relationships and it

:

01:06:48,433 --> 01:06:49,743

happens through that, discernment.

:

01:06:50,838 --> 01:06:51,088

Christy: Yeah.

:

01:06:51,088 --> 01:06:56,108

Well, like I said, you know, we had many

meetings talking about this and about the

:

01:06:56,108 --> 01:06:58,028

contest, whether or not we should do it.

:

01:06:58,108 --> 01:07:03,898

And, um, we went out there, Brent

had us come with him at one point

:

01:07:03,938 --> 01:07:09,568

to meet and talk to Nat because he

wanted mine and Stephen's intuition.

:

01:07:10,163 --> 01:07:13,703

and Discernment to add to his own, Right.

:

01:07:13,953 --> 01:07:17,823

He didn't want to make the decision by

himself because it was a big decision.

:

01:07:18,233 --> 01:07:20,983

And we all agree 100 percent

it was the right thing to do,

:

01:07:21,433 --> 01:07:22,813

but you don't know the future.

:

01:07:22,843 --> 01:07:25,183

And again, you know, he

wasn't foolish about it.

:

01:07:25,193 --> 01:07:29,043

He didn't just get starry eyed and

be like, Oh, wow, this Nat Friedman,

:

01:07:29,073 --> 01:07:31,243

he's got a lot of money and he's Mr.

:

01:07:31,423 --> 01:07:32,403

GitHub and Oh, wow.

:

01:07:32,453 --> 01:07:37,323

Now, you know, he really stopped and

took the time and, and thought about it.

:

01:07:37,343 --> 01:07:38,143

And, and.

:

01:07:38,478 --> 01:07:44,088

Ask the other team members to think

about it and help him make the decision.

:

01:07:44,088 --> 01:07:52,088

So, you know, I think that you just

have to be wise and, and those kinds of

:

01:07:52,088 --> 01:07:55,918

choices, but not, don't be afraid either.

:

01:07:55,988 --> 01:07:56,478

You know,

:

01:07:56,883 --> 01:07:58,593

Evan Troxel: mean, looking

back on it, it must be really

:

01:07:58,593 --> 01:07:59,273

satisfying.

:

01:07:59,708 --> 01:08:01,978

Christy: it is, it is actually, um,

:

01:08:02,278 --> 01:08:02,493

Evan Troxel: it was

:

01:08:02,493 --> 01:08:03,918

probably a nervous.

:

01:08:04,218 --> 01:08:05,593

Christy: we just didn't

know how it was going to go,

:

01:08:06,048 --> 01:08:09,628

you know, is everyone going to say,

yeah, those stupid Kentucky folk couldn't

:

01:08:09,628 --> 01:08:12,818

do it, so they had to have a Silicon

Valley guy come in and save them.

:

01:08:13,523 --> 01:08:20,582

You know, um, and that's why It's so

rewarding reading that Forbes article

:

01:08:20,723 --> 01:08:22,893

because the exact opposite happened.

:

01:08:23,883 --> 01:08:27,853

So, in other words, people didn't,

there's not been one disparaging word

:

01:08:27,893 --> 01:08:32,122

written about the Kentucky team or,

or the fact that we couldn't do it.

:

01:08:32,133 --> 01:08:35,783

The only, the only one who did that,

actually it just happened to last week,

:

01:08:35,803 --> 01:08:42,452

is, is, uh, one of the authors who wrote

the, who wrote the, uh, paper without

:

01:08:42,452 --> 01:08:44,792

Brent, but that's a whole other story.

:

01:08:45,122 --> 01:08:51,783

Um, You know, no one is, everyone

recognizes the beauty in bringing people

:

01:08:51,783 --> 01:08:53,233

in and letting them be a part of it.

:

01:08:53,243 --> 01:08:57,223

And, you know, the, the, the contestants

are so happy to be a part of it.

:

01:08:57,702 --> 01:09:03,363

Um, so it is very satisfying, you

know, it was risky and we didn't

:

01:09:03,363 --> 01:09:04,573

really know how it would go.

:

01:09:04,643 --> 01:09:07,712

What would, you know, what would

the messaging end up being?

:

01:09:08,313 --> 01:09:10,923

Um, that's the other thing

about communication is

:

01:09:11,002 --> 01:09:12,563

controlling your communication.

:

01:09:12,883 --> 01:09:16,462

You know, um, we were

very intentional about.

:

01:09:17,563 --> 01:09:21,372

You know, getting ahead of the

game and having a press event

:

01:09:21,443 --> 01:09:27,042

and, you know, sending people, um,

press releases and stuff so that

:

01:09:27,042 --> 01:09:30,243

we could control the messaging,

make sure that people understood.

:

01:09:30,533 --> 01:09:34,542

We already had the tool, we had already

thought of machine learning, we had

:

01:09:34,542 --> 01:09:36,523

already tested it and knew it would work,

:

01:09:36,872 --> 01:09:37,313

you know.

:

01:09:37,693 --> 01:09:38,622

Evan Troxel: It's not just an idea.

:

01:09:38,712 --> 01:09:38,823

Yeah.

:

01:09:38,923 --> 01:09:44,133

Christy: Exactly, um, and so, and for

the most part, it was all very positive.

:

01:09:44,133 --> 01:09:45,752

We did have a few instances.

:

01:09:46,198 --> 01:09:52,278

They're interesting where, um, the story

kind of got turned upside down and, you

:

01:09:52,278 --> 01:09:56,998

know, it was Nat's idea to do machine

learning and all kinds of different stuff.

:

01:09:57,008 --> 01:10:01,288

But the media is 1 that is

very difficult to control.

:

01:10:02,077 --> 01:10:05,298

So you do everything you can

because it is so difficult to

:

01:10:05,448 --> 01:10:07,168

control and then things just happen.

:

01:10:07,168 --> 01:10:13,688

And fortunately, you know, people

recognize what a, what a big, generous.

:

01:10:14,198 --> 01:10:17,118

You know, thing it was for Brent

to release all the data, um,

:

01:10:17,138 --> 01:10:21,238

and to, you know, develop tools

to help people understand how

:

01:10:21,238 --> 01:10:22,548

to, how to work with the data.

:

01:10:23,048 --> 01:10:26,978

Um, you know, it was a big

project getting it up and going.

:

01:10:27,308 --> 01:10:29,568

And of course Nat had a

team and they were great.

:

01:10:30,108 --> 01:10:33,458

Um, but we had to do a lot on our

side too, you know, we couldn't just

:

01:10:34,008 --> 01:10:35,448

throw it to them and say, here you go.

:

01:10:35,618 --> 01:10:39,048

I mean, you know, there are probably

two people in the world, now three,

:

01:10:39,077 --> 01:10:43,673

three people in the world Who know

as much as Brent and Steven and

:

01:10:43,673 --> 01:10:48,013

Seth, our other team member, do

about, you know, what Herculaneum

:

01:10:48,023 --> 01:10:50,343

scrolls look like in tomography.

:

01:10:51,253 --> 01:10:54,813

You know, I mean, it, and it's,

it's a, it's not something that you

:

01:10:54,813 --> 01:10:56,223

can just sit down and understand.

:

01:10:56,223 --> 01:10:59,973

I mean, it's, you know, the data is

difficult to get your mind around.

:

01:11:00,003 --> 01:11:04,813

And, um, so we couldn't just turn it over.

:

01:11:04,853 --> 01:11:07,403

We had to, you know,

provide some structure.

:

01:11:07,988 --> 01:11:09,478

Scaffolding, so to speak.

:

01:11:09,827 --> 01:11:11,748

Yeah, it's been a wild ride.

:

01:11:12,150 --> 01:11:14,890

and there's a company making a movie now.

:

01:11:15,470 --> 01:11:15,960

Evan Troxel: Oh, really?

:

01:11:16,360 --> 01:11:18,840

Christy: Yeah, you know, we've

had a lot, like Randall said,

:

01:11:18,840 --> 01:11:20,590

we've had a lot of press coverage.

:

01:11:20,630 --> 01:11:21,960

I mean, just a lot.

:

01:11:22,000 --> 01:11:25,360

And we've had, you know, we

were on Expedition Unknown

:

01:11:25,370 --> 01:11:27,130

with Josh Gates not long ago.

:

01:11:27,130 --> 01:11:28,170

he came to the lab.

:

01:11:28,190 --> 01:11:29,630

And what else?

:

01:11:29,930 --> 01:11:34,350

Just, you know, Secrets of the

Dead, PBS recently, that was on.

:

01:11:34,580 --> 01:11:38,240

Um, we, you know, National

Geographic, I think, has been here.

:

01:11:38,240 --> 01:11:43,790

I mean, just a lot of people, uh, since

the contest grand prize was awarded.

:

01:11:43,800 --> 01:11:47,280

And even before that, you know,

En-Gedi brought a lot of attention.

:

01:11:47,280 --> 01:11:51,800

But this particular group, they're

making a, like a major, a feature film.

:

01:11:52,470 --> 01:11:55,410

And interestingly, this

time, instead of it.

:

01:11:55,420 --> 01:11:58,360

being all about the

technology and, and the Wow.

:

01:11:58,360 --> 01:12:02,150

factor of what that's able to do,

they're, they're telling the story

:

01:12:02,960 --> 01:12:08,680

of Brent's Quest, basically, you

know, it's actually almost 25 years

:

01:12:08,910 --> 01:12:16,210

because, uh,:

on, you know, the digital library.

:

01:12:16,620 --> 01:12:22,305

He had a grant from the NSF and, um, I

think, actually, he may have gotten the

:

01:12:22,305 --> 01:12:27,245

, but by:

thinking about these objects that, you

:

01:12:27,245 --> 01:12:32,515

know, can't, you can't always flatten

the book down and take a picture of

:

01:12:32,515 --> 01:12:37,125

it on a scanner because it's all, it's

brittle and it's bumpy and when you try

:

01:12:37,125 --> 01:12:38,615

to do that, it doesn't, it doesn't work.

:

01:12:38,615 --> 01:12:41,205

And if you take a photograph, just

a 2D photograph, it doesn't work.

:

01:12:41,205 --> 01:12:43,725

So it goes all the way back to 99.

:

01:12:43,725 --> 01:12:44,835

So 25 years, I would say.

:

01:12:46,130 --> 01:12:46,620

For sure.

:

01:12:47,160 --> 01:12:52,700

Um, but they're wanting to create a

film about that whole story and kind

:

01:12:52,700 --> 01:12:56,620

of the, the serious challenge being

the culmination, you know, of that.

:

01:12:57,130 --> 01:13:02,120

so it's been interesting to, you know,

answer all their questions and stuff,

:

01:13:02,130 --> 01:13:04,430

thinking about it, um, in those terms.

:

01:13:05,070 --> 01:13:09,040

Randall Stevens-1: Well, we'll, uh,

like I said, we'll put some links, uh,

:

01:13:09,520 --> 01:13:12,380

there'll be plenty of things for people

to go read and go watch some of these

:

01:13:12,710 --> 01:13:15,970

videos and we'll put links to, just

to kind of get the visual behind

:

01:13:16,880 --> 01:13:18,590

what all we've been talking about.

:

01:13:18,600 --> 01:13:24,110

But, uh, you know, I, you know, one,

thanks again for, uh, coming and talking

:

01:13:24,110 --> 01:13:26,430

to the group at the live Confluence event.

:

01:13:26,450 --> 01:13:28,240

And, uh, I think.

:

01:13:28,680 --> 01:13:31,170

I think it brought a lot, it brings

a lot of things together, not only

:

01:13:31,170 --> 01:13:35,020

the project you're working on from

that standpoint, but these ideas that

:

01:13:35,020 --> 01:13:39,440

you have to communicate, there's a

technology piece of it, but there's,

:

01:13:39,610 --> 01:13:44,520

there's people on the other side that

you have to bring along with you, right?

:

01:13:44,520 --> 01:13:46,850

If you're going to, if you're

going to lead and get things done.

:

01:13:46,850 --> 01:13:50,020

So I think that that's the,

uh, the big lesson, right?

:

01:13:50,030 --> 01:13:51,400

And why I was, uh,

:

01:13:51,460 --> 01:13:51,670

Christy: Yeah.

:

01:13:51,670 --> 01:13:55,090

Well, I, I really enjoyed it so

much and I, I thank you for inviting

:

01:13:55,090 --> 01:13:57,840

me and I'm so glad Brent was out

of town and he couldn't do it.

:

01:13:58,335 --> 01:13:59,565

You're so jealous of me.

:

01:13:59,895 --> 01:14:04,155

Um, but I have to ask, did I get

like a hundred percent rating on the

:

01:14:04,595 --> 01:14:07,145

Randall Stevens-1: I'll, I'll go,

I'll go back and look, but I just know

:

01:14:07,145 --> 01:14:07,385

when,

:

01:14:07,395 --> 01:14:07,735

Christy: if I

:

01:14:08,055 --> 01:14:08,745

didn't, I have to come

:

01:14:08,745 --> 01:14:10,525

back so I can get the hundred percent

:

01:14:10,684 --> 01:14:13,934

Randall Stevens-1: when the surveys come

in, you know, it's like, we always ask

:

01:14:13,934 --> 01:14:15,425

like, which was your favorite blah, blah.

:

01:14:15,425 --> 01:14:17,605

And then all of a sudden,

it's like, Christy, Christy,

:

01:14:17,655 --> 01:14:18,684

Christy, Christy, Christy.

:

01:14:18,825 --> 01:14:20,595

It's like, you know, so anyway.

:

01:14:20,630 --> 01:14:25,110

Christy: Well, you know, that's great,

I'm glad, and it's not my story, you

:

01:14:25,110 --> 01:14:28,010

know, it's just a great story, and that's

the reason there's been so much coverage

:

01:14:28,010 --> 01:14:31,340

of it, and they're, they are making a

movie out of it, and, you know, there's

:

01:14:31,340 --> 01:14:35,750

a, there are big philosophical sort of

lessons too, I mean, these things are

:

01:14:35,750 --> 01:14:40,340

trash, they're trash, they're useless.

:

01:14:40,885 --> 01:14:43,765

except as a relic, you know,

that was found and we're gonna

:

01:14:43,765 --> 01:14:44,985

store in a closet somewhere.

:

01:14:45,184 --> 01:14:50,105

They're just trash and they're being

redeemed and their original purpose

:

01:14:50,125 --> 01:14:54,565

as a written text with information

that was meant for someone to read

:

01:14:54,815 --> 01:14:59,715

is being restored and, you know,

there's just, you know, beauty from

:

01:14:59,715 --> 01:15:02,135

ashes as the book of Isaiah said.

:

01:15:02,145 --> 01:15:06,905

There's just such a hopeful sort

of thing about it too that I think

:

01:15:06,925 --> 01:15:10,305

resonates with people, whether they

even consciously realize it or not.

:

01:15:10,800 --> 01:15:14,050

You know, um, Yeah.

:

01:15:14,090 --> 01:15:15,800

so we all have trash in our lives,

:

01:15:15,800 --> 01:15:16,200

right?

:

01:15:16,450 --> 01:15:23,720

And to, to, to see physic in the, in the

flesh, something being trash and being now

:

01:15:23,720 --> 01:15:26,175

turned into this great prize, literally,

:

01:15:26,690 --> 01:15:30,760

um, it's just really, you know, it's

really exciting to be a part of it.

:

01:15:30,920 --> 01:15:34,590

And, um, I'm, I am definitely

an evangelist for the

:

01:15:34,590 --> 01:15:35,710

project because I think it's

:

01:15:36,065 --> 01:15:37,855

Randall Stevens-1: Well,

we're starting to plan for.

:

01:15:38,305 --> 01:15:42,195

Next year's Confluence event in

October, and maybe we'll coach you

:

01:15:42,195 --> 01:15:43,725

to come back and hang out with us.

:

01:15:43,830 --> 01:15:44,240

Christy: Yeah.

:

01:15:44,240 --> 01:15:44,910

Well, let me know.

:

01:15:44,910 --> 01:15:46,750

Cause I can come up with all

kinds of things to talk about.

:

01:15:46,790 --> 01:15:48,080

Obviously I'm not.

:

01:15:48,225 --> 01:15:48,915

Evan Troxel: an invite this

:

01:15:49,020 --> 01:15:49,230

Christy: Yeah.

:

01:15:49,565 --> 01:15:49,945

Randall Stevens-1: out.

:

01:15:52,850 --> 01:15:54,100

Christy: Well, he's a

very good speaker too.

:

01:15:54,100 --> 01:15:55,460

You probably heard him, Randall.

:

01:15:55,490 --> 01:16:01,710

And, um, you know, a lot of my speaking

skills actually I've picked up from him.

:

01:16:02,160 --> 01:16:06,070

Um, I mean, I have my own, but I've

also picked up some tips from him too.

:

01:16:06,070 --> 01:16:12,690

So you would definitely be,

uh, be pleased, but I don't

:

01:16:12,690 --> 01:16:13,710

want him to take my place.

:

01:16:13,710 --> 01:16:13,990

No.

:

01:16:15,955 --> 01:16:16,365

Evan Troxel: Sounds like

:

01:16:16,365 --> 01:16:16,925

a twofer.

:

01:16:17,095 --> 01:16:17,255

Randall Stevens-1: All

:

01:16:17,255 --> 01:16:17,705

right.

:

01:16:17,795 --> 01:16:18,925

Well, thanks, Christy.

:

01:16:18,925 --> 01:16:19,305

This has

:

01:16:19,805 --> 01:16:20,645

been great.

:

01:16:21,275 --> 01:16:25,105

Appreciate your coming on and, uh, and

sharing the story and then, you know,

:

01:16:25,105 --> 01:16:29,965

these insights into, you know, how to, how

to, how to get better outcomes from these,

:

01:16:29,965 --> 01:16:32,085

uh, initiatives and, uh, technology.

:

01:16:32,450 --> 01:16:32,620

Christy: Yeah.

:

01:16:33,255 --> 01:16:33,595

Don't give

:

01:16:33,750 --> 01:16:34,470

Randall Stevens-1: Bridging on it.

:

01:16:34,470 --> 01:16:35,410

Yeah, be audacious.

:

01:16:36,710 --> 01:16:37,070

Great.

:

01:16:37,135 --> 01:16:37,365

Christy: go.

:

01:16:37,365 --> 01:16:39,125

forth and be audacious, everyone.

:

01:16:40,059 --> 01:16:40,790

Randall Stevens-1: Thanks again.

:

01:16:41,395 --> 01:16:42,675

Christy: Yeah, thank you.

Listen for free

Show artwork for Confluence

About the Podcast

Confluence
The director's commentary track for AEC industry software development.
The Confluence podcast is the director's commentary track for AEC industry software. Go behind the scenes with us to learn how and why decisions were made in the creation of your favorite software for the architecture, engineering, and construction industries.

It's a collaboration between Randall Stevens of AVAIL and Evan Troxel of TRXL.

About your hosts

Evan Troxel

Profile picture for Evan Troxel
An industry-leading design and technology expert with a passion for connecting people, Evan is a licensed architect in California and is most well known for his podcasts that focus on the AEC industry.

He has over 25 years of experience in the practice and technology in the architectural profession working with large teams to deliver large public projects for clients. He now brings his experiences together on the Archispeak and TRXL podcasts, and now on the Confluence podcast.

Randall Stevens

Profile picture for Randall Stevens
An AEC industry veteran with 25 years of software development, and sales and management experience, Randall offers a unique combination of expertise in software and graphics technology — coupled with a background and degree in architecture.

In 1991 he founded ArchVision, a software firm specializing in 3D graphics, specifically Rich Photorealistic Content (RPC). Through ArchVision, Randall has built an extensive network with the industry’s leading experts, architectural firms, and visualization software companies, which led him to product development of the AVAIL platform.