Random Fandom: A.I in Creativity with Tom & Matt from Din of Celestial Birds
FandomentalsDecember 23, 2024
134
1:39:28227.94 MB

Random Fandom: A.I in Creativity with Tom & Matt from Din of Celestial Birds

AI Coca-Cola Ads

Weird Faces - https://youtu.be/IQWUKWM2JrQ?si=VzinW1Qk5Qac43MS

Weird Squirrels - https://www.youtube.com/watch?v=mlVkTA_JGVg

Din of Celestial Birds

The Night Remixed on Bandcamp - https://dinofcelestialbirds.bandcamp.com/album/the-night-remixed

The Night Remixed on Spotify - https://open.spotify.com/album/6tcdnxvOjHNauXkuK7K0h3

Website - https://dinofcelestialbirds.bandcamp.com/

Spotify - https://open.spotify.com/artist/6b91rmLZURcbwNnBRwerym

Apple Music - https://music.apple.com/gb/artist/din-of-celestial-birds/1524030187

YouTube - https://www.youtube.com/channel/UCGM-GLNakgXzhDWhYpLFmXg

Instagram - https://www.facebook.com/DinofCelestialBirds/

Facebook - https://www.facebook.com/DinofCelestialBirds/

Get 25% Off Pet Accessories Here - https://styleupyourpet.com/discount/harley25

Fandomentals Links

Discord Server - https://discord.gg/x6d9PNGQfF

Donate to the Podcast - https://fandomentals.captivate.fm/donate

TeePublic Store - https://fandomentals.captivate.fm/podcastmerch

Blue Sky - https://bsky.app/profile/fandomentalspod.bsky.social

Instagram - https://instagram.com/fandomentalspod

Threads - https://www.threads.net/@fandomentalspod

Email – fandomentals@yahoo.com

Website - https://fandomentals.captivate.fm/

Artwork Designed by Alex Jenkins

Website - www.hexdesigns.org

Instagram - https://www.instagram.com/hexshadow

Twitter - https://twitter.com/hexghosts

Thank you for checking out this episode and be sure to subscribe for more content!

Donate to CALM Here - https://tiltify.com/@podomedy/fundraiser-for-stay-tuned-2025


CALM Tools & Resources - https://www.thecalmzone.net/tools-mental-health-support


Hosted on Acast. See acast.com/privacy for more information.

AI Coca-Cola Ads

Weird Faces - https://youtu.be/IQWUKWM2JrQ?si=VzinW1Qk5Qac43MS

Weird Squirrels - https://www.youtube.com/watch?v=mlVkTA_JGVg

Din of Celestial Birds

The Night Remixed on Bandcamp - https://dinofcelestialbirds.bandcamp.com/album/the-night-remixed

The Night Remixed on Spotify - https://open.spotify.com/album/6tcdnxvOjHNauXkuK7K0h3

Website - https://dinofcelestialbirds.bandcamp.com/

Spotify - https://open.spotify.com/artist/6b91rmLZURcbwNnBRwerym

Apple Music - https://music.apple.com/gb/artist/din-of-celestial-birds/1524030187

YouTube - https://www.youtube.com/channel/UCGM-GLNakgXzhDWhYpLFmXg

Instagram - https://www.facebook.com/DinofCelestialBirds/

Facebook - https://www.facebook.com/DinofCelestialBirds/

Get 25% Off Pet Accessories Here - https://styleupyourpet.com/discount/harley25

Fandomentals Links

Discord Server - https://discord.gg/x6d9PNGQfF

Donate to the Podcast - https://fandomentals.captivate.fm/donate

TeePublic Store - https://fandomentals.captivate.fm/podcastmerch

Blue Sky - https://bsky.app/profile/fandomentalspod.bsky.social

Instagram - https://instagram.com/fandomentalspod

Threads - https://www.threads.net/@fandomentalspod

Email – fandomentals@yahoo.com

Website - https://fandomentals.captivate.fm/

Artwork Designed by Alex Jenkins

Website - www.hexdesigns.org

Instagram - https://www.instagram.com/hexshadow

Twitter - https://twitter.com/hexghosts

Thank you for checking out this episode and be sure to subscribe for more content!

Donate to CALM Here - https://tiltify.com/@podomedy/fundraiser-for-stay-tuned-2025


CALM Tools & Resources - https://www.thecalmzone.net/tools-mental-health-support


Hosted on Acast. See acast.com/privacy for more information.

[00:00:14] Hello and welcome to Fandomentals, the podcast that explores pop culture one conversation at a time. I am your host, Harley. Every episode I interview different people from around the world to discuss a variety of topics within the world of pop culture. Thanks for joining me on this journey and I hope you enjoy the episode.

[00:00:32] Welcome to the eighth and final episode of Random Fandom 2024. In fact, this is the last episode of the podcast for the year 2024. And what an exciting conclusion I have for all of you today. On this episode, I'm joined by two returning guests of the podcast. It's Matt Benatton and Tom Hazlehurst from the band Din of Celestial Birds.

[00:01:08] I met these guys at the Arc Tangent Festival earlier this year and we really hit it off. So when I was putting this season together, I reached out to them to see if they'd like to do a return guest spot. And of course they would choose whichever topic in pop culture they would love to discuss.

[00:01:23] And they came to me with something truly spectacular and very relevant in our time. And that is, of course, AI in Creativity.

[00:01:33] This is very much a hot button topic right now in the world of pop culture. Matt and Tom have a very unique perspective on this subject as two individuals who both work with AI in a professional setting, but also as artists who are very much aware of the creative implications of this technology.

[00:01:52] It's a really fascinating conversation that we get into. I was so thrilled that they came on and shared with me such a nuanced and balanced perspective, which I'm sure you will enjoy.

[00:02:02] I will, of course, be letting you know at the end of the episode in the credits what to expect for the podcast in the coming months.

[00:02:09] So make sure you stick around for that at the end.

[00:02:12] So let's just get to it.

[00:02:15] This is AI in Creativity with Matt and Tom from Din of Celestial Birds.

[00:02:30] Hello, Tom and Matt, and welcome back to the Fundamentals podcast.

[00:02:34] Hello. Thanks for having us.

[00:02:36] Hi. Yeah. Thanks for bringing us back.

[00:02:39] Oh, it's my pleasure, guys. And, you know, I've got to say, this is one of the most unique topics I've ever had sent my way.

[00:02:46] And I'm really excited to talk about this for so many reasons.

[00:02:49] So as you would have seen from the title, folks, you know, I'm better late than never.

[00:02:54] I'm talking about AI on the podcast.

[00:02:57] But I thought it'd be a good one because you guys mentioned that you've got a background in it, both as creatives and on a professional level, from what I understand.

[00:03:03] So I'm intrigued to dive into this with you guys, I guess, to kind of kick it off.

[00:03:07] I mean, my first question really is, what do you guys do in a sort of professional capacity that means you understand this sort of better than the average person?

[00:03:19] Tom?

[00:03:21] Well, I'm a research fellow at university.

[00:03:24] I work in the Department of Computer Science.

[00:03:27] Okay.

[00:03:28] Yeah.

[00:03:29] I research AI every day.

[00:03:32] In the past few years, that's all I've done, really.

[00:03:35] Really?

[00:03:36] Yeah.

[00:03:38] Okay.

[00:03:40] Yeah.

[00:03:40] Yeah.

[00:03:40] But, you know, I don't research in a creative sense.

[00:03:47] I kind of do more industrial AI stuff.

[00:03:49] So, you know, this has different applications to what many people tend to think what AI is.

[00:03:56] But I think there's a really broad kind of gamut of possibilities what you can use it for.

[00:04:02] I don't think a lot of people really appreciate.

[00:04:05] Right.

[00:04:06] Okay.

[00:04:07] Interesting.

[00:04:07] And how about yourself, Matt?

[00:04:09] I'm also an AI researcher.

[00:04:12] I'm in industry.

[00:04:14] So, I did my PhD at Leeds, same place that Tom did.

[00:04:19] Okay.

[00:04:19] And that's not how we know each other, though.

[00:04:22] We go very, very, very far back.

[00:04:26] Okay.

[00:04:26] Okay.

[00:04:26] But, yeah.

[00:04:28] Yeah.

[00:04:32] My areas of expertise in AI are quite broad.

[00:04:35] I've been working in the field as a researcher for a very long time.

[00:04:39] Longer than I'd like to admit.

[00:04:43] And the stuff that I've worked on has kind of taken me from originally looking at audiovisual speech processing.

[00:04:53] So, looking at what we call computer vision, which overlaps a lot with what Tom's looked at in the past.

[00:05:01] And, in fact, Tom's supervisor was my internal examiner.

[00:05:07] Right.

[00:05:07] And on top of that, audio is really how I got into it.

[00:05:14] So, working on audio DSP and making my own sort of plugins and things like that and little apps and stuff.

[00:05:24] And then that kind of took me through to, like, through a kind of protracted journey to machine learning.

[00:05:29] And I've worked on things to do with multimedia.

[00:05:33] I've worked on things to do with drug discovery for pharmaceutical companies.

[00:05:37] Right.

[00:05:38] And now I'm back to multimedia working.

[00:05:43] Well, sort of multimedia adjacent anyway, working at Sonos.

[00:05:47] So, I work with the speaking company Sonos where I develop various kinds of AI algorithms to do various kinds of things.

[00:05:54] Wow.

[00:05:55] Okay.

[00:05:55] So, both of you coming at it from, I guess, different angles.

[00:05:59] But, yeah, in the same way looking at how you can apply this technology to sort of, I guess, advancements.

[00:06:05] Right.

[00:06:05] I mean, that's kind of the main thing.

[00:06:07] That's been my understanding of this.

[00:06:09] But, yeah, I mean, how long do you think this has been a thing for?

[00:06:12] Because that's something I feel like is a bit of a misconception.

[00:06:14] Right.

[00:06:14] It does feel like it sort of sprung up overnight.

[00:06:17] But just even listening to you guys talk there, saying about how you've been researching it for years, working in different fields.

[00:06:22] Like, what do you sort of think is the timeline on a lot of this stuff?

[00:06:26] I mean, I would say, you know, from the 90s, a lot of this kind of algorithms and stuff has been there.

[00:06:33] Yeah.

[00:06:33] But I think it's just been very limited by the computational power.

[00:06:36] Sure.

[00:06:37] I think a lot of the really advancements have come from being able to just, you know, ramp up these machines.

[00:06:42] They're just getting better and better every day.

[00:06:44] And really the power that's behind it is really kind of pushing this advancement.

[00:06:49] And because you can do these things a lot faster than you used to be able to, you know, the progress is just exponential, really.

[00:06:57] Yeah.

[00:06:58] I mean, when you say how far does it go back, it's a difficult question to answer because a lot of people will say it's something like Ada Lovelace and the Mechanical Turk and stuff like that, which is like going back.

[00:07:11] Yeah.

[00:07:11] Like a very long time.

[00:07:14] You can go back to Alan Turing, can't you?

[00:07:16] Early 20th century and like Alan Turing.

[00:07:19] Yeah, yeah.

[00:07:20] So people have been really interested in being able to describe the world and build agents to work within the world automatically for a very long time.

[00:07:33] Sure.

[00:07:34] And what we're seeing now is just, yeah, it's a scaling up of that.

[00:07:37] It's taking that to the nth degree and seeing what happens when you fire loads of computational power at it, like Tom says, and you can do really magical things.

[00:07:45] Yeah, that makes sense.

[00:07:46] I mean, I suppose it's problem solving, isn't it?

[00:07:48] That's what it's been for the longest time.

[00:07:50] Yeah.

[00:07:51] I mean, I think like, yeah, problem solving pattern recognition is probably one of the best ways of generalizing it.

[00:08:01] But then, of course, you know, what we see today with things generating things, that's not necessarily recognition, but it comes from the same place.

[00:08:08] You know, you need these kind of underlying algorithms need to learn what it is they're trying to generate in order to be able to generate it.

[00:08:17] So that element of recognition is still very much part of the equation.

[00:08:22] Interesting.

[00:08:23] Yeah.

[00:08:23] So that's something I'm fascinated by is, yeah, where that sort of crossover comes in.

[00:08:29] Right.

[00:08:29] Because I feel like most people can understand the idea of, yeah, you're building in machine learning, as you say, pattern recognition makes sense, problem solving, all this stuff.

[00:08:39] That feels like a huge leap then to get to creativity, like a machine figuring out how to build something.

[00:08:47] And I guess like the whole implications of that, which is a whole massive topic in itself.

[00:08:52] But yeah, something I'm very curious about is what you guys sort of make of that.

[00:08:56] I suppose as someone who works in the industry, so you understand better than most, like how these things actually operate and what they are.

[00:09:04] But it's also as creative yourselves and sort of what you're starting to recognize.

[00:09:08] And because I think that's the big talking point right now, isn't it, for a lot of people?

[00:09:12] So, yeah, I guess I appreciate it's a massive question, but I am genuinely curious what your guys sort of take is on that.

[00:09:18] And we can go off in all sorts of directions with this, I'm sure.

[00:09:21] I guess from the general public, I guess what you see really is all this generative stuff.

[00:09:29] You can just scroll through Facebook and every third post is just an AI-generated image.

[00:09:36] Sure.

[00:09:38] And people are worried that these things are just going to take other people's jobs.

[00:09:44] Is it going to...

[00:09:45] Sure.

[00:09:45] Like, why would you hire a stock photo photographer if you can just generate something that's good enough kind of thing?

[00:09:53] I can understand why people are worried about kind of things like that.

[00:09:58] I don't know, though, but I always think, you know, stock images is one thing.

[00:10:05] But, you know, if you're writing a song or you're an artist and you're painting a picture, it's that...

[00:10:10] That's a human connection kind of making that, you know.

[00:10:15] And I think that's always...

[00:10:16] That's something that's never, ever going to go away.

[00:10:19] Yeah.

[00:10:20] If you saw an image...

[00:10:22] In fact, one of the examples I was thinking earlier today was this man called Gold.

[00:10:26] And, you know, if you didn't know the artist, you think, oh, these are some good songs.

[00:10:32] And you can read about the artist and, you know, the stuff that that artist pretty large

[00:10:36] that she went through.

[00:10:37] Those songs hit so much harder.

[00:10:39] But if you just heard, oh, an AI-generated it, you're like, oh, yeah.

[00:10:44] You don't really feel anything.

[00:10:45] You're like, well, it's a good song, but...

[00:10:47] You know, I'm sure it's a good song, but if you don't feel anything, then what's the point, really?

[00:10:54] Yeah.

[00:10:55] I don't think you can be a fan of something that's generated without, like, without that

[00:11:01] element of the human experience.

[00:11:03] That seems really strange to me to, you know...

[00:11:07] And people talk about stock music and things like that, you know, just being able to generate

[00:11:14] lots of this music.

[00:11:15] And there's a company that I think they have either been acquired or they went under a little

[00:11:22] while ago.

[00:11:23] But a friend of mine interned there a little while back.

[00:11:26] They were called Duke Deck.

[00:11:27] And what they were doing, they were developing these machine learning-based methods and sort

[00:11:35] of procedural methods for, like, automatically generating music.

[00:11:43] And this was before, you know, this was before we have the stuff that we, you know, we have

[00:11:48] today where we're starting to get some pretty convincing music generation.

[00:11:54] And it just felt weird to me.

[00:11:57] Like, obviously there's a market for it, but there's a big ethical implication there.

[00:12:02] And so the question of how does this work creatively is a really pertinent one right now.

[00:12:09] Mm-hmm.

[00:12:11] Um, I had the, I had the, um, sort of fortune of having a chat with, um, somebody who works

[00:12:19] at Stability AI, um, a few months ago.

[00:12:22] Um, now Stability AI, if you're not aware, you can Google them.

[00:12:26] Um, they, they release various, um, large generative models that, um, you know, do anything

[00:12:33] from, you know, um, sort of images through to music.

[00:12:36] And I'm sure they're going to look at multimodal things that do like images and music, you know,

[00:12:41] on things like that.

[00:12:42] I'm sure, you know, they're interested in that, that, that side of, um, generative content.

[00:12:47] Um, but the guy that I spoke to was working on generative music content specifically.

[00:12:52] Um, and I was interested because he's a musician.

[00:12:56] A lot of people who work on this stuff, a lot of people who actually generate the models

[00:13:00] that generate the music that threatens the jobs of people who want to make money through

[00:13:04] music, um, are themselves musicians and they don't come at this from a bad place.

[00:13:09] Right.

[00:13:09] They're interested in creating tools that they would like to use.

[00:13:12] Right.

[00:13:13] Um, and his point was like, well, what if you could just have something that, that played

[00:13:17] the perfect accompaniment?

[00:13:18] Right.

[00:13:19] So what, you know, if you, as a, as a musician, what if you could just, you know, pick up a

[00:13:23] guitar and then you have, uh, you know, the bass and the drums there and they're following

[00:13:28] what you're doing and it feels really natural and you don't need to find people that you,

[00:13:34] you know, and orchestrate band practices and things like that.

[00:13:39] Right.

[00:13:39] Cause organizing all of that stuff, you know, we have five people in our band, right.

[00:13:42] It's really difficult organizing anything.

[00:13:45] That's true.

[00:13:46] Yeah.

[00:13:46] And sometimes, sometimes I do wonder, um, but, um,

[00:13:52] You gotta be a replacement.

[00:13:53] Yeah.

[00:13:54] Well, I'm the first one to go.

[00:13:55] Yeah, probably, but, um, Is this how you're finding out?

[00:13:59] Yeah, I think it might be.

[00:14:01] This is awkward.

[00:14:04] But I think that's a, that's a perfectly legitimate perspective to have, right.

[00:14:08] It's a, it's a cool way to use the technology, but like with all technology, it can be misused.

[00:14:13] Right.

[00:14:13] I think that's the thing, you know, as someone who works in tech and, um, is invested in creating

[00:14:23] technology that brings people joy.

[00:14:25] I don't really want that technology to have negative effects.

[00:14:29] Right.

[00:14:30] So for me, you know, working on, um, generative music is a bit contentious because it's, it's a,

[00:14:37] it's a very, um, it's a very clear line from that to something that can impact people's

[00:14:44] livelihoods and music.

[00:14:46] Yeah.

[00:14:46] Music's hard enough as it is.

[00:14:47] Oh yeah.

[00:14:48] Yeah.

[00:14:50] Hmm.

[00:14:51] Interesting.

[00:14:51] What do you make of that Tom?

[00:14:53] Yeah.

[00:14:54] I mean, I broadly agree.

[00:14:56] I mean, it's difficult, you know, going back to what I said earlier, I think generative AI

[00:15:05] is such a, I think we just really see the, the, the tip of the iceberg as it were.

[00:15:12] And it can be used in so many other, you know, so many other ways.

[00:15:18] You know, one thing that comes to mind is the medical AI, you know, they use generative

[00:15:22] techniques.

[00:15:23] So one thing a colleague of mine works on, well, I think they used to work on it, but,

[00:15:28] um, they're using AI to detect cancer in, you know, in, in medical images.

[00:15:35] Right.

[00:15:36] And one of the problems they have is, you know, there's not many people who've got this particular

[00:15:39] type of cancer.

[00:15:40] So they're actually using generative AI to create new examples to train, you know, they

[00:15:46] can train doc, real doctors with this and other AI to detect cancer cells.

[00:15:50] And these things are incredible.

[00:15:52] I think like going to literally save millions of people's lives.

[00:15:56] And it's just something when people go, Oh, AI is bad.

[00:15:59] It's just, people just don't even think about it.

[00:16:02] That's a really good point, actually.

[00:16:03] Yeah.

[00:16:04] I think this is kind of taps into what you guys have been saying from the start, really

[00:16:07] is the idea of using this as tools.

[00:16:09] Right.

[00:16:10] And, um, I like what you brought up there, Matt, as well, about the, the idea of using

[00:16:14] it as, as a creative tool.

[00:16:16] I mean, one that kind of jumps out to me and I'm sure you guys are very familiar with this

[00:16:20] is, um, things like, um, we mentioned it earlier using sort of DSPs and VTS, VSTs, you

[00:16:26] know, kind of like the, for anyone that doesn't know basically virtual digital instruments.

[00:16:31] And pretty much all of those nowadays rely on some form of machine learning.

[00:16:36] I mean, the most common example I can think of, and I'm looking at Tom's guitar collection

[00:16:41] and I know for myself, you know, it's the same.

[00:16:44] I mean, I use digital amps all the time now.

[00:16:47] And like the prevalence of that in the last four years has just, I mean, it was always

[00:16:52] around, but it's blown up.

[00:16:54] Right.

[00:16:54] And it's incredible when people are able to do with that.

[00:16:56] And the same with drums, like people, you know, people were able to go in a studio, record

[00:17:00] real instruments, feed it through a machine learning program that uploads all of that

[00:17:06] as samples.

[00:17:06] And then boom, you're away.

[00:17:08] And if you're like me, just some guy at home looking to write some music and create some

[00:17:13] songs, like you can do it now.

[00:17:15] You have the power to do that, which is incredible.

[00:17:18] Um, as you say, so I totally agree.

[00:17:20] I think it's one of those things when it comes to learning and how to use this stuff in

[00:17:24] creativity.

[00:17:25] I agree.

[00:17:26] It's good to look at it from a balanced perspective, not just this is all terrible.

[00:17:30] It's like, no, no, it has applications.

[00:17:32] I mean, case in point, sometimes I use it.

[00:17:34] I use it on pretty much every episode of the podcast.

[00:17:36] I have various plugins that I use, but noise removal and EQ and compression, and all of

[00:17:43] them have some sort of setting on there where you just get it to listen, learn the track

[00:17:47] and then give you a sort of preset.

[00:17:51] That's brilliant because I'm not, I'm not a studio engineer.

[00:17:55] So, and even, even studio engineers, like the, the fact of the matter is, is the technology

[00:18:01] is at a point now where, um, it's giving people the ability.

[00:18:08] So it's good.

[00:18:09] So, I mean, the tools you talk about there, right?

[00:18:11] I mean, don't know how you feel about dropping brands on this, but eyes are tight.

[00:18:15] Oh, why?

[00:18:16] Because they have been using it right now.

[00:18:20] There you go.

[00:18:21] Right.

[00:18:21] Um, they've been industry leaders in, in this sort of stuff.

[00:18:24] Um, and they've always been at the forefront of picking up new technologies and showing

[00:18:31] what you can do with that.

[00:18:32] And the fact of the matter is, is that, you know, they, and I've had the benefit of seeing

[00:18:37] some of this in, in, in my career because I started out really in digital signal processing,

[00:18:43] right?

[00:18:43] So, okay.

[00:18:44] Developing algorithms to do the sorts of things that you're talking about broadly.

[00:18:48] Yeah.

[00:18:48] Right.

[00:18:49] Um, but over time, what we found, you know, there, there are, there are limitations to

[00:18:56] what, what we would term as classical or traditional techniques.

[00:19:00] Um, sure.

[00:19:01] And these machine learning based methods allow us to do more.

[00:19:06] They allow us to get past that, um, and open up this, this whole new realm of possibility

[00:19:14] of being able to do things that we've not, you know, not been able to do before the stuff

[00:19:18] to do is, you know, being able to denoise things, um, in isotopes tools, uh, phenomenal, right?

[00:19:25] They've always been good, but now that they're a totally new level.

[00:19:29] I've got, uh, an example of that in an episode I did a couple of months ago.

[00:19:33] Now, um, I interviewed kid bookie for the podcast.

[00:19:36] And, um, unfortunately that was a ton of background noise in the place we recorded.

[00:19:42] And I was like, Oh no, this is ruined.

[00:19:44] I can't use this.

[00:19:45] And then I had the idea.

[00:19:46] Oh yeah.

[00:19:48] Isotopes got the RX things.

[00:19:51] I thought I'll get a trial, see how it works.

[00:19:53] Cause I've got the basic version on here and it's pretty good.

[00:19:56] It was incredible.

[00:19:58] Like what it was able to do.

[00:19:59] It stripped out so much of the noise and both of our feeds were so clear.

[00:20:04] And I just remember thinking like, yeah, this is pretty remarkable.

[00:20:07] Like if this is the kind of stuff that you can do now.

[00:20:09] And to that as well, I'm sure you guys have seen it.

[00:20:11] I've, I've sort of went down a bit of a YouTube rabbit hole afterwards,

[00:20:14] looking at other softwares that do the same with like film footage, you know,

[00:20:18] and like taking old grainy people's like home videos and they're able to like

[00:20:22] upscale them into almost like 4k kind of stuff.

[00:20:26] Oh, those, those things are incredible.

[00:20:27] Really?

[00:20:28] Yeah.

[00:20:28] It's totally.

[00:20:29] Yeah.

[00:20:30] I mean, if you're, you know,

[00:20:32] I've got lots of family videos from when I was a child,

[00:20:35] I'm sure in, you know, 20 years time, these will look all grainy,

[00:20:38] but if I can use a tool to make them look as good as new,

[00:20:41] then it's exactly.

[00:20:43] It's wonderful.

[00:20:43] Really?

[00:20:43] It's crazy.

[00:20:44] I remember when I was,

[00:20:47] when I was doing my PhD,

[00:20:50] me and my girlfriend, now wife,

[00:20:54] were watching,

[00:20:55] I don't know,

[00:20:56] I feel like it was the Bourne Identity or something like that.

[00:20:58] And they did,

[00:20:59] they did that typical sort of enhanced thing with some CCTV footage.

[00:21:03] Yeah.

[00:21:04] And I like,

[00:21:04] super enhanced.

[00:21:05] Yeah.

[00:21:05] And I was like super like smugly,

[00:21:07] like that's impossible if the data is not there,

[00:21:09] I can't just imagine the data.

[00:21:10] Right.

[00:21:11] And now it's like,

[00:21:12] no,

[00:21:13] it can't like,

[00:21:13] it can because enough information said that you can,

[00:21:16] yeah,

[00:21:17] you can learn what that from that high level structure,

[00:21:20] you can learn what the local structure is and you can recreate that.

[00:21:23] And it's,

[00:21:24] it's,

[00:21:25] yeah,

[00:21:26] it's kind of fun to be eating my words like this.

[00:21:28] I think it's,

[00:21:28] it's,

[00:21:29] it's a really interesting time that we're in.

[00:21:31] A hundred percent.

[00:21:32] Yeah.

[00:21:33] So I saw a new tool.

[00:21:35] I was doing,

[00:21:36] doing some Googling and I found a new tool called Audiby.

[00:21:40] And we were having a brief chat before you got here,

[00:21:42] Matt,

[00:21:43] but I was saying,

[00:21:43] I'm not a very good singer,

[00:21:45] but this Audiby thing,

[00:21:46] but essentially what it does is you can,

[00:21:48] you know,

[00:21:49] you sing your,

[00:21:50] your vocal line.

[00:21:51] And what it does is you upload it to the website and it,

[00:21:55] it recreates.

[00:21:56] It must do some transfer learning where it turns your voice into like a

[00:22:00] different singer.

[00:22:02] So it's using all your articulation,

[00:22:04] but it's just remapped it onto like a,

[00:22:06] no.

[00:22:08] That's really cool.

[00:22:09] Like,

[00:22:09] it's really like,

[00:22:10] and I don't know.

[00:22:12] Like,

[00:22:13] I feel like that's,

[00:22:14] is that okay?

[00:22:15] I guess that's the question,

[00:22:16] right?

[00:22:16] If you're a,

[00:22:17] if you're a songwriter,

[00:22:18] you know,

[00:22:18] from my point,

[00:22:19] if you've been writing a song,

[00:22:21] I'm a great singer,

[00:22:22] but I want to write a song for,

[00:22:24] you know,

[00:22:24] a soul group.

[00:22:25] Why not make it,

[00:22:27] you know,

[00:22:27] if I really wanted to release the song or tire someone to do this,

[00:22:31] but as a songwriting process,

[00:22:33] why not?

[00:22:34] This reminds me of a case recently.

[00:22:36] I can't remember who the artist was,

[00:22:40] but their voice was basically being,

[00:22:44] being used on recording or like it was kind of made available.

[00:22:48] I think through some model that,

[00:22:50] um,

[00:22:51] some,

[00:22:51] some people had trained somewhere.

[00:22:53] Um,

[00:22:54] they trained it on her voice and then they transfer learned from that.

[00:22:56] So,

[00:22:56] so just quickly,

[00:22:57] um,

[00:22:58] transfer learning is a technical term,

[00:22:59] um,

[00:23:01] that we use to talk about,

[00:23:03] um,

[00:23:03] imparting the,

[00:23:05] um,

[00:23:06] the aesthetic properties of something onto something else.

[00:23:10] And you can do it with any kind of medium.

[00:23:12] So you've probably seen,

[00:23:14] um,

[00:23:16] you,

[00:23:17] you've probably seen people,

[00:23:18] um,

[00:23:19] turn photos into,

[00:23:22] into different types of paintings,

[00:23:23] right?

[00:23:23] Like you can turn a photo of something into like a,

[00:23:26] something that looks like it was painted by Monet or,

[00:23:29] or whatever.

[00:23:29] Right.

[00:23:30] Um,

[00:23:31] and transfer learning is,

[00:23:33] is how you do that.

[00:23:34] And you have a,

[00:23:34] um,

[00:23:35] a machine learning model that,

[00:23:36] um,

[00:23:37] learns from a whole bunch of these paintings.

[00:23:39] Um,

[00:23:40] and then it has the ability to transfer that style.

[00:23:45] It's called style transfer.

[00:23:46] Sorry.

[00:23:46] No,

[00:23:46] it's not called transfer learning called style transfer.

[00:23:49] Um,

[00:23:50] transfer learning is something completely different and I won't go into it,

[00:23:52] but it's,

[00:23:53] it's quite,

[00:23:53] it's,

[00:23:54] it's quite useful.

[00:23:55] Did I just say that?

[00:23:56] Or did you,

[00:23:56] did you say transfer learning?

[00:23:57] I said transfer learning.

[00:23:59] Okay.

[00:23:59] Right.

[00:23:59] Cool.

[00:24:00] Not my fault.

[00:24:01] Um,

[00:24:03] there we go.

[00:24:03] No.

[00:24:04] Um,

[00:24:04] so it's called style transfer and,

[00:24:06] and,

[00:24:06] and it's because it does that.

[00:24:07] It does just that.

[00:24:08] It transfers the style.

[00:24:09] So you can do that with,

[00:24:10] um,

[00:24:12] uh,

[00:24:12] you can do that with images and you can do it with audio.

[00:24:14] So you can,

[00:24:15] you can do,

[00:24:15] you know,

[00:24:16] what,

[00:24:16] what Tom said,

[00:24:17] um,

[00:24:17] this,

[00:24:18] this tool does,

[00:24:18] which is you,

[00:24:19] um,

[00:24:20] you train something on a bunch of examples of someone's voice,

[00:24:24] and then you can put in some other voice content and apply that style to the new voice content.

[00:24:32] And you can do this with frighteningly small amounts of information,

[00:24:36] right?

[00:24:36] Cause there's been a lot of concern around deep fakes and things like that.

[00:24:40] Um,

[00:24:40] and you can do this with something like a few seconds of audio from someone you can,

[00:24:45] that's enough.

[00:24:47] Um,

[00:24:49] probably quite dependent on the quality of that and the variance that occurs in that data.

[00:24:53] But,

[00:24:54] um,

[00:24:55] you know,

[00:24:55] it's,

[00:24:55] it's,

[00:24:55] it's very efficient.

[00:24:56] So,

[00:24:57] um,

[00:24:58] yeah,

[00:24:58] I recently,

[00:24:59] um,

[00:25:00] was reading about a case where,

[00:25:02] yeah,

[00:25:02] someone was,

[00:25:03] was very angry that,

[00:25:04] um,

[00:25:04] this sort of technology was being used with the likeness of their voice.

[00:25:08] Um,

[00:25:09] and so that's obviously a concern in terms of,

[00:25:12] um,

[00:25:13] artists themselves,

[00:25:14] right?

[00:25:14] Like what if you can just go and you can find like,

[00:25:17] um,

[00:25:18] a vocal rendition of something?

[00:25:19] I mean,

[00:25:20] you can,

[00:25:20] you can do what's called source separation now.

[00:25:24] Sorry,

[00:25:24] we're throwing a lot of terms out there.

[00:25:26] Um,

[00:25:26] source separation is,

[00:25:28] um,

[00:25:29] the,

[00:25:30] um,

[00:25:33] the means by which you extract different signal sources from,

[00:25:37] um,

[00:25:38] audio.

[00:25:38] Right.

[00:25:39] So for example,

[00:25:41] um,

[00:25:41] when you see people wanting to like make karaoke tracks or something by

[00:25:46] removing the vocal,

[00:25:47] like you could argue that that's kind of a,

[00:25:49] a form of source separation.

[00:25:51] Um,

[00:25:52] but it's gotten very advanced now with AI and you can actually just pull like

[00:25:56] the drums and the bass and the vocal and the guitars,

[00:25:59] you know,

[00:26:00] you can have those separate stems,

[00:26:02] right.

[00:26:02] From a,

[00:26:02] from a stereo recording,

[00:26:04] which means that you could get that vocal and you could train a model on

[00:26:08] it.

[00:26:08] And then that's,

[00:26:09] that's that artists IP,

[00:26:12] right?

[00:26:12] Like that's there.

[00:26:14] Shouldn't that belong to them?

[00:26:16] Like,

[00:26:16] how do we deal with that as a society?

[00:26:18] How do we like?

[00:26:20] Yeah.

[00:26:20] Yeah.

[00:26:21] That's a really interesting one.

[00:26:22] I guess that's falls as well under the whole thing of,

[00:26:25] this being on the internet.

[00:26:28] Yeah.

[00:26:28] You know,

[00:26:29] because it's like,

[00:26:29] you can't police that,

[00:26:31] right?

[00:26:31] Like you can't really stop somebody doing any of that stuff.

[00:26:36] I mean,

[00:26:37] interestingly,

[00:26:37] I think we kind of had a brief chat along this lines,

[00:26:40] right?

[00:26:40] At the festival,

[00:26:41] we were chatting about your band name and how like there was a band

[00:26:44] somewhere else in another part of the world that did use it for a bit

[00:26:47] and then they stopped.

[00:26:47] And so like,

[00:26:48] you know,

[00:26:49] when you,

[00:26:49] when you do anything creative,

[00:26:50] I had the same with the podcast.

[00:26:51] I'd sort of Google the name.

[00:26:52] And weirdly,

[00:26:53] I got tagged in a thing the other day and I was like,

[00:26:56] it's not me.

[00:26:56] Looked at it.

[00:26:57] It's like,

[00:26:57] Oh,

[00:26:57] someone started a YouTube channel with a similar name.

[00:26:59] And I went,

[00:27:00] well,

[00:27:00] I can't do anything about it because it's the internet.

[00:27:02] You know what I mean?

[00:27:03] Like I can't email them a cease and desist because who cares?

[00:27:07] Yeah.

[00:27:08] But you know,

[00:27:08] I know that I really care,

[00:27:09] but it's the thing of like with AI,

[00:27:11] it's like that on another level.

[00:27:12] You're absolutely right.

[00:27:13] If you're able to do something like take someone's voice and like,

[00:27:19] I get sent memes all the time by the guitarist in my band.

[00:27:22] And they are very funny where it's like someone's ran a Britney Spears song

[00:27:26] through a James Hetfield filter,

[00:27:28] you know,

[00:27:29] and it sounds like Metallica covering,

[00:27:30] you know,

[00:27:31] toxic or whatever.

[00:27:32] And it sounds funny and it's like,

[00:27:33] but it's very close.

[00:27:35] Like the articulation is pretty much spot on.

[00:27:38] You're like,

[00:27:39] yeah.

[00:27:39] What if somebody was to take this and then like go and use it to make music

[00:27:45] and say,

[00:27:45] here's a new Metallica single.

[00:27:47] Like what,

[00:27:47] what happens there,

[00:27:48] for example?

[00:27:49] Like,

[00:27:49] yeah,

[00:27:50] you're right.

[00:27:50] It's a fascinating one to sort of consider.

[00:27:52] And that,

[00:27:52] that reminded me that this is something that Stephen Wilson encountered quite

[00:27:56] recently and posted about.

[00:27:58] And he said,

[00:28:00] you know,

[00:28:00] somebody had generated this song,

[00:28:03] right.

[00:28:04] And they'd basically,

[00:28:06] you're right.

[00:28:10] So,

[00:28:11] yeah,

[00:28:11] fun facts for everyone listening.

[00:28:13] I managed to break a wheel on my chair.

[00:28:16] I broke it making,

[00:28:18] is it,

[00:28:19] is it a jingle?

[00:28:20] I don't know.

[00:28:20] I entered a very bizarre segment of a podcast,

[00:28:24] an episode a few weeks back.

[00:28:25] And I had to throw my chair down to create a desired effect.

[00:28:28] And I broke a wheel.

[00:28:30] I'm literally breaking my furniture to make content.

[00:28:33] I hope people appreciate this.

[00:28:34] Bear with me.

[00:28:35] Please keep talking.

[00:28:36] You're suffering for your art.

[00:28:37] It's,

[00:28:37] it's admirable.

[00:28:39] Exactly.

[00:28:40] AI could never do this.

[00:28:43] Yeah.

[00:28:44] Is that a good thing?

[00:28:44] Is it a bad thing?

[00:28:45] Maybe that's not for us to determine.

[00:28:48] You could say.

[00:28:49] Yeah.

[00:28:51] There we go.

[00:28:51] We're good.

[00:28:52] Apologies.

[00:28:52] Beeson's in you.

[00:28:53] Yeah.

[00:28:53] Stephen Wilson posted about this a little while ago.

[00:28:56] And I found it really interesting because in his post,

[00:29:00] he said,

[00:29:00] um,

[00:29:01] I'm,

[00:29:02] I'm,

[00:29:03] I'm paraphrasing all over the place here,

[00:29:05] but,

[00:29:05] um,

[00:29:06] I'll try and be as,

[00:29:06] um,

[00:29:08] uh,

[00:29:09] as accurate as memory allows.

[00:29:10] Um,

[00:29:12] so he,

[00:29:13] somebody had sent him something,

[00:29:14] um,

[00:29:15] that somebody had generated along the lines of,

[00:29:18] uh,

[00:29:19] a Stephen Wilson song.

[00:29:20] Right.

[00:29:21] Um,

[00:29:21] and he listened to it and he said,

[00:29:23] well,

[00:29:23] you know,

[00:29:23] it's,

[00:29:23] it's not me.

[00:29:24] Right.

[00:29:25] But it could be,

[00:29:28] right.

[00:29:28] If it was somebody else listening to it,

[00:29:30] he wouldn't,

[00:29:30] you know,

[00:29:31] he,

[00:29:31] he wouldn't be surprised if he thought that it,

[00:29:34] you know,

[00:29:35] it was close and it was a close enough likeness for them to be fooled.

[00:29:39] Um,

[00:29:39] and that's using these sorts of technologies.

[00:29:42] Um,

[00:29:43] and he has a,

[00:29:44] he,

[00:29:44] he seems to have a pretty bleak view of the future.

[00:29:49] Um,

[00:29:50] in that regard,

[00:29:51] uh,

[00:29:51] he,

[00:29:52] from what I gather,

[00:29:54] he,

[00:29:54] you know,

[00:29:54] I don't want to put words in his mouth.

[00:29:56] He seems to have a quote if you want it.

[00:29:58] Oh,

[00:29:58] please.

[00:29:58] Yes.

[00:29:59] Well,

[00:29:59] I just find this,

[00:30:00] um,

[00:30:01] off Google.

[00:30:02] It says,

[00:30:02] um,

[00:30:02] we're in the midst of a seismic change in the way that music is made and how

[00:30:06] people engage with it.

[00:30:07] Do the majority even care that they aren't listening to a human being?

[00:30:11] The future bites indeed.

[00:30:12] Please let me know your thoughts.

[00:30:13] So that was him being interviewed about it and engaging,

[00:30:16] um,

[00:30:17] on a social media post.

[00:30:18] And yeah,

[00:30:19] it's,

[00:30:20] yeah.

[00:30:24] AI's taken over,

[00:30:25] man.

[00:30:25] I did not.

[00:30:26] What is happening?

[00:30:30] That's terrifying.

[00:30:32] There we go.

[00:30:33] Google's listening.

[00:30:36] Google's always listening.

[00:30:36] This is what I don't like about it.

[00:30:38] It's going to just throw that away.

[00:30:40] yeah.

[00:30:42] Oh man,

[00:30:42] this is where the machines turn on me,

[00:30:43] isn't it?

[00:30:44] Well,

[00:30:44] this is the thing,

[00:30:45] right?

[00:30:45] So there are many conveniences that we,

[00:30:49] uh,

[00:30:50] are fortunate enough to have in,

[00:30:52] in,

[00:30:53] in,

[00:30:53] in our modern day to day lives,

[00:30:55] um,

[00:30:59] that require,

[00:31:00] uh,

[00:31:01] a level of conceding some level of privacy and you can decide what level,

[00:31:06] right?

[00:31:07] Um,

[00:31:07] but it's true to say that we're probably at a point where it's getting a

[00:31:12] little bit,

[00:31:13] a little bit scary and that you could decide not to have that one particular

[00:31:17] convenience.

[00:31:18] Okay.

[00:31:18] So what if you,

[00:31:19] what if you don't,

[00:31:20] you know,

[00:31:21] what if you don't want,

[00:31:21] um,

[00:31:24] cookies,

[00:31:24] right?

[00:31:25] As the simplest,

[00:31:26] the simplest form of sort of privacy preserving sort of example,

[00:31:29] right?

[00:31:30] So everybody I think is aware of cookies on websites and they're used to

[00:31:33] track what you look at so that different,

[00:31:36] so,

[00:31:36] so that web,

[00:31:37] certain websites can function,

[00:31:38] right?

[00:31:38] So that they know that you're logged in or whatever,

[00:31:41] but they're also used to,

[00:31:42] for advertising,

[00:31:43] right?

[00:31:43] And a lot of people disable advertising and stuff like that,

[00:31:46] but there's a lot,

[00:31:46] I think that still runs under the hood and things where you can't be

[00:31:49] bothered and you just accept all and you do all of that.

[00:31:52] Right.

[00:31:53] Um,

[00:31:53] but there's a benefit to that.

[00:31:55] Right.

[00:31:55] And the benefit is that when you go onto Amazon or eBay or whatever,

[00:31:59] you're,

[00:31:59] you're seeing relevant things,

[00:32:01] right?

[00:32:01] Like this has happened to me before where I've seen like some music thing

[00:32:04] that I want and it's popped up because my devices know that I'm

[00:32:08] interested in that thing.

[00:32:09] And otherwise I would have,

[00:32:10] I would have missed an eBay listing or something like that.

[00:32:13] Right.

[00:32:14] Sure.

[00:32:14] Yeah.

[00:32:14] And so that's a convenience that I kind of like,

[00:32:16] but I have to ask myself,

[00:32:19] you know,

[00:32:19] is it worth sacrificing that sort of privacy?

[00:32:21] Um,

[00:32:22] and so,

[00:32:23] so it's,

[00:32:23] it's kind of the same with,

[00:32:26] with the,

[00:32:26] the,

[00:32:27] the music sort of side of things,

[00:32:29] right?

[00:32:29] Like it's allowing us to do lots of really cool things,

[00:32:32] but there are negative side effects to that as well.

[00:32:34] Right.

[00:32:34] Like on the privacy front,

[00:32:37] if that data gets into the wrong hands,

[00:32:39] then that can lead to some very tricky situations indeed.

[00:32:44] Right.

[00:32:45] Um,

[00:32:45] in terms of music tools,

[00:32:48] right.

[00:32:48] If that data,

[00:32:50] if not data,

[00:32:51] but if those technologies are used to create likenesses of big artists who have

[00:32:58] worked very hard to establish their careers.

[00:33:01] Right.

[00:33:01] And then all of a sudden that,

[00:33:03] what that just becomes meaningless.

[00:33:05] That doesn't seem right.

[00:33:07] Yeah.

[00:33:07] Yeah.

[00:33:08] I suppose that's,

[00:33:08] that's the big question.

[00:33:09] Right.

[00:33:09] I mean,

[00:33:10] what do you make of that song?

[00:33:12] Um,

[00:33:13] I don't know.

[00:33:13] It kind of reminds me of when streaming started.

[00:33:16] Okay.

[00:33:16] People were worried,

[00:33:17] you know,

[00:33:18] new technology came around and,

[00:33:22] you know,

[00:33:22] it has shifted the market as it were,

[00:33:25] because people don't really make any money on CDs anymore.

[00:33:29] But speaking for ourselves.

[00:33:31] Yeah.

[00:33:32] Yeah.

[00:33:33] Well,

[00:33:33] you guys and every other artist.

[00:33:35] Exactly.

[00:33:36] Yeah.

[00:33:38] But people still make money playing gigs and selling merch and these kinds of things.

[00:33:42] And I think it's just another kind of another shift that's going to happen.

[00:33:46] I think if you,

[00:33:47] at some point,

[00:33:49] if you resist it,

[00:33:49] you're just going to get left behind.

[00:33:51] Cause it wasn't the other thing,

[00:33:53] but like,

[00:33:53] you know,

[00:33:54] I'm pretty sure Jimi Hendrix wasn't on Spotify for years and things like this.

[00:33:58] Yeah.

[00:33:58] I don't think he had much of a say in it.

[00:34:00] Well,

[00:34:00] no,

[00:34:01] but you know,

[00:34:03] the people who are in that music,

[00:34:03] they didn't want it on there.

[00:34:05] And it's like,

[00:34:05] well,

[00:34:06] then people just don't listen to it.

[00:34:08] Yeah.

[00:34:08] And if you just,

[00:34:09] if you're getting left behind,

[00:34:10] but,

[00:34:11] um,

[00:34:12] on the flip side,

[00:34:12] like people always want to think for music,

[00:34:16] you know,

[00:34:16] especially for the kind of things that we listen to.

[00:34:18] We want to see it.

[00:34:20] You don't,

[00:34:21] it's one thing you can listen at home and listen to whatever,

[00:34:24] but I like watching it live.

[00:34:26] And this goes back to it being a human thing.

[00:34:32] I want to see it and hear it and experience it.

[00:34:34] I don't,

[00:34:35] and I don't think it,

[00:34:36] can you recreate that?

[00:34:38] Well,

[00:34:39] not currently,

[00:34:40] but maybe one day.

[00:34:42] So,

[00:34:43] so is live music becoming more of a subculture?

[00:34:51] Hmm.

[00:34:51] Yeah.

[00:34:52] I suppose that's the sort of a logical question,

[00:34:55] isn't it?

[00:34:55] Yeah.

[00:34:55] Off the back of all of this.

[00:34:57] I think a lot,

[00:34:58] personally,

[00:34:58] I don't know if you guys,

[00:34:59] my kind of take is like,

[00:35:00] it goes back to what you were saying at the start,

[00:35:02] right?

[00:35:03] About,

[00:35:03] I guess it's intentionality.

[00:35:05] You know,

[00:35:06] when you sit down to create anything,

[00:35:08] like even this conversation right now,

[00:35:10] like what's the purpose?

[00:35:11] It's three people that want to connect and talk about something they're

[00:35:14] interested in and share it with an audience.

[00:35:16] Right.

[00:35:17] Is a machine going to think to do that?

[00:35:20] Not really.

[00:35:21] Like you can feed it the data and say,

[00:35:23] okay,

[00:35:23] I want to make a podcast about this.

[00:35:26] And this is the kind of topic.

[00:35:27] And these are the types of guests.

[00:35:29] I'm sure that you,

[00:35:30] if you could,

[00:35:31] you could theoretically like come up with that stuff.

[00:35:34] Right.

[00:35:34] I mean,

[00:35:34] I've seen that you can do that.

[00:35:36] Yeah.

[00:35:36] I suppose that's it.

[00:35:37] Right.

[00:35:37] Like it is possible,

[00:35:39] but the end result,

[00:35:40] and this always comes back to the thing.

[00:35:41] And it reminds me of the video you guys send me,

[00:35:44] which I'm sure we'll get onto in a minute.

[00:35:45] The sort of the adverts,

[00:35:47] the stuff that comes up,

[00:35:48] the music,

[00:35:48] all of anything that's created.

[00:35:50] AI and the fact that it's always just off.

[00:35:54] Right.

[00:35:54] And I think as human beings,

[00:35:55] because we just know it's,

[00:35:57] it's the uncanny valley effects.

[00:35:59] And I think it is that thing of why,

[00:36:02] why,

[00:36:02] why create something?

[00:36:04] It's because you have something to say and you want to connect with another

[00:36:07] person over it.

[00:36:09] And a machine just can't do that because it doesn't have the capacity to.

[00:36:13] So I agree with you.

[00:36:14] I think when it comes to music and live music in particular,

[00:36:17] I think,

[00:36:18] yeah,

[00:36:18] probably will become more of a sub sort of subculture.

[00:36:20] It become more important.

[00:36:21] I think is the other thing as well,

[00:36:23] because,

[00:36:23] you know,

[00:36:24] you'll be listening to music and going,

[00:36:25] is this even real?

[00:36:26] I don't know.

[00:36:27] Oh,

[00:36:28] they're on tour.

[00:36:28] Okay.

[00:36:29] This is a real thing.

[00:36:30] I'm going to go to it.

[00:36:31] Unless surprise,

[00:36:32] you turn up and it's all tracks and there's no one there.

[00:36:34] In that case,

[00:36:35] we're all dead.

[00:36:36] Like that's it.

[00:36:38] Game over.

[00:36:39] You just described one of our gigs.

[00:36:41] Um,

[00:36:42] so,

[00:36:43] um,

[00:36:45] I think the problem,

[00:36:46] like the problem here is that,

[00:36:48] um,

[00:36:49] AI isn't inherently anything.

[00:36:51] It isn't inherently evil.

[00:36:52] It isn't inherently good.

[00:36:53] The issue with it is the same issue with the same issue that we have with all technology,

[00:36:57] which is,

[00:36:58] um,

[00:36:59] it's,

[00:36:59] it's the people behind it that are the problem.

[00:37:01] And it's,

[00:37:02] you know,

[00:37:03] we talk about it as a tool,

[00:37:04] but is it a tool to help a musician,

[00:37:06] a musician who has studied hard to,

[00:37:09] and to develop skills in their instrument,

[00:37:12] worked hard to network within their scene and build a name for themselves within,

[00:37:17] you know,

[00:37:18] as an artist.

[00:37:19] Right.

[00:37:19] Um,

[00:37:20] because I think we can all agree that that's fine.

[00:37:22] Um,

[00:37:23] but it could also be a tool for somebody who,

[00:37:26] you know,

[00:37:27] wants to be a musician and doesn't want to put in the effort and just generates content.

[00:37:33] And it does have a message.

[00:37:35] It does have a human,

[00:37:36] there's still an element of the human experience because they're still guiding the algorithm to generate what they want.

[00:37:43] Right.

[00:37:44] But I'm quite concerned about that because that feels to me like there's a,

[00:37:49] there's a whole,

[00:37:50] you know,

[00:37:51] a whole world of experience missing from that.

[00:37:55] Like when I listen to music,

[00:37:57] I want to be listening to music that doesn't just reflect the ideas that the musicians have.

[00:38:04] And composing that piece of music.

[00:38:06] Right.

[00:38:06] Cause you could argue that that,

[00:38:07] you know,

[00:38:08] you don't need a lot of skill to come up with ideas.

[00:38:11] You need to have vision.

[00:38:12] Vision's really important.

[00:38:13] That's part of the equation.

[00:38:14] Right.

[00:38:15] Um,

[00:38:17] but I like to know that these are people who have also,

[00:38:20] you know,

[00:38:20] worked hard at their craft in order to be able to deliver that vision,

[00:38:25] you know,

[00:38:26] through human hands.

[00:38:28] Right.

[00:38:28] I completely agree.

[00:38:29] Yeah.

[00:38:30] yeah.

[00:38:30] And so I think that's,

[00:38:31] that's the thing for me.

[00:38:32] That's kind of where the line is.

[00:38:33] It's like,

[00:38:34] if there's still a lot of,

[00:38:35] a lot of genuine effort as an artist,

[00:38:38] whether,

[00:38:38] you know,

[00:38:39] whether maybe it's visual art,

[00:38:40] maybe it's music,

[00:38:41] maybe,

[00:38:41] you know,

[00:38:41] it's videography,

[00:38:42] whatever.

[00:38:42] Right.

[00:38:43] If there's a lot of genuine effort and graft going into it,

[00:38:49] then I think it's okay to use these sorts of tools.

[00:38:53] Hmm.

[00:38:56] Yeah.

[00:38:56] Yeah.

[00:38:57] I'd agree with that.

[00:38:58] Tom,

[00:38:58] you're looking very thoughtful.

[00:39:01] Yeah.

[00:39:02] I,

[00:39:03] I think you've nailed it really.

[00:39:05] As long as I think,

[00:39:07] you know,

[00:39:08] what,

[00:39:09] so what's better.

[00:39:10] Someone,

[00:39:11] you know,

[00:39:12] using an AI tool to write a song,

[00:39:14] you know,

[00:39:15] that may not be the best musician or,

[00:39:17] you know,

[00:39:18] it might,

[00:39:18] the song that comes out might be really well produced.

[00:39:21] You might not be able to tell it's AI or someone who can play three chords

[00:39:25] and a guitar and shout what,

[00:39:28] what connects to you?

[00:39:29] What connects more to you?

[00:39:31] I would say the second one,

[00:39:32] because it's,

[00:39:33] you know,

[00:39:33] someone actually out there doing it.

[00:39:37] I don't know.

[00:39:38] That's my kind of,

[00:39:39] all I can think of that analogy is somebody playing wonder wall.

[00:39:43] And so you've really biased me towards,

[00:39:45] towards the former.

[00:39:49] Yeah,

[00:39:50] but it's been trained on wonder wall.

[00:39:56] Oh man,

[00:39:57] wouldn't that be the plot twist of the century?

[00:39:59] You go to the Oasis gigs next year and it's just AIs and holograms.

[00:40:05] They don't even show up for the gig.

[00:40:07] That would be amazing.

[00:40:08] I wouldn't be,

[00:40:09] I wouldn't be surprised if their,

[00:40:11] their agents and stuff have that as a backup given their history.

[00:40:14] Maybe that's why it was so expensive.

[00:40:16] It's funding new technology.

[00:40:18] Yeah.

[00:40:19] I mean,

[00:40:19] it's,

[00:40:20] you know,

[00:40:20] I believe in funding innovation.

[00:40:22] Excellent.

[00:40:23] Mm-hmm.

[00:40:24] Funding laziness.

[00:40:27] Right.

[00:40:28] You've got to think of the researchers behind it.

[00:40:29] They're getting paid.

[00:40:30] Yeah.

[00:40:31] They're getting paid.

[00:40:32] There you go.

[00:40:33] Isn't,

[00:40:33] isn't all technological advancement in some way funding laziness or working towards

[00:40:38] laziness?

[00:40:40] That's the end goal.

[00:40:41] I mean,

[00:40:43] yeah,

[00:40:44] maybe,

[00:40:45] maybe,

[00:40:45] but then also I think it's just making things easier.

[00:40:48] I mean,

[00:40:49] I go back to like,

[00:40:50] again,

[00:40:50] the amp sim thing for me is like my sort of go-to of this,

[00:40:52] of,

[00:40:53] you know,

[00:40:53] I,

[00:40:54] I couldn't afford a studio's worth of cabinets and amplifiers,

[00:41:00] but if I can pay for a software that has figured out how those sound and I can pick

[00:41:06] different settings and go through and recreate that amazing,

[00:41:10] you know,

[00:41:10] and I'm able to get sounds and tones that just would be physically impossible.

[00:41:15] We,

[00:41:15] we are without going into severe debt.

[00:41:17] So like there's,

[00:41:18] there's that aspect of it.

[00:41:20] But,

[00:41:20] and I guess the extreme of a side of it is what we're talking about here is you

[00:41:24] then take that feed it through a machine and it just writes the entire song for

[00:41:28] you.

[00:41:29] And you don't even just,

[00:41:29] you don't even bother using the tool anymore.

[00:41:31] You're just running it all through and going,

[00:41:33] yeah,

[00:41:33] that'll do.

[00:41:34] Well,

[00:41:35] this is the thing because I think all musicians out there,

[00:41:39] right.

[00:41:41] They,

[00:41:41] they,

[00:41:42] not all musicians,

[00:41:43] some musicians are fortunate enough to have,

[00:41:45] you know,

[00:41:47] very healthy careers as musicians.

[00:41:52] But a lot of us have to fund being a musician,

[00:41:56] right?

[00:41:56] We have to do,

[00:41:57] do things that,

[00:41:58] that pay the bills.

[00:42:00] Um,

[00:42:01] and a lot of musicians do that by creating content for ads,

[00:42:07] TV.

[00:42:07] I mean,

[00:42:08] obviously like writing for TV and film and stuff is its own thing as well.

[00:42:12] And that's great.

[00:42:12] But there are a lot of people who do,

[00:42:14] you know,

[00:42:14] sort of smaller scale things.

[00:42:16] Actually my,

[00:42:16] my neighbor,

[00:42:17] as it happens,

[00:42:18] um,

[00:42:19] he,

[00:42:20] um,

[00:42:22] he,

[00:42:24] sort of hit the majority of his career.

[00:42:28] He's like,

[00:42:29] he's 70 now.

[00:42:30] Um,

[00:42:30] the majority of his career was in,

[00:42:32] um,

[00:42:34] writing music for TV.

[00:42:37] And the way that he,

[00:42:38] um,

[00:42:39] he got ahead there is he was one of the first,

[00:42:41] uh,

[00:42:42] not one of the first people,

[00:42:43] but he was,

[00:42:44] um,

[00:42:45] an earlier,

[00:42:46] uh,

[00:42:46] earlier doctor of the fair light.

[00:42:48] So he was able to,

[00:42:49] um,

[00:42:50] do things with,

[00:42:52] you know,

[00:42:52] using MIDI composition that prior to that,

[00:42:55] people would have to get,

[00:42:56] you know,

[00:42:56] a bunch of musicians into a room.

[00:42:58] And so he was making use of the technology,

[00:43:00] of the time to be,

[00:43:03] you know,

[00:43:03] to,

[00:43:04] uh,

[00:43:05] make himself incredibly competitive,

[00:43:06] you know,

[00:43:08] rather than going to a composer,

[00:43:09] who's going to need to get an orchestra and you can just go to Paul.

[00:43:13] Right.

[00:43:14] Um,

[00:43:15] and Paul's going to just do it all himself and he's going to charge you what?

[00:43:19] 5% of what you'd be paying otherwise.

[00:43:22] Um,

[00:43:23] and you know,

[00:43:24] so that,

[00:43:24] that was,

[00:43:24] that was lucrative for him then,

[00:43:25] but now,

[00:43:26] I mean,

[00:43:27] everybody's doing that,

[00:43:28] right.

[00:43:28] I'm sure we've all written things that we,

[00:43:30] you know,

[00:43:31] we'd just be like,

[00:43:31] yeah,

[00:43:31] just upload this to whatever music service you're using centric or whatever,

[00:43:37] and see if people pick it up.

[00:43:38] Right.

[00:43:39] Um,

[00:43:40] and for some musicians,

[00:43:41] that's,

[00:43:41] that's how they,

[00:43:43] that's how they fund being able to do the music they love is by writing some music that they don't necessarily love.

[00:43:49] And now these,

[00:43:51] these technologies are taking that option away,

[00:43:55] um,

[00:43:56] or potentially taking that option away.

[00:43:57] And that's,

[00:43:58] that's,

[00:43:58] that can be quite damaging.

[00:43:59] Um,

[00:44:01] yeah.

[00:44:02] Sorry.

[00:44:02] That was a bit of a long,

[00:44:03] I talked a lot.

[00:44:04] No,

[00:44:06] it's a,

[00:44:06] it's a podcast.

[00:44:07] That's not a problem.

[00:44:08] The problem is the opposite.

[00:44:10] Sorry.

[00:44:12] That wasn't a dig,

[00:44:13] Tom.

[00:44:16] I mean,

[00:44:16] yeah.

[00:44:17] Genuinely while we're here,

[00:44:18] Tom,

[00:44:18] like what,

[00:44:18] what do you make of that?

[00:44:19] Cause again,

[00:44:20] I feel like this is the thing people are sort of really concerned about,

[00:44:23] right?

[00:44:23] Like this is what we're circling is using the tools.

[00:44:26] But then as you said,

[00:44:27] like what happens when it then swallows up potential work for people,

[00:44:34] for example,

[00:44:34] like,

[00:44:35] and that's,

[00:44:35] it feels like,

[00:44:36] again,

[00:44:36] a very difficult one to sort of grasp with.

[00:44:38] Right.

[00:44:39] Yeah.

[00:44:39] I mean,

[00:44:40] I just don't think it's a new problem.

[00:44:42] I mean,

[00:44:42] it's happened.

[00:44:43] It's been.

[00:44:44] Okay.

[00:44:45] I think it's happens everywhere all the time.

[00:44:47] You know,

[00:44:48] when computers came around,

[00:44:50] you know,

[00:44:51] you didn't need people in typewriting pools,

[00:44:54] writing letters.

[00:44:55] You could just press print on a printer and it printed it out.

[00:44:57] It would get different jobs.

[00:45:00] Yeah.

[00:45:00] I mean,

[00:45:00] not to be too dismissive,

[00:45:02] but I think,

[00:45:04] there's,

[00:45:04] there'll be new things people can do that we don't even know of yet.

[00:45:10] And I think,

[00:45:12] I mean,

[00:45:12] it is difficult to see what those might be,

[00:45:14] but yeah,

[00:45:16] they will come around.

[00:45:18] Yeah.

[00:45:19] Yeah.

[00:45:19] I feel like every time new technology comes around this,

[00:45:22] you know,

[00:45:23] like I said,

[00:45:24] with computers,

[00:45:24] when they came around to now,

[00:45:25] you need people who can fix computers and things like this.

[00:45:29] Yeah,

[00:45:29] it's true.

[00:45:30] Yeah.

[00:45:30] Yeah.

[00:45:31] I suppose that's it,

[00:45:32] right?

[00:45:32] You've,

[00:45:33] you've got to adapt.

[00:45:34] And I guess as a creative,

[00:45:35] I mean,

[00:45:37] yeah,

[00:45:37] I suppose,

[00:45:38] tell me what you guys think of this.

[00:45:39] My,

[00:45:39] my opinion has always been like,

[00:45:41] you're self-employed if you're a creative,

[00:45:44] right?

[00:45:44] Full stop.

[00:45:45] And whatever that looks like is up to you.

[00:45:47] Like,

[00:45:47] and I yourselves,

[00:45:48] I'm the same,

[00:45:49] you know,

[00:45:49] I have a day job.

[00:45:50] Like most,

[00:45:51] most creatives do.

[00:45:52] And then the ones that kind of make a career out of it have to build that up

[00:45:57] and find ways to fund that.

[00:45:59] You know,

[00:46:00] podcasting is a good example that,

[00:46:02] you know,

[00:46:03] most of the people I know now who are very successful with it,

[00:46:06] um,

[00:46:06] run ads,

[00:46:07] you know,

[00:46:08] on their shows.

[00:46:09] And there was a lot of pushback on that when that first started.

[00:46:11] Whereas now you kind of just accept like,

[00:46:12] that's how most people pay the bills.

[00:46:14] Like,

[00:46:14] yeah,

[00:46:14] you can have a patron.

[00:46:15] You can have a support.

[00:46:16] And some people that pays off others.

[00:46:18] It's like,

[00:46:19] yeah,

[00:46:19] I actually doesn't really work.

[00:46:21] I have to go with advertising or I have to go with sponsors.

[00:46:24] Some of them do YouTubers,

[00:46:25] right?

[00:46:25] Or like musicians.

[00:46:27] How many of them are like streamers now,

[00:46:28] you know,

[00:46:29] or they spend time on Tik TOK or Twitch or whatever,

[00:46:32] like just interacting with fans and getting that engagement.

[00:46:35] But that's the point is they have to treat it with a bit of an

[00:46:38] entrepreneurial spirit and a bit of a,

[00:46:40] I guess,

[00:46:40] a problem solving mindset of like,

[00:46:43] okay,

[00:46:44] album sales,

[00:46:44] like for example,

[00:46:45] don't make the money they used to make.

[00:46:47] So I can either complain about it or I can think,

[00:46:50] right,

[00:46:51] how do I adapt to this in some way?

[00:46:53] And maybe that's the solution,

[00:46:55] right?

[00:46:55] With this is like you said,

[00:46:56] you can see the problem and think,

[00:46:57] okay,

[00:46:57] but how do I adapt?

[00:46:59] Cause it's,

[00:47:00] cause there's nothing you can do about it.

[00:47:01] Like the technology is here and it's only going to get stronger as

[00:47:04] time goes on.

[00:47:05] So yeah,

[00:47:06] I like what you're suggesting is like,

[00:47:07] yeah,

[00:47:08] okay.

[00:47:08] Figure out how to use it to your advantage maybe,

[00:47:10] or just pivot to something else.

[00:47:13] Yeah.

[00:47:13] I mean,

[00:47:14] I think there's an element here of what does it mean to the fans?

[00:47:19] And I'd be really interested in that because obviously,

[00:47:21] you know,

[00:47:22] you speak to lots of people about things that they're really passionate

[00:47:24] about and music in general.

[00:47:29] Like I,

[00:47:30] I,

[00:47:30] as a,

[00:47:30] as a fan of music,

[00:47:31] I try and be really supportive.

[00:47:33] Right.

[00:47:33] So I support a few Patreons and I buy,

[00:47:37] like,

[00:47:38] I always buy merch,

[00:47:39] which my wife has a go at me for,

[00:47:41] cause I've got so many t-shirts,

[00:47:42] but like the fact is,

[00:47:44] it's like,

[00:47:44] I go there,

[00:47:44] I see a band and I want to support them.

[00:47:46] I,

[00:47:47] you know,

[00:47:47] I want to chip in and I know,

[00:47:48] you know,

[00:47:49] from,

[00:47:49] from how it works for us,

[00:47:51] that merch is,

[00:47:51] is,

[00:47:52] is one of the best ways of doing that.

[00:47:53] Right.

[00:47:54] There you go.

[00:47:54] Like we,

[00:47:56] we often,

[00:47:58] we often break even on ticket sales now,

[00:48:00] I think,

[00:48:01] which is a thing in itself.

[00:48:04] It's quite nice.

[00:48:08] but we,

[00:48:09] when we make money,

[00:48:10] it's from much.

[00:48:11] Yeah.

[00:48:13] And,

[00:48:13] and so I,

[00:48:15] I always like to buy merch and say,

[00:48:16] I want to support people,

[00:48:18] but what,

[00:48:18] you know,

[00:48:19] what does that mean for,

[00:48:21] for,

[00:48:21] for,

[00:48:23] other fans?

[00:48:24] How do we,

[00:48:24] how do we help to,

[00:48:25] um,

[00:48:27] kind of.

[00:48:29] Fuel that culture.

[00:48:30] How do we create this healthy ecology around music and,

[00:48:36] and the different music scenes?

[00:48:38] I think we're,

[00:48:38] we're,

[00:48:38] we're really fortunate to be in a really,

[00:48:40] really lovely scene with,

[00:48:42] you know,

[00:48:42] it's,

[00:48:43] it's incredibly supportive and people are happy to,

[00:48:45] um,

[00:48:46] and buy a,

[00:48:46] buy a vinyl and t-shirt and stuff like that.

[00:48:48] But I know that in other scenes,

[00:48:50] right.

[00:48:50] In other genres of music,

[00:48:51] it's very,

[00:48:52] very different.

[00:48:53] Um,

[00:48:54] right.

[00:48:54] So yeah,

[00:48:55] I don't know.

[00:48:55] I'm just interested,

[00:48:56] like from the people you've spoken to,

[00:48:58] what,

[00:48:59] how,

[00:48:59] you know,

[00:49:00] what are your concerns?

[00:49:01] How do you think people view it?

[00:49:02] And do you think people could ever be fans of AI?

[00:49:06] That's the thing.

[00:49:07] Yeah.

[00:49:08] I do genuinely wonder about this.

[00:49:09] And I think for me,

[00:49:10] again,

[00:49:10] it goes back to what I was saying about the connectivity.

[00:49:13] You know,

[00:49:13] that's the thing for me that get,

[00:49:14] like I get personally is like,

[00:49:16] I can hear the AI generated songs.

[00:49:18] I can watch the ads and like the videos and the stuff.

[00:49:20] And I can kind of go,

[00:49:21] Oh yeah,

[00:49:21] that's quite cool.

[00:49:22] What you can do.

[00:49:23] I don't feel anything,

[00:49:25] you know,

[00:49:26] when I watch it because I think knowing that it's computer generated for one.

[00:49:30] And even if you don't again,

[00:49:33] it's like that,

[00:49:33] that has a shelf life.

[00:49:34] Like for example,

[00:49:35] we're talking about music.

[00:49:37] Let's say you hear a song on the radio and turns out it's all AI.

[00:49:41] Like everything was all composed.

[00:49:42] And you might really like the song.

[00:49:44] I think actually,

[00:49:44] do you know what?

[00:49:44] That's pretty good song.

[00:49:45] It's pretty well written.

[00:49:46] It's catchy,

[00:49:47] blah,

[00:49:47] blah,

[00:49:48] blah.

[00:49:49] That's the end of the experience there because that,

[00:49:53] what are you going to get from it?

[00:49:54] Nothing.

[00:49:55] You can't then,

[00:49:55] I'm going to go to the show.

[00:49:57] I'm going to buy some merch.

[00:49:58] Which I'm sure someone out there could find a way to make that happen again.

[00:50:02] Like maybe do the,

[00:50:03] have a voyage thing of,

[00:50:05] you know,

[00:50:05] we'll create a hologram and we'll sell t-shirts and like,

[00:50:08] sure,

[00:50:08] you could do that.

[00:50:09] And you know what?

[00:50:10] I'm going to put money on the fact that there's probably someone out there who

[00:50:13] would do that just for the hell of it,

[00:50:15] but it's going to be a pretty meaningless,

[00:50:18] empty kind of thing of like,

[00:50:19] it might be fun as a novelty for some people,

[00:50:23] but it's not the same as like doing what we did this year,

[00:50:26] right?

[00:50:26] Like going to a festival,

[00:50:27] going to gigs,

[00:50:28] like connecting with people.

[00:50:30] You are not going to get that experience with technology.

[00:50:32] And it's not designed for that.

[00:50:34] That's the thing ultimately,

[00:50:35] right?

[00:50:35] Like this is what it goes back to at the start of our conversation here.

[00:50:38] It's like,

[00:50:39] what's the purpose of this stuff?

[00:50:41] It's problem solving.

[00:50:42] It's,

[00:50:43] it's maybe using as a tool.

[00:50:45] It's giving you ways to sort of shortcut things from time to time.

[00:50:48] All of that's completely fine,

[00:50:50] but it's not the same as connecting with another person,

[00:50:54] expressing yourself,

[00:50:55] telling a story.

[00:50:57] And that's,

[00:50:57] that's the bottom line is a machine just cannot do that.

[00:51:00] The day that it can,

[00:51:02] that's a different conversation.

[00:51:03] And I'll be having it from a bunker.

[00:51:06] Well,

[00:51:06] well,

[00:51:07] this is the thing,

[00:51:08] right?

[00:51:08] I mean,

[00:51:09] lots of people,

[00:51:10] including some senior researchers that I know have asked,

[00:51:16] have legitimately asked the question,

[00:51:18] is chat GPT conscious?

[00:51:21] Right.

[00:51:22] And good question.

[00:51:24] And just to answer this,

[00:51:25] right.

[00:51:25] For everybody right now.

[00:51:26] Yep.

[00:51:27] No,

[00:51:28] no,

[00:51:29] it's not definitely not,

[00:51:31] but it never used it,

[00:51:32] but yeah,

[00:51:33] I've,

[00:51:33] I've got that impression.

[00:51:34] It is what it is,

[00:51:35] is convincing.

[00:51:36] Um,

[00:51:37] and that's,

[00:51:38] that's what you need with any kind of generative content.

[00:51:44] You need it to be convincing.

[00:51:46] Um,

[00:51:46] yeah.

[00:51:47] So I think you're,

[00:51:48] you know,

[00:51:49] you,

[00:51:49] you,

[00:51:49] you raise a good point.

[00:51:50] I,

[00:51:50] I,

[00:51:51] I personally completely agree.

[00:51:53] I think that the human experience is fundamental to all forms of art.

[00:51:56] Right.

[00:51:57] I don't think you can have art without it.

[00:51:59] It is meaningless.

[00:52:00] Then it's just patterns and shapes and whatever.

[00:52:03] Right.

[00:52:03] It's sounds and doesn't,

[00:52:05] it doesn't mean anything,

[00:52:06] but if you can be convinced that it is in fact,

[00:52:12] art,

[00:52:13] right.

[00:52:14] Because that can be done.

[00:52:15] Right.

[00:52:16] Somebody could create an elaborate sort of account.

[00:52:18] And is that it's,

[00:52:19] I don't know,

[00:52:20] sort of like we're going through that meme.

[00:52:23] Right.

[00:52:23] And it's like,

[00:52:24] at this point,

[00:52:25] is it,

[00:52:26] it's like brain explosion meme,

[00:52:29] whatever that's called.

[00:52:30] Right.

[00:52:30] Like,

[00:52:30] yeah.

[00:52:31] Is that just another form of art?

[00:52:33] Like,

[00:52:34] does that just open a pathway to a new form of expression where you're

[00:52:38] creating this,

[00:52:40] you know,

[00:52:41] fabricated persona,

[00:52:42] which has fabricated music and fabricated artwork.

[00:52:48] That's all generated by AI.

[00:52:50] And there's no human experience behind the individual pieces of content,

[00:52:54] but there's something behind this creation in itself.

[00:52:59] I,

[00:53:00] I don't know.

[00:53:01] Yeah.

[00:53:01] I don't,

[00:53:01] I don't know.

[00:53:02] Uh,

[00:53:02] I shouldn't have asked the question.

[00:53:04] I saw those,

[00:53:05] um,

[00:53:05] there was a thing going around recently about there's,

[00:53:07] um,

[00:53:08] there was a guy who,

[00:53:09] I think he's had some art AI generated artwork in an art gallery,

[00:53:14] but you know,

[00:53:14] he'd spent probably like done 119 different prompts and combined them all

[00:53:19] together to get to the end point.

[00:53:21] And in a,

[00:53:22] in a task of itself,

[00:53:23] I thought that was quite interesting.

[00:53:25] And as a piece of art,

[00:53:26] I've agreed that was probably art,

[00:53:28] but people were saying,

[00:53:28] cause it's,

[00:53:29] I think,

[00:53:30] because I think this is right,

[00:53:32] but if something's AI generated,

[00:53:33] you can't copyright it.

[00:53:35] Oh,

[00:53:36] that's a difficult one.

[00:53:37] And so,

[00:53:38] because it couldn't be copyrighted,

[00:53:40] they,

[00:53:41] people just stole his artwork and people,

[00:53:44] you know,

[00:53:45] people,

[00:53:45] you know what people are like,

[00:53:46] they're like,

[00:53:46] ha ha,

[00:53:47] now you know how it feels.

[00:53:48] Yeah.

[00:53:49] I don't know.

[00:53:49] I don't necessarily know that that's a hundred percent true.

[00:53:52] I think that you can copyright it,

[00:53:54] but I don't know for certain.

[00:53:56] That's a really,

[00:53:57] really,

[00:53:57] really good question.

[00:53:58] So I'm under the impression that you can't.

[00:54:00] Okay.

[00:54:00] Should I,

[00:54:01] should I ask chat GBT?

[00:54:03] Yes.

[00:54:03] It'll be like,

[00:54:04] no.

[00:54:04] Copyrighted.

[00:54:05] Can I copyright AI generated art?

[00:54:09] This is the first time I've ever used this,

[00:54:11] by the way.

[00:54:12] So this is going to be really interesting.

[00:54:19] Welcome to the future.

[00:54:21] Please do not be afraid.

[00:54:24] Simply ask me anything.

[00:54:26] I promise.

[00:54:27] I will not use this data to your disadvantage.

[00:54:33] Okay.

[00:55:28] So yeah,

[00:55:29] it has said that this is a complex issue,

[00:55:31] yada,

[00:55:31] yada.

[00:55:32] US copyright law.

[00:55:34] So in the,

[00:55:35] apparently United States,

[00:55:36] US copyright office has stated that works created by AI without human

[00:55:40] involvement,

[00:55:41] cannot be copyrighted.

[00:55:43] Without human involvement.

[00:55:45] Yes.

[00:55:46] That's the key word.

[00:55:47] I think there is.

[00:55:47] The key requirement is that the work must have a human author and AI tools

[00:55:51] are not considered legal authors.

[00:55:53] So I think that's the thing.

[00:55:54] And it says here,

[00:55:55] if the human plays a significant role,

[00:55:58] such as providing instructions,

[00:56:00] selecting and refining AI generated content,

[00:56:02] or making substantial creative decisions,

[00:56:05] then the resulting work may be eligible for copyright protection.

[00:56:08] So I suppose,

[00:56:10] like in terms of what you just described,

[00:56:12] that example,

[00:56:14] in a court of law,

[00:56:15] you could argue that he played a significant role because he put all the

[00:56:19] prompts in.

[00:56:19] He cut out the bits that he thought were not working,

[00:56:23] put together the bits that was working and then created his,

[00:56:26] his piece.

[00:56:27] Yeah.

[00:56:27] And I thought that in a task in itself was actually quite a creative process.

[00:56:32] You know,

[00:56:33] you know,

[00:56:34] it's not a painter or illustrator,

[00:56:36] but you can,

[00:56:37] he's definitely had a directing role in this kind of whole artwork as it were.

[00:56:43] Yeah.

[00:56:44] That's interesting.

[00:56:45] And I,

[00:56:45] I agree.

[00:56:46] You know,

[00:56:46] there's lots of artwork that is all about process,

[00:56:50] right.

[00:56:50] Rather than product.

[00:56:52] And I think.

[00:56:53] True.

[00:56:53] I'm not necessarily a fan of a lot of the process.

[00:56:57] Yes.

[00:56:57] Artwork.

[00:56:58] I'm more a fan of the product artwork.

[00:56:59] I guess that makes me kind of shallow or something,

[00:57:02] but.

[00:57:04] So,

[00:57:05] so like for me,

[00:57:06] it kind of has to be both to be like,

[00:57:08] or at least it has to be intention and product and the product element for me,

[00:57:12] it's coming back to that thing of.

[00:57:15] Grafting,

[00:57:16] you know,

[00:57:16] really demonstrating mastery over a craft.

[00:57:19] That's something that I.

[00:57:22] You know,

[00:57:23] personally value,

[00:57:24] but it's not necessarily something everybody values.

[00:57:27] There are lots of people out there who enjoy minimalist artwork that you could argue like,

[00:57:34] okay,

[00:57:34] it didn't take as much skill.

[00:57:36] Right.

[00:57:37] But it still took vision.

[00:57:39] Right.

[00:57:39] There's still a.

[00:57:40] Yes.

[00:57:41] A human element.

[00:57:42] Right.

[00:57:42] A significant human element.

[00:57:44] A lot of the time.

[00:57:45] And.

[00:57:45] You know,

[00:57:46] it's still,

[00:57:46] it still has value.

[00:57:48] And a lot of it has significant value.

[00:57:51] That's true.

[00:57:52] Yeah.

[00:57:53] I a hundred percent agree with that.

[00:57:54] And yeah,

[00:57:56] it's again,

[00:57:56] it's just that thing of.

[00:57:58] Do you,

[00:57:59] yeah.

[00:57:59] Do you connect with it?

[00:58:00] What's the intentionality?

[00:58:02] And I think as people,

[00:58:03] if you give people credit,

[00:58:04] you can pretty much tell straight away.

[00:58:06] Right.

[00:58:07] That's the thing.

[00:58:08] No matter how good it looks,

[00:58:09] no matter what it is.

[00:58:12] It's creating like.

[00:58:13] That's still.

[00:58:16] If that bit is missing,

[00:58:17] it's not going to move you in any way.

[00:58:19] Is it?

[00:58:20] You know?

[00:58:20] So that's fine.

[00:58:21] I think it's an interesting experiment though.

[00:58:23] Definitely.

[00:58:23] So they kind of play around with this stuff.

[00:58:25] And yeah,

[00:58:26] I guess.

[00:58:28] I don't know.

[00:58:29] It is just fascinating to me.

[00:58:31] Just where,

[00:58:31] where it all goes.

[00:58:32] I mean,

[00:58:32] I just thought of the example.

[00:58:35] One major application we saw this year.

[00:58:36] I don't know if you saw the.

[00:58:38] What was it called?

[00:58:39] I've just got it here.

[00:58:41] The Willy Wonka experience.

[00:58:42] Oh yeah.

[00:58:43] Up in Glasgow.

[00:58:43] I'm sure you guys were all over that

[00:58:45] when that came out on the news.

[00:58:47] I am totally ignorant of this.

[00:58:50] Oh,

[00:58:51] okay.

[00:58:52] This is incredible.

[00:58:53] Look this up on YouTube.

[00:58:54] It's basically what it was

[00:58:56] is there's a guy out there

[00:58:57] called Billy Cole

[00:58:59] who his entire thing,

[00:59:01] I did a deep dive on him

[00:59:02] because I was fascinated by this dude.

[00:59:04] His entire thing

[00:59:05] seems to be using AI

[00:59:07] to run multiple businesses

[00:59:10] and write books

[00:59:11] as well.

[00:59:12] You can find these books

[00:59:12] on Amazon.

[00:59:14] They're terrible

[00:59:16] because they're entirely

[00:59:17] written by AI.

[00:59:20] But,

[00:59:20] and it's not just that

[00:59:21] it's written by AI

[00:59:22] or created by AI.

[00:59:22] It's obviously

[00:59:23] created by AI

[00:59:24] by a guy who doesn't understand

[00:59:26] how AI works.

[00:59:27] Like,

[00:59:28] he's just seen this

[00:59:28] as a shortcut

[00:59:29] to making money.

[00:59:31] And basically,

[00:59:32] the long story short is

[00:59:33] he created a website,

[00:59:35] got a bunch of imagery

[00:59:36] that was very reminiscent

[00:59:37] of Charlie and the Chocolate Factory

[00:59:39] and then sold tickets

[00:59:41] to an experience.

[00:59:42] And then when people turned up,

[00:59:44] it was terrible.

[00:59:45] Like,

[00:59:45] it's just a warehouse

[00:59:46] with just stuff

[00:59:47] scattered around.

[00:59:48] And it's very funny,

[00:59:50] but it's also not

[00:59:51] because, like,

[00:59:51] poor people were scammed.

[00:59:53] But,

[00:59:54] it's just like,

[00:59:55] again,

[00:59:55] I sort of went on a dive

[00:59:56] into his website

[00:59:57] and what he does.

[00:59:59] And it's kind of mad,

[01:00:00] but I just thought

[01:00:00] it was a very,

[01:00:02] albeit interesting,

[01:00:03] application

[01:00:03] of the technology

[01:00:04] of like,

[01:00:05] okay,

[01:00:05] this guy's

[01:00:07] clearly seen

[01:00:08] an opportunity here.

[01:00:10] But again,

[01:00:11] it goes back

[01:00:11] to the whole thing

[01:00:12] of, like,

[01:00:12] the intentionality.

[01:00:14] Like,

[01:00:14] you go through his website

[01:00:15] and it's just,

[01:00:16] all of it is,

[01:00:17] it's very copy-paste

[01:00:19] kind of in terms

[01:00:20] of what it is.

[01:00:20] Like,

[01:00:20] he tries to sell courses

[01:00:22] and books

[01:00:22] and all the rest of it

[01:00:23] and you look at it all

[01:00:23] and it all looks

[01:00:24] and feels the same.

[01:00:26] And that's the thing

[01:00:27] is it's like,

[01:00:28] you might get pulled in

[01:00:29] initially,

[01:00:30] but then within about

[01:00:31] a few minutes

[01:00:32] you're like,

[01:00:32] oh,

[01:00:33] okay,

[01:00:33] I see what this is.

[01:00:34] And I don't know

[01:00:35] about you guys,

[01:00:36] that's kind of

[01:00:36] the feeling I get

[01:00:37] whenever I see

[01:00:37] or hear stuff

[01:00:38] like this now

[01:00:39] is kind of like,

[01:00:41] even if you're

[01:00:41] caught in it

[01:00:42] for about a minute

[01:00:43] or two,

[01:00:44] it takes a second

[01:00:45] for you to kind of

[01:00:45] go,

[01:00:45] oh,

[01:00:46] wait,

[01:00:47] okay,

[01:00:47] I see what this is

[01:00:48] now.

[01:00:50] There's not a person

[01:00:51] behind this.

[01:00:52] So this brings us

[01:00:53] on to this,

[01:00:54] so before the episode

[01:00:54] I sent you a link.

[01:00:55] You did?

[01:00:57] Because it's December

[01:00:58] where it will be

[01:00:59] when this goes out

[01:01:00] presumably.

[01:01:02] It's the Coca-Cola

[01:01:03] Christmas advert,

[01:01:04] which is this year

[01:01:05] AI generated.

[01:01:08] And it's terrifying.

[01:01:10] So the first time

[01:01:10] I watched this,

[01:01:11] I was just,

[01:01:12] you know,

[01:01:12] I was eating my dinner

[01:01:14] at the time,

[01:01:14] so I wasn't paying

[01:01:14] attention.

[01:01:15] I didn't notice.

[01:01:16] And so I imagine a lot

[01:01:18] of people don't notice

[01:01:18] because they're just not

[01:01:19] really paying attention.

[01:01:21] But if you actually

[01:01:21] might watch it,

[01:01:22] I recommend everyone

[01:01:24] to go onto Google

[01:01:25] or YouTube

[01:01:25] and search for the

[01:01:26] Coca-Cola

[01:01:27] AI Christmas advert.

[01:01:28] I might even chuck a link

[01:01:30] in the show notes for this

[01:01:31] because I agree.

[01:01:32] I think it's worth watching.

[01:01:33] Absolutely.

[01:01:33] And when you actually

[01:01:34] like watch it,

[01:01:34] it's just like.

[01:01:36] Well,

[01:01:36] wait,

[01:01:36] wait,

[01:01:37] wait,

[01:01:37] wait,

[01:01:37] wait.

[01:01:37] I think if people

[01:01:39] are listening to this

[01:01:39] or watching it,

[01:01:40] I think you should pause.

[01:01:42] Yes.

[01:01:42] Go and watch it

[01:01:43] and then come back.

[01:01:46] Do I have a jingle for that?

[01:01:48] I don't know that I do.

[01:01:49] You can AI generate one.

[01:01:51] Oh,

[01:01:52] that's scary.

[01:01:53] Right.

[01:01:54] Not to peek too far

[01:01:55] behind the curtain

[01:01:55] in the future,

[01:01:56] but my brain has been

[01:01:57] firing off ideas

[01:01:58] involving AI

[01:01:59] for this episode.

[01:02:00] So stay tuned,

[01:02:01] folks.

[01:02:03] I do have a break jingle

[01:02:04] that I could just play here.

[01:02:07] Yeah,

[01:02:07] maybe I'll do that

[01:02:08] because I haven't played that

[01:02:08] at all really this year.

[01:02:11] It's time for a break.

[01:02:12] Why not check out

[01:02:13] our merchandise

[01:02:14] or leave a five-star review?

[01:02:17] Recommend the show

[01:02:18] to a friend.

[01:02:19] You can also donate

[01:02:20] to the podcast.

[01:02:21] Don't forget to subscribe.

[01:02:23] Now let's get back

[01:02:24] to the episode.

[01:02:26] Anyway,

[01:02:27] moving on swiftly

[01:02:28] before I get distracted.

[01:02:29] Yes,

[01:02:30] I agree,

[01:02:31] folks.

[01:02:31] So I'm assuming

[01:02:32] you've all listened

[01:02:33] to Matt's brilliant advice here

[01:02:36] and you've gone

[01:02:36] and watched it.

[01:02:38] Continue.

[01:02:39] It's so weird.

[01:02:41] Like,

[01:02:42] it's so weird.

[01:02:43] When I first saw it,

[01:02:44] I wasn't really paying attention

[01:02:45] and it just seemed like

[01:02:46] the same advert

[01:02:47] they put out every year.

[01:02:49] Yeah.

[01:02:50] I'm literally watching it again now

[01:02:52] and it's just,

[01:02:54] you just look at all the features

[01:02:55] and it's just like,

[01:02:56] wait,

[01:02:56] that's not quite right.

[01:02:57] It's haunting.

[01:02:59] Why are the wheels weird?

[01:03:00] What's wrong with that person's hand?

[01:03:02] Why is that person's face weird?

[01:03:04] Do you know what?

[01:03:05] That's the biggest thing

[01:03:06] is it's human features,

[01:03:07] right?

[01:03:08] That's like a consistent

[01:03:09] with this kind of stuff.

[01:03:10] So one of the things

[01:03:11] AI is really bad at

[01:03:12] is counting.

[01:03:15] Ah.

[01:03:16] So like,

[01:03:16] interesting.

[01:03:17] You're still in the job,

[01:03:18] Tom.

[01:03:18] Yeah,

[01:03:18] absolutely.

[01:03:19] I have a PhD in maths,

[01:03:20] so.

[01:03:22] Phew.

[01:03:23] But yeah,

[01:03:23] that's why you get like

[01:03:25] six fingers

[01:03:25] and teeth are really like weird.

[01:03:28] Yeah.

[01:03:29] Do you know what it was?

[01:03:29] It was the faces for me

[01:03:30] when they smiled.

[01:03:32] Like,

[01:03:32] it just didn't quite get,

[01:03:33] it was stuff like that

[01:03:34] where you're like,

[01:03:35] and again,

[01:03:36] that uncanny valley,

[01:03:37] that bit of your brain

[01:03:37] that goes,

[01:03:38] don't know,

[01:03:39] no,

[01:03:40] something's off,

[01:03:41] something's off.

[01:03:42] So a lot,

[01:03:43] a lot of this is a local

[01:03:44] structure problem.

[01:03:48] What it tends to be

[01:03:49] quite good at

[01:03:50] is global structure,

[01:03:51] which is,

[01:03:51] you know,

[01:03:52] having an image that

[01:03:53] when you look at it

[01:03:53] the first time,

[01:03:54] or even if you blur it,

[01:03:55] it looks convincing,

[01:03:57] right?

[01:03:57] Because when you blur it,

[01:03:58] that's really looking

[01:03:59] at the global structure

[01:03:59] and that's actually

[01:04:00] how a lot of these

[01:04:01] models are trained

[01:04:01] is they look at

[01:04:02] the kind of,

[01:04:03] the global structure

[01:04:04] and then they refine that.

[01:04:06] That's how they tend

[01:04:07] to generate these images.

[01:04:08] They'll generate some

[01:04:08] global structure

[01:04:09] and then they'll

[01:04:10] refine that.

[01:04:10] If you've ever generated

[01:04:12] something with mid-journey,

[01:04:13] it visualizes this

[01:04:14] really nicely for you

[01:04:15] so you can see

[01:04:16] how the image

[01:04:18] is being generated

[01:04:19] and you can see

[01:04:20] how it starts

[01:04:20] with sort of a,

[01:04:21] you know,

[01:04:22] it's like an artist

[01:04:23] throwing some paint

[01:04:23] onto a canvas

[01:04:24] to give some impression

[01:04:25] of what it might be,

[01:04:26] you know,

[01:04:26] what it might turn into

[01:04:27] and then you see

[01:04:28] all these details

[01:04:29] coming in,

[01:04:30] right?

[01:04:31] Yes.

[01:04:32] So yeah,

[01:04:33] so you end up

[01:04:34] with these weird things

[01:04:35] where like the local

[01:04:36] structure is wrong

[01:04:37] so like you have

[01:04:38] the wrong number

[01:04:39] of fingers

[01:04:39] or like you've got

[01:04:40] like some weird

[01:04:41] looking teeth,

[01:04:42] right?

[01:04:42] Like some of the teeth

[01:04:43] are a uniform size

[01:04:45] and some of them

[01:04:45] are weird

[01:04:46] but it's getting

[01:04:46] better at that,

[01:04:48] right?

[01:04:48] There are the models

[01:04:49] that we're using

[01:04:50] developing new forms

[01:04:51] of transformers.

[01:04:53] Those are the sorts

[01:04:53] of models that we use

[01:04:54] for the,

[01:04:55] I say we,

[01:04:56] I'm speaking for

[01:04:56] the entire AI community

[01:04:57] here apparently.

[01:04:58] Yes, yes.

[01:05:00] That's the sort of model

[01:05:02] that AI researchers

[01:05:04] and developers use

[01:05:07] because those are the ones

[01:05:08] that are capable

[01:05:09] of digesting

[01:05:10] these enormous amounts

[01:05:11] of information

[01:05:11] in order to learn

[01:05:13] about the structure

[01:05:14] of what it is

[01:05:15] you want to generate

[01:05:16] and they're getting

[01:05:17] better at understanding

[01:05:18] local and global

[01:05:19] structure

[01:05:20] and how to create

[01:05:22] the right local

[01:05:22] structure

[01:05:23] given the kind

[01:05:24] of global information.

[01:05:25] So yeah,

[01:05:26] that's something

[01:05:27] it will improve.

[01:05:28] We've already seen it

[01:05:29] like we're already

[01:05:30] seeing more and more

[01:05:31] AI generated images

[01:05:32] with like five fingers

[01:05:34] or like four fingers

[01:05:35] in the thumb,

[01:05:36] right?

[01:05:38] So they're getting

[01:05:39] better

[01:05:42] but there's a lot

[01:05:43] of stuff there

[01:05:44] that I don't,

[01:05:46] I feel like we'll,

[01:05:47] I feel like it'll happen.

[01:05:48] Like I've,

[01:05:49] I've been as much

[01:05:50] as I've been close

[01:05:52] to the work

[01:05:52] and like,

[01:05:53] you know,

[01:05:53] I read AI research

[01:05:56] papers all the time.

[01:05:56] It's part of my job

[01:05:58] and I've been following

[01:06:00] the developments

[01:06:01] but I've always been

[01:06:01] skeptical when something

[01:06:02] big comes along.

[01:06:04] Yes.

[01:06:04] I'm always like,

[01:06:05] I don't know,

[01:06:06] I'm like,

[01:06:06] it's probably like,

[01:06:08] let's not jump the gun,

[01:06:09] right?

[01:06:09] Let's not jump

[01:06:10] on the hype train.

[01:06:11] Let's sit back

[01:06:12] and,

[01:06:13] you know,

[01:06:13] and I tend to be

[01:06:14] a bit more pessimistic

[01:06:16] maybe about how far

[01:06:17] along it is.

[01:06:18] I suppose like any

[01:06:19] technology,

[01:06:20] isn't it?

[01:06:20] Like,

[01:06:20] especially if it's

[01:06:21] a new concept,

[01:06:21] like you'll need

[01:06:22] time to test it out

[01:06:24] and see if it works

[01:06:24] properly first,

[01:06:25] not just,

[01:06:26] oh,

[01:06:26] brilliant,

[01:06:27] everyone do this.

[01:06:28] But my pessimism

[01:06:29] used to be closer

[01:06:30] to realism

[01:06:30] and now it really

[01:06:31] is more pessimism.

[01:06:32] Like it's a little bit

[01:06:33] scary how,

[01:06:34] how quickly things

[01:06:35] are developing

[01:06:36] right now.

[01:06:36] Right.

[01:06:37] Um,

[01:06:38] so I don't,

[01:06:39] I don't know how long

[01:06:39] it will be before

[01:06:40] that Coca-Cola advert

[01:06:41] looks,

[01:06:42] you know,

[01:06:43] real.

[01:06:44] It's when they get

[01:06:45] the John Lewis ad

[01:06:46] then we're in trouble.

[01:06:47] When he makes you cry

[01:06:48] he's like,

[01:06:49] exactly.

[01:06:50] I mean,

[01:06:50] it makes me cry now

[01:06:51] but for different reasons.

[01:06:56] Yeah.

[01:06:56] Sorry,

[01:06:56] Tom,

[01:06:57] I kind of cut you off.

[01:06:58] Keep talking,

[01:06:58] I enjoy you talking

[01:06:59] about the creepy,

[01:07:00] creepiness,

[01:07:01] like what made you

[01:07:02] really freak out?

[01:07:03] I mean,

[01:07:04] like a lot of the lorries.

[01:07:06] Right.

[01:07:06] Okay.

[01:07:07] Well,

[01:07:07] this,

[01:07:08] it looks like they're

[01:07:09] sliding along the ice.

[01:07:10] Maybe it's really dangerous.

[01:07:11] Um,

[01:07:12] one thing that I will point out

[01:07:14] is the Coca-Cola logo

[01:07:15] has obviously been

[01:07:16] edited in afterwards

[01:07:17] because it would not

[01:07:18] generate things like that.

[01:07:20] Well,

[01:07:20] they say,

[01:07:21] you know,

[01:07:21] in the,

[01:07:22] um,

[01:07:23] in the information about it,

[01:07:24] they say it's generated,

[01:07:26] you know,

[01:07:26] or like,

[01:07:27] um,

[01:07:27] created

[01:07:31] users generative AI.

[01:07:33] So they're using it as a tool,

[01:07:34] right?

[01:07:35] They're using it in the sort of context

[01:07:36] that we've been talking about.

[01:07:37] And,

[01:07:38] you know,

[01:07:38] there's still a director behind it

[01:07:40] and,

[01:07:40] you know,

[01:07:40] there'd be a colorist

[01:07:41] and all these kinds of things.

[01:07:42] And it's not like it's,

[01:07:44] they've just gone

[01:07:44] Coca-Cola advert.

[01:07:46] Here you go.

[01:07:48] But there is in the second link,

[01:07:49] it's like Coca-Cola,

[01:07:50] silver Santa one.

[01:07:52] There's a bit with some squirrels.

[01:07:54] That was so weird.

[01:07:55] Okay.

[01:07:56] Go on.

[01:07:57] They're like,

[01:07:58] their heads are like really close together.

[01:08:00] And like,

[01:08:01] if you weren't paying attention,

[01:08:02] you'd just play,

[01:08:02] Oh,

[01:08:02] it's two squirrels.

[01:08:03] They're next to each other.

[01:08:04] Then they kind of move apart

[01:08:05] and their face is like,

[01:08:07] yeah.

[01:08:09] The other thing with the squirrels

[01:08:11] that really,

[01:08:11] really got me is,

[01:08:13] um,

[01:08:13] there's a sort of weird,

[01:08:14] um,

[01:08:16] disconnection between the visuals

[01:08:18] and the audio.

[01:08:18] Cause obviously the audio has just been,

[01:08:20] you know,

[01:08:20] somebody's just done it with,

[01:08:22] you know,

[01:08:22] a folder artist or whatever,

[01:08:23] you know,

[01:08:23] a sound,

[01:08:24] sound designers come in.

[01:08:25] They've done that.

[01:08:26] Um,

[01:08:29] but it's weird because they've not like,

[01:08:31] how do you,

[01:08:32] how do you create audio for this weird

[01:08:36] generated video content?

[01:08:38] It doesn't,

[01:08:38] it doesn't look right.

[01:08:40] So all those little details,

[01:08:41] all that local structure stuff that I was

[01:08:43] talking about is just off.

[01:08:45] Right.

[01:08:46] And so when you hear the,

[01:08:47] the audio that corresponds to the,

[01:08:49] to the,

[01:08:50] to the visual information for me,

[01:08:52] it's super uncanny Valley because it's

[01:08:55] like,

[01:08:55] it's like,

[01:08:56] Oh,

[01:08:56] this weird,

[01:08:57] like mutant AI generated thing is making real

[01:09:04] sounds.

[01:09:05] And I'm,

[01:09:07] I can totally believe it when I watch like a

[01:09:08] well-drawn cartoon or whatever.

[01:09:11] Totally fine.

[01:09:11] Brain is totally fine with it because

[01:09:13] everything I think is like kind of consistent

[01:09:15] within that context,

[01:09:16] right?

[01:09:17] It's contextually consistent,

[01:09:18] but with this,

[01:09:19] like it shifts a bit,

[01:09:21] it shifts around.

[01:09:21] And like Tom was saying,

[01:09:22] like their faces morph and like,

[01:09:26] there's all sorts of weird stuff.

[01:09:28] Like maybe it would make more sense to me if it

[01:09:30] was doing weird morphing noises as well.

[01:09:33] Yeah.

[01:09:35] Should have had me.

[01:09:39] So yeah,

[01:09:40] it's weird.

[01:09:41] It is.

[01:09:42] Yeah.

[01:09:43] It's,

[01:09:44] you know,

[01:09:45] as an advert,

[01:09:45] we're talking about it.

[01:09:47] Clearly it's working.

[01:09:48] that's,

[01:09:48] that's the thing.

[01:09:49] That's the gimmick they're going for.

[01:09:51] Yeah.

[01:09:51] That's the thing.

[01:09:52] It's,

[01:09:53] it gets attention and you kind of,

[01:09:56] and also I don't know about you.

[01:09:57] I'm not surprised that like big corporations are

[01:10:01] the sort of the first people to take a swing at

[01:10:03] this kind of thing and,

[01:10:04] and use it.

[01:10:05] Well,

[01:10:06] it just kind of,

[01:10:06] it's what you expect.

[01:10:07] I mean,

[01:10:08] Disney,

[01:10:08] right,

[01:10:08] for example,

[01:10:09] gotten a lot of hot water when they use an AI

[01:10:12] thing to generate.

[01:10:13] I think it was secret wars,

[01:10:15] like the opening credits for that show.

[01:10:17] that's right.

[01:10:18] Yeah.

[01:10:18] It was all AI generated.

[01:10:20] And I remember watching that show much to my

[01:10:23] chagrin,

[01:10:23] but you know,

[01:10:24] it was a waste of time.

[01:10:25] Don't bother if you haven't.

[01:10:26] Nope.

[01:10:27] Nope.

[01:10:27] I like it.

[01:10:28] I like it.

[01:10:29] Okay.

[01:10:29] All right.

[01:10:30] All right.

[01:10:30] Listen,

[01:10:31] I'm up for everyone's opinion.

[01:10:33] That's fine.

[01:10:33] I just,

[01:10:34] I liked it until the end.

[01:10:36] I'm perfectly happy for my opinion to be

[01:10:38] wrong,

[01:10:39] but no,

[01:10:40] I liked it.

[01:10:40] But also I liked the,

[01:10:42] I liked the intro sequence and I could see

[01:10:44] like,

[01:10:45] you know,

[01:10:45] I've been playing with mid journey a bit and

[01:10:47] I could see it was very much sort of

[01:10:50] generative and particularly like the way

[01:10:52] that they have to morph images between

[01:10:54] each other.

[01:10:55] So that,

[01:10:55] you know,

[01:10:56] so that it kind of makes sense.

[01:10:57] Right.

[01:10:58] Particularly at that time.

[01:10:59] Yeah.

[01:10:59] Cause we're getting better now.

[01:11:00] Like there's,

[01:11:01] there's better sort of video generation

[01:11:02] now than there was then.

[01:11:03] And it was interesting.

[01:11:04] I was like,

[01:11:05] I remember watching it going,

[01:11:05] okay,

[01:11:06] I can kind of see this.

[01:11:07] It was just very bad timing.

[01:11:08] So I think it was right.

[01:11:09] A strike.

[01:11:10] And that was kind of like the height of these

[01:11:12] discussions.

[01:11:13] Um,

[01:11:13] and also the,

[01:11:14] they clearly use a lot of it to do the end

[01:11:15] fight scene.

[01:11:16] Um,

[01:11:17] and it was horrific.

[01:11:19] Oh,

[01:11:19] that's my,

[01:11:20] that's my problem with it.

[01:11:21] No,

[01:11:21] I don't,

[01:11:22] I don't,

[01:11:22] I don't know whether they did.

[01:11:24] You really think they,

[01:11:25] I don't know.

[01:11:26] I don't know if it was all day,

[01:11:28] but a lot of the,

[01:11:28] the technology they used for that was

[01:11:31] bad.

[01:11:32] Yeah.

[01:11:32] It just looked like they just didn't,

[01:11:33] they just weren't paying their CGI people is what

[01:11:37] that looked like to me.

[01:11:38] That's kind of what I was wondering was whether it

[01:11:40] was that,

[01:11:40] where they just chucked it in and just went,

[01:11:42] yeah,

[01:11:42] this will do,

[01:11:43] which either way,

[01:11:45] not great.

[01:11:45] But yeah,

[01:11:47] again,

[01:11:47] it's an interesting thing.

[01:11:47] And like you said,

[01:11:48] but it sparked up a lot of conversation and that's kind of the same

[01:11:51] with this Coca-Cola thing is like,

[01:11:52] yeah,

[01:11:52] but it gets people talking and it gets people looking at it and it

[01:11:55] gets people going,

[01:11:56] Oh,

[01:11:56] I wonder.

[01:11:57] And so there is that kind of conversation to have about it all.

[01:12:00] Right.

[01:12:00] But yeah,

[01:12:01] I get,

[01:12:02] I guess it kind of comes back to what we were saying a minute ago

[01:12:04] though.

[01:12:05] Like we can talk about all of this stuff and discuss the ins

[01:12:08] and outs of it,

[01:12:09] but how does it leave you feeling is the bottom line.

[01:12:12] And we've all agreed like,

[01:12:14] yeah,

[01:12:14] it's unsettling.

[01:12:15] It is.

[01:12:16] Yeah.

[01:12:17] That's,

[01:12:17] that's the feeling you come away with.

[01:12:19] And I'm,

[01:12:19] and I'm worried because it's unsettling now,

[01:12:22] but I'm more scared about the future.

[01:12:26] Like I'm,

[01:12:27] I'm,

[01:12:27] I'm worried about the time that it isn't unsettling.

[01:12:30] That's the thing,

[01:12:30] you know,

[01:12:31] that's where those real ethical concerns start to surface.

[01:12:34] So what do you think is a solution to that then?

[01:12:37] I think it's,

[01:12:39] I think building strong communities that appreciate

[01:12:43] artistic endeavor.

[01:12:45] Right.

[01:12:47] Right.

[01:12:47] Like how do we,

[01:12:48] how do we do everything that we can to keep people excited about

[01:12:53] supporting real humans doing real art?

[01:12:57] Interesting.

[01:13:00] I mean,

[01:13:01] I would go a bit broader than that,

[01:13:03] really.

[01:13:03] I mean,

[01:13:06] I think there's going to be a point where you ring,

[01:13:08] you know,

[01:13:08] you ring the gas company and it's going to be an AI operator and you could

[01:13:14] spend your whole day talking to AIs and not talking to a real human.

[01:13:17] So it's really important that.

[01:13:19] That's already my life.

[01:13:21] Yeah.

[01:13:22] I was going to say,

[01:13:23] if you try doing online chats with any company these days,

[01:13:26] that's pretty much what it is.

[01:13:27] But one day where you're just on the phone and it will just sound like you're

[01:13:30] talking,

[01:13:30] but I think it's important that,

[01:13:32] you know,

[01:13:33] there's,

[01:13:33] I think my recommend,

[01:13:35] well,

[01:13:35] not that anyone is asking,

[01:13:36] but my recommendation would be just to make it clear,

[01:13:39] make it clear when,

[01:13:41] I don't think there's a problem with AI,

[01:13:42] but make it clear when it is AI and when it's not AI.

[01:13:46] Cause you could go down,

[01:13:47] you could spend your whole day not talking to a single human being.

[01:13:50] And that sounds a bit sad.

[01:13:52] And that's when you run into issues.

[01:13:53] I agree.

[01:13:54] I agree.

[01:13:56] That's yeah.

[01:13:57] I think that's it.

[01:13:58] I think you're right.

[01:13:59] I think that's probably,

[01:13:59] that's the way forward.

[01:14:00] Right.

[01:14:00] Is yeah.

[01:14:02] Make it clear.

[01:14:04] And as far as the supporting people goes,

[01:14:07] a thing goes,

[01:14:07] I think that'll always be there.

[01:14:08] Maybe I'm just too much of an optimist,

[01:14:10] but I kind of think again,

[01:14:12] this stuff to me,

[01:14:13] whenever I've stumbled across it,

[01:14:14] in creativity,

[01:14:15] like we've talked about in these various avenues,

[01:14:18] it's always a fun novelty,

[01:14:21] but you know,

[01:14:22] it's not the same as the real thing.

[01:14:24] It just isn't like,

[01:14:26] I'm not,

[01:14:26] I'm personally,

[01:14:26] I don't,

[01:14:27] I don't think anyone really for at least for a longterm thing is going to

[01:14:31] want to go out and spend time and money investing in something creative

[01:14:37] that isn't made by a person.

[01:14:39] I hope so.

[01:14:40] It's going to be empty.

[01:14:41] It's good.

[01:14:41] It might be fun for a few minutes and you might get a kick out of it,

[01:14:44] but I don't know.

[01:14:46] Are you going to want to follow that for 10,

[01:14:48] 10 years or more?

[01:14:49] I don't,

[01:14:49] I don't know.

[01:14:50] It depends.

[01:14:51] Like you're talking about,

[01:14:52] this is like a real fan of music,

[01:14:54] right?

[01:14:54] A real fan.

[01:14:55] That's true.

[01:14:55] Yeah.

[01:14:56] Right.

[01:14:56] And what about people who just want to,

[01:14:58] they just want to create an atmosphere,

[01:15:00] right?

[01:15:00] They want to create a mood right in their house.

[01:15:03] Like what about,

[01:15:04] okay,

[01:15:04] so you're,

[01:15:05] you're having a romantic dinner in with your partner,

[01:15:08] right?

[01:15:09] And you just want to create a mood.

[01:15:11] Does it matter that the songs that are playing,

[01:15:14] you know,

[01:15:15] so long as they create the mood that you want,

[01:15:16] does it matter that they are written and recorded and performed by real

[01:15:23] people?

[01:15:23] Does that,

[01:15:25] well,

[01:15:26] I mean,

[01:15:27] I would say if you want to,

[01:15:29] you know,

[01:15:30] it's like Alexa,

[01:15:31] play some romantic music that's generated.

[01:15:34] Yeah.

[01:15:35] Yeah.

[01:15:35] We,

[01:15:35] if you say that's your first date and then on your wedding day,

[01:15:39] do you want to be like,

[01:15:40] Oh,

[01:15:40] let's play the AI song that we generated for our first dance.

[01:15:43] Yeah.

[01:15:44] Does it?

[01:15:45] I don't know,

[01:15:45] but then it does have meaning,

[01:15:46] right?

[01:15:47] Then it has more meaning,

[01:15:48] right?

[01:15:49] Cause you've just linked it to your own personal experience.

[01:15:53] Right.

[01:15:54] It's like,

[01:15:54] but you,

[01:15:54] but I don't think,

[01:15:55] I think to Tom's point,

[01:15:57] I don't think he would though,

[01:15:58] because you wouldn't remember it.

[01:16:00] Cause you're not going to play again.

[01:16:01] I don't know.

[01:16:02] Well,

[01:16:02] generate me some more.

[01:16:03] You're like,

[01:16:03] well,

[01:16:04] but what if you did,

[01:16:05] what if you did play again?

[01:16:06] What if,

[01:16:06] what if there becomes an,

[01:16:07] what if there's an interface whereby you would be like,

[01:16:11] Hey,

[01:16:11] I want this playlist.

[01:16:12] And then every time there's a song that you really like,

[01:16:14] you're like,

[01:16:15] call this song,

[01:16:16] this or whatever,

[01:16:17] I'll save this song.

[01:16:18] Right.

[01:16:19] Surely you just,

[01:16:19] you just generate songs that sound similar to that,

[01:16:22] that you might like.

[01:16:23] Yeah.

[01:16:23] And also to that point,

[01:16:25] to,

[01:16:25] to sort of pick you,

[01:16:26] piggyback off what you were saying just there as well,

[01:16:28] Matt,

[01:16:28] like what was your intent with doing that?

[01:16:31] It wasn't,

[01:16:32] Oh,

[01:16:32] I want to listen to this.

[01:16:33] It's just background noise.

[01:16:35] So you're,

[01:16:36] I don't know if you would be paying enough attention or care enough to go,

[01:16:40] Oh,

[01:16:41] save that.

[01:16:42] Maybe,

[01:16:42] maybe I'm just cynical about that,

[01:16:44] but I don't know.

[01:16:45] Yeah.

[01:16:46] I don't,

[01:16:46] I don't,

[01:16:46] I don't think anybody knows the answers to these sorts of things.

[01:16:49] Right.

[01:16:49] Like it's really difficult.

[01:16:51] It's,

[01:16:51] it's different people engage with,

[01:16:54] I don't want to call it content.

[01:16:56] I think content's a really cold word,

[01:16:57] but I think that if we're talking about,

[01:17:00] you know,

[01:17:00] this generative AI,

[01:17:02] you know,

[01:17:03] music,

[01:17:04] I would,

[01:17:04] I don't,

[01:17:05] I wouldn't hesitate to call that content.

[01:17:07] Right.

[01:17:07] Agreed.

[01:17:08] Yeah.

[01:17:09] Yeah.

[01:17:09] But people engage with multimedia,

[01:17:12] right.

[01:17:13] Music,

[01:17:14] video,

[01:17:15] you know,

[01:17:15] you know,

[01:17:15] TV and film and art for different reasons in different ways.

[01:17:21] That's right.

[01:17:22] I know that there are plenty of musicians out there who are very happy to be

[01:17:26] able to use generative AI to create album covers that look amazing.

[01:17:31] They would never be able to pay an artist to create.

[01:17:34] Right.

[01:17:35] And,

[01:17:36] um,

[01:17:38] there was a time when I thought that that was cool,

[01:17:40] but that time was before it was widely adopted and now it's everywhere.

[01:17:45] And you can see how generic it has become.

[01:17:49] And maybe we'll,

[01:17:50] maybe we'll escape that hurdle,

[01:17:52] but I,

[01:17:53] I hope that we can learn while we're in this,

[01:17:56] this kind of valley of imperfection with,

[01:18:00] with these tools.

[01:18:01] I hope that we can learn that there is value that these methods will never be able to generate.

[01:18:12] Right.

[01:18:12] In terms of human experience,

[01:18:14] right.

[01:18:14] Coming back to what we've talked about.

[01:18:16] So,

[01:18:18] yeah,

[01:18:19] I,

[01:18:20] but it doesn't mean like,

[01:18:22] I,

[01:18:22] it doesn't,

[01:18:23] it's not,

[01:18:24] it's not a rosy future.

[01:18:25] Right.

[01:18:25] I think that Stephen Wilson's right to be concerned and to be pessimistic to a certain extent.

[01:18:30] And maybe it's realism.

[01:18:32] Right.

[01:18:32] Yeah.

[01:18:32] Um,

[01:18:34] but I think it's entirely possible that once you have something that is,

[01:18:42] you know,

[01:18:43] a technology that is ubiquitous enough for anybody to just go and use it.

[01:18:49] Yeah.

[01:18:49] Maybe,

[01:18:50] maybe,

[01:18:51] maybe it's going to have a terrible impact.

[01:18:53] Right.

[01:18:53] Think about Spotify.

[01:18:54] Think about how much musicians get paid for streaming.

[01:18:57] Right.

[01:18:58] Well,

[01:18:58] you know what the next step is not paying musicians at all.

[01:19:02] Yeah.

[01:19:03] Yeah.

[01:19:04] But again,

[01:19:04] I've,

[01:19:05] maybe I'm just cynical like this,

[01:19:06] but I just kind of,

[01:19:07] again,

[01:19:07] not cynical.

[01:19:08] I think I'm actually optimistic in this degree of like,

[01:19:10] but I think that has a shelf life.

[01:19:12] I hope you're right.

[01:19:14] I think you can do that enough.

[01:19:16] And I'm sure there'll be,

[01:19:17] there'll be,

[01:19:17] again,

[01:19:18] there will be an audience for it.

[01:19:19] I think about it the same way I think about pop music.

[01:19:21] Right.

[01:19:21] And this is not,

[01:19:21] I'm not going to go at pop music for anyone listening.

[01:19:23] Like I've done pop bands on this podcast,

[01:19:27] but I often think the way that some people write,

[01:19:29] like,

[01:19:30] for example,

[01:19:30] if you hire a bunch of,

[01:19:31] session artists,

[01:19:32] they come in,

[01:19:32] they write you a great album.

[01:19:34] That's cool.

[01:19:35] You play some radio hits,

[01:19:37] but I've noticed this kind of happening over the last few years.

[01:19:40] Like how many songs can you think of?

[01:19:41] Like from 10 years ago where you go,

[01:19:43] yeah,

[01:19:44] that song,

[01:19:45] you know what I mean?

[01:19:45] Like the ones that people say are all time classics that they go back to.

[01:19:49] It's like,

[01:19:50] I think you kind of get to like the 2000s and then there's like a sharp drop off,

[01:19:54] but you still get ones that pop through.

[01:19:56] Like certain artists come along and they managed to break the mold,

[01:19:59] but that's the thing.

[01:20:00] They have to break the mold.

[01:20:01] They have to bring their voice into it enough that you then pay attention to them.

[01:20:05] And in between that,

[01:20:06] there's a million dime a dozen,

[01:20:08] you know,

[01:20:09] artists that come and go.

[01:20:10] And you don't really,

[01:20:11] you might hear it in a quiz one day and go,

[01:20:13] oh yeah,

[01:20:14] I remember that song,

[01:20:16] but is it going to be something that you connect with or like,

[01:20:18] oh,

[01:20:19] I'm going to go see that person on tour.

[01:20:20] I'm going to buy a t-shirt.

[01:20:21] I'm going to do any of the things we discussed to supporting them.

[01:20:24] No.

[01:20:25] So if you bring in that,

[01:20:26] if you've removed that further by giving it to a machine,

[01:20:30] I think it's going to be that one step further removed where it's going to be even less memorable.

[01:20:35] And it becomes what you described a minute ago,

[01:20:37] Matt,

[01:20:37] where it's content.

[01:20:38] Yeah.

[01:20:38] I hope that's right.

[01:20:39] And what do you do with content?

[01:20:41] You just,

[01:20:41] you have it and then it goes and you don't ever think about it again.

[01:20:47] So I think I'm always amazed by is like people who do things like Patreon.

[01:20:53] Yeah.

[01:20:53] And they really put their like,

[01:20:55] you know,

[01:20:55] I like this.

[01:20:56] I want to give you money.

[01:20:57] Yeah.

[01:20:57] And I think that's just going to increase really,

[01:21:00] because they know that creative people,

[01:21:02] you know,

[01:21:02] are going to struggle more and more.

[01:21:03] And I think this model of like,

[01:21:06] no,

[01:21:06] I like what you do.

[01:21:07] Have some,

[01:21:07] you know,

[01:21:08] here's three quid a month or whatever to keep on doing it.

[01:21:11] And I think these kinds of things kind of,

[01:21:14] you know,

[01:21:15] 20 years ago,

[01:21:15] they wouldn't have this kind of model of just maybe have fan clubs.

[01:21:19] I don't know.

[01:21:19] It's so interesting that you asked about 20 years ago,

[01:21:23] many,

[01:21:23] many,

[01:21:24] many moons ago,

[01:21:25] almost 20 years ago.

[01:21:27] In fact,

[01:21:27] I did a university module called music and patronage.

[01:21:33] And it talked about how different musicians,

[01:21:37] like it,

[01:21:37] you know,

[01:21:38] talked about,

[01:21:38] it was,

[01:21:39] you know,

[01:21:39] the purpose of this module was to learn about how different musicians over

[01:21:44] the years,

[01:21:45] um,

[01:21:46] made a living through music.

[01:21:48] Right.

[01:21:48] And,

[01:21:49] and you learn like,

[01:21:49] it's always,

[01:21:50] it's always been hard.

[01:21:51] Right.

[01:21:51] It's never been,

[01:21:52] it's never been easy.

[01:21:54] And that's the interesting thing.

[01:21:56] Cause I think a lot of people are like,

[01:21:57] Oh,

[01:21:57] well,

[01:21:57] Spotify has ruined music or AI generated music,

[01:22:00] ruined,

[01:22:01] ruined musicians lives.

[01:22:02] And like,

[01:22:03] it's been,

[01:22:04] it's been difficult,

[01:22:04] you know,

[01:22:05] going,

[01:22:05] going all the way back.

[01:22:07] Right.

[01:22:07] You know,

[01:22:08] um,

[01:22:09] I mean,

[01:22:09] and sort of examples like Mozart and Beethoven are the ones that I,

[01:22:13] I sort of,

[01:22:13] um,

[01:22:15] looked into in particular,

[01:22:16] like I did like a little essay or something.

[01:22:17] I can't remember much of it now,

[01:22:19] but you know,

[01:22:20] for example,

[01:22:21] um,

[01:22:21] back then the church was one of the main patrons for musicians.

[01:22:25] So if you're a musician,

[01:22:26] then you would typically be writing or playing for a church,

[01:22:31] right?

[01:22:31] Beethoven,

[01:22:32] an organist,

[01:22:33] right.

[01:22:33] You know,

[01:22:34] wait,

[01:22:35] sorry,

[01:22:35] not Beethoven Bach.

[01:22:37] Um,

[01:22:37] yes.

[01:22:37] Uh,

[01:22:38] a phenomenal,

[01:22:38] um,

[01:22:40] organist,

[01:22:40] um,

[01:22:42] spent a lot of time writing and playing for the church,

[01:22:46] right.

[01:22:46] Mozart was a little bit different,

[01:22:48] but also,

[01:22:49] you know,

[01:22:49] you know,

[01:22:50] he had that sort of,

[01:22:51] um,

[01:22:52] that sort of involvement as well.

[01:22:53] And so that's always been,

[01:22:57] um,

[01:22:58] been a difficulty,

[01:22:59] right.

[01:22:59] How to,

[01:22:59] how to make money through music and how to make money through art more

[01:23:02] generally,

[01:23:03] right.

[01:23:03] Like the starving artist isn't a new,

[01:23:06] um,

[01:23:06] it's,

[01:23:07] it's not a new paradigm,

[01:23:08] right.

[01:23:08] AI hasn't,

[01:23:09] hasn't caused that.

[01:23:10] That's something that I think everybody's been familiar with,

[01:23:14] uh,

[01:23:15] hundreds of years,

[01:23:16] thousands of years since the,

[01:23:18] since the dawn of art.

[01:23:20] It's true.

[01:23:21] It's true.

[01:23:21] But again,

[01:23:22] I think it goes back to what we were saying before.

[01:23:23] It's community for one thing,

[01:23:26] connecting with other people.

[01:23:28] And again,

[01:23:28] my,

[01:23:29] my personal take is like,

[01:23:30] see yourself as self-employed,

[01:23:31] you know,

[01:23:32] learn about these things.

[01:23:33] For example,

[01:23:34] yeah.

[01:23:34] AI comes along.

[01:23:34] Okay.

[01:23:35] Learn about it.

[01:23:36] Think how,

[01:23:36] how can I use this to my advantage?

[01:23:38] Is there something that can help me?

[01:23:40] And if the answer is no,

[01:23:42] we think,

[01:23:42] you know,

[01:23:42] it takes away from something.

[01:23:43] That's fine.

[01:23:44] Don't worry about it.

[01:23:45] Just pivot and do something else.

[01:23:46] But yeah,

[01:23:47] I agree with you,

[01:23:48] Tom.

[01:23:48] I think that's the biggest thing is absolutely.

[01:23:50] It's connecting with other people.

[01:23:51] And I agree.

[01:23:52] I think if it ends up right,

[01:23:54] let's say doomsday scenario happens and we just get flooded with AI content,

[01:23:59] then it's very convincing.

[01:24:00] And,

[01:24:00] you know,

[01:24:02] you find,

[01:24:02] I don't know,

[01:24:03] bands and pop culture podcasts are overrun with AI and whatever.

[01:24:09] People like ourselves who make the things become more valuable because they're more rare.

[01:24:15] And then people will go out of their way to connect because what was a fun novelty becomes mundane and it becomes content.

[01:24:24] So it's like boutique music.

[01:24:26] Well,

[01:24:27] I think about it like this,

[01:24:28] like cinemas kind of had this recently,

[01:24:30] right?

[01:24:30] Where a lot of Holly,

[01:24:31] this has been a topic that's come up recently actually on the show is like so many studios have been throwing hundreds of millions of obscene,

[01:24:38] amounts of money,

[01:24:39] like big blockbuster things,

[01:24:41] recognizable IPs.

[01:24:42] But it's the intention is it's content.

[01:24:45] It's we need stuff for our streaming platforms.

[01:24:47] We need to build a cinematic universe,

[01:24:48] blah,

[01:24:48] blah,

[01:24:49] blah.

[01:24:49] It's not about telling a story.

[01:24:51] It's not about doing something creative.

[01:24:53] It's really just let's make a thing to keep our shareholders happy.

[01:24:57] That's the bottom line.

[01:24:59] And what happens is the trickle down effect is when it gets to the audience,

[01:25:03] guess what?

[01:25:04] They don't connect with it.

[01:25:05] Or if they do,

[01:25:06] it's fun for five minutes and then you get bored of it.

[01:25:09] You know,

[01:25:09] how sick are you now of multiverses?

[01:25:11] Right.

[01:25:11] I know I am.

[01:25:12] It was fun for the first few minutes as a big old nerd.

[01:25:15] I was like the target audience for stuff like this.

[01:25:18] Now I'm sick to death of it because it's no longer the intention.

[01:25:21] It's no longer like,

[01:25:22] let's bring all these characters together.

[01:25:24] Let's try something new.

[01:25:24] It's just,

[01:25:25] Oh yeah,

[01:25:26] another one.

[01:25:26] So then when you go out and you watch something that's lower budget or,

[01:25:30] you know,

[01:25:31] whatever,

[01:25:31] I don't care because what,

[01:25:33] those are the things that are now doing really well.

[01:25:35] Why?

[01:25:35] Because there's a story because the people behind it intended to do something

[01:25:40] different and it then becomes the new thing,

[01:25:43] you know?

[01:25:43] And that's just sort of,

[01:25:44] that's my opinion of like,

[01:25:45] I think that's what's going to happen in this field in particular.

[01:25:48] I know that we're,

[01:25:49] we're getting to time.

[01:25:51] I don't know how much margin there is,

[01:25:53] but go on,

[01:25:54] go on.

[01:25:55] When you talk about the,

[01:25:57] the issue with everything being the same,

[01:26:00] right?

[01:26:00] I totally agree.

[01:26:01] I think as a fan of,

[01:26:03] you know,

[01:26:04] you were talking about this idea of multiverses and I was,

[01:26:06] I was the same.

[01:26:07] And now it feels like,

[01:26:09] okay,

[01:26:09] we've,

[01:26:10] we've got a formula and it's,

[01:26:11] and it's made us lots of money.

[01:26:12] So let's do it again.

[01:26:13] Right now.

[01:26:15] Now here's,

[01:26:15] here's the interesting thing.

[01:26:16] So I spent two and a half years,

[01:26:22] something like that,

[01:26:23] working at IBM research as,

[01:26:25] um,

[01:26:25] as a researcher into,

[01:26:27] um,

[01:26:28] using AI methods to improve various types of,

[01:26:34] um,

[01:26:35] of production pipelines.

[01:26:36] Um,

[01:26:36] but one of the things in particular was drug discovery.

[01:26:40] Okay.

[01:26:41] Um,

[01:26:41] and so with drug discovery,

[01:26:44] you're working with people who have years and years and years of experience in an industry that is very mature.

[01:26:51] Right.

[01:26:53] people have,

[01:26:54] you know,

[01:26:54] and,

[01:26:54] and,

[01:26:55] and,

[01:26:55] and yet we,

[01:26:56] we see,

[01:26:58] um,

[01:26:59] so few new drugs,

[01:27:01] right?

[01:27:01] I think everybody's familiar with the issues that we have at the moment with antibiotic resistance

[01:27:05] and how much we really need new classes of antimicrobial drugs.

[01:27:08] Um,

[01:27:09] and the techniques that we worked on,

[01:27:11] right?

[01:27:11] They weren't,

[01:27:12] they weren't the,

[01:27:13] the,

[01:27:13] what people call gen AI or generative AI that,

[01:27:17] that,

[01:27:17] that we're,

[01:27:18] we're used to,

[01:27:19] right?

[01:27:19] Where it's,

[01:27:19] you know,

[01:27:20] it's,

[01:27:21] um,

[01:27:21] generating,

[01:27:22] uh,

[01:27:24] text or visuals or audio,

[01:27:27] right?

[01:27:28] But it's generative AI,

[01:27:30] right?

[01:27:30] Anything that,

[01:27:31] that,

[01:27:31] that has a,

[01:27:32] that,

[01:27:32] that learns.

[01:27:33] So the,

[01:27:34] the technical way of,

[01:27:35] of saying this is anything that learns from a distribution to,

[01:27:40] to generalize the,

[01:27:42] that learns to generalize the key characteristics of that distribution,

[01:27:49] right?

[01:27:50] Of some data,

[01:27:51] um,

[01:27:51] such that it can generate new,

[01:27:54] very plausible data is generative AI,

[01:27:57] right?

[01:27:57] And this is something that has existed for a very long time.

[01:28:00] This is just a,

[01:28:01] it is a field in AI.

[01:28:04] It is a group of methods.

[01:28:05] Um,

[01:28:06] and this is right,

[01:28:07] you know,

[01:28:07] Tom was talking about this earlier.

[01:28:09] Um,

[01:28:09] you were talking about your,

[01:28:10] your,

[01:28:10] your colleague who,

[01:28:11] who is using similar sorts of things to generate new examples of tumors that are not,

[01:28:18] you know,

[01:28:18] that are not common.

[01:28:19] And so we don't have a lot of data,

[01:28:21] right?

[01:28:21] So in a similar way,

[01:28:23] what we can do is we can use different types of AI methods.

[01:28:29] Um,

[01:28:30] they're not,

[01:28:31] they don't need to crunch as much data as these big fancy models that we have,

[01:28:36] right?

[01:28:36] They,

[01:28:36] they can be much more efficient.

[01:28:38] Um,

[01:28:40] but they can learn about a particular domain.

[01:28:44] So that domain might be antimicrobial drugs,

[01:28:47] right?

[01:28:47] So you have certain types of microbes and you want to be able to kill those kinds of microbes,

[01:28:51] but you want to be able to,

[01:28:52] you want to ensure that the host doesn't die.

[01:28:55] Right.

[01:28:55] That's the,

[01:28:55] that's the trick,

[01:28:56] right?

[01:28:57] Otherwise we can do what,

[01:28:58] what Donald Trump said,

[01:28:59] and we can just,

[01:29:00] you know,

[01:29:01] inject bleach into our veins or whatever.

[01:29:03] Um,

[01:29:06] and,

[01:29:06] and so the point that I'm getting to here,

[01:29:08] right,

[01:29:08] right.

[01:29:09] you know,

[01:29:09] by a,

[01:29:10] a bit of a,

[01:29:11] um,

[01:29:12] bit of a roundabout way is,

[01:29:14] um,

[01:29:16] these sorts of methods have,

[01:29:18] uh,

[01:29:19] enabled us to develop,

[01:29:21] uh,

[01:29:22] AI machine learning methods to propose new combinations of,

[01:29:31] um,

[01:29:32] uh,

[01:29:32] new,

[01:29:32] new combinations of molecules for,

[01:29:36] uh,

[01:29:37] potential drugs.

[01:29:38] Right.

[01:29:39] Very cool.

[01:29:39] Okay.

[01:29:40] And that's really cool.

[01:29:41] And there's,

[01:29:41] there's lots of amazing research going on in that.

[01:29:43] But the,

[01:29:44] the interesting thing here is that those are things,

[01:29:47] those are combinations that the experts who have 20,

[01:29:51] 30,

[01:29:51] 40,

[01:29:52] 50 years of experience in these fields,

[01:29:55] tackling these very,

[01:29:56] you know,

[01:29:57] some of these problems are very,

[01:29:58] very specific.

[01:29:59] Um,

[01:30:00] they haven't thought of this.

[01:30:02] These are,

[01:30:02] these are out of the box for them,

[01:30:04] right?

[01:30:04] It's something that they,

[01:30:05] they haven't been exposed to.

[01:30:07] So if we come back to your point about,

[01:30:10] um,

[01:30:10] this,

[01:30:11] this kind of recycling this,

[01:30:13] you know,

[01:30:14] okay,

[01:30:14] well,

[01:30:14] everybody's going to get used to AI content because it's going to,

[01:30:16] it's going to have particular characteristics and have a sort of

[01:30:20] fingerprint that,

[01:30:21] you know,

[01:30:21] we kind of get used to,

[01:30:22] and then it's just recycled.

[01:30:23] It's just,

[01:30:23] it's again,

[01:30:24] and we're kind of sick of it.

[01:30:25] Well,

[01:30:26] there are actually techniques that can kind of help make sure that

[01:30:29] doesn't happen and keep AI pretty convincing and engaging.

[01:30:33] Um,

[01:30:34] and so that's something to bear in mind,

[01:30:35] right?

[01:30:36] This is the,

[01:30:36] you know,

[01:30:36] these sorts of things,

[01:30:37] it's a mature field.

[01:30:39] Um,

[01:30:40] a lot of that lies within the field of optimization,

[01:30:43] um,

[01:30:44] where,

[01:30:45] you know,

[01:30:45] we,

[01:30:45] we look at generating new solutions to difficult problems.

[01:30:49] Um,

[01:30:50] and so if you incorporate some of these techniques,

[01:30:53] if you combine those with these big fancy multimedia models,

[01:31:00] you could end up with some,

[01:31:01] you know,

[01:31:04] Disney plus just generating all of its content for decades to come.

[01:31:09] Right.

[01:31:10] And that is,

[01:31:12] that is plausible and disturbing.

[01:31:14] Yeah,

[01:31:14] that is a,

[01:31:15] and that's a very fun note to end on.

[01:31:19] Cause yeah,

[01:31:20] it is,

[01:31:21] it is fascinating.

[01:31:22] I mean,

[01:31:22] I hope again,

[01:31:23] I hope that isn't the case,

[01:31:25] but who knows?

[01:31:27] Who knows?

[01:31:27] I suppose.

[01:31:28] I,

[01:31:28] I like to be,

[01:31:29] I hope not to.

[01:31:30] No,

[01:31:30] again,

[01:31:30] I hope to think of the optimistic side of,

[01:31:33] even if,

[01:31:33] even if it learns to get clever with how it creates the stuff,

[01:31:37] you'll still know it will still feel.

[01:31:39] And again,

[01:31:39] it's the intent,

[01:31:40] even if you like,

[01:31:41] you pulled in and you think,

[01:31:42] Oh,

[01:31:42] this is good and whatever the intent behind,

[01:31:45] it's always going to be hollow.

[01:31:47] So to me,

[01:31:47] it'll always feel hollow.

[01:31:49] It'll always be because bare humans can't do it.

[01:31:52] You know what I mean?

[01:31:53] Like when you've a bad show or a bad album that's made because people,

[01:31:58] someone just want to throw it together to make money.

[01:32:00] Never,

[01:32:00] ever works.

[01:32:01] Right.

[01:32:01] From a human's point of view.

[01:32:02] So I,

[01:32:03] I personally don't believe a machine's gonna know how to pull that off.

[01:32:07] If it does.

[01:32:08] I hope so.

[01:32:09] God help us.

[01:32:09] And I think even if it does,

[01:32:12] even if it does,

[01:32:13] it's an arms race,

[01:32:14] right?

[01:32:14] Like there will always be people who are working on methods to detect this,

[01:32:19] right?

[01:32:20] So that we can label it and do what Tom said,

[01:32:22] right?

[01:32:22] People should know when something is AI.

[01:32:25] I think that's really important.

[01:32:26] You can help people to make their own,

[01:32:28] their own minds.

[01:32:29] I agree.

[01:32:30] I think for me,

[01:32:31] the takeaway from this is support artists,

[01:32:33] everyone,

[01:32:34] because before you know it,

[01:32:36] it's,

[01:32:37] it's all going to be AI coke ads and we're all doomed,

[01:32:39] but no,

[01:32:41] definitely not.

[01:32:41] No,

[01:32:42] I've,

[01:32:42] I've really enjoyed this chat and yeah,

[01:32:45] thank you guys so much for coming on,

[01:32:46] like sharing all your thoughts and,

[01:32:48] and expertise on this has been really,

[01:32:49] really fascinating.

[01:32:50] And as I said,

[01:32:52] I did have to duck out for two minutes,

[01:32:53] so you probably have done all the plugging,

[01:32:55] but I am curious.

[01:32:55] What have you guys got coming up as,

[01:32:57] as real humans that make real things in the real human gigs?

[01:33:01] We have,

[01:33:02] we just announced some,

[01:33:03] some shows in March.

[01:33:05] Oh,

[01:33:06] okay.

[01:33:06] 14th,

[01:33:07] 15th,

[01:33:07] the 16th.

[01:33:08] We're playing in Leeds,

[01:33:08] Birmingham,

[01:33:09] Bristol.

[01:33:10] That's right.

[01:33:11] Yes.

[01:33:11] I'm going to try and make it to one of those.

[01:33:13] Yeah.

[01:33:13] Nice.

[01:33:13] And then we're playing with a fort and solas.

[01:33:16] Lovely.

[01:33:16] Lovely.

[01:33:17] Good fun.

[01:33:18] Yeah.

[01:33:18] For also our attention.

[01:33:20] Um,

[01:33:20] I don't know if you interviewed them,

[01:33:22] but if you didn't,

[01:33:23] you should,

[01:33:24] uh,

[01:33:25] they ring a bell,

[01:33:25] but I,

[01:33:26] I might cross paths with them in the future.

[01:33:27] You never know.

[01:33:28] Well,

[01:33:28] if you come to the shows,

[01:33:29] you'll be able to see them.

[01:33:30] I will.

[01:33:31] Yeah.

[01:33:31] I can verify they're real human beings as well.

[01:33:34] I will report back everyone.

[01:33:36] Yeah.

[01:33:37] We also have,

[01:33:38] we also have lots of merch on sale on our band camp.

[01:33:41] If you fancy.

[01:33:42] Amazing.

[01:33:43] Real human generated art.

[01:33:45] No,

[01:33:46] but I say it like that.

[01:33:46] It doesn't sound it.

[01:33:48] Yeah.

[01:33:48] Yeah.

[01:33:48] I can assure you it was.

[01:33:50] His fingers weren't crossed.

[01:33:51] Everyone.

[01:33:51] He is,

[01:33:51] he's being genuine.

[01:33:52] It was,

[01:33:53] it was too expensive to be AI generated.

[01:33:55] There you go.

[01:33:58] I love it.

[01:33:59] But yeah,

[01:33:59] guys,

[01:33:59] I mean,

[01:33:59] that's amazing.

[01:34:00] And obviously I'll put links in the show notes and yeah,

[01:34:03] people can go and check you out.

[01:34:04] I highly recommend that you do.

[01:34:05] Cause you know,

[01:34:06] I'm like to make a habit of checking out people that I interview their work.

[01:34:10] And so after meeting you guys,

[01:34:11] I did make a point to go and listen to some of your music and I really enjoyed it.

[01:34:14] So I absolutely recommend people go and check that out.

[01:34:17] Thank you very much.

[01:34:18] Amazing.

[01:34:19] You're very welcome.

[01:34:20] Yeah.

[01:34:20] Matt,

[01:34:21] Tom,

[01:34:21] thank you so much for coming onto the podcast.

[01:34:23] Thanks for having us.

[01:34:24] Thank you.

[01:34:25] A huge thank you to Matt and Tom for coming back onto the podcast.

[01:34:28] And ending this year with a bang.

[01:34:32] Truly one of my favorite conversations I've ever had doing this podcast.

[01:34:35] And I'm not just saying that it was a ton of fun.

[01:34:38] I learned so much in this conversation.

[01:34:41] And as you probably gathered from the AI jingle,

[01:34:43] I was very much inspired by this conversation as well.

[01:34:47] So again,

[01:34:47] a huge thank you to these guys for coming back onto the podcast.

[01:34:50] Do yourself a favor and check out their music.

[01:34:53] There are links in the show notes.

[01:34:54] We can do that.

[01:34:55] In fact,

[01:34:56] at the time of this episode coming out,

[01:34:58] they have just released a brand new remix of two of their songs.

[01:35:03] I've left links in the show notes.

[01:35:04] You can find out more.

[01:35:05] I highly recommend that you seek it out.

[01:35:08] They're absolutely brilliant.

[01:35:09] And as a nice added bonus,

[01:35:11] Matt was the very one who remixed them.

[01:35:12] So go and give him some support.

[01:35:15] Check out those remixes.

[01:35:16] Check out the bands.

[01:35:17] They are going to be doing some tours in March of 2025.

[01:35:20] So make sure you head to the link in the show notes where you can get some tickets

[01:35:24] and go and give these guys some much deserved support.

[01:35:28] Once again,

[01:35:29] links are in the show notes.

[01:35:31] Thank you so much for listening to this episode.

[01:35:33] If you enjoyed it,

[01:35:34] then please consider doing a few simple things to help me out as an independent podcaster.

[01:35:39] I do everything here except the artwork.

[01:35:41] That is Alex Jenkins.

[01:35:42] His details are in the show notes,

[01:35:44] but everything else from booking the guests,

[01:35:46] recording the episodes,

[01:35:47] editing the episodes,

[01:35:49] spending what can only be described as an unhealthy amount of time on jingles,

[01:35:52] and everything else is done by me.

[01:35:55] So if you want to support me in this journey,

[01:35:57] then please,

[01:35:58] please do a few simple things to help me along as an independent podcaster.

[01:36:02] First and foremost,

[01:36:04] just tell somebody.

[01:36:05] I don't mind who you tell or how you tell them.

[01:36:07] Word of mouth is great.

[01:36:09] Social media is also brilliant.

[01:36:11] You can give me a follow on all of the social media platforms that are linked below,

[01:36:14] from Threads,

[01:36:15] Blue Sky,

[01:36:16] Instagram,

[01:36:17] technically Twitter,

[01:36:18] and a Discord server.

[01:36:19] So make sure you give me a follow on all of those,

[01:36:21] so you can stay up to date with the latest from the podcast.

[01:36:24] So once again,

[01:36:25] links are in the show notes where you can do that.

[01:36:27] If you would like to go that one step further and help me out even more,

[01:36:31] then you can simply leave me a lovely five-star review on your favourite podcatcher.

[01:36:35] If you have done so,

[01:36:36] then please let me know,

[01:36:37] because I'd love to read it out and give you a big thank you on an episode of the podcast.

[01:36:41] And finally,

[01:36:42] if you'd like to contribute some money towards the cost of running the show,

[01:36:45] then I would direct you to the donation page,

[01:36:47] also linked in the show notes.

[01:36:49] I'd like to also give a massive thank you and farewell to the podcast sponsor,

[01:36:55] styleupyourpet.com.

[01:36:56] Richard was kind enough to give me a 25% discount,

[01:37:00] which you can still use if you're listening to this episode at the time of it coming out on December 23rd.

[01:37:05] I believe he'll be closing down the website in January.

[01:37:09] It is a shame.

[01:37:10] It's a very good website.

[01:37:12] I've got some products from it.

[01:37:13] I think they're absolutely brilliant.

[01:37:14] So if you fancy getting a last minute gift in there,

[01:37:16] then do yourself a favour, click the link in the show notes and get that very generous 25% off.

[01:37:22] If you don't have the means for it right now,

[01:37:24] then I'll make sure to drop a link to the eBay store which you'll be setting up,

[01:37:27] where you can grab yourself any of the stock that's left over.

[01:37:30] So thanks again to Richard for that.

[01:37:33] And if you're interested,

[01:37:34] I would direct you once again to the link in the show notes.

[01:37:38] And the question on everybody's mind,

[01:37:40] what am I doing for 2025?

[01:37:43] Well, there are a number of seasons in the works,

[01:37:45] but for the very first series of episodes for 2025,

[01:37:49] I am continuing Random Fandom.

[01:37:52] Yes, I am carrying on with this stretch because I've had some fantastic offers

[01:37:57] for guests to come on with some brilliant topics,

[01:38:00] and I just couldn't turn them down.

[01:38:01] So I'm going to extend this well into the first few months of 2025.

[01:38:06] In fact, the very first episode that will be up

[01:38:09] will feature a brand new guest to the podcast,

[01:38:12] and it will be available on January 6th, 2025.

[01:38:16] Thank you so much for listening to the podcast.

[01:38:18] This year has truly been a spectacular year for the show.

[01:38:22] I cannot believe some of the experiences I've been privileged to have,

[01:38:26] and it is all thanks to you guys.

[01:38:28] Thank you so much for making this fun little hobby of mine grow into something

[01:38:35] where I can reach out to people, do in-person interviews,

[01:38:38] and things that frankly I only ever dreamed of.

[01:38:41] So thank you.

[01:38:43] Thank you so much for making that happen for me this year.

[01:38:45] You have no idea how much it means to me.

[01:38:48] I'm truly, truly grateful to all of you.

[01:38:50] And I'm excited.

[01:38:51] 2025 is looking to be a really fun year for Fundamentals.

[01:38:55] So make sure that you're following, subscribed,

[01:38:59] whatever it is you have to do,

[01:39:00] so you do not miss out on what I have coming up.

[01:39:03] Thanks again for listening to this.

[01:39:04] It truly means the world.

[01:39:06] And I will meet you all right back here

[01:39:08] for even more episodes in 2025.

[01:39:11] See you then.