The Charleston Marketing Podcast

How AI Can Help Without Replacing Human Creativity w/ JT White

Charleston AMA Season 3

How are we doing? Who do you want to learn from next? Text us with notes and ideas.

A teen builds a hip-hop forum, hears its bars on the radio, and sends Sony a homemade cease and desist—then grows up to lead an AI video platform. That’s how our conversation starts, and the twists don’t stop. We bring JT White (Augie.studio) and filmmaker Jake from Koz Films into the studio for a frank, funny, and deeply practical look at where AI helps creators—and where it crosses the line.

We dig into the difference between augmentation and replacement. On one side: AI that cleans audio, removes filler words, accelerates edits, and prototypes storyboards so clients know what they’re buying. On the other: generative video trained on dubious datasets, copyright pitfalls, and the “race to the bottom” that devalues craft. JT argues for licensed models, provenance, and transparency; Jake champions on-set authenticity, human performance, and the years of practice behind great cinematography. Together we test real scenarios—a fast-turn social ad, LED wall shoots, product placement in B-roll—and show how to navigate them without losing the soul of the work.

We also go under the hood: why many models aren’t commercially safe, how data scraping works, what “licensed training” really means, and why your masters should live in your own storage. You’ll hear practical steps any marketer or creator can use today: insist on clear rights, document sources, keep protected archives, and use AI where it cuts drudgery, not where it steals credit. This is a playbook for ethical AI in marketing and video production—ruthlessly useful, legally aware, and grounded in the reality of budgets, deadlines, and brand trust.

If this conversation helps you think sharper about AI and creativity, follow the show, share it with a colleague, and drop a review with your take on where you draw the line.

The East Cooper Student News team serves both Lucy Beckham High School and Wando High School communities. Your sponsorships will help our staff pay for field trips, competition fees, and equipment throughout the school year. Our staff publishes linear broadcast content in the form of our flagship newsmagazine East Cooper Student News, our sports talk show Rivalry Report, and announcement show Tribe Time, which serves Wando High School. 

Support the show

Title Sponsor: Charleston American Marketing Association

Presenting Sponsor: Charleston Media Solutions

Sponsor: SCRA; South Carolina Research Authority

Cohosts: Stephanie Barrow, Mike Compton, Rachel Backal, Tom Keppeler

Produced and edited: RMBO Advertising

Photographer | Co-host: Kelli Morse

Art Director: Taylor Ion

Score by: The Strawberry Entrée; Jerry Feels Good, CURRYSAUCE, DBLCRWN, DJ DollaMenu
Studio Engineer: Brian Cleary and Mathew Chase

YouTube
Facebook
Instagram

SPEAKER_00:

Welcome to the Charleston Marketing Podcast, brought to you by the Charleston AMA and broadcasting from our friends at Charleston Media Solutions Studios. Thanks to our awesome sponsors at CMS, we get to chat with the cool folks making waves in Charleston. From business and art to hospitality and tech. These movers and shakers choose to call the low country home. They live here, work here, and make a difference here. So what's their story? Let's find out together.

SPEAKER_01:

Hello, listeners. Mike Compton here. And we have a special guest. All the guests are special, but this one has the mustache. Special kind of guy. You do. If you're on YouTube, you'll see it, but uh you wish you were seeing it. Uh if you're listening to us on the audio. Anyhow, he's also got a face radio. Used to work in radio. I did. JT White's in the house. Say hello, JT. Hello, JT. Good work, good work. JT uh is here. He's on our board. I am. As is Jake. So we got all boardcast today on the Charleston Marketing Podcast. JT, talk about Augie. I I want I want to hear about Augie real quick and then talk about how you got to Augie and kind of your your journey to that, if you will. And listeners, he likes to talk. So just get ready because he's a smart dude.

SPEAKER_02:

I'm going to temper it back today because I think between Jake and I, there won't be a lot of air left in the room if we start. So my journey to Augie, I think, well, I'll start from the beginning. So when I was 16 years old, I accidentally sold a company to Sony. Uh in the music in the music space. That was uh a really good lesson. Like a very good lesson for me in uh how much I didn't want to work in music. Okay. And also just kind of business in general.

SPEAKER_01:

What was that? Hold on. What did you do? What did you say? So it was called accidentally.

SPEAKER_02:

It was called Rhymeandreason.net. So there used to be this. I was I'm very like 16 years old. Six. I was 14 when I started it. Dear Lord. So there was this thing called Ratbattles.com, and it was run by this guy named Shadow, who I can't remember his real name, but that was his username. And it was like it was a chance to like you would battle online or cipher. I've been into hip hop for my entire life. And so Where are you from? I'm from New York. Okay. And so we me and a couple friends hated the forum because it was like really negative. Like everybody was just dogging you, and nobody was trying to help.

SPEAKER_01:

And this is circa what? When you're 14. So let's age ourselves.

SPEAKER_02:

So I'm this so I'm 14, so this is like 99. 99. 98. Just the beginning. I wasn't even a thought. Yeah, it's true. Just a game. Life was great by glimmer in somebody's eyeball. And so we uh life was amazing. This is pre-Jake. So PJ. Um, yeah, and so we started me and my friend Jake, actually, funny enough, uh, started a website that was called Rhyme and Reason, which was like, hey, we want to like help build a community of people that like want to get better because like you can't just go on and like get crushed every time you'll you'll stop liking the music. And that's not what we wanted. So we built a thing and it got pretty good, and we had like a couple thousand hundreds or not hundreds of thousands, but like fifty to seventy thousand users or something like that. And then from there, we wound up unintentionally. This is why I say unintentionally, like we had two or three artists that were really good, and then like we started listening to some of the underground tapes that were coming out from like known artists, and we were like, dude, they're stealing from our website, they're stealing lines from our website.

SPEAKER_01:

Right. Well, this is the phone's ringing.

SPEAKER_02:

I'll get it.

SPEAKER_01:

Answer it real quick. No, it's not. Oh. Hello, this is Matthew. Perfect. Oh dear. Anyways. Take three. So uh so Napster is not coming to mind right now.

SPEAKER_02:

No, so Napster and Line were so those things were out, but what happened is is that we started hearing stuff that was on our site on the radio. And we were like, Dog, like what's happening, they're stealing from our and then come to find out, we actually like this is actually kind of funny. We sent like a fake cease and desist because we didn't have lawyers. Because I was 16. Yeah. And so we sent a cease and desist to Sony, and then like they got in touch with us and they were like, actually, that is the artist that's using your site as like a forum to test bed some of their new work. And so we had like relatively well, like Royce the 59 Chino, like we had like some very good MCs that were kind of like using our site to test all of their new like all their new rhymes, and which was cool. And then they sort of like took it over, and then they I don't know what they did with it.

SPEAKER_01:

Oh, that's awesome.

SPEAKER_02:

I was gone. It was fun, yeah. It was a really good experience. And then from there I worked at like Best Buy. I launched CSP and radio FM across the country. Whoa, whoa, whoa. Really? No, yeah. Yeah.

SPEAKER_01:

Well, was it like one of those million-dollar deals?

SPEAKER_02:

Like, no, I got a 1997 Dodge Avenger and a real distaste for the music industry. I love that.

SPEAKER_01:

That's what I got out of that deal. No. But you were 14, 16 years old. I was 16, man. I really wanted a Dodge Avenger.

SPEAKER_04:

So did you really demand them a Dodge Avenger?

SPEAKER_01:

No, it was just keeping it. That would be awesome. What I used with the cash. That would be hilarious.

SPEAKER_02:

It would have been funnier if I made them give me a Dodge Avenger.

SPEAKER_01:

That's how we did it in '99.

SPEAKER_02:

Yeah, yeah. No, I went and got one. I went and got one with the cash that we made. Or it's not it wasn't a big acquisition. Like it was like literally just like a you guys go away and we're gonna do cool stuff with this.

SPEAKER_01:

And then you became a cool kid because you had this adventure.

SPEAKER_02:

Yeah, and then I crashed. It's the only car I've ever crashed. Oh my gosh.

SPEAKER_01:

No kidding. Okay, so you can go on now then. That was so yeah.

SPEAKER_02:

Then uh so I went to college and and was working uh Stony Albany for evolutionary psychology. Shout out. Uh, and then while I was there, I was working at Best Buy, and then I got a job working for a local radio station. And from there I wound up working specifically with ESPN radio and launching all their FM broadcasts across the country because they used to just be on AM, which you guys would not remember. But there was a thing called AM, which is a different kind of radio. Do you guys know radio? Just kidding. Yeah. So, like, we would so we launched that and it got me very into the digital space. And I had an amazing boss named Bob Osfeld who was like, Did you sell a website when you were 16? And he's like, I was like, Yeah. He's like, You're in charge of the internet. And I was like, What? I was like, I don't know what I'm doing, but but he let me learn on the fly. And then I worked for a smattering of other companies and did a bunch of like app development where we launched uh WWE network, we launched uh all the full episode players for NBC Universal, we did the Google TV release, built a bunch of really beautiful apps, and then I've actually been in AI for the past 11 years now. So started on the big data machine learning optimization side, and then I've been in the AI video space for the last like three and a half with a company I founded called Log X Labs.

SPEAKER_01:

This wow, you've done a lot over the couple two trees. Now that that wasn't a long time though. I feel like you've been forty, so it's it's been twenty.

SPEAKER_02:

I mean Young's relative on this panel.

SPEAKER_01:

Well, speaking of young, you know, young Jake, and listeners, um Augie is a video AI platform. Yes, right? It's a video production platform. Run by run by JT, Augie.studio. Look it up. Jake, we all know, has a video production company with real people.

SPEAKER_04:

Cosfilms.

SPEAKER_01:

Cosfilms. There's real people who work at Augie, too. I was gonna say I paused, I paused and then say because I was gonna say there are real You are a real person, Jake. And when there's a couple of things. You're a very real person. I'm sure there's a few other people, right? And then a lot of good to that point, there are a lot of jobs being made because of AI. For sure. They're not just taking the jobs away. Am I wrong, Jake?

SPEAKER_04:

No, you're not wrong.

SPEAKER_01:

Right. Okay, end of story. Good job.

SPEAKER_02:

But they are but but to be but to be clear, they are also taking jobs away. Yeah, but it's a balance, right? Like so the big difference between Jake and I is that Jake is shooting original, beautiful, incredible, high-quality content for premium brands and premium uh producers and shows, right? Right. Like our platform can use his stuff to repurpose it for different things, or we're leveraging Getty and we're leveraging a bunch of other AI platforms that are, but we're not creating visuals. Like the visuals that we use are still commercially produced by somebody like a Jake.

SPEAKER_01:

Sounds like a great tool, Jake. Sounds like a wonderful tool. Have you used Auggy yet? Put it on the spot.

SPEAKER_02:

I know he's adamantly opposed to using it.

SPEAKER_04:

I'm not adamantly opposed to that. Now that you hear it like that, yeah, for myself. I it's definitely something an avenue I would explore to clients who want to make uh take the content that I shoot and repurpose it quickly.

SPEAKER_03:

Yep.

SPEAKER_04:

Um I think that is a great tool. And um, however, it comes into there's a little bit of toes that are being stepped on, but it's not something that it's like I'm entirely opposed to. Usually how we work is that we set a scope of deliverables on what we give with the client, and that's and and that's what they get based on our our our work. Sure. Um when it comes to when it comes to AI, it's uh I'm a I hate that platforms that use generative content to create, and I hate people that are using it as a replacement for services like me.

SPEAKER_02:

Yeah, I think that's reasonable, but I also think that like right now it's not commercially viable anyway, because none of the models are. Absolutely.

SPEAKER_04:

Well, I I thankfully think we should dive into the transparency and the legality aspect of what is created. Yeah. Mike, do you want to talk about what happened last night?

SPEAKER_01:

So we had a great Four Chairs uh panel last night. Um every quarter we're gonna have this panel. And and last night, our first panel was on influencers. That was back in April with Olivia Flowers. It was great. You did a great job on that video. And then last night we did um the good, the bad, the ugly, and AI. Uh and Jake and JT were both on the panel, along with Melissa Hortman and Dave Ingram. Dave Ingram. Uh Melissa works for Microsoft. Let me repeat that for the people in the back. Microsoft, and then Dave is a brilliant founder of uh Query. Query. Look that up. Yeah, please look that up, listeners. Um and Auggy.studio. I keep saying Augie. I don't have to, so I have to. Cosfilms. RMBO.co. Um Matt, we'll get yours in or two in a minute. Uh Matthew, our engineer.

SPEAKER_04:

Um so we talked about uh a lot of different subjects, going over, you know, what what we the benefits that we see in AI, you know, what we hope to see in the future, and as well as some of the risks that that and some of and what's currently happening um in the AI scene.

SPEAKER_01:

And we were real quick, we were at Code and Trust. I want to give a shout out to Code and Trust um for hosting us. Go ahead, Jake.

SPEAKER_04:

Um we uh we we discussed all about you know what uh as three panelists and coming from different backgrounds, I came as the skeptic. I came as someone who has a successful video production company and is worried what AI is gonna do to my industry. Um Melissa came in on the educator side and said, Well, this is what uh this is what I hope to see AI can do in the future, and this is how we're currently using it at Microsoft. And Dave came in as the engineer who said this is what the current like what AI can currently do, and this is what you know what uh how people are currently implementing it, and this is what the future looks like.

SPEAKER_01:

Yeah, it was a beautiful Martech synergy discussion on like I put the Martech in there, don't you? Yeah, uh on AI, and you're our Martech chair too, JT. Just just saying. Um so it was a really, really cool event. What Jake, how many people came up to you at the end? I thought it was very gutsy for you to do this, right? So we're going into an AI building, right? Code and trust, like they're all AI in there. You're sitting next to a woman from Microsoft who is super smart, much more seasoned than you, probably more than that. Everyone else on the panel. I'm talking about like life seasoned as well, right? I'm not calling her, you know what I mean, but she was super smart. And then and then this guy is super smart too. We all know that. And then Dave, he's been traveling, super busy, really successful with with Query. Um, it was gutsy of you to do that going in there. But how many people came up to you at the end of the uh uh event and said, I really loved your passion?

SPEAKER_04:

Everyone. Uh I I I could, you know, I could I I lost how off on my two hands.

SPEAKER_01:

Yeah, it kind of made me throw up in the bathroom a little bit.

SPEAKER_04:

I'm like I mean, I'm I I I've always been very b uh uh firm in my beliefs, and I'm I I I don't I I don't want to ch I don't want to change. Yeah.

SPEAKER_02:

Um, you know, I've I and that my friend is youth. Being willing to change is important.

SPEAKER_04:

I i when when if with presented with new information, I'd absolutely be willing to change. And right now the current laws and regulations, as well as the transparency aspect, are not are are not willing for me to butt yet. That's and uh and until that they can show that that AI can be used ethical, it it is I I I will not be I will try to limit uh the use of generative AI and as much as my work is possible.

SPEAKER_02:

So okay, so I want to dive into what I thought was the most interesting discussion we had last night, and we didn't get enough time. Yeah. So you just said the word ethical. Okay. That's a really vague word because the line of like ethics changes per person. Always. And it changes over time, too, right? Because you learn more things. Like there's stuff that like you're not old enough, but there's stuff that like was regular when we were younger that would never fly now because our ethics have evolved. Yeah, right. So, like, I first of all, I want to be very clear for for listeners, right? There's a huge differentiate, like generative AI when it comes to visuals and video, right now is not in a place where it's commercially viable anywhere, period, hard stop. Getty has a engine that can do it, that it is trained on proper models, like it's trained on licensed content that is paid for. Um, it's not really distributed a ton yet, just because it doesn't have the whole internet, so it's not as good as the ones that are trained on the internet, right? But like from that, I think it's an important differentiation that like when you say generative, like and you mean specifically generative video and and uh because okay. When you you are using generative audio though.

SPEAKER_04:

Uh no uh I wouldn't say no, but it's the uh the database that it's trained on is through Adobe.

SPEAKER_02:

So that's ethical, yes.

SPEAKER_04:

So you're I mean it's it well, so this is the question that I ask people where do you draw the line? Is using AI as a cost effective or a cost-saving measure um uh now does that ethical line shift if you're in the same industry? As a business owner who is uh, you know, just is like I need to put out content and need to uh just uh like and does not care where it comes from and just needs to put it out with the budget and uses AI, that is um, you know, like that's that's his ethical line. But as a video a video business owner, I'm not gonna go outsource my creative to a computer. I'm not gonna do that. I'm not gonna outsource my creative to an algorithm.

SPEAKER_02:

But I think it's also important to know, Jake, that like your clients wouldn't want you to anyway, because that's not why they come to you.

SPEAKER_04:

Absolutely.

SPEAKER_02:

Right? So like I'm not, and by the way, I'm not a proponent of it.

SPEAKER_04:

So like it like if it when where does it draw the ethical line? So let's say we have a you know a huge project, and one of the scenes and and after the fact we're like we need to get this one scene, and we don't have the the money, the money or the funding or resources to go out and get it.

SPEAKER_03:

Okay.

SPEAKER_04:

Do you then use a AI train the train the the data off of what you just shot and what you captured and then have them regenerate that scene? Is that ethical? Is it sounds huge? That's the I mean it it it like this. I'm I'm not gonna do that. It's footage that you shot.

SPEAKER_01:

I'm gonna But wait, it's footage that you shot. You created this footage, yeah. And it's learning off of what you created.

SPEAKER_02:

So, okay, so here here's here's a real life example, right? We're not gonna say who, but let's say that you're a protein brand. Okay, like you sell protein to the public. Protein, okay. Yeah, yeah. Like you know, powder protein, right? That you'd find in GNC or whatever, right? So you want to do a you want to create a commercial that's about people using protein and whatever. Now, and all you want to do, and let's say you have B-roll of a bunch of people from before that you've shot, or let's say that you own, you have a relationship with Getty or Shutterstock or whoever. What is it unethical to then use AI to put your protein thing in the frame?

SPEAKER_01:

What's around the frame?

SPEAKER_02:

Well, the rest of the frame is content that you own or have the rights to and that everybody's been paid for. So all you're using is there's video in the background, and now you're gonna have your container of protein spin open the top and sh shoot a bunch of protein in the air while there's a bunch of dudes playing volleyball behind it or whatever. Did is that ethical?

SPEAKER_04:

I mean, i uh I mean uh me personally asking if the AI was uh if it was all licensed and you it was all.

SPEAKER_02:

Well you own you own your own content, right? So you own the protein. Who you own the thing behind it, right? I think that's an ethical use. Because fundamentally, everyone who should have been paid has been paid.

SPEAKER_01:

You got that motion graphics animator that was was I was gonna pay to build that you know protein canister in the shot and then make it twist off and make the whole thing. Yeah. That's a position, that's a job that I was gonna hire somebody to do.

SPEAKER_02:

But if But the but the people but by using AI who did it, they have paid other engineers and tons of talent and for this all other thing to be able to do the thing that that person is.

SPEAKER_04:

So you're still taking money out of the production industry and fill and entertainment industry and and pumping it into the even if it's just one small thing.

SPEAKER_01:

Stop.

SPEAKER_02:

Like the thing, what I'm saying is like you still had to have a shot. You still had to have all these other pieces. So this is about augmentation. So if there's a piece of this where you can marry two things.

SPEAKER_04:

Well, like, I mean, if if it's if it's yeah, I mean, if it's like that's that's the ethical line. Like, I can't, I'm not gonna answer that. I don't know what I would say if I I don't know what I would say to that if a client presented me that. Right now, the technology is not there to be able to create the shot in the way that they they that they want it to.

SPEAKER_02:

That's you're wrong.

SPEAKER_04:

Uh for the if I have B-roll footage on me right now. If you have a if you have a protein thing, I would love to see an AI, uh, an AI created.

SPEAKER_02:

So I 100% after, we cannot do it on air, but if you want, we will put it in.

SPEAKER_04:

And I'm I would love to see I would love to see it. When when I was talking to I talked to a co-producer at when I was talking to uh one of my one um one of my uh family friends is a co-producer at Sony Imageworks. And uh when I had a conversation with her the other day, they talked about how they've been trying to implement AI into the motion picture industry, but their models are not up to standard. True. With with their with their data, they said that the directors are so picky with exactly with what they want, and that these generative AI, they'll get you, they'll get you close, but not all the way there, and that's not good for the directors and the people in charge of the are charging.

SPEAKER_02:

So but hold on. This is an important distinction, though. We're talking about two wildly different use cases. Like mo should Kevin Feige and Marvel be using AI? No, that's crazy. That's not a good idea. If you're making a commercial that's a 15-second commercial that's gonna be six this nine by sixteen that's gonna run on TikTok for a little while, I mean I'll leave it as this.

SPEAKER_04:

As someone in the entertainment, the production industry, um I'm I I I I would completely say that's uh uh unethical. However, I would Why though?

SPEAKER_02:

I want to know why.

SPEAKER_04:

Because it's taking jobs away from people that are.

SPEAKER_02:

But it created jobs for people that created the platform. So your jobs are more important than their jobs?

SPEAKER_04:

I don't say that's more important than my jobs, but it I mean it sounds like that's exactly what you're saying.

SPEAKER_01:

How about this? Would you shoot in a volume?

SPEAKER_04:

What do you mean?

SPEAKER_01:

Do you know what a volume is? Uh so volume is what like the Mandalorian was shot in.

SPEAKER_04:

Oh, an LED wall.

SPEAKER_01:

In LED, like they're called, yeah, when you get like a big like size, they're called volumes, and they're building like a mad all over the country. Yeah. And you can make those screens look like anywhere in the world.

SPEAKER_04:

Oh, absolutely.

SPEAKER_01:

So would you shoot in a volume and use that technology?

SPEAKER_04:

It's all it's our all circumstantial. Could be, yeah. With with what with with with what I found with LED walls is that the actors aren't in an um environment, they're in a in a studio and in surrounded by, you know, they they don't you don't get as a true of a performance as you would be putting in in a real world. Um I can say some real. I I saw The Mandalorian, I apt as a Star Wars fan, I absolutely hated it. It was all awful acting, it was an awful plot. Uh however, I can see it for certain use cases. Okay. If there is a scene like a mountain climbing scene, sure. And they and it it's it is their safety issues, there um, and you know, you can't get a shot of the actor hanging off the side of the mountain with wind blowing and all that.

SPEAKER_02:

Unless it's Tom Cruise, because it'll actually be a good one.

SPEAKER_04:

Yeah, well, exactly. Then it's like I completely see I completely see AI being being used, uh, or I'm sorry, not AI, completely be using a volume or an LED wall to go shoot something like that. But to shoot an entire series off of it, I I I mean, I just don't think it makes for good acting. I really don't think like act like I don't think it makes for good cinema. Um, I'm you know, as a as a as a as a true as a cinematographer, I really look up to Christopher Nolan as a director. He's shooting the entire Odyssey on 70 millimeter IMAX, all on location, all with real sets and real props. And now I'm I understand not every business has a you know a half a billion dollar budget to be able to go do that on. And so it's like, well, where is the where is the line? The thing that I'm really worried about, AI, it is a rat race to the bottom where people want to do um do get the cheapest for what um and the um and the cheapest for what they can for the work.

SPEAKER_02:

Well, yeah, I mean look, I mean the the truth of it is is that this is like not a new thing, right? So like we've done this with everything, right? We did it with consumer electronics, like you guys won't remember this, but like plasma TVs used to cost like 10 grand. I can just move closer.

SPEAKER_04:

Oh well, oh like he's watching the levels, he's good.

SPEAKER_02:

I'm watching the levels. Well, no, but so like look, things get commoditized over time. Yeah, right. Now, the first tranche of any new technology is going to be shit. Like it just is, right? You're gonna get a bunch of sludge. So, like when mobile apps first came out, most of them were trash, right? Like when TVs came out, most of them were trash. When new systems, every like anybody who's ever played a PlayStation knows the very first gen that they drop is probably not gonna be great and they're gonna have to do upgrades. Anyone who's ever owned a BMW knows never buy first gen because their tech's gonna break. It takes a while for everything to settle, and then you got to figure it out. Give me a sec, give me a second. I see you like two questions ahead already. Just give me a second. What you like? I hear your fear, and I think it's reasonable and valid. I'm not invalidating your fear at all. But you also have to give the everyone time, and everyone's a lot of people, to wrap their head around how this is gonna work. Experimentation is necessary is necessary for invention. So there are gonna be a bunch of weird shit that gets made. Some of it will be good ideas, a lot of it will be bad ideas, and after a year or two or three, it will start to settle into the places. And I think in your industry and specific, like specifically, we're already seeing a lot of people pulling way back and going like, okay, so now that we know what it actually is, everybody chill out. There's total use cases for it, and there's a lot of use cases not for it, and we'll roll it out and see what happens.

SPEAKER_04:

So I want to, I, I want to, all right, I want to clarify. If I I like we were we were going talking about the ethical top, uh the ethical topic, and I want to circle back to that. What um I what I'm concerned about is what happened with the Industrial Revolution and all the factories popping up in America. There was no regulations, absolutely polluted our waterways, and it still has lasted for hundreds and hundreds of years.

SPEAKER_02:

Yeah.

SPEAKER_04:

And I'm worried with this. That's a good analogy. I'm really worried about the uh with the with this with the AI space. We are absolutely polluting, polluting the regulations, and I get it, there needs to be experimentation, but and then it needs to be ethical, it needs to be legal, and it it is uh it is um, you know, I don't want it, I don't want stuff to be have lasting effects for hundreds of years. And that's why I think we should speak out and make sure that these laws and regulations are implanted sooner, because we've as humans, we are we will repeat the same stuff we do in the past. Oh yeah. You know, the industrial revolution was how did they produce the um uh uh a product in the quickest and the cheapest way away ever. It was all about it was all about profits. They did not care about the environment, they did not care about the safety of their workers, they did not care. They they were just like, we just want to put out uh put out stuff and and sell it to the master. Capitalism.

SPEAKER_02:

Well, yeah, exactly. And but here's the thing though, here's the thing, right? So I by the way, I want to say personally I agree with you, but I'm gonna play the devil's advocate because we're on camera. Like the thing that I think is interesting is there is a very and like look, this we're not this is not a political podcast, we're not gonna go there. However, what's should there be regulation around AI? Obviously, it's a dangerous technology if it's put in the wrong hands, right?

SPEAKER_01:

Like the internet is dangerous.

SPEAKER_02:

Well, the internet is terrible, like but but then take it a step further, right? Okay, so it could be dangerous, it could replace the wrong jobs, it could replace some of the right jobs. There's a lot of like second and third order consequences to the decisions that are getting made. That's important. Also, the environmental impact of AI because of all these the enormous data centers that they're building, the the blow-off of that is non-trivial. Like it's a serious problem. Also, uh, all of our competitors in the rest of the world may or may not give a shit. And so there is a weird line that you have to draw, right, which starts to get in the in the gray space of what's ethical and what keeps us ahead to make sure that we don't fall so far behind that we then become susceptible to having our economy dictated by somebody else or losing all of the jobs because we didn't adopt AI and now everybody here is broke and we start to fly down the socioeconomic.

SPEAKER_04:

I mean, like we talk I don't want to talk about politics, but uh going you mentioned about the economy. I mean, right now the economy is in shambles.

SPEAKER_02:

We have the wealth It's not because of AI.

SPEAKER_04:

Well, no, and it no it really isn't because of AI. But you know, the well the it's it there's becoming no more no more middle class. There's been like the wealth, I mean it's gotten worse in the past, you know, hand a hand couple where it's the wealthiest of the wealthiest are are becoming wealthier, and the the the poorer uh the poorer and the middle class are getting are getting getting poorer.

SPEAKER_02:

Well, the middle class is getting completely eradicated. Like you either are above a line or below a line. Yeah. But this is so, okay, let's bring it back to AI for a second, though. Right? So that's interesting. Yeah, please. We don't want to lose. We don't want to get we can't be called something on one of these pockets. So here's what's interesting there's an opportunity. So we started talking about this last night, and you and I have like a wildly different opinion on this. So my kids, I built a small product for my kids. It's called Play Arty. It's P-L-A-Y-A-R-T-I.com. You can just go use it, it's free. It's an opportunity for kids to use their imagination and use AI to create visuals for themselves. You had a very fair argument, which was like, well, just give them a pen and pencil and or paper and pen and let them go make stuff. Now, what we didn't get to talk about last night is like, of course I let them do that too. Like, my daughter's in Gifted and Talented Arts. She went to three weeks of camp just to use her hands. Like, she did like puppetry and she did glass breaking and installations. Like, I'm a big art, I love artists, which is why you and I ultimately will always get along even when we disagree. However, I do think there's a very interesting thing with AI, which is that it's an accelerant to creativity. So people who aren't as naturally gifted as you and aren't as naturally a visual thinker as you now have the opportunity to take the thing that's in their head that their hands physically cannot do and start to put some shape to it. And I think there's something beautiful there. And ultimately, if you think long term, that will create more opportunity for more people and access because it's pretty inexpensive right now. Like the fact that you can have Chat GPT Pro for$20 a month is a wild deal because coffee is like$10 right now.

SPEAKER_01:

Exactly.

SPEAKER_02:

Right? I think cigarettes in New York City are probably over 20 by now. Like, there's other things you could cut out of your life and have a full GPT instance that's dedicated to you that you could be training on the way you think and talking and figuring out new business ideas, how to better, how to better figure out like how to handle your finances, like the things that it's actually good at.

SPEAKER_04:

So I want to touch on two different things here. First, you said about the kids in the uh using it was a playarty.com.

SPEAKER_01:

Yeah, dude, this is the cutest thing I've ever played with. This is so cool. I'm doing it right now.

SPEAKER_04:

It's the door. Yeah, let me turn it on.

SPEAKER_01:

Playarty.com.

SPEAKER_02:

So I added all these elements. Yeah, so you just pick some elements. This is such a bad visual right now. Sorry, podcast. But like what we did is like we just created this. By the way, this was made by an artist. All of these pieces were made by artists. Actual people drew all this stuff. But what happens is it just goes through, and you said a dog, you know, on an iceberg dancing, and we have pre we have prompts behind the scenes that auto- You just gotta give it a second and a look. It's creating. Um, so it'll go through, and like we have prompts to make it safe because the internet is a scary place and it does a lot of weird stuff. It's also relying on replicate to run, so it might not actually work. But what it does is it will just create a fun image based on the things that you selected, so you have some place to start from. Oh, it didn't work.

SPEAKER_04:

Oh, well, pen and paper would never do that.

SPEAKER_02:

By the way, I pay for this out of pay for this out of pocket for my children because it's not a real thing.

SPEAKER_04:

So what when you describe when you first described it last night, I thought you were talking about something like Google VL. No, it's where it's like the people.

SPEAKER_02:

Like, I'm not gonna give my kids. Listen, sorry, but the evidence for porn. Like you've got to keep this all away from things. So like this is a very, very specific use case where kids can go, I would like to see a giraffe skateboarding on the moon, yeah, and you can give them a visual report.

SPEAKER_04:

Yeah, I mean, then that's harmless. Well, and that's so dumbed down than what I was what I was what I was thinking. Yeah. You know, all right, and then that's fine, but like if you're like trying to train kids on like how to use, like, you know, I my big fear and my my concern is with um use uh with uh with using a AI models to you know learn how to create stuff is that people don't actually learn the mediums, they learn how to use like the AI model. And that's that's a reasonable concern.

SPEAKER_02:

I think that's a fair concern.

SPEAKER_04:

There, there is, you know, they're not like with when I got into and you know you said you mentioned that I'm very gifted with the videography. It's not a gift, it's years and years of learning, years of YouTubing, years of trial and erring. You know, it's a skill. It's not it, it's it's it's practice. It takes practice, yeah. And it's something I practice every single day. And to be able to, and so it's like I said it, but I wasn't given a you know, all right, hey, go find Google images and throw them in an editing software. It was like, all right, here's how here's a camera. Here's how you create take a photo with a camera. Here's all the settings you use, here's how you lay it, here's how you move the actors, here's how you record audio, here's how you throw everything uh together to create it. It's not like all right, all right, give me stock footage of XY. Of doctor smiling holding pill bottle.

SPEAKER_02:

But I think so, okay, but I think there's okay, so I've been saying this for years on like stupid panels and stuff. Like content is king, curation is queen. Okay. How you curate the content matters. The kite, the type of content you're making, nobody wants computers to make. Yeah. Nobody wants that. I'm not asking for that. I don't want it.

SPEAKER_04:

So so you also brought up earlier about having your own instance of Chat GPT. Did you see the Harvard study about how ChatGPT, 95% of users after using ChatGPT, became 78%?

SPEAKER_02:

78% of people's cognitive load went down.

SPEAKER_04:

Yeah.

SPEAKER_02:

Right. Which is a lot. Now that said, right, I think that first of all, remember, evolutionary psychology is what I went to school for. Yeah. How you run tests matter. Absolutely.

SPEAKER_03:

Yes.

SPEAKER_02:

Right. Right. The subjects of that test are on. We don't know who they are. They did it peer-reviewed. We don't know the actual full, like we don't know what the the tasks that they were given. How dumb they were before. Right.

SPEAKER_04:

Like I mean, it's so they were assigned to write an essay, and one one was given a control droop to write an essay every single week for a set period of time to be able to do that. And then the other was given Chat GPT to help out. And at the beginning they used it sparringly, and by the end of the um the by the end of the Yeah, the cognitive blow went down because they didn't have to do anything because it's been formed.

SPEAKER_02:

Yeah. But I would also argue that that's a really poorly run experiment. Now I know that I don't like hard work.

SPEAKER_04:

I mean it should be a very much clickbait headline. Right.

SPEAKER_02:

The whole the whole point of that is that if you give anybody look, people or people, if you give anybody a shortcut, the amount of people that will stick to their guns and not hit the shortcut is unbelievably low. It's even lower than you think. At Cosfilms, we don't do shortcuts. Sponsored by Cost films. But like, look, I think that's it's it's important. Look, I hear what you're saying, but like when you think about like so why I have I'll tell you very specifically, and this is a thing I didn't get to say last night because Mike told me not to talk that much. I didn't never say that.

SPEAKER_04:

Mike tells me to have a talk on that list.

SPEAKER_02:

I didn't say that. No. So here's the thing. So in that room last night, we were very lucky to have a bunch of College of Charleston students, and I think that was fun. I love when those kids come out and they have great questions. And your collegiate uh board over here? Yeah. So I think but so here's the one thing I didn't say to them that I wish I did. All of them should be spending$20 to have their own instance of GPT. And here's why. Specifically the terms and conditions. Specifically. Because if you pay, you get it's a different wall.

SPEAKER_01:

Yeah, that's what Jamie was telling.

SPEAKER_02:

It's a completely different wall. So paying that$20 so that your intellectual property and the things that you're asking are contained and remembered in your own instance is worth the money. First of all, just to save the stuff you want, because God forbid you have like, look, I don't think anybody at ChatGPT is like reading your idea. They don't care.

SPEAKER_04:

It can be used in lawsuits where AI isn't protected. Where if they, if they, you know, if you type something up in AI, they can, you can, um the government can Yeah, the government can take a text, the government can take an email. Still Twitter. Well, okay.

SPEAKER_02:

Don't call it Twitter. I don't care.

SPEAKER_04:

Well, X, Twitter, whatever, same difference. Um with when you're using Grok, you can now uh your your hi your history and your grok prompts are now um searchable. They're indexed. Yeah. So and it's like you don't like I don't want that.

SPEAKER_02:

You know, if I'm Nobody should want that, but but so it goes back to like so the ethical thing, right? Is like what's unethical about the space right now, which is dangerous for me to say because I've been in it for so long, but like we don't have good education about this part. Yeah. About the sign-on, right? Like can can I say the F-word here or no? Sure, go ahead. Okay. Well, I'll just say F, right? There what when you're making When you're making products, you try to make products that are Fing magic, right? That's the whole goal. Like if you're making digital products, actually that's true just in the world too, right? When if you're a if you're a chef or if you're you know making fishing rods, a toadfish, whatever, you're trying to make something that feels effing magic in a moment. That's the whole goal. So the problem with AI, and the scary thing about AI, is that it feels like magic instantly. Uh-huh. Immediately. There's an immediacy to it. And that that immediate dopamine hit of like, oh my god, like that's a real physical reaction. Don't give me down the dopamine line.

SPEAKER_04:

It really is such a huge issue, especially with like people my age, with like the TikTok, Instagram, social media algorithm. Screen type. It's an instant, it's an instant hit of dopamine.

SPEAKER_02:

Yeah, it's open. Listen, there's there's been this is this gets into my world of stuff that I like as a nerd. Like it's really scary. It is. That said, like thinking about how you ethically use AI every day, though, like once you get past the effing magic, there are fundamental use cases for it that I think really are like look, I use it a lot. Obviously, I work in AI. I've been using it for years. You're building stuff.

SPEAKER_04:

I mean, we talked about this last night where like I I'm not gonna get a as much as I dislike AI, I'm firm in my beliefs and my models of not using generative AI. However, there's editing software being built, like, or my editing software is being built with AI features on it. You can't get away with it. And these AI features save so much time. And this goes back into the cost-effective and ethical measure. Yeah, well, I'm still gonna be editing it, I'm still gonna be doing all this stuff myself. And now instead of then having to go hire someone to go do the audio or the audio mastering or removing stuff from a dialogue, or spending hours of my time doing that, I can go hit one button with AI and it will do it for me.

SPEAKER_02:

Right. And so it's like, well, that's taking jobs away from the from your well.

SPEAKER_04:

I mean, is it like I said, the ethical, like where do you draw this? Yeah, where's your line? Yeah, exactly. Yeah, and so it's um, you know, oh um, Adobe.podcast, by the way.

SPEAKER_01:

If nobody knows about it.

SPEAKER_04:

I use I I mean, it's just a trade secret here. You can solo an audio track from a video production that has noisy background, just export that piece of dialogue, run it through Adobe Podcast, fine-tune your settings, and then download and and have that be in your editing.

SPEAKER_02:

Dude, descript like I have a podcast. Like the script is crazy because you put the script in, it'll get rid of all your filler words immediately. Descript? Is that what you're saying? D-s-c-R-I-P-T. It's a great platform. Riverside FM does the same thing. They have all these platforms that will just automatically consolidate all your audio, get rid of all the background noise, isolate everybody, and get rid of all filler words, and they'll actually fill it, and you won't even be able to tell.

SPEAKER_01:

Crazy.

SPEAKER_02:

That's a like that uses. I just have to do that in garage band, like word by word.

SPEAKER_04:

We're not we're not going to escape AI in the future. No, like currently, I uh not the future. Now, or yeah, I mean it's here now. Yeah, uh with cameras are currently being built with AI chips built into the camera itself where they have where it's autofocus training, where they're training their lar a large data set of images on to be able to detect people's eyes if it's a plane, a bird, a moving car, um, uh an animal, a dog, and you uh and be able to be able to pick that exact preset to be able to focus better and not have tell your camera this is what I I want you to focus on on my image.

SPEAKER_02:

But so but okay, so this is interesting. Now I'm gonna take your side for a minute. So here's why I hate that. Right? The reason I hate that because I also love film. Like I love films, right? Like the reason I hate that is because uh there's a thing in in Japanese culture called uh was it Wabi Sabi? Do you guys know Wabi Sabi? I've heard of the same thing. So Wabi Sabi is this idea that it's it's the imperfection that makes perfection. I agree. Right. So the thing is, is like when you think about like Wes Anderson, like the way Wes shoots things is often out of frame or a little out of focus until it's in focus. And so now if the camera starts doing the same thing for every film, I'm not interested in that.

SPEAKER_04:

Well, I mean, you can I mean there's settings you can pull focus, and that's this is I'm talking about in a sense where it where you know you need that setting. There's always a spot to turn off your focus on all cameras, and even in film and cinema today, they they don't use lenses with autofocus on it. Of course they do. They still have a focus puller who is doing all the are shooting on film still.

SPEAKER_02:

But but the point is though, is that to my point earlier about humans, the longer a thing's around that just does a thing for you, the more people are gonna go, Oh, that is pretty easy. And they'll start to sacrifice this or that or one little thing. Like, I do think that like the the separation of art, creativity, and business. Like art and creativity, I think using AI as an accelerant to what you want to do, and then going and creating with your hands and brain is my favorite version of that, right? The business world, like I'm gonna be very clear, people are losing jobs, hard stop. Like it's coming for a lot of people's jobs, they're gonna have to figure out how to be better at different things.

SPEAKER_01:

Call centers, I mean you name it.

SPEAKER_02:

There's just a lot of stuff. There's a lot of there's a lot of rote activities that don't actually have to be done by human beings on a regular basis that are both wildly inefficient and not particularly good because human error is a real thing and the machines don't mess up simple math. Like, if you're just rounding stuff up, like, dude, AI's good. It's got you, like you're good. So I think trying to like my personal ethical dilemma is like in the art and creation space, is where's the line of like starting to help you get creative and get something from brain to paper and then actually putting it in the world?

SPEAKER_04:

I mean, I I we I talked about this last night. You know, a lot of big agencies are using like um sites like Google VO to create storyboards. So before we go into a production, we will uh meet with a client and we'll try to get them exactly with what they want. You know, if we're doing like for example, you hire me to do a 30-second ad for a um let's say a soccer company. And um I will write I will literally back in the day we draw out the exact frame by frame aim of every single shot that we were gonna get for that video. Then we take it to the client, we pitch it to them and say, Do you approve and sign off on this? Now we're with it what AI is helping with is now taking those storyboards and you know being able to generate what we actually what we actually drew, be able to show that to the client saying, Hey, we're not gonna be able to use this for the final product, but this is really similar to what it's gonna look like. We're just gonna go off to go out and shoot.

SPEAKER_02:

And it's super efficient for you because now you're not wasting production time on on site, because if you have to go reshoot, which is everybody's nightmare, right? Because now the sun's different and like all the things that you have to take into consideration, like I think that's a really interesting use case, but it still requires you to go shoot the thing. And like you brought up the the Coca-Cola thing last night, right? So for people who didn't know, Coca-Cola had an ad that they made entirely through AI, um and it was a very poorly poorly thought out process on their part, and it was an even worse release. It just did not go over well, right? They, as a company who is Coca-Cola, probably shouldn't be trying to cut corners. Like you, first of all, you have a brand expectation that you've already set in public. Like as a marketer, like I thought that was a wild choice because like people expect a certain aesthetic tone and life to the things that you bit that you do, and then they sort of like were devoid of that because they were like, Well, we're gonna try something else. I appreciate the attempt, but like making it your actual Christmas commercial, bold strategy, cut, and let's see how it rolls out. Like, they should be calling people like you, like they're always gonna need people like you. But I do think there's space for all these other small small businesses that just want to have like a better Instagram.

SPEAKER_04:

So now I want to play Devil Devil's Advocate. Um remember the Caliche commercial that aired like a couple months ago during like a big uh sports game. Yeah. Uh it was the uh cliche is a um sports betting app. It's a life betting app where you can bet uh bet on outcomes of life. And yeah, you know, it's like uh the South Park just made fun of it last night. It was really did you watch it? It was so funny, it was good. But um they um it's so they they had a really successful commercial with Google with Google VO. But it um and I want to say, like, it blew up, it was trending, it was you know extremely popular and really uh well rated. However, I think at the same time, they did it on top of a trend, they hit it at the peak of the trend, and the ridiculousness, the timing, and the placement mattered so much so much more.

SPEAKER_02:

Yeah. Yeah. Look, I think that that's part of the thing. Like right now, every couple of years, specifically in the media space, but like just in general, there's buzzwords, right? It was the year of connected televisions, it was the year of mobile, it was the year of second screens, it was the year, baby, baby, baby, baby. We've been doing this forever. Like now it's AI. Everybody's gonna be doing AI. That particular thing, I actually appreciate the way they did it because they leaned all the they leaned all the way into the ridiculousness of it, right? They played that really well, and it was a fun little thing, right? But they also were very upfront about what it was. They were like, this is basically an AI sludge video, and we think it's funny. And like, sick, that's like if if that's actually your campaign, then you should do that. But trying to pass it off as not your campaign and pretending that thing this is normal, this is what we've always done, but then using a completely different technology and not using any human beings to do it, or a limited set of human beings leveraging AI technology, that feels disingenuous.

SPEAKER_04:

When you when you asked us uh when you asked us last night, um I think the opening question was what do you see AI? AI should be um complementary to your workflow, not supplementary, where it should not be used to replace an entire process, however, they're just to make things more efficient.

SPEAKER_02:

I I mean I'm gonna be honest with you, Doc, I disagree. Like I think there are, I think in your workflow, maybe in the creative industry. In your workflow, I think that's fair, but like go talk to some people that work in office jobs, buddy. I I there's whole divisions that don't have to be there.

SPEAKER_04:

As I'm speaking, I'm speaking, yeah, I'm speaking from a creative standpoint and from the a video, video production, video curation side of things.

SPEAKER_02:

That then I agree with you there.

SPEAKER_04:

Um I'm not I'm speaking from my my my own experience and not, you know, yeah, there is some manual data entry jobs that could easily be solved with like AI. Yeah, but you said some basic stuff. And so I mean, if you're if like this is where we go back to the ethical line, I don't care if you use AI for that. This is not, you know, it's yeah, it's a lot of people. Those people do though exactly. You don't care. So yeah, exactly. Well, so where like where is this where where is this ethical line and in the in in in uh using AI as a replacement?

SPEAKER_02:

Well, so so here's what I think like my I have an answer to that actually. So first of all, and like you know, sorry if they're listening, but like I personally think that we've done a very poor job of picking the speakers of the AI industry. Like I think the people that are currently out there speaking, well, it's Sam Altman and it's like it's it's Elon. Like there's a lot of people and they have very strong opinions, but they're also shareholder-driven Zuck. Like they're shareholder driven, right? Like they're they're profit-driven. And I think we need to be more careful with this because like I know how brains work because that's what I care about. And like it's scary. Like, there's some stuff we have to think about here. The way that we solve that at the micro level, the macro level, we can't control the micro level is people like you and me and you and Matt, like everybody using it ethically, finding their own boundaries, and being really and talking about where they think it starts to get ugly. Because the more that we all collectively agree that certain things don't feel good, then they just won't use them. Like it's a weird example, but like everybody just canceled Disney because of Jimmy Kimmel. Like we have power en masse. Like, if we do stuff like say things that are like, hey, you know what? This part feels gross, guys. So don't use it.

SPEAKER_03:

Right.

SPEAKER_02:

And if everybody just sort of stops using certain parts, you can force hands. So like I do think it's a groundswell because let's be very clear the companies that are operating this are capitalist organizations that are driven by shareholder value and they're gonna find the most money they can regardless of impact. They've proven that over the last year.

SPEAKER_01:

Yeah, yeah. That's make that brings up a good point of a older podcast that we did with Zach Giglio and he's so smart. Yeah, Zach and Emma. They're both so smart. They're working with small to medium-sized businesses on implementing AI into their workflow. Yeah. Um, those are the type of people that you're talking about that we need more of. Yeah. Because he's preaching, yes, everybody use it, everybody figure it out. Educate yourself. You would educate yourself on anything else, right? Like educate yourself on AI so you can figure out what's wrong with it, and then we can all talk to it on the internet.

SPEAKER_02:

Well, and that's Washington, too. That's like going to Washington and talking to you. Like, I like yeah, people like him. Like, that's a great example of like, by the way, local dude, super, super smart. Him and his wife are both brilliant. Yeah, like, and they care. Like, they care about making small businesses more efficient and better and leveraging the technology in ethical ways that are reasonable. Now, look, the truth is that it is going to cost people some jobs. Hard stop. That's just the truth. Yeah. Like, and that's a bummer, specifically in our current economy, because like unemployment's up. Right. But what that means is like, what are we doing to use AI to train those people on new things? Exactly. Like, there's another, like, there's those conversations are the ones that we have to have locally because they're not going to happen nationally. They're just not.

SPEAKER_01:

That's a great point. That's a great point. Uh we're kind of wrapping up here, Jake and James.

SPEAKER_04:

Yeah, I mean, one of the other things that I want to talk about that we do that we actually touched on is uh the tr the transparency and the legality of some of these models and that and what we create and how it's created. Like you mentioned Disney um earlier. Disney is uh suing uh Disney, Universal and Warner Brothers are now currently suing journey. Yeah, Mid Journey on uh because I mean when you type in Mid Journey, um, or when you type in Mid Journey, uh create Darth Vader, an exact replica or exact copy of Darth Vader will pop up and that is Disney's and Warner Brothers and Universal's intellectual property. And so uh with these models, it's like you have to be careful, and that's why I really like what you brought brought up with the Chat GBT. Um nothing in uh um and I I I forget who, someone someone brought it up last night. But the first law of copyright is that if in order for a thing to be copy uh copyrightable, you it needs to be created by a human. Yeah. And so none of the work, none of these people, none of uh whatever people are creating with a with AI, it's not copyrightable.

SPEAKER_02:

So presently it's not, and there's actually a bunch of interesting litigation happening about like what but if you prompt a thing on things, like there's some interesting stuff that's gonna come of that. Most importantly, like I can actually make this conversation super short. If you are creating images or video leveraging AI today that's not on the Getty version, it is not permissibly commercially usable. Period. Hard stop. None of the other ones are because we don't know because they're not transparent about where that's listen, it's the same thing with like ChatGPT. Like, listen, I use it every day and I'm not talking smack about it, but like the vast majority of data that comes from it is sourced from Reddit.

SPEAKER_01:

No kidding. Reddit. I do see a lot of the Reddit pop up.

SPEAKER_02:

Reddit.

SPEAKER_01:

Uh-huh.

SPEAKER_02:

Reddit is not a place where truth is very high on the fucking petting order. Sorry, I did just say the F-word now. But like it's just not a thing. Like, if you've been in Reddit forums, dog, it's not safe. Like, it's it's just a bunch of people with opinions. Yeah. Being keyboard heroes, going like, oh, this is what I think. Like, Reddit's a scary place. I also love Reddit because it's also an amazing place, but it's a big place. So, like the fact like, yes, I think everything that you do, you should be checking its work. And like, you can also, the nice thing about owning your own chat GPT and paying for it is like I get I give it X my my GPT has wildly explicit instruction about what it is and isn't allowed to do.

SPEAKER_04:

So so you brought you brought the you brought this up earlier, and that or with the Chat GPT scraping Reddit for data, as a creator, how do I protect myself and my data from being scraped by these companies?

SPEAKER_02:

You don't load it anywhere.

SPEAKER_04:

Yeah, I will and that's that I mean that's just not that's just not possible. Where you know, uh like I I think one of the ex um ex meta executives came out and said if government would would ban data scraping, it would make AI a um it would completely ruin the AI industry.

SPEAKER_02:

It wouldn't ruin the AI industry, it would change it dramatically, probably, potentially for the better and some for the worse. Here's what I would say as a creator, like make sure that we talk about what data scraping is first before we get into the Yeah, well, data scraping is basically like you can build a web scraper on anything that basically is like think about a spider that could sit, like if spider had eight legs and you put it in the middle of a page and then it just reached its legs out and grabbed everything and brought it back to the web. Like, that's what scraping is. Scraping is going to a page, pulling all the things down, categorizing it using AI by like meta tagging, saying, like, oh, okay, this is talking about this, this is talking about this, and then smashing it into a database and going like, okay, now we can query it. That's what scraping is. Now it's the same thing, like, you know, YouTube has actually, oddly, has done a really good job about making it really hard because they built all of this. It's using AI, by the way. They're using all this tech up front to know if it's copyrighted content. So you can't just like put Spider-Man on YouTube. It will get brought down and they'll sue you. Like, they don't mind. They got time and they got lawyers, right? So, like, what I what I think about you as a content creator, like, make sure you're using platforms that you know stand behind the process. So YouTube and Vimeo are both really good because they care about that stuff because they're creator focused, because that's how they make money, right? Like putting your stuff up on Instagram, uh, like you know, the meta place, everything in meta right now is a little sketch because like we don't really know what they're doing and they're not super transparent. Should you be putting stuff on TikTok? Oh, that's a lot, that's a lot of questions about that one. Like, it depends on what you're trying to do, right? So, like, if you're creating original content, make sure that you're sourcing it and storing it, first of all, storing it somewhere that you have like your own instance that's not a shared drive, like pay the extra for an S3 bucket that's yours.

SPEAKER_01:

I'm gonna pay for mine uh right after this. I'm gonna start paying for mine.

SPEAKER_02:

I think that's I can talk to you about that. Yeah, or take keep it offline. Like there are ways to do it, but it's about educating yourself in such a way that you know that you're set up to keep it safe. Yeah, and then be really careful about how you distribute stuff, uh, or don't be careful and understand that it's no longer yours. Like it's a choice, right? Right. And you just have to make the one that's right for your particular business and your ethical line. But because now, as long as we're transparent about what's happening, uh-huh, which is the part that I'm mad about, is I don't think we're telling people enough information. Right. But if you're transparent, like, look, if it's on the internet, it's out there.

SPEAKER_01:

Hard stop.

SPEAKER_02:

Like I always laugh when people. Well, when people were saying things about like, oh, they're watching me on my phone, I'm like, dog, if you knew how the advertising, like they know everything about you. Like, do you have an Amazon account? You're screwed. Do you have an Amex? You're screwed. Everybody sells data. That's part of the ecosystem, that's how it works. Facebook's not lit, well, Facebook might be listening, but Facebook's not listening. But every time they're like, I was talking to my dad about lawnmowers, and then I got a lawnmower ad. I can tell you digitally how that happens. And it's not because the mic's on.

SPEAKER_01:

That's funny.

SPEAKER_02:

Like, it's because your dad probably looked at a lawnmower the day before, which is why he brought it up with you. And your phone, which is an I which like you have an ID device ID, you have a device ID in his house, which is on his IP address. Now they know that you were in his house with your phone and he was looking at lawnmowers. Well, you might talk about lawnmowers. So they're going to show you a lawnmower ad to see if you pay attention. If you click on it, guess how many lawnmower ads you're gonna get? Oh, yeah. All of them. Like, that's how that happens. So it's not that Facebook's listening, it's that you have a digital footprint that you carry around with you everywhere you go. Right. And the second you hop on another network, ha, gotcha. Like now they know where you were. Like it's crazy the stuff that people don't understand. And again, that's the lack of transparency in the digital space of just being honest about what we're actually doing here.

SPEAKER_01:

Yes. God dang it, JT. I dropped the mic, but it's on that, but it's on a stand. One of the more interesting points of last night, though, if you recall, uh talking about um privacy and and not um duplicating uh was that um level up, my friend over at Matt Grayson. I had a clone being made. So I'm I'm I'm going into AI. I'm I'm I'm learning everything I can, right, Jake? You judge away, buddy. I suggest you do the same. But uh I couldn't watch the uh clone because it thought I was a celebrity. Yeah. That's what he really wants to end on. Yep, that was it. That was exactly what I wanted to end on. So go ahead and listeners, make a comment on what celebrity you think that it thought I was. Steve Basemi. Who? Steve Bashem Bashemmy. I've been crazy guys all day long. Very good actor. You were like crazy guys, money. Incredibly good actor. Oh, incredibly good actor. Uh guys, wow. Um, Matthew wasn't even working. He was just watching and listening to you guys, I think. So our sound engineer. Hopefully you guys will be a little bit more. Yeah, no kidding. But thank you guys, JT, Jake. Um, any final words?

SPEAKER_04:

No, this is great. I mean, like you said, JT, we you we can have like I think this is this shows great. I uh one thing as society as a whole, I feel like we're going into where people can have opinions on stuff and then like absolutely hate each other. I think this shows that we can both differentiate, uh have you know opposing viewpoints, but still be able to get along and you know go have a beer afterwards and go and go laugh about it.

SPEAKER_02:

That's it, man. I what I would say is experiment intelligently. Like that's what I that's what I encourage. And ethically. Well, that's yeah, intelligent is I think ethically is being intelligent. But like that's what I encourage my kids to do all the time is experiment intelligently. So like experiment intelligently, right? Like go go see what this stuff does, be aware of what it is, and then make your own choices. And you know, don't trust anybody else to make the decision for you. Go make your own choice.

SPEAKER_01:

God love it. This is what podcasts are all about. Thank you to our sponsors, Charles and Media Solutions. Thanks to JT and Jake. Uh which one? Augie.studio? Sure. Just go with that one. Causefilms.com.

SPEAKER_04:

Causefilms.com.

SPEAKER_01:

K O Z Films.com. K O Z Films.com. Um, Charles Media Solutions. Let's thank uh DJ Jerry Feels Good. Why not? We haven't given him a shout out in a while. Thank my mom. What's up, Ma? Thank your mom.

SPEAKER_04:

Thank you, Matthew, for being on the board today.

SPEAKER_01:

Yeah, thank you, Matt. Yep, yep, exactly. And um that's it. We'll see you guys next time, folks.