[00:00:00] Speaker A: This episode brought to you by Studio Network Solutions. Media teams have enough things to worry about.
[00:00:08] Speaker B: Storage shouldn't be one of them.
[00:00:10] Speaker A: That's where Studio Network Solutions comes in. SNS makes your shared storage, media management and cloud workflows easy so you can.
[00:00:19] Speaker C: Focus on what you do best creating.
[00:00:22] Speaker A: See how SNS can help your
[email protected].
[00:00:26] Speaker C: Foreign.
[00:00:33] Speaker A: Welcome to broadcast to post. Today's topic is Deploying AI Effectively in Creative teams. We're thrilled to be joined by Blackspot, a forward thinking creative agency based in New York and over the past few months Blackspot has been at the forefront integrating AI into their creative media pipeline. In this episode we discussing the boundaries and guidelines Post facilities need to consider when using AI in client projects. While many focus on which AI tools to adopt, few discuss how to roll these new systems out to staff and clients. We'll explore the challenges of migrating production teams to AI, cloud versus Air gap solutions, managing security and legal concerns, internal policies, ethics, team adoption, and the hidden pitfalls facilities should consider. Joining me today from Blackspot is John Laskas, Creative Director and founder, and Anthony Carvalho, the Director of post production at Blackspot. Gentlemen, thank you for being here. John, let's start from the beginning. How did industry trends and expectations influence your decision to adopt AI at Blackspot?
[00:01:36] Speaker C: Well, we've really seen a lot of change in the industry over the last five to six years. Realistically, the pressure to close that fast, good and cheap triangle is getting greater all the time as audiences want to see more content than ever before. Clients want to see more content than ever before and you have to put it in more places and it's got to be really, really good at the same time. So we're in marketing and entertainment. We make promos and we make trailers and things of that nature in addition to long form, but we do a lot of short form stuff. And with the decline in production due to Covid and then also due to the two strikes, we found that a lot of our clients are relying more than ever on library content. And so how do you cremate new stuff from these vast reservoirs of content that have, you know, that our clients are trying to mine and you know, sometimes we're looking at an order of a thousand hours of footage and how can you economically go through that, look at it, process it, and then creatively come up with a way to utilize that if you are simply going to do it with traditional post methods? And we found it was almost possible to do that without employing hundreds of people so Anthony and I started looking around for tools that would have had to sort of automate the workflow a little bit to give us an advantage to make world class stuff that we can put anywhere almost instantly. And we started about 18 months ago looking at some tools that were not quite mature. But over the last year and a half, things have really exponentially improved and really given us an advantage that we wouldn't have otherwise had.
[00:03:13] Speaker A: So Anthony, what things? When Blackspot was looking at AI, what things were you looking for? Was it tools that would edit or is it tools that would help log or what were you kind of looking for at the big trade shows?
[00:03:26] Speaker B: Well, we were looking for improvements across the board, really.
We had been working internally on a system called Thresher that it takes the very end step of post production to deliverables process. And we found a way to automate all the reformatting, the resizing, delivering and adding bugs to specific versions, all of the unique exports that you have to do, really the stuff that computers should be doing anyway.
So we found a way to do that and it saved us, I mean, 90 to 1 hours, a 90 to 1 ratio difference.
So we were obviously very interested in finding anything else like that out there that either exists or, you know, on the horizon.
[00:04:22] Speaker A: So it sounds like a lot of the tools that you were looking into were things to optimize, kind of the grunt work, right? The reformatting, the logging as opposed to putting together something that was just ready to go.
I think what a lot of folks are interested in as, because there are so many AI tools out there, are you able to kind of walk us through what your current AI workflow is?
[00:04:43] Speaker C: Yeah, and we had several. I mean, so, I mean, I think with any innovation, you're always going to be looking to combine tools that are excellent and put them together in a novel way in order to make, you know, a greater improvement overall. So we don't have any one tool that's a silver bullet. We do have tools that help us with blogging.
So one of the tools that we use is called Cara 1 and that is a system that lets us look at huge, huge libraries of content and it automatically meta tags it. And it's not really meta tagging. Like we're not actually looking at meta tagging. What it lets us do is we can type in, in a natural language processing kind of way or in the way that you use ChatGPT or another large language model. We can ask you questions as if you've got the best assistant editor on the planet or the best team of assistant editors on the planet. And so that lets us really find B roll. It also helps us find situations that we would not otherwise be able to find because it will actually infer based on your query, the kind of results that might work well for you. And then it ranks it and it really gives editors or producers or writers tools that you know, or the ability to, to really understand hundreds and hundreds of hours of footage simultaneously. So that's one thing that we use. We use another tool called Quikture, which does do some automated assembly of cuts. And so again, it's sort of like working with a really talented assistant editor. So you can get, Beat it a bunch of sequences and say, okay, well from this show, like let's say we worked on an episodic crime procedural and they wanted to put it on a social media channel. So we needed to have two to three minute cut downs. We needed to end with a, with a cliffhanger in order to drive back to the, to their streaming platform. And so we're able to actually just put a show in in about 10 minutes. It processes, logs it, it ranks sentiment and then we can query it and say, make this two minute cutdown or make this three minute cutdown. Start with a joke, end with a cliffhanger and don't show any dead bodies. And it's incredibly effective at doing that. It can do it in about 10 or 15 minutes. So, you know, we're really watching our workload decrease by a ratio about 10 to 1 in those situations. But the more interesting thing is that because it's doing the work that assistant editors were doing, those assistants are now free to be creatives. And so because they have access to all this stuff and they don't need to be doing the grunt work and the slogging work. That's difficult about the post production process. All of a sudden the creative we're getting out of our junior people is just as good as we're getting out of a lot of senior people. So it's really opening up, expanding our ability as a creative agency to really use everybody to their maximum potential.
[00:07:30] Speaker A: So if I understand correctly, just to paraphrase kind of some of the things you said, is that you're loading the libraries of footage, the previously shot content, finished content, indexing that, utilizing Carol1, which is from Scalelogic, and then you're using Clickture and your assistant editors are then prompting or crafting a prompt to then take the results from Carol 1 and put that into a assembly cut. That's then Reviewed, refined. And then an editor is, is making it, making the edit more pleasing for the end client.
[00:08:06] Speaker C: Right. So the, so the editor is still like you really need a talented team of people to do this. It's not, you can't just rely on the tool. You really need to have top notch creatives who are doing the, you know, doing that less stuff. You're just giving them. The advantage that is, is we equate it to the introduction of the nle.
[00:08:23] Speaker B: Right.
[00:08:23] Speaker C: So before the NLE came out, before Avid was introduced, we were, you know, we would, every time that you needed to make a change to film, it cost a fortune. Or if you're doing tape to tape, the process was just onerous. When you got an NLE, you could go to cut 30, right. And so there's an undue button, right. And so this is, this is sort of the same way. And, and to give an example, we were working with one network. They gave us 25 years of shows and they wanted to do. They, they asked us to make one Christmas or holiday related spot. So with a team of five people, in two days working with a thousand hours of footage, we made 25 spots and they were all great.
So that's the potential of what you can do. So instead of going to cut 30, now just make 25 different versions or make as much stuff as you possibly can and decide what's going to work. Or plug that into an algorithm on social where you find you're not doing A and B testing, you're doing A through Z testing. So you know, which actually is going to work, which actually is going to engage people and actually drive back to whatever platform you're actually trying to have your call to action send people to.
[00:09:32] Speaker A: That brings up a really good point, Anthony. Could you kind of talk about the infrastructure that was needed to accommodate these AI tools?
[00:09:41] Speaker B: The infrastructure wasn't all that different from what we had already as an already running studio. Um, the amount of hardware that we added barely takes up two spaces on the rack. Um, a lot of, a lot of this stuff is just software running.
Having, having computers that do post production work are, they're already up to a certain standard. So this is just running software on top of those. Um, so as far as actual infrastructure, not much different than what most people already have.
[00:10:16] Speaker A: And these tools worked with multiple NLEs or was this kind of siloed to just one type of video editor?
[00:10:23] Speaker B: They did not work with any NLEs when we first got them.
In fact, that was one of the first things that we worked with Scalelogic on, with Cara1 we got them to. Because you would do all of your searches and it would be like, oh great, here's all my searches. And those searches were awesome, don't get me wrong.
But we worked with them getting an AAF export that way it just shows up as a timeline in your, in your edit system. Whether you're Premier, Avid or whatever.
That's kind of the thing that's been cool working with a lot of these companies so early in the process.
It's kind of good for both of us. Right.
They're building the tools that we need and they're figuring out what the needs of the customers or the user base are. Right.
So yeah, so it's been a good, it's been a fun process doing that.
[00:11:15] Speaker C: Yeah. And a lot of the tools will exist as either a panel in Premiere or you know, through AIF with Avid. Obviously Avid's got a different API situation from Premiere, but there are, you do have a lot of options. There are on premises tools that you can use. We use both those and we do use web based tools as well, you know, and they both have their advantages and disadvantages. But if you're working with a web based tool, like there's a, there's a tool called Social Department which is really amazing, which, you know, will take, you know, almost any piece of content you have and customize it for, for a specific audience or specific demographic that's entirely web based. So you upload that to the cloud, it's all completely secure and then interprocesses there. So then your investment from an infrastructure point of view is nothing. You know, all you're doing is you're paying them for the storage. You're paid by terabyte or by whatever metric they have. So, you know, it seems to us like the 80% of the tools that are currently available exist as cloud based.
And then the on premises stuff you're buying, you know, you're buying a server and they really are not tremendously expensive or difficult to integrate. You know, we have, I think we have three or four different kinds of shared storage here. And it works, all the tools work very well with all the storage we've got.
[00:12:30] Speaker A: John, you mentioned utilizing various tools. Some were a base, some were on prem, which kind of brings up the point of hidden expenses or hidden costs you didn't anticipate. So when speccing this out and looking at the financials, what were some hidden costs that you might not have thought of prior?
[00:12:50] Speaker C: We have actually been very surprised at how economical it is, because what you're getting, I mean, as a. As an owner, what you're getting out of these tools is so much more powerful than whatever the expense would be.
You know, we. When the pandemic started, our first attempt at going remote, we're completely remote at this point. We were using. We were using a remote product from one of the larger NLE companies. And, you know, it was costing. It was. It was considered. It was a very powerful tool, but it was expensive. It was cheaper for us to actually replicate that on premises.
Once we have that infrastructure, applying everything else. Applying these AI tools is not onerous at all.
It's sort of shocking because if you consider that you are freeing up so many people to do so much more stuff, like if you have an expense on order of 50 to $100,000, you're gaining so many more people. You're. It's really a force multiplier for your staff and for your actual creatives that the cost really, you know, it really becomes diminutive. It approaches zero. Because you will find that you are so much more effective as a creative team by using these tools.
[00:14:06] Speaker A: For folks who haven't worked with AI or even asset management systems, one of the core things is making sure that the media that's being. The media that's being indexed or meta tagged has the right tag, so the computer systems know what to do with that content. So, Anthony, how did you kind of approach the difference between metadata tagging versus kind of traditional indexing?
[00:14:27] Speaker B: Well, now we don't meta tag anything at all.
Meta tagging is. You're only limited. You're meta tagging, you're limited by whoever is doing the description. Right. So you might miss something so regular. You know, it really depends on how granular you want to get with the data. Um, you're also. When you're doing just strict meta tagging, there's no way. There's no way to compare with. With how. How should I put it? When you're doing regular meta. Meta tagging, you. You're. You're limited by whatever the tags are there. So you'll get omissions. Right. For instance, if I'm looking for an audio byte that says this guy is the mvp, if it was recorded as this guy's the most valuable player, I won't get that unless somebody added that to the meta tagging.
So I find meta tagging super limited. So we, yeah, we don't really tag anything anymore, which has been great.
[00:15:32] Speaker C: Yeah. And, you know, you can get.
You can get that kind of metadata out of the tools. But to Anthony's point, you're really not, you're not using the tools to their proper potential. If you are worried about something like, you know, you can search in phrase find, or you could search, you could search a transcription and you can find the specific stuff that you want, but you're not going to get that inferred stuff. I was working on a documentary and I was stuck. I was stuck. I couldn't, I couldn't make a connection that I needed to make. And so I just typed into one of the tools, I just typed in, tell me why nitrogen and phosphorus are so damaging to the Chesapeake Bay. And within seconds I had 10 clips and I could make that connection instead of going through. But if I had gone through and searched for the transcript, which I had been doing previously, I was not yielding anything that was useful for me because I wasn't answering the question in the right way. So it really gives you the power to really just say you're just talking to a very knowledgeable colleague is what you're getting out of these tools. So it's sort of, you sort of need to change the way that you think about what post production is and also change the way that you think about the way that you manage media.
Because it is phenomenal the difference that this stuff will make.
[00:16:44] Speaker A: Both of you have mentioned that you were working with these companies very early on. In fact, you mentioned that one of the tools didn't even work with NLES at the time. So that is just one challenge. Anthony, what are some of the other challenges that you encountered when deploying AI at Blackspot?
[00:17:01] Speaker B: Oh, wow, that's not a short list.
One of the issues or one of the problem is just inherent in, you know, just the term AI with stuff. It's, it's such an open sandbox. Right.
And, and every single problem requires either a different set of AIs, a combination of them.
So, you know, literally, I mean, almost by design, every, every new project is a new design problem. Right. You got to figure out like, oh, we're just doing cut downs of this thing. Okay, cool, let's use this tool for that. Um, we're working on a documentary like John just said, you know, let's, let's use the, the NLP to, you know, figure out some scientific facts or whatever, you know.
So, yeah, you know, it. Every day is a different challenge and they're mostly unforeseen.
[00:17:56] Speaker C: Yeah. And I would add too then staffing wise, it's, you know, when you introduce these tools you know, people are reluctant to use them because there's a lot of, you know, I mean, obviously when you talk about generative AI, which is not what we're doing, we can't do anything gener for legal reasons, you know, people get worried, like, your writers are going to freak out. They're going to say, well, it's going to take my job. And your editors are going to freak out because they say, well, why am I not the right person to do this job? So convincing people that these are just tools that enhance their creativity instead of something that is coming for their job or coming to take their profession away from them is a challenge. And we still have people on staff like we have. We, we have an enormous show that we do that, that, that, you know, a lot of the team won't use the tools because we haven't convinced them yet. But if you start slowly, if you start to increment something as simple as, like Trent, where, you know, where you're getting your, your transcriptions done, or if you're doing a transcription in Premiere, where just, you know, just starting small really helps you introduce people. But, you know, the resistance is real and warranted. But, you know, one of the things that we. And part of the great thing about having a forum like this is to convince people that you don't need to go to the generative route in order to reap the rewards from what this technology can do.
[00:19:19] Speaker A: I think that's a really important point. And for those who haven't been dipping your toe in the AI pool, there's a big difference between generative AI, which has generated new media, and then analytical AI, which some companies call utility AI, which is more applying computer vision and AI models to content and then saying, this is what I'm seeing, quote unquote, in the content.
And as you mentioned, it's getting the content ready for the creatives to put the human touch on. It's not replacing that part of the creative process.
[00:19:54] Speaker B: Yeah, that's kind of the whole point. And one of the examples I use, there's a color software. Color Color Lab AI.
You know, if I'm, if I'm coloring a commercial, I usually have about a day to do it. 30, 60 seconds or whatever. You know, 70% of my day is just leveling. Right. Just getting the blacks to match. Just, you know, just your balance.
[00:20:16] Speaker A: Pass.
[00:20:16] Speaker B: Yeah, yeah, yeah. Just. Yeah, my first pass. Right.
This does it in like 15 minutes. Right. So I'm not going to spend just half a day color correcting. I'm going to do. I'm going to spend most of my day doing what a colorist actually does, like color. I'm going to have extra time to do some Windows passes. I'm going to make it look nicer as a result. Right. I'm not taking that work away. I'm just making it, making it better in the same amount of time.
[00:20:42] Speaker C: And just as another anecdote which I love to talk about because it just seems like magic to me, but Anthony was once working on a project that was shot, it shot anamorphic and pal and it came in and it wasn't soft, it was out of focus and it was 40% of the content for the show.
He made it, he put it back in focus easily. And the network didn't, you know, didn't even, they were not even concerned. You know, the tools are crazy and they just get better every day.
[00:21:13] Speaker A: I'm sure a lot of facility owners are concerned about the security and data protection. Right. We've seen some web based AI systems potentially utilizing user input without their approval. We've seen some newer AI systems that may be phoning home to China. So what are some of the things that you've had to worry about in terms of the data protection and security of the content that you're putting through AI?
[00:21:37] Speaker C: You have to worry about it all the time. Not from the vendor point of view, but with your clients. You need to be very clear about the tools that you're using and make sure that they're okay. Because we have some clients who will not let us work with certain tools.
Fortunately for us, the ones with which we do work on a regular basis, especially the on prem stuff, that is a closed large language model. So it does not look at the Internet, it does not talk to anything. It exists purely inside that box and nothing can escape. And it's very, it's very sound.
Some of the other tools that are out there, yeah, you really do need to make sure that they're MPAA secured. And you know, anybody who's in this business, I mean we all get audited for security all the time.
These automation tools also need to be audited all the time to make sure that they are compliant with the terms of service or the scope of work that you've signed with which client is. But we have a large sports client. It took us eight months to get through legal and they preclude us from using some of the tools that we have access to.
We had another client who another vendor had used a tool to, to simulate an actor's voice Just for one or two words. Showrunners saw it, freaked out, and it took us a long time to write that contract. Because people need to understand that their data is not going to be used to train anything. They need to know that it's not going to leave your facility or that if you're using a cloud based service, they need to know that, that it's not going to leave that service and be used for any other purpose. And so the thing that we recommend highly is have your lawyers talk to your client's legal department, disclose everything. Every time you add a new, have that conversation with your client.
[00:23:15] Speaker A: So we've touched on this a little bit and I kind of want to pull on this thread a little bit more. And that's the ethical considerations on using AI, whether it be personal moral beliefs or it becomes business practices. Can maybe you elaborate a little bit more on kind of the ethical considerations that you went through before deciding to pull the trigger on AI?
[00:23:34] Speaker C: I mean, so you know, anytime that you introduce a new tool or anything that's going to affect the creative process, you really have to work with your creatives to make sure that they're comfortable with it. Because that is the heart of what we do. You know, if the creatives aren't happy, you're not going to have a good product.
So that's the biggest thing. We are, you know, 100% confident that the tools that we are using are simply there to give everybody who works with us a better experience, let them make better stuff, give them opportunities that they wouldn't have, and also to make more stuff, more better stuff.
So, but you know, that is a conversation that you need to have with the people who are actually creating the stuff that you put out into the world. And if they're not comfortable with it, some people simply won't use it. And that's also okay as far as larger considerations about working with other people's intellectual property.
We will not employ a tool that we do not know is not completely secure and completely closed. And we have some stuff that we developed proprietarily so we know that's secure. We constantly have conversations with our vendors to make sure that we are not ever going to leak anything into the outside world. And I think that if you stay with that utilitarian or iterative AI kind of stuff, that there really aren't any issues that you need to address that are not personal with the people who are making the stuff. If you get into generative stuff, it's a completely different world and we won't touch it.
[00:25:04] Speaker A: Anthony, from a user level, what kind of policies or guidelines did you put in place for the creatives so that you did kind of walk that line between, you know, ethical and responsible AI use?
[00:25:17] Speaker B: Yeah, I think the initial, the initial talk we had with everybody, a lot of, especially the older editors.
[00:25:30] Speaker C: A little.
[00:25:30] Speaker B: Trepidation in the beginning, but, you know, and I can understand that. But like John just said, you know, if people don't want to use it, cool, all right, you want to do, you know, you want to do math, longhand, go for it, you know.
But ethics wise, we really didn't have any, you know, moral ambiguity. I mean, we're not doing anything generative. You know, we're just handing you tools to make it, make your job faster and better.
[00:26:00] Speaker C: We got really lucky when the tools were sort of in their infancy.
One of our most experienced creatives was really reluctant, didn't, didn't want to try it. But he, you know, he took the chance, he used some automation stuff to.
First he made his own cut and then he applied the tools. And the tools gave him about 80% of where he wound up without using the tools, except that they did it instantly, whereas he had spent two or three days coming up with his original cut. And so when he saw that what is coming, his end product, is still the stuff that's coming out of his mind and his own creative vision, it just got him there faster. Then all of a sudden, he was the biggest convert, you know, so that, that's sort of the way that it goes. It's, it's not, this does not get in the way of your own personal vision. It just lets you get there faster.
[00:26:52] Speaker A: We talked a little bit earlier about lawyers and, and some of the potential legal ramifications of utilizing AI. Have there been any certifications or any kind of audits that you've been through? And things check out. So folks who are watching this know that there is a path to be, shall we say, given a thumbs up that this tool, these tools can be used.
[00:27:12] Speaker C: Yeah, and it depends by the, it depends on the client. Everybody's sort of got their different issues that they have, you know, some, some gigantic networks are, you know, they're normally okay with their standard audits because they're so in depth anyway. Smaller clients want to talk more about the individual tools than they want to talk to the engineers and they want to talk to the company that manufactures it. If it's something that you, you know, that you haven't developed de novo, something that's not in house they want to talk to the engineers who are actually developing it. Sometimes they'll say no, and that's okay. There are a lot of tools out there and more come out every day. I just can't stress enough that you need to be transparent about it.
[00:27:55] Speaker A: You mentioned something earlier that I, I also want to examine a little bit more. You mentioned voice cloning, right. You were using it to get a few words and that the ones.
[00:28:09] Speaker B: A client, a client that we work with had another company or somebody in house, I think use it and that's what. And, and then they had like across the board policy, no AI in this, in this house kind of thing. Um, but yeah, yeah, so yeah, yeah.
[00:28:28] Speaker C: But that, and that's, that's an overreaction. We understand why. I mean, so, I mean obviously if you're using that kind of technology, you're in violation of all the side contracts that were just signed. So we can understand why there's that overreaction to close stuff down. So you know, you really, this is a process of not only teaching yourselves how to use these tools, but also teaching how your clients how to use and that it's okay to use the tools and just letting everybody sort of, let everybody be nervous about it, let everybody take small steps just because the rewards are so great. Usually people come around pretty quickly.
[00:29:00] Speaker A: The proliferation of AI tools is predominantly in the cloud.
[00:29:04] Speaker B: Right.
[00:29:04] Speaker A: There are obviously tools that you're running on prem. So I'm kind of curious, how did you evaluate the cloud based tools versus the on prem tools and what kind of major decision to go with a more pay upfront concept with on prem than kind of pay as you go in the cloud?
[00:29:21] Speaker C: Anthony told me not to be an idiot is what happened.
He just said when we were, when we started looking, he, he said, try every tool you can and what do you have to lose? And that was basically it. So we're, you know, it's I, as, you know, as, as, as the owner of the facility. Um, I prefer the on prem stuff because I always know what my costs are going to be. But you know, if you're using stuff that's in the cloud, you just build that into the budget. So it's not, you know, there really isn't, there, there really isn't a significant difference in utility. I. Both are, both are valid. You're going to find a lot more stuff in the cloud than you will on premise, but that shouldn't scare you away from it.
[00:30:04] Speaker A: So looking back, are there things that you wish you knew when you first started Advice you can give folks who are looking to kind of incorporate AI into their traditional post production pipeline.
[00:30:16] Speaker C: That sounds like an anti question.
[00:30:18] Speaker B: Well, looking back. Well, looking back, the only thing I could say is, you know, just try everything. I mean, it's not like most of these things have trials, right. And there's a lot of options. So, you know, the whole risk reward is not, it's not that crazy here. Right. And if you don't like it, you don't use it.
I wouldn't tool up specifically for a job having not ever done it before. So, you know, use this time to figure out what works for you and what doesn't because everybody's going to need a different, you know, a different solution because everybody's working on different types of things and all of these things, the cement is still wet. Uh, so, you know, make relationships with the place that you're probably the first or first 10 customers of, you know, they'll work with you. Um, at least that's what I found in our, in our experience so far. Um, but yeah, no, like I said, just go out there and try it.
[00:31:21] Speaker C: Yeah, yeah. And don't, don't expect everything to work or anything to work per se. You really need, you need to combine them. Like you need to work with them. You also need to be willing to experiment and to push as hard as you can.
There's nothing that I would say that I wish we knew because we didn't know anything when we started with this, because it didn't really exist. We had spent some time working on our own natural language processing system. So we knew a bit about what it was and we knew what the limitations were. But yeah, it's a constant process of making sure that you communicate with the vendor because they do want to talk like they want to make it better. This stuff learns all the time. It gets better every week, it gets better every month. So just always ask questions, try everything and try to break it. And if you try to break it and try to combine things, then you'll wind up with the tool that really will help your team accomplish amazing stuff.
[00:32:16] Speaker A: So in production and post production, a lot of folks are aware of the Holy Trinity, right? The good, fast, cheap. And you can only pick two. Right. So I understood that, John, that there's kind of an augmentation you'd like to make to that.
[00:32:28] Speaker C: I mean, I wish that, I wish that we couldn't solve that problem. I would rather everything to be good, fast and expensive. But realistically, that's not where the market is. And that's not what. That's not what people need. People need good, fast and cheap. And it was not possible to do before. It is possible to do it now, and it's going to get more possible going forward.
So, you know, I would say embrace it, enjoy the ride, because you're going to be happier with the stuff that you can create if you embrace this change.
[00:32:59] Speaker A: So, as we wrap up, I always like to ask, where do you see the industry going from here? What do you think is in a year, in five years and forward?
[00:33:10] Speaker C: I mean, I would imagine that a lot of the tools that we're using today will be incorporated into mainline.
And Anthony would be better qualified to talk about this because he actually speaks with, you know, everybody who's trying to incorporate it better.
[00:33:23] Speaker B: Yeah, I think. I think in the very near future, there's going to be a lot of really great breakthroughs. They're not going to be very sexy.
It's going to be like, oh, this AI groups all your clips and attaches the audio. All stuff that, like, again, computers should be doing already.
And I think, I mean, I hope NLE's companies start opening up a little more and allowing the streamlining of, like, being able to build the plugin packages, you know, they're. Because they're going to need to. Because they're going to start eating their lunch pretty soon. You know what I mean?
[00:34:02] Speaker C: But, yeah, it's not. There's no. There's no change. If you look at just the changes that have happened in the last three or four years with NLEs, you can see the progress, because automatic transcription is something that would have been magical six years ago. So all that stuff, it does happen, or every time you see a demo in nab, you sort of expect that it's mostly going to be vaporware, but this stuff is actually maturing and becoming real. So there's no question that stuff is going to be more and more integrated and part of what we do in the same way that we are used to.
We used to pay $2,000 an hour to go into a spirit room and use the same tools that are now available for free and resolve, you know, so it's just that progress is inevitable or that change is inevitable. So.
And on the whole, I think it is. It's a net positive for everybody.
[00:34:51] Speaker A: That's about all the time we have today. Again, thank you to John and Anthony from Blackspot Creative. Thank you to all, the entire Keycode Media team, and we'll see you on the next podcast.
[00:35:01] Speaker C: Thanks for watching Broadcast to post.
[00:35:03] Speaker A: Please make sure to subscribe to the.
[00:35:04] Speaker C: Podcast to receive Future episodes, follow KeyCode.
[00:35:07] Speaker A: Media on LinkedIn, Twitter, Facebook or Instagram to receive news on additional AV broadcast and post production technology content. See you next time folks.