The VFX Artists Podcast

AI in the VFX & Creative Industry: Impact, Issues & Solutions | TVAP EP45

November 07, 2022 The VFX Artists Podcast Episode 45
The VFX Artists Podcast
AI in the VFX & Creative Industry: Impact, Issues & Solutions | TVAP EP45
Show Notes Transcript

AI in the VFX & Creative Industry: Impact, Issues & Solutions | TVAP EP45 

Join us, as we discuss the impact of AI in the VFX & Creative industry. 

With the rise of platforms such as Midjourney, Dalle-2, and Stable Diffusion influencing artists, we bring forth this open discussion about what the impact could be for our industries if certain laws and rules aren't put into place to protect artists and creators worldwide. 

We look to touch on copyright ownership and rights, privacy, ethics, and more! 

Join us this Wednesday, Nov 2nd, 2022, for the discussion and to take part in our Q&A session with industry professionals in the VFX & creative industry. 

Our guests are, in no particular order:

Simon Legrand - Realtime Supervisor at Untold Studios London
Justin Braun - FX TD at Untold Studios London
Vladimir Venkov - Modeler/Texture Artist
Carlo Santoriello - 3d artist, Animator & Graphic designer
Anthony Martin - Senior Lecturer & Course Leader for Concept Art at Staffordshire University
Jamie Bakewell - Company Owner / Creative Director (Visualisation for Film, Game & TV) at Bigtooth Studios
Tom Paton - Writer // Director // Producer focused on integrating AI, UE5.

Artwork created by Carlo Santoriello

Listen to all episodes on our website

#thevfxartistspodcast #vfxpodcast #aiart #midjourney

You can watch all our episodes on our YouTube channel.

Thank you for your support! We appreciate you!

[Music] foreign podcast and um yeah um I guess yeah we we are an independent podcast by artists um and we generally bring on artists from all types of different VFX departments and to discuss um yeah how people are going into in the industry out their the progress through the their journey and just discuss all sorts of via VFX um topics um yeah so yeah today we just thought we'd use our platform to to discuss um Ai and the rise of AI art which is has been for the past six months I think has been quite popular um and quite noticeable especially on LinkedIn and yeah we thought we could we would explore what the the impacts or the issues could be for for our industry and the creative industry in general so um yeah so here we are um coffee just just to follow that up with you uh real quick um should we be live on YouTube as well we are Live on YouTube um yeah because I was just checking out on the side here and okay okay yeah so let me just double share my screen sharing the presentation that I made yeah so Ai and the VFX and creative industry uh yeah we're looking to explore the impacts Solutions and issues um I'll have to say that this podcast or this yeah webinar is inspired by the concept Association um they did an AI Town Hall webinar on YouTube I'm just discussing this same topic um and I think they did a great job just covering the basics of it and the whole topics I just thought to give them credit um so yeah today's that line you just want to go through the impacts and issues and solutions and possibly it's your q a and just see what people think about it um so yeah um maybe I might actually ask Simon because you might know more about CSA are you able to maybe tell us a bit more about maybe why why we're here um what AI artists and and just the competition that we have on screen I guess so from from my point of view uh the reason I I reached out to you and and thought that this would be a good conversation is uh I found I found myself talking about AI in Broad terms uh within uh the company that I work for I'm told um and just before we go any further I just want to say that all opinions that I speak today are my own and not my my companies um I um I kind of thought like there is a lot of speaking in our industry that needs to be done about this subject because especially with the visual a well you know I want to say AI That's a very broad term but AI ml neural network specifically and then diffusion as a as a subset of that diffusion in particular has been like something that has hit the visual industry uh like a like a truck in the last four or five months um you know when when mid-journey really came out uh it was almost like personally I got defeat I got a bit of a strange feeling of like what is happening here after a few days I realized okay it can't do this it can't do that we're safe here but then as I saw the entire industry and the subject progressing and realizing how much like granular control people are getting you know every couple of weeks um this is moving very very fast and I feel like I can I can talk to my small network of friends here and there as much as I can but there's a broad discussion that needs to be had about this because it's going to affect us all um whether we work in graphic design concept art uh texture painting modeling or rendering even you know there are there are diffusion plugins for for blender right now that that that essentially tries to replace a renderer like it's not killing it but four months ago mid-journey was not killing it at concept art but now it is so it's it I just thought why not why not talk about it with as broad of a you know a bunch of people as we can find um and just you know uh I guess in many ways that you know over to you guys uh because I have my own opinions potential fears you know I also think it's awesome uh and you know you can't ignore progress um so yeah that that's essentially why um you know I kind of reached out to you coffee so yeah oh you're still muted yeah of course I was I just saying funny enough because this is my first time going live so I just realized I was in life but nowhere yeah YouTube open yeah now we're live okay but um okay Simon say that all again say it again in a shorter out of time yeah when I reached out to you coffee uh we uh my main concern was that AI was progressing a lot faster than I initially thought it would when mid-journey first reared its head uh about four months ago four months ago my journey was impressive it did images that were kind of amazing one of them still my wallpaper on my desktop um but you know you could tell that like and it wasn't doing anything at a it was doing some cool stuff but a concept artist needed to still be involved to make it make sense um but then in the last four or five months uh you know the control over what you can do you literally just paint out a hand and try again try again until the hand starts you know coming through it's becoming actually useful it looks like it's becoming useful and I think there's a broad discussion to be had in VFX because it's not just going to impact graphic designers concept artists it's absolutely going to impact the daily job of texture artists um if you look at you know some current uh push in by Nvidia model modelers lighting artists with blender plugins that are using diffusion as essentially the the render engine um it still is it still seems slightly uncontrollable but so did mid-journey four months ago and we're only talking about a four months Gap from mid-journey to being able to uh use dream studio now where you can put your own head as a anime character on you know on a prompt on a on a beautiful picture that you just created with a prompt um so I think all of us in the visual media and and VFX industry should be keeping a very close eye on this and more importantly probably just talk about it um instead of just keeping it to ourselves keeping keeping our our anxieties or or you know hopes to ourselves I think it's good to just put it out there and and talk about it and learn from each other yeah sure yeah um if you don't mind just because we I forgot to ask you to introduce yourself can you tell us about yourself and what you're really sweet that's okay uh I'm I'm the real-time I'm a real-time supervisor at Untold uh Untold studios in in London um initially my job was to mainly take care of unreal and and real-time elements uh but I kind of found myself sliding into AI uh when it came out and um lately I've sort of been the point of contact for AI as well um and I'm told um a good thing to mention as I mentioned that as well is all these opinions that I will speak today on my own you know I'm not speaking for Untold um so uh yeah that's what I do let's go yeah okay great all right I'm just gonna represent my slide is for everyone that missed out say let's see okay all right so for everyone that's just tuned in um welcome to the AI in the VFX creative industry um webinar uh we just today we're just trying to um yeah discuss the impacts and issues and possibly find the solutions to how air could maybe influence or help industry um this webinar is inspired by the concept of our association which previously did an AI Town Hall about a month ago discussing the same topic and I found that's something very interesting and very insightful so I gathered some inspiration from it yeah so as I said we're gonna touch on the impacts and the issues and solutions and possibly take some questions from the from the crowd um we'll be hopefully touching base on these three companies um of which the journey is on the left and Dali and yes they will stay with Fusion uh the the three maybe top companies currently um say just trying to explain what AI art is is basically how this platform works is you give the AI a prompt so you tell it you give it a set of keywords and or a sentence and give it a prompt and you ask it to generate an art RPS so this is um a comparison between the results generated by my journey and Dali and stable diffusion using the same prompts which is looking at behind the scenes of shooting the moon landing Hollywood studio in 1969 backstage photograph astronaut actors enlightened so all these three platforms gave us three different results um using the same prompt um and as you can tell you can you can literally create any image from from the from the prompts that you give it but um yeah just today we're just gonna try and work out where the copyrights where the data sets is is being gutted and who owns the rights and all of that stuff say that's what we're gonna attach based on this is this image on the left is an image that I try to make on my journey because I was looking to make a an image for for this webinar so my I went in and gave it a prompt of a photograph of an advanced technology humanoid humanoid with a with a female head long hair pretty in a dress but with mechanical arms holding the painting brush in in left hand and placing a palette in the other hand standing in front of a cool deception um yeah but this is what I generated and then on the right is what color in in in in a guest um made and and sent me and I just comparing it to you I felt his felt stronger as as an image and as a concept and as a final render so I I chose his over what I created using my journey um say that's something to think about so yeah just let's discuss what how it will impact us and yeah possibly airless let's see how it goes so yes Simon it'd be great to to to to hear how how you you came across um air because the past six months I think I discovered it from you on LinkedIn and it was a buzz for some time and everyone wanted to trial it and just let us know how you've been using it and what you've been finding it finding about it and your thoughts about it would be great um sure I think um maybe before I start rambling on about myself too much would it be a good idea to to have everybody introduce themselves yeah sure yeah yeah um I guess uh I'm just gonna go with what I see uh the order that I see so uh Vlad you're the first person I see uh hi guys I work in the VFX industry uh you live in the UK uh I thought for a lot of companies as a modeler and texture artist I'm a sculpted as well ice cold also a character artists so and as I said before I've never used AI I haven't generated a single image so I'm really I know how it is done in theory but then in practice I don't have the the life experience when it comes to AI but I've got a lot of questions a lot of questions you're you're the very definition of VFX artist yeah it's it's great to have you uh here excellent um Carlo you're you're the next one on my list I'm uh Carlo I'm um 3D 3D artist blender artist um and I live in I'm in a digital Nomad but I'm in Valencia at the moment I've been here for yeah yeah it's a nice place to be I got stuck here during lockdown I haven't moved so um yeah um and I come from a not from a VFX background I come from a an engineering background and I create mainly um B2B sort of uh content a lot of that a lot of my work is that not all of it so um that's the sort of angle that I'm I'm looking at whether you know whether marketing agencies will be able to do the stuff that I do for them on their own with this kind of a bit like Vlad I've not got much I've used mid-journey a little bit and I sort of delved into it and I saw the big sort of uh rise of it and I'm bored of it really I did a bit of dabbling and then yeah I I think there's a lot to talk about when it comes to B2B um for sure uh yeah um Tom uh you're the you're the next little image on my screen uh I'm Tom Payton I'm a writer director producer and founder of a new company called pigeon Shrine which is based out of the West Midlands and I'm a massive AI Fanboy um I I suspect I might end up being the bad guy on this particular chat because a lot of what I'm doing is about completely re-jigging the production pipeline you know and I think that the advantage that I've got as a fresh-based company is you know I'm coming off the back of nine movies but without a pipeline of product that needs to be finished so I don't have a backlog of clients a higher end whose work is already in an existing workflow so we're essentially rebuilding the entire production pipeline from the ground up using AI assistance of pretty much at every level including concept art VFX work sound mixing post-production you know really kind of integrating all those elements and and you know there's a real push for me to be able to bring production costs down and push the final product quality up and then in turn perhaps change some of the culture around the way VFX artists are treated within the film World period so yeah I'm uh that's me uh for sure that um I think we're we're getting a little bit of a cut video uh okay we we can hear most of what you're saying but sometimes it goes a little bit sure let me I'll swap let's swap my internet cool if you could do that then yeah that might work sure um I guess in the meantime uh Jamie uh do you want to jump on and yeah sure say hi everybody um I'm Jamie Bakewell uh and I've been in the film industry since 2009 working on as an artist as a previous visualization artist on films like Jurassic world James Bond and Guardians of the Galaxy and plenty of other big movies and I set up my own previous company based in the West Midlands in 2018 uh and so yeah we know a West Midlands based visualization company also working on Netflix movies and commercials and video game cinematics for yeah big big titles don't forget the plug what's the name of the company a company called big tooth Studios based in the West Midlands so yeah and not really dived too much delve too much into AI but just started using you know mid-journey uh just like they scored generator just just having a little playing in spare times but um a few of our artists on our team are obsessed with uh generating artwork of themselves as superheroes and all sorts of really really cool stuff and yeah me and Tom we as of like a few months ago we've been sending back and forth yeah lots of AI stuff and articles and videos and links and all that cool stuff so it's very exciting um and scary and curious and all sorts of feelings but uh yeah here we are and then for all of us yeah uh welcome back Tom uh it's sorry yeah I fix them I fix it now is that is that better I think that works right perfect it's perfect there you go excellent um so yeah thanks Tom again and and Jamie thank you that that's um a previous actually I think we we can talk about as well um you know uh we can talk about it later but you know AI for animation is happening as we speak um quite properly um Anthony you're the next beautiful icon on my page gosh well you're charming me Simon um yeah I'm Anthony Martin I'm uh I work at Staffordshire University I'm a course leader and Senior lecturer for concept art and we have a concept art course up at Staffordshire University with quite a lot of students and um I've been very interested in watching AI seemingly explode almost out of nowhere after the over the last four or five months or so and I've been watching it with interest uh I'm I'm fairly skeptical of it I but I'm also excited about some of the applications of it that don't necessarily affect concept artists uh I'm kind of because I used to be I used to also do uh VFX and animation before I started teaching at University I actually am not quite as convinced that AI imagery and I'd like to call it AI imagery rather than AI art at the moment um I'm less convinced that it will necessarily replace concept artists as such I think it will pose a big threat to illustrators uh and that saddens me to a certain extent but um but I'm very interested in how it can be integrated into the nuts and bolts of VFX work like you mentioned earlier uh about how it can almost form substitute renderer and deal with compositing issues and tasks and and texture generation I the techie part of me finds that more interesting maybe than the actual generating kind of uh to what to me looks like very cliched kind of artwork but I'm sure we'll get into that there a bit later yeah yeah I'm sure we will actually I will take a quick note after you've said that I'll just I will need to talk about this um excellent and uh Justin well I know you because we work together at Untold um but introduce yourself for for the other people that's right thank you so much um yeah I'm Justin Brown I'm working as an effects artist at Untold Studios and I'm mostly yeah I would I would say I'm a generalist working mostly on commercials and TV shows uh in the past and uh yeah before that I was I was I was working on like as a compositor and a little bit as a dop um but yeah quickly transitioned into like 3D and uh that's my main focus right now and uh yeah I'm in in the lucky position at until to do a little bit with machine learning as well I'm integrating stable diffusion into nuke and yeah we were doing like a lot of progress there and yeah thank you so much for the invitation I'm super happy to be here um yeah if I can add something to your introduction oh yeah I'd like to say that Justin has been on top of machine learning program us since uh at least 2018 maybe 2017 when I first met him and he actually used some earlier machine learning techniques to make a an amazing short film where he actually used machine learning to mix and match and create Mega scans out of Mega scans I don't quite understand how that all works but Justin has been actually using a machine learning Hands-On for quite a long time so as humble as he seems right now he actually he's he's deep he's deep into it thank you cool um and then yeah uh I guess uh coffee you you you're just asking me to introduce myself so yeah I'm the real-time soup that I'm told and um uh the way that I got into machine learning was um initially out of interest uh talking to Justin years ago looking at I I started a motion capture Studio a few years ago called the mocap studio uh in London and very creative name I know um and it's good for Google hits um and and while we had this motion capture Studio one of our main um gripes I guess one of our main issues was motion capture cleanup um and motion capture cleanup uh traditionally has just been a manual job you know you bring mocap into motionbuilder or Maya and you know back then kinfx Houdini didn't exist or Houdini existed but can affected him um so you know I was very interested into uh in seeing how machine learning could help that and back then there was already quite a few efforts and a few startups that were even asking us like emailing us as a as a motion capture company asking for data just saying can you just send us data like animation mocap data like anything you've got can you send it to us and you know back then we're like we're trying to run a business so we're like okay and what do we get in return and they're like ah you can get like a a one month trial to our you know new amazing motion AI motion capture cleanup solver um so at the time we said thanks but no thanks uh because these companies we had no connection with we didn't understand what they were doing but um I was interested I was like okay let's see what that really means um and it turns out that you know a long time before this disco diffusion came out and diffusion in general came out machine learning was was already doing pretty well with simple data so simple data is parameters and numbers basically a spreadsheet you know machine learning like writing some machine learning to manage a spreadsheet um is actually pretty pretty easy dare I say um because there's not a lot of numbers to play with uh and motion capture is the same yeah you've only got a certain amount of Bones all these bones have got a rotation value and a translation value over a bunch of frames it's basically just a big spreadsheet um and so at the time it felt like somehow it was within my reach to actually do some of that myself um there were a few tutorials out there a bunch of githubs and I started kind of like getting deeper into like well can I actually write some training to clean up the motion capture that we generate um and then I met Justin um who uh kind of started telling me about uh how much harder it was than I thought it was um so at that point um you know we I didn't really personally push the machine learning for motion capture cleanup much further uh I just kind of moved on to running my business um but it was it wasn't until honestly it wasn't until mid-journey reared its head that I you know I I had almost forgotten about machine learning at that point uh you know I was like yeah it's cool it seems really hard to get anything done mid-journey came out and just uh you know and the open models like disco diffusion or you know the lion 5B model that this kind of fusion was based on so coming out and then it's almost like about four or five months ago there's like somebody lit a match and just set fire to this this whole business or industry um with progress just happening at Breakneck speed um yeah so now I'm just running with everybody else just you know trying to keep up to date with all the progress that's happening I'm on all the subreddits I'm on all the Discord channels and still like the information it's just overwhelming I it's very hard to keep up with yeah sure so um are you able to tell us about your integration into your pipeline and how you use it and yeah maybe what you worry about and I'll I'll jump in very quickly about how I've used it and then I'll let Justin talk about how we're looking at integrating it into our pipeline how I've used it was for um uh there's there's a WWF not the wrestling but the World Wildlife Foundation um commercial which will come out in I think uh two days from now um and that was a director uh who is famous for using Cutting Edge technology in strange ways so a few years ago he did a short film or act or a commercial that used um death sensors to create very kind of interesting shapes and form and stuff and so that director is very in tune with you know emerging tech and and really enjoys using emerging Tech uh to to to do his work um and he approached us to do this uh WWF uh TV commercial and at some point I made the mistake of God like oh maybe we could do this with AI and he was like AI I like that let's talk more um and uh disco infusion was um kind of a thing at that point um Disco diffusion warp so there was a disco diffusion warp which is a fork of disco infusion that used Optical flow to deform the Discord diffusion stuff uh so make it animated like an easy way to make it animate without like just flickering um so we just started using that and it was a weird three or four weeks of just generating generating random stuff by typing prompts like fire Death uh fish dying skulls you know like just words you know and we had supervisors jumping in going like type this sentence in and let's see what this does and it was like a very strange workflow because as as VFX artists we're used to controlling every single Pixel that we make and and massaging it into place and doing and this one was more like here's a whole bunch of stuff let's see what what we can do with it um and luckily we had really talented Compass uh to turn my absolute Insanity output uh into something cohesive um so that's when we used it first as as a studio um Justin is actually uh the one take taking the the torch in terms of uh implementing it in a serious way not not in a like hacker in the bedroom kind of way still doing that though yeah um so yeah Justin like have a you can you can probably like talk about the new stuff that we're doing at the moment or yeah sure sure um I mean um in general I think like uh the opportunities for AI are quite broad I would say already with like the existing setups um I think there are a lot of uh different departments that can benefit from them so um basically what what we're right now doing is integrating it mostly into nuke um but we have like um like a system that can I mean we are running all all the stuff locally I've seen like a lot of plugins already out there for Photoshop for example that communicate with a server running somewhere on the internet basically um and we we are actually using like like our internal hardware and running an internal server uh which um right now is just like running on on the machine um the users yeah using and uh it's it's connecting with different software so for example right now it's it's just uh nuke and Houdini uh which can talk to uh to this stable diffusion server and uh it's it's accepting prompts uh you you can for example uh load an image in uh select an area and type something you want to change and it generates it quite quite quickly on the Fly uh loads it back in and it's it's we're trying to make it as seamless as possible to really iterate fast iterate quickly with like uh input images or like you and you draw something and you quickly with with the paint node and um you you feed that uh in into the um into the server and uh it generates something uh that you can use so um that's basically that's the idea behind it but um we got really good feedback from from the compositing team uh we we're just uh trying to roll it out and giving like people the opportunity to test it um and uh collecting feedback for that so that's that's roughly at the stage where we are um and um I mean plans for for getting getting different departments uh into the stable diffusion thing as well uh we we have some progress on the texture generation part for example in Houdini you can uh you you can use the server to generate some textures for four objects you have or um what what we're gonna do is implementing something that's similar to the blender rendering tool for Houdini so we um have have something that works for for our pipeline because uh we we don't really use blender that much um and uh yeah yeah oh sorry yeah so sorry if I I just I was curious about uh Tom mentioned that you were also a writer and director is is that right yeah yeah that's correct so my background is in production um you know I've produced directed written most of them and delivered to international buyers Nine Movies at this stage right and so the the reason I was really curious to hear about what you had to say is I have a a friend who's a director and you know when mid-journey came out I sent him an invite and he just went crazy just all of a sudden said like I can make the posters I can make the the all the pitch Decks that I wanted to make that I always had to kind of call upon a friend or a favor or uh you know somebody to kind of help me out to visualize what I'm trying to say here he's just been so happy because he's just been able to like just generate these pitch decks just all on his own um how how do you feel about it is that something you've been doing as well or yeah I mean look so I mean my my a little bit like Justin really you know I've been kind of on the AI tip for a very long time I watched a uh a short film called sunspring which was written by an AI back in 2016. they got Thomas middleditch to come and star in it and you know it was kind of like this bizarre Shakespearean it didn't make any sense but it was kind of entertaining and you know at the time it was kind of this jokey product that everyone was like haha look at the silly AI short film but I remember thinking about then okay this is this is the future for all departments and so I've kind of made it a sort of a vested uh you know part of my journey to stay on top of integrating AI is you know whenever we could really um you know and it hit the sound world first you know it became super quick and easy for us to use in terms of sound cleanup and bringing those costs down you know we've started using it within composing and music departments and I'm working with my composer to speed up turnaround times using AI uh just recently we did uh you know we filled in ADR on a movie that we just delivered the fluffy QC and we never actually got the actors back we just trained the The Voice models to to replicate them and then I essentially did the performances filled in the gaps and you know we flew through and the thing with AI is it kind of lets you do this in all kind of departments and I think you know with the visual element stuff you know mid Journey especially a lot of filmmakers are finding that the problem when you're trying to get something off the ground right like until before product ever gets to you guys where there's a locked budget you know and we can say oh well this is how much we've got to spend on this there's there's years of front loaded work where you start with most of the time indie film I can start with absolute zero you know and they're just trying to be their own hype machine and convince people that they should put some development funds in and then that Development Fund gets spent on people like Vlad and Carlo uh you know but then you're already like nine ten months into a process by that point and most of these producers and directors are now eating pot noodles and starving to death and end up quitting and the project never gets going and all of a sudden mid-journey empowers them to be able to put that out there one of the interesting things we've done this year is you know I've got free projects that are all about to go they're all now funded they're all ready to go and nearly all of them attacked their cast by generating images of that cat of that actor inside the character and then sending it to them directly you know like there was one guy I went I won't say any names but you know I'm on a zoom with this quite famous actor trying to talk him into it and while I'm talking to him I'm generating stuff on stable diffusion like down here in my lap and then I'm sending him like I'll check this out and I'm I'm certain that's what made him join the project and we're funded for that so you know this stuff is is really powerful just in terms of getting your project off the ground in the first place yeah um well I'm I'm really glad um because there's a lot to unpack uh there um I I guess number one um you know like you did mention at at the beginning Ed and look I you know we all understand here that like starting your own film or your own project you're in production you know there's no money there it's it's just Blood Sweat and Tears and belief in yourself like there's there's absolutely no doubt about it unless you got really rich parents yeah but it's it's one of those things where you know um initially yeah uh back a while back you would have had to go to Vlad now I imagine that Vlad is just fine and um he's got plenty of plenty of work but we have to remember that there are younger people um Juniors that may have been happy to also put Blood Sweat and Tears into doing the concept art for for your film and and may not have that in so no I think it's about repurposing and you yeah I think it's about repurposing like you know that's part of what we're that's part of what we're trying to address is that you know you know I I know better than anybody you you know I I never even went to University to study filmmaking I started my production company when I was 20 probably like 15 16 years ago now and just hustled it into existence and I never want to kind of cheat anybody out of work and and I guess it's about how you repurpose people so you know a big thing me and Jamie have been discussing is the potential for using Ai and animation and you know where where I would now come to Vlad where I would have used them in that in that sort of initial stage to sell the project now what I would do with a concept artist like Vlad is you know I might use stable diffusion to actually fund the project and then I will bring it to Jamie with that development funds and will grayscale out the entire film and if it's animated I'll then I'll then hire vlads to come and essentially concept art the first frame of every shot and then feed it back into into a you know an AI model you know like something more advanced but along the lines of EB synth and essentially take his concept art and create AI assisted animation where the delivery times on them are tiny and actually that lets us go well now I pay Vlad more one of the big things that me and Jamie have been discussing I'm insistent that I think AI will let us do is start to treat VFX artists like above the line Talent where we're going actually people show up to see these products and buy them for your work just as much as the cast name and the fact that VFX artists are normally right at the back end of the production after we've messed it all up and and the shoot didn't go the way it was supposed to and then we're relying on the VFX artists to essentially come in Apache up but with the smallest amount of money AI lets us change that workflow and essentially go well now we can put more funds into the VFX team themselves and treat them you know like we would above the line talents that's kind of what we're trying to do with it yeah that that's super interesting I I get it like you know[Music] Vlad doesn't have to and sorry Vlad that we're using you as a as a Twilight but let's say young young Vlad young Vlad doesn't have to work for you for free in the first couple of years to get the project off the ground and AI can do that and then you can pay current Vlad his actual wage to do proper work on the film that yeah that makes a lot of sense it's actually a really good angle to it and it like and at faster speeds as well that's the thing sorry yeah how do you feel about that Vlad since we've mentioned you so much just on a side note I'm I'm a 3D artist about that doesn't change Tom's Point actually yeah yeah it goes for everybody well I mean as Tom said it's it's all about repurposing yeah I know I'm not I'm not gonna be able to stop it that's it it's like a snowball that's it it's Unstoppable so we'll have to just adopt does it yeah but to go to maybe to Anthony who's teaching um young concept artists who are just kind of starting out in the industry um I I guess I I I'm old enough now that I'm I'm probably not quite 100 in touch with what it means to be young and starting out today um so maybe Anthony can like Enlighten us on on that front because as you know as we just kind of highlighted with Tom is that you know maybe entry jobs might be potentially more scarce or I don't know um at Anthony if yeah uh I think that uh it was all it's already difficult for young people to get a foot in the door as a concept artist I think concept art in particular I I mean it's it's it's it's a there's a lot of competition anyway in the VFX art kind of world and and games art kind of world uh but then with concept art I think the competition is even even greater because there are just inherently less concept artists working on a game or a film then there are 3D artists uh or compositors or lighters and um so I think that some of them some of them are very excited uh and they're they're wowed by it they're wowed by the possibilities of it uh and they want to explore it they want to see kind of kind of what can it generate I think almost in the sort of um almost in that sort of uh pure joyous young way of wanting to see wow I'm in the future and it feels like a science fiction novel um what what can this stuff generate but then there are others who who are legitimately worried because they think uh what what is the point of learning to draw uh learning to to to use something like Photoshop or or blender or whatever learning to develop an eye for art Direction when uh a client whether they might be the actual uh producer uh director of the game or an art director production designer of the film I couldn't just go well I don't really need you anymore to do this part I think we lost the job and so they were a bit scared of that I'm sorry were you going to say something uh sorry I think we we lost you for about 10 seconds oh no oh no um uh so I'm not sure where I got yeah so there there are students who are excited about it uh and they're probably at the moment in the minority but they they are genuinely excited uh about it because they see it as uh yeah it's an exciting thing you know they feel like they're living in the future uh but then there are others who are genuinely fearful of it because they think that um what is the point of me doing three or four years worth of University education to to learn how to to draw to learn how to paint to learn how to design when yeah um when when when a client uh it doesn't matter if it's in the games or the or the film or the TV industry can can just uh come up with a particularly good prompt and and get what they want I try and reassure them I think that as I mentioned earlier when I introduced myself I think for an illustrator it gen it is generally a really scary time because illustrators they're they're operating uh are on on a scary kind of level of generally freelance kind of uh level and they have very short deadlines and they're often their work is often not massively appreciated or they're given the credit so we've already I remember seeing that there was uh the journey Alex Jones trial when he was being uh being taken to court ensued I think it was I think it was the New York or the Atlantic um had a fairly high profile AI generated illustration uh topping their their kind of um their article and and it looked perfectly good it was fine you know it it worked and I think a lot of illustrators probably looked at that uh and were thinking oh my God you know this is this is it I could see the writing on the wall I I think the nitty-gritty of concept art is a bit different because concept artists will do illustration kind of work but that's not really necessarily the actual majority of what they do especially in a in a games company they will have to work with layout artists and level designers and and make sure their designs make coherence sense in 3D and are consistent and I think at the moment AI has a little bit of trouble doing that I think one of the telltale signs sometimes of AI art is it can look really cool it can look really amazing but there's a certain sloppy messiness to it sometimes that makes you think what what if I ask for that exact same scene but from 90 degrees and then 180 degrees how how Rock Solid could it make make that and I'm sure though it probably will do eventually I think that this is something that is inevitable now that yeah someone I think maybe I can't remember who mentioned it it is like the cat's out of the bag you know the the it's happening whether we like it or not uh yeah so um yeah what when it comes to teaching students they ask to use AI art in their assignments then I'm okay with them doing that as long as they're clear and open that they're using it and to be honest I don't think at the moment a student could really hide that they'd used air because it would be a bit like um here's my here's my here's my drawings here and they're really sloppy and terrible and you know the perspective's awful but then oh look here's my final product and it's this perfect rendered uh amazing prize-winning kind of image they I don't think they could get away with that I'm kind of yeah I'm kind of all over the place in terms of I'm I'm very impressed with the technology of it but I really don't like the business model of uh of these companies I think um they're they're very sketchy some of them yeah that's something we could uh move on to um unless anybody else wants to jump in uh after Anthony to to add anything to it and like for me I'm kind of newish to it so I haven't really dived into it too much um to really know too much about it but I guess from a previous side of things it's quite interesting because um you know like like Tom said about sort of you know because previous there's all sorts of levels of previous it's sort of can be grayscale it can be high-end previews the client wants super high quality or they want just like a grayscale Cubes but they want to see something but it's not final but it kind of is there to you know paint a picture and and tell tell a story um I can see you know benefiting the sort of speed because we we're going back and forth with changes and you know tweak this change that um what's it look like you know over here and when Tom said about sort of grayscale in the whole scene um and then being able to just paint um a concept frame and then generate that through the whole thing I mean that's time saving we're all about saving time and being efficient not that efficient is always good but um it's very interesting so I'm kind of just here to observed ask questions and just find out more and if there's any questions about previous and how we can see you know in the future then I'll happily answer it from a from a visualization uh perspective but yeah yeah um yeah I just want to I just wanted to chip in though about um yeah if I could just chip in about the actual I think the thing is these concepts are it's especially like students that are learning now they shouldn't in my opinion right like it's one thing being able to type in oh uh wizard with a hat on right and generate that image but actually you know the the the art of prompt creation is in itself rooted in an artist's knowledge you know like the more detailed a prompt becomes you can you can always tell when somebody you know who is a very talented artist with a with a deep knowledge and respect for where they learn their art from you can always tell when that person uses AI their prompts are so complex they're so informed they're all of a sudden there's nothing ropey about that AI image and then when you give it to somebody who doesn't have that artistic background and they just type wizard with a hat on you know that's when you get that traditional sort of AI look and you know I kind of feel like there'll always be value in that knowledge that you know you know student artists possess you know even if it's even if it's a you know you're still paying them that traditional day rate but they're generating you thousands of versions of what you need using their knowledge base to produce that for you but I'd still pay them for that skill set because it's one that nobody else on the team possesses and I think that that's something that's to take into account I don't see a IR coming along and and stealing that bottom layer from people starting out I think it it kind of gives them that window to apply the knowledge that they've learned in in a much broader way that gives the client more choice thank you yeah that knowledge without getting the amazing image the knowledge needs to be there to get to that Point yet so obvious the comparison of oh no someone without the knowledge and then someone with the experience and knowledge of of you know film composition what looks good what looks bad to generate a really good image I'm Vlad you were sorry just a question all the new Young students concept artists and artists in general have the drive and inspiration just to study hours and hours of like a composition drawing just to get to that point to use this just to generate images by typing words are they going to have that inspiration that drive that's the question I think many of them will I think that it's interesting when you talked about the complexity of prompts and I've been looking at a lot of the prompts that generate some of the more um impressive kind of images and they're using knowledges of lenses uh so they'll often be describing the specific f-stop of that they want for an image it's almost like the language of a cinematographer or a director rather than I think a traditional artist who who will think about depth of field but they won't be thinking about it in the way that a photographer a cinematographer or a director kind of thinks about it so I I actually see it as being something that that directors and cinematographers uh maybe storyboarders will they'll take to it very well I think that the the younger artists who are learning that traditional historical uh visual art kind of method where they learn about value and and composition in a way that they they don't necessarily describe in a verbose way they they're artists they they feel it uh you you can describe a certain amount to them but then a big part of it comes with their repetition of actually practically doing this kind of art and so they begin to feel it so I think that um I I think a lot of I don't think of many of my students would would think I want to do this course so that I I would be a really good AI prompt kind of person uh I I almost feel that's something that a film course uh it's it feels more of a film kind of thing rather than a concept art kind of thing but I guess I guess there's always the argument of of kind of like you know you can deploy image to image you know it's like let's say you know Carlo does me an initial design and I go wow I love it you know I need I need an entire pitch deck of this you know he can essentially train his local model on his particular style or on that particular image and generate the rest of them at the back of it and it's kind of like trying to find out how to be marry the two i i find I find that aspect of it the the way you can feed your own stuff into it uh and get variations far more interesting than trying to just get a pretty picture uh or or just a cool image out of a good prompt I find it way more interesting yeah to think yeah I I use my own stuff if I wanted to in all my own stuff to get a new idea that would feel like me I think that kind of opens up the door uh to talking about all of the artists that are currently in all the diffusion models that we use who have never agreed on being used in those provision models exactly yeah so there is definitely a very very big question to be asked there um I think that you know it's an opt-out process at the moment uh and it should be an opt-in process yeah uh and and I think even when you opt out your you you will be opting out of future data sets um you will probably still being the ones that are still out there at the moment what do you opt out of though the real question is as a concept artist right now a lot of my friends are I have some work on that station myself actually not a lot but some um you don't opt out because the reality is mid-journey right now the current model that it uses if you mention a specific art station artist by name you will get a style that is reminiscent of that particular artist name that means that mid-journey has had to scrape art station yeah um that is against the terms of Services of that station yeah I was looking into this I can't speak I don't know the data set that mid-journey uses but I know that the lion data sets if you read their FAQ about copyright infringement and and how they how they kind of scrape these images they they make fairly clear that they're not actually scraping the image themselves they're scraping a URL that points to the image and they I think it sounds like an nft yeah they're a lot like that yeah um and and I I know reading the FAQ it there's a certain amount of mealy mouthness to their description about how they they try and reassure you I suppose that that it's okay you know it's not really it's not like we're stealing there are but at the same time our system just points to the the location on the internet where that art is you know it never downloads it um so I that that kind of aspect to it I I think I I don't really I don't really like it feels like it brings me up to the other point of um there's uh yeah there's dance diffusion isn't there I think it's that they're they're doing a music dance music version of AI generated art and they're very clear with music very clear that they will only use uh Creative Commons copyright free music or music that's specially commissioned because they know that music Publishers will will definitely uh uh come down on them like a ton of bricks they already have actually I read an article recently that they already have lawyers yeah geared up to to fight for them the thing is the music industry has a a representative body yes that can and will fight for them yeah visual artists and you can totally see why why that is I mean the music industry comes from Big major Publishers where where licensing and royalties uh and the legal framework is all there and artists work for record labels that then get published whereas whereas visual artists tend to just work for themselves there might be other hobbyists or they might be Freelancers that there is no massive Union um that can can support them so it of course you know there's no incentive really for the for for these kind of AI companies to sort of proactively go yeah let's make it opt-in even even though I think that's silly in a way because I think there'll be many visual artists who would love to opt in who who would be very happy to contribute to it but of course there wouldn't be two billion um artists to to do that then everything would look like anime yeah well that's another thing isn't it it's um I think um I it wasn't me but I've read someone who they wondered if they could break AI uh by just typing AI into into mid-journey what happens when you type AI into mid journey and the the four different images they got uh were were basically the same White sexy Pretty Woman um who looked vaguely cyborg but with vaguely Asian kind of figures so you know that this it's like input output it's it it's being steered towards a certain demographic I I think one of the examples Kofi had in his slideshow was old man now nowhere in in the description did it say white old man but all the three images were were white old men kind of coming out of it so I I think that there's um these things are being steered towards a certain way as well they're being aesthetically judged the data sets are being aesthetically judged I mean let's let's all be thankful that all of these AIS didn't scrape for Chan well I wouldn't be surprised if they had probably did I think so it's basically a big collection of everything and I mean this is maybe something that's yeah that's capturing this kind of thing as well and I'm just wondering um what I wanted to ask you is um do you think would you treat like the AIS as like something yeah just like a compression algorithm taking all those uh 4 billion pictures and interpolating between them or are you are you more like because no one is if if you're opening uh your browser and going to Art station you can look at any image as a concept artist and use it as reference um would would you treat it the same or any I mean I'm interested in your mind does it feel like the same or is it is it more more as an interpolation between existing stuff because as a concept artists are as a director usually you want to you have your own feelings your own real world anchored um experience that you try to get into into your art you have like an intention that's that's going into yeah so there's yeah there's no and there's no intent is there at all happening yeah with AI at all and I think you're right it it in some ways it is a lot like the front page of art station in a way which I think is ironic and sort of um not surprising in a way is is that it it's scraping lots and lots of stuff so it's it's smearing out and you're getting a kind of an averaging view of of what nope is popular but what people are generating a lot of so uh and when I look at Art stations front page um I see I see technically really well done kind of art but I do see a lot of exactly the same kind of stuff but people are people are into um anime people are into um sort of what I would call it uh cheesecake kind of imagery uh that's sort of uh generic attractiveness of characters uh a sort of generic view of Science Fiction environments where where all the all the futuristic cities even when generated by by real people they tend to look roughly the same uh yeah and that takes that takes the discussion more in a way towards social media and what gets clicks but I think that's a really pointed way to take it because in theory we could actually think of AI and and prompts as being the next social media in a way of like instead of getting clicks and likes you're getting a lot of prompts that are requesting your style or a lot of prompts a lot of ads are being kind of generated and in art you know when when we look at past art when we were all disconnected when there was no real connection between us as humans African art had its own style and look and definition you know uh obviously you know the Egyptian art is still famous to this day for its very kind of particular look and style uh the Renaissance brought a whole bunch of new stuff and back then we weren't clicking like we weren't you know it wasn't a whole bunch of humans gone like oh I want to get more of this I want to get more of that it's just people doing something you know with some very limited information of what could appeal to the people immediately around them social media kind of helped popularize a particular type of art which I 100 agree with you Anthony has been you know if you just go on our station you'll see what that art style is and that's basically the representation of the people clicking like in the world right now who have access to computers and like video games and anime characters you know like it's it's a very small subset of Earth right now it relatively small subset of Earth we think it's the entire world it really isn't but uh we're seeing these and you know I hate to use the word but it's it is just the right word like these kind of ancestral um uh uh development of art of artists inspiring artists inspiring artists looking for clicks rather than looking for just art for the for the sake of it yeah I'm I you see I'm not I don't judge that kind of Art in a way it might sound like it did in the way that it came across but I think Styles and and Fashions and what people are into changes and and even before social media everyone uh everyone in Europe and America probably were watching similar kind of Hollywood movies watching the same kind of cartoons uh and and still like in superhero movies and superhero comics back then as well so they they probably all like that kind of stuff and I don't think there's anything inherently bad about that I find it interesting that air sort of super spreads that out because it's taking something that's already a very mainstream um tastes and flavors of what art is and it's it's spreading out even more I I would like it if it went the other way and I would almost want the AI to dream itself and to see what what does the AI think of when when almost when you when there is no human kind of prompt I remember before mid-journey came along well came along in front of my kind of um my eyes there was a there was an app on on my phone called uh Wombo I think it was called and I think it's changed name a couple of times now and you could basically you got a couple of gifts from that there yeah and you could give it basically um a prompt and then you could give it an art style like a fairly generic Arts like Christmas or um cyberpunk or something like that and I think it was almost trippier and weirder and more interesting because it was the technology was almost more primitive and it wasn't as good at understanding The Prompt and so the results you got looked more like uh a a weird otherworldly kind of dream but I think when you get something like stable diffusion which is technically an incredible and now can generate stuff that genuinely looks like a real photo um it almost then becomes to me of a purely aesthetic artistic kind of point of view less interesting because it's replicating photography now almost and um and it's cheap when it's more convenient and and I can see see the appeal of it um but uh you know I kind of want to see that the smaller quirkland stuff I'm more interested in that I suppose from a personal point I mean I I just I just want to jump in though because there was intelligent yeah sorry there was a there was a thing about how AI you know AI art hasn't got intent and like and I feel that's kind of discounting what I can see on you know especially in the you know the year of r d that we've been doing you know there's actually sort of this emerging kind of it's it's like uh it's like using a new tool it's like it's it to me it's like saying you know when artists started to use Photoshop and started to create digital art that you know what this is this is a tool that's you know it's not real art because it's digital and it's like I feel like actually you know when you when you're using artificial intelligence in all sorts of ways you know artistically speaking you know and I'm talking directly as a filmmaker here and earlier we spoke about you know using my knowledge of lenses and light flares and composition you know and then coupling that with AI artwork to create something that often feels like it came directly out of my head now you can always say well that you know where where are those images coming from I think all of us as artists are influenced by the imagery that you know you know art is definitely imitating life and life is imitating art at this stage and I don't think any of us know where that line is anymore you know and and I kind of feel like what AI is able to do if used correctly is create it's a tool for a new style of artists to generate you know a kind of a new way of getting across their well point of view that existed in a way that they weren't able to access before so I kind of feel like we shouldn't discount the power for for air to kind of use somebody's knowledge base and help them generate a world view that perhaps we haven't seen that they wouldn't have been able to achieve traditionally absolutely yeah I mean with the intent uh comment um what's more like without the human behind it it's probably just like a random intent that's trained on one of those billions uh images and you just get like a random version like a random thing from from this big data set not that there isn't any intent uh originally in those images I'm just thinking if we if we really use it as a tool and there's someone really steering this AI towards like this intent or like this idea uh then then I think it's it's definitely extremely helpful but just just like randomizing things and just looking at things that you like it it might not be the yeah it doesn't really create this art that that you're used to uh when when you yeah from scratch yeah I guess what it's doing is it's letting the idea yeah it's sort of letting people you know like traditionally artists will always be inspired by other arts and we're always you know reference or have that you know yeah it's it's very hard as a director to go on set and not be thinking about those things that influence you to to you know want to make this thing and then try to find your own space within it I guess what AI art is allowing to happen is people with no Artistic integrity who you know to Simply replicate imagery that they are seeing and it's kind of like Simon's point to some degree I think you'll start to see the gap between genuine artists who are using Ai and just people who are trying to replicate a fad that they've seen online and that sort of that sort of gap between the two will widen I think over the next year as more and more people use this yeah I've got a question um on YouTube from Ian Naylor um he said um that plane with the prompts here on my journey for the first time he's found that he's refusing to use certain words and he's just wondering what what do you guys think about the AI protects him the artists from themselves so you can um you can you can tell that user that uh using stable diffusion he'll be just fine using whatever it was he wants to use um disco sorry uh mid-journey in particular being a commercial product that uh hides um its model so its model is not really public they've actually implemented stable diffusion on top of what they've already got but their original model was never made public and um was probably very much inspired by Art station uh I'm I'm just gonna you know wager here okay um and because they came out as a commercial Pro product right off the bat not an open source project not a research project in any way shape or form I think um they essentially just made a bunch of words uh you know unusable um the reality of it is is if you use any of the other models out there you can put anything you want uh in the search terms and and you'll get some horrible I'm talking from experience here it works um yeah so we've just added Davis Dave is a VFX students um he was watching us so yeah he I think he's got um some opinions so yeah if you can just tell us yeah yourself about yourself and where you study and where you study okay um My Name is Davis I'm a visual effects student at the Savannah College of Art and Design and I'm pretty heavy into AI right now I'm I've played with mid-journey and um stable diffusioned Ali to um but I really want to push the boundary and go beyond with um like 3D ml stuff so hopefully merging some like deep faking with the diffusion models yeah deep faking deep faking is close close to my heart actually I've been playing around with that for quite a while um uh I've done a bunch of tests actually there might be some on my LinkedIn if you check it out like where I just made a very basic metahuman version of myself and then defect it with a model of my face I think deep fake is is interesting it also has very similar kind of concerns when it comes to um getting the the the rights uh to use somebody's face in a certain way there are a lot of question marks there all of the defects I've done have been using my and my friends faces so we're in the clear um but the uh the the concern there when it comes to deep fake is yes uh like we we live in a world right now where you could load up enough movies of a certain actor or actress and make them do whatever you want then create a video of it the reason I'm not and that's just me personally um the reason I'm not that worried about the the morality of that side of things is that we've had Photoshop since the early 90s um if not lay ladies but and and everyone was saying okay you can now paste you know any Spice Girls face on any thing and you know the entire world is going to collapse because anybody could be put into a bad situation the reality of it has never happened because journalism can't just write off a single picture it also has to have uh people uh Witnesses and and documents and things to prove that whatever photo you're showing is true so with deep fake it there is a weird kind of scary um kind of uh future world of like you anybody's face could be put in any weird situation but we've kind of already lived through that in the 90s with Photoshop and it never really went anywhere um I think I think with the Deep fake stuff you know it's got real commercial value you know again from a from a producing perspective you know for me to get anything greenlit so that I can be in a position to hire VFX teams you have to cast bankable talent and and you would be shocked to find out just how tiny that list is you know there are actors that you would be convinced will will land you a North American Sale so that you can you can fund your project and they they worth nothing you know and this list is so tiny and these actors are booked out nearly all the time so that's why it's so difficult to see independent film getting made but I think the Deep fate stuff with AI is going to allow it especially when you couple it with real-time rendering and ue5 and stuff he's going to let us exist in a world where I can pay an actor to use their likeness you know and they'll have final say but it's like okay I have a different artist in and we're using artificial intelligence to to defeat their face to deep fake their voice on and then they'll have approval of performance but that essentially will get these films greenlit so there is a world where actually this deep faking technology generates more green lit movies which means more work for all of us you know inferior and do AI just does AI sort of because there's many companies and Studios that sort of use in AI to generate information of what's going to be a successful movie and you know they've got databases of like if so-and-so's in this film with a cast with somebody else then there's a 75 chance that we're going to make a lot of money just because AI is learned what is a successful trailer or you know a successful you know type of movie and let's just generate those type of movies I guess where's that yeah look you know where the really scary thing yeah where the really scary thing is going to happen Jamie is you know a year or two from now when all of these models are able to generate you know okay this is a pretty good temp score okay this is a pretty good uh you know animated version of the film you know I think we're going to end up with where if you think about what a script really is it is the most detailed text prompt that you could feed an AI on a project you know and so we're looking at a future where you're going okay we have a script with deciding whether we're going to spend two and a half million dollars on this let's give it to the AI and then we'll watch the AI version of the film and if it's no good then we won't give him the money and I think we are Marching like at rapid speeds to somebody essentially housing all of this stuff into one big umbrella and offering that as a service to companies like Netflix I'm kind of now I'm saying it out loud wishing that that's what I was doing because it feels like there's going to be a ton of money there but you know that is that's where we're heading with this and that is going to make actual Artistic integrity within movies very difficult when that comes along because you know first impressions on a film is everything fun fun future films well yeah it's gonna be it's gonna be real pain yeah and this is it but you know you're going to end up in a situation yeah you're going to end up in a situation where those bigger bigger movies are getting more and more homogenized and then there is this space for AI powered uh Unreal Engine real time using you know even in Virtual production times I can foresee very quickly where we're going to be able to generate something in stable diffusion like a a virtual production background in 2D feed it to something like Google's infinite nature and within two clicks of a button like okay now I have a virtual set obj I can open up on a VP stage and off we go and I think that that kind of stuff is going to increase the market value of independent Productions but it's still it's still going to make you know that this has to be somebody gatekeeping you know to try to make sure that people like Netflix aren't AI generating whole movies to decide whether their value is in whether they get funded or not so I guess that's kind of where my vested interest comes right now in trying to steer that model in a direction where it's like yeah this is just a handy tool that we use to sell the film in the first place without it becoming the point of the movie yeah so that would be the the more detailed descriptive you know writing to generate a trailer or a movie becomes the actual movie is I mean I I think I think it's going to be a depressing future for script readers when this kind of tech becomes ubiquitous in the next five years let's say you know and you know you're not going to get your script green lit unless you've written it in such a descriptive manner that it will generate a good AI temp film for them when they watch it to decide whether they're going to fund you or not you know and then it's going to make script reading like kind of this boring arduous task where it reads like prompts so you know this isn't just a thing that's going to affect the art well it's not already yeah I mean it is but it's already terrible yeah you know but this is I guess that's an interesting argument you just said the assignment because in weird way ties back to what Anthony was saying earlier where you know when your background is in art and or VFX you know you become very sort of like protective of The Craft right and when your background is a writer like me you become very protective of the craft of screenwriting but to a panel of VFX artists it's like that's the worst part of my job is having to read the script it's so boring right and so it you know I think AI is encroaching on all of those spaces you know there's no there's just no way around it and it's it's okay how do we how do we keep some level of Integrity in every department and make sure that we're actually using the AI to pay you guys more not pay you less that's got to be the key to this I think I think a big key is is definitely keeping people aware of what's Happening um yeah because the thing is is it's really easy for a 14 15 year old right now who's just getting into you know discovering what they like in terms of entertainment um to just you know potentially fall in a Vortex of self-replicating AI stuff that yeah wasn't that really didn't come out of the person it just came out of enough social media retweets enough you know AIS taking that in and then generating him generating more and then it's really easy for us as humans to just dig ourselves into a single thing because it's comfortable and that's what we like and that's why you know some people might just like at a novel type entertainment and then you try to show them an old French movie and they'll just immediately recoil without even watching the movie they'll recoil immediately they're like I'm not gonna like that I know I'm not gonna like that because I've spent most of my life liking this um and the the the main thing with AI is that it's going to accelerate the generation of that stuff so you know at the moment um you know I know some people who love uh scrolling Comics uh these these Comics that are designed for phones and they're like you know uh funny whatever like just and they have a particular style they kind of they always have a style that looks like that phone comic once the that is probably the very first um point of attack when it comes to AI uh because that was some awesome AI Comics out there already there's there's some AI comic books out there that are great okay yeah they're right there I mean I think I've read like 25 this year that I was like okay that's actually pretty good um I think the thing that we gotta I think that conversations like this are super important and I think that you know educating you know people who are aspiring to work in the Arts whether that is as concept artists or VFX or his film directors like really educating them on on the power of using artificial intelligence to get their Vision across is important because if we don't like you know the thing that worries me about AI is that 15 year old that you mentioned who gets stuck in their own Echo chamber where they make one image with one prompt and that blows their mind and now they only work on variants of that prompt and so they never discover their artistic style that they would have discovered through trial and error you know and that's the thing I think that you can avoid as long as we're educating on this is how you use AI to express yourself as opposed to get trapped within that art station Echo chamber that you know that I think I think is the thing we're all concerned about you know yeah sure as as an antheas as a lecturer with concept art and stuff have you found like similar to what Toms just says interestingly have you found any students kind of like you know using mid-journey to generate an image that then Sparks inspiration to go okay I've combined Blade Runner with Mad Max and then now I'm thinking I've got an idea that's kind of amazing I'm going to go off and do something for myself not get stuck in that to use it as you know prompt words to inspire themselves to then generate something manually yeah I'm supervising a finally a student's project at the moment in his reference uh board that he's created for himself contains um photography film Stills that he wants to to get and but also some mid-journey prompts of some Landscapes that he wants to get uh so yeah that you know they're using it and I think that um if we can somehow completely divorce the the ethics of how these things work away from that and we just only talk about the practicalities of how useful it is then yeah totally I mean it it works like um one of the techniques we would use to do to get lots and lots of different photos a little bit like I suppose the AI would do and collage them together and put all the crazy blending modes on Photoshop and then do a little sort of window that you can look through very much you know when you look at the you look at the clouds and you see castles in in clouds and things like that so they're using mid journey in that kind of way and I can see how it's really useful in in that way to use it right at the beginning to do that initial total Blue Sky kind of thinking but I think then they come unstuck on as artists and so not as directors not as screenwriters or producers as actual visual artists is then how do they progress from that point um once they've got that idea what do they do do they do they then use that as if they just created reference and then they just traditionally do there are as usual or do they continue to try and um manipulate and and and fine-tune their prompts to get them what they want uh you know but it's so new it's so new that this is the first year that that this is even slightly become something that that students deal with and I know that they're the students using it in different ways that one student is very much she's she she's very excited about it and she wants to make it the focus of her entire final year kind of project and I'm interested to see what she does but she's also very paranoid about it she's also worried that will people judge her for doing for doing it this kind of way and and I try and tell my students that mean you're a student so you're not doing any commercial work so don't worry about that it's not like you're making money off of this so you you don't have to worry about that use this as an excuse to to research it you know you're in Academia you can you might as well fulfill that kind of that part of that task of being at University and and research and see kind of how it works I think as humans um when we care about the art that we're looking at whether it's writing or photography or we the the people who care about the art intrinsically they care about the effort sometimes more than they do care about what they're looking at oh oh yeah yeah and I think that's that's a whole thing isn't it the the process as an artist even if you're a commercial artist even if you're doing stuff a concept artist is they're not like an artist with a capital A they're not hanging stuff up in galleries uh it is like a it's part of the pipeline of of production but there's still the process of it is is a big part of it it it's a pleasure in it there's uh to open me when you get positive feedback yeah like I've took me 10 attempts at creating this piece or this animation and the directors like Blown Away by it and it kind of gives you that boost of kind of whoa that feels really good and I I think that that's um I think that if you're not an artist and you you see someone who's really good at Art they're a little bit like a little magical creature and you don't really know how they do it but you're impressed it's almost like a trick it's almost like were they born with Talent OR how hard did they have to work to kind of get this and and you even if you're a director even a producer yeah like you said you you know you can tell a good bit of art that's had a lot of work put into it and that can't help but impress you and I think then but I think there's also a certain amount of envy that goes on you know when you see a magician do a trick and then like showing off that you what's the trick I I want to know how it's done and now ai will allow people to who don't go through that process who don't go through the years of hard work and boring practice boring boring practice of having to draw the same fundamental things to get good at it now they can kind of do it they can do the trick as well and I think it is it's inevitable it was always going to happen but the way technology was going and I find it fascinating to think about but I think it it you've got to remember we're dealing with humans you know it's like uh yeah it's a bit like um it's a tool and I think in VFX it's a great tool brilliant for doing all these things we've talked about in Nuke and Houdini and blender but as an artist it's like they were working in the car factory and initially Photoshop was a brand new type of spanner that just did the car manufacturing better so fair enough you dab with that then 3D comes along and now we've got another type of even better spanner that you can use uh but now ai comes along and it's not a spanner anymore it's it's one of those robots and and now the robot's making the car and you're really good at using the spanner but you know no matter how quick and Brilliant you are with those spanners um maybe you've got to learn how to control the robot but but you don't need as many people to control the robot as you do when you when it's just a people and you know and I I don't want to be accused of oh I'm against progress or anything like that but you know it's going to have an impact it is going to have an impact on people Carlo that actually that's a pretty good segue into B2B which is you know a lot of what you work on uh and and a lot of what you work with and um when when you mention B2B um and it might not be the right uh thought to have at that point but I did think of of uh companies that sell stock images um you just don't mention any names but we know stock companies and when my journey first popped up I have to admit that um I immediately thought if if I had shares in a stock image company right now I'd probably pull them because um because that's kind of their business model is to get honest work like things that took a long time to make and then put them up for sale to a wider audience instead of just one to a single buyer to a wide audience to then make it cheaper but generate more hopefully more money for uh the artist that puts it out there um with AI requiring requiring very little effort it could either play into their hands where they actually decide to lawyer up and essentially just take over some of the market by creating custom models that nobody has access to uh based on their own internal database um taking everybody else who's used that database to make a model to court to make sure that they're not using anything that was on their stock website um and then condense their power into that it it it condenser power more or um you know if they don't catch up it could all unravel before them and essentially people will just use a service that can generate stock images and all of these uh stock companies will just be kind of left out in the cold it has that got any kind of does that ring so yes well I think I think um I think from from my point of view is if you're working for a like a big a big company and they've got you know they've got their budgets they always want to have something for us you know as cheap as they can and as quick as they can so if they've got if they can um come to someone and say you know because a lot of the time I have to sort of visualize and Abstract uh you know an abstract idea it's not necessarily a product it's it's something that they as a company you know service they're not even a service but you know the idea of the company and they want to sort of visually Express that so um they might come to me for some bespoke set of images for a website or something and I can see that maybe AI would you know someone in their marketing department instead of writing an email to me with the brief they could just write that into a into an AI you know generator and then they'll get what they what they want necessarily but I suppose on the flip side of that um going to what Tom was saying about it being at all um I couldn't do what I do now without the sort of technological advancement of things like blender and Photoshop because I can't draw as well you know I can't get the ideas out of my head and put them on paper with a pen so until digital came along I couldn't do that and a lot of people see that as a well that's not real art because you're you know you're pressing a button or you just using a mouse it's so it's I think I think everyone on this panel would agree that this is real out they're already yeah that's right yeah I know now but in some of the uh industries that I've worked in particularly because I came from um really long journey from shop floor to um draftsman to you know to digital so even when I went from you know doing drawings on a computer you know you'd get the old draftsman coming up and saying well you just press a button you know it's not they didn't think there was the same amount of work involved yeah I think it's kind of I think that's dropped off on digital art a lot now um you know people do recognize it as an art and it is difficult to do and I suppose with the AI now it can sort of take it back that way where you literally are just you know pressing a few buttons but you've still got to have the ideas haven't you yeah you've still gotta have that that idea to get something out of it I guess yeah which is which it can't replicate for now for now but it's moving fast yeah uh cool yes we uh how how are we doing for time just out of curiosity yeah I mean happy to conclude sin say um one thing because sorry can I ask you something uh you you said um you you think like um yeah stock footage sites are like in a bad position but I I would argue they are like in a really really great position because they have the one thing that the AI needs like the data set which sets sets them apart from everything else because I mean of course like the scraped data sets uh are maybe useful for research purposes but if you have like high if it goes into like higher quality images and generating like 4K or 6K or whatever I would argue that like stock images are like the best way to go without uh watermarks and all this uh stuff so um and I mean I I don't know if you had uh there's like this partnering up from Shutterstock and image gen uh and they uh they actually yeah going into um maybe that's something to discuss as well if if we still have time but if not then don't worry but uh um they actually paid their like certain amount if to to the uh people who uploaded the images when when it's used for training for different prompts or something like this I I didn't go like completely in depth uh um all right but but I I think for me it was interesting to to see that there's some reaction to this yeah I mean yeah that that's that was my my thoughts exactly they're either going to to lose out or do really well and if they if they scramble fast enough to concentrate the power that they currently have um then they'll be fine um but I'm sure a few of them won't be I was going to say though I kind of agree like it is a story of the world yeah I was going to say I kind of agree with Justin in the sense that I think that weirdly the the stock sites might be the answer to some of the concerns that Anthony brought up earlier as well like you know when that you brought up Simon like how do you financially you know kind of look after the artist and think we've Shutterstock as a company is they already have this giant user base that are paying a monthly subscription fee so they have like front-end users and they have the the actual artists on site too so you know if they only use imagery from there from you know their own data set then exactly that if you generate an image using their at the front end and we and they specifically use your picture to train that image then they have to pay you a royalty fee similar to Spotify to a musician and I think that that is kind of the only functioning business model that would look after both the front user and the actual artists that have been replicated yeah I I am not probably not knowledgeable enough to know whether or not it's measurable um like something like mid-journey or you know disco infusion stable diffusion if you type in a couple of prompts you get a lot of random stuff and sometimes very unexpected stuff how measurable are those biases um that would probably take somebody a lot more knowledgeable than myself about how these uh models are made um it could be that a company like Shutterstock has to have a type of but like how how they can mechanically actually measure who gets what like if you just put like 20 artists names in inside your prompt and and looking at that resulting image and like what percentage of that image is for this artist and that artist and that others that that becomes very complex I think uh I I think it'll end up it'll end up just being a it'll end up just being a flat fee for everyone it'll be like a flat fee where it's like okay everybody gets this flat fee you know and and it won't be as profitable as you know like you're used to it maybe yeah I mean I would have used to have gone to the record store and happily paid 13.99 for 11 songs on an album now I get every song ever for 11 quid a month you know so it's like we're gonna end up unfortunately in the same position where I can think like like art station will have their own AI generally we all know what he was saying yeah sorry sorry yeah that's one green up yeah no no go for it Tom because the audio is doing that funny thing again yeah I don't know I'll fix it[Music][Laughter] yeah I was wondering just like how we have end credits at the end of a movie would it be ideal for an image or the data set to execute definitely yeah it's like it's interesting that you mentioned about the the Shutterstock and stock images well they get credited um you know if if a publication uses uh imagery then then you'll often see the stock photo provider credited underneath the the photo in a journal and a newspaper or something like that and I don't think it's like I think it's totally doable for them if they can scrape billions of images then I think they and they know what URLs they're coming from then I think they can probably figure out somewhere of credit and I think that I think credit even without money I think credit goes a long way I think that there's something that stings about about maybe seeing a bit of your art or something and it's just not been credited and it's and I think that the problem is I don't think the public really cares that much um but I think artists kind of do care uh and and I don't think that it would be too difficult for them to do that I just think that you know the people are unwilling to credit sometimes because they they don't want to sort of dilute their Vision it's my vision I came up with a prompt and I don't really want necessarily all the credit there I mean and taken to the the most cartoonish kind of level is is um sorry I'm going to mention it is going to come up at one point he's going to come up Elon Musk um is that he he infamously said that he would never credit any kind of art that he found on Twitter and that artists shouldn't be credited and and obviously that that rankles artists an incredible amount but I think a lot of people say they either agree or they just don't really care but I think it wouldn't hurt to put the credit on and I think that um I found it puzzling that people don't want to credit people I I suspect I've got my armchair psychiatrists the suspicion of why that might be but I'm not trained enough I mean would a great Kofi when you when we've worked on like a movie and maybe we've done like a week's work of like cover on I don't know a big movie and you don't get credited it's that type of like what how much do you need to work on it to get credited because if you if I've worked on a shot and it took me a week unlike you know like a postfish shot or and it's pretty much that that shot when they've shot it yeah but you didn't work on it for four months so you don't get credited I guess that's like similar you know how much percentage justifies how much credit that person gets because even like one week's worth of work should be recognized I know the public won't care but you as an artist when you're watching it going that week was like hard graft surely it would be cool to just get that recognition and just have a little name and at the very least some you know you can cram so much text into metadata you know at the very least you know it should be required to be in metadata um you know I know that like social media they just they scrape metadata out so that's not going to work if you're posting on Instagram or Facebook or Twitter or whatever you know um I have my own opinions about all these companies but the the reality is like if if you can't cram an entire massive prompt uh at the bottom of an image which sometimes can be a little bit unrealistic I guess because they can get really long if you get very convoluted about it um potentially you know at least isolate artist names you know so you don't need to credit a house in a prairie with a cow you know but but at least mention if there's an artist name that's in your database as a recognized artist um obviously a lot of people have the same names so that could be problematic as well but um I agree in that like Anthony you make a very good point that like you know Shutterstock will still have a little Shutterstock logo at the bottom of a image on of BuzzFeed you know on the BuzzFeed front page you'll still have that little from from Shutterstock um or courtesy of I think if BuzzFeed starts generating most of their images from AI I think it's fair game to expect them to at least uh mention the artists that they used in that prompt if they use any artists if they just say I want to you know a mouse on a bicycle then fine fair game whatever but but if they go I want to Mouse on a bicycle in the style of a current artist struggling artist than they should mention that may but do certain artists want to be credited if if it's like a like some other artist doing their work because I yeah yeah sorry actually you make a very good point so I have a very good friend uh whose name I went mentioned but he he designed a very famous Throne let's just leave it at that he designed a very famous throne in a series that a lot of people have watched um and when the former Journey kind of became public he was getting a lot of Twitter mentions um and it was art that looked inspired by these art but it always said bye his name and because it was getting added he would he would get mentioned and then he'd just be like I didn't do that if and he didn't like it he absolutely hated everything that was being generated he thought it looked terrible he was like I would never paint that I didn't make that but everyone on Twitter thinks that I've done that uh and he was he's still very angry about this stuff um and so yeah intention to create this image so yeah yeah in some way it's not happy yeah yeah yeah that's a good point yeah all right so I guess yeah we can we can sum it up and yeah I guess my last question is can we live with it I don't think we've got a choice yeah yeah inevitable I'd say yeah all right isn't the project it's a bit too too fast for us I mean as a humans we've got certain capacity of absorbing absorbing stuff I mean now it's okay but in five years probably AI is going to be maybe 10 times as more powerful as today so and will it deflect us as humans to think that oh what's the point in learning that because yeah next week exactly I think we all just have to hope that the the AI keeps a few of us in a zoo for prosperity and you know looks after those that are left yeah maybe they can use it as batteries to power yeah perfect yeah I feel like that's quite a good film possibly that could yeah people yeah let's let's talk let's type it up Tom and let's let's pitch that I've I've just told the AI scriptwriter to generate that one just there we go okay get a buyer all right great well yeah it's been it's been a great discussion and yeah so thank you guys for for the timing for for sharing your your thoughts my pleasure thank you Kofi for pulling it together yeah thank you it was awesome awesome to chat with you all yeah bye guys good night bye everyone bye guys[Music]