Prior to venturing into VFX and CGI – his childhood passion – Method Studios’ VFX Supervisor Ivan Guerrero had a career in genetics and molecular biology, working in a lab at Harvard Medical School on the Human Genome project. “I’ve always had a love of nature and science. There’s a rigour and discipline one learns to use when analysing and dissecting how natural phenomena work, and part of that comes from learning the scientific method, and part of that comes from doing practical labs.”
In the early 2000s, Ivan had been studying 3D animation at the University of Colorado, and had become well versed in hair simulations. This led to him working as a technical director on a CGI film at Curious Pictures in New York, where he learnt about asset management and the power of programming and automation in the animation pipeline. His career in New York took him to Framestore, MPC, The Mill, Hornet and Rhino, getting as much on-set experience as possible to observe how VFX supervisors worked.
Ivan joined Method Studios in 2013 as a lighting technical director, and ascribes his development into a successful VFX supervisor – a role he officially took in 2019 – to the VFX supervisors who mentored him over the last decade, including Dan Glass, Sean Konrad, Jim Rider, as well as creative directors like Jon Noorlander and Johnny Likens.
Speaking to LBB, Ivan discusses the changes and challenges of the VFX industry today, why his team dissected a chicken for a beer ad, working on ‘Black Panther’, and how new AI tools and other emerging tech is making the impossible possible.
Ivan> I think there’s a few misconceptions among the public about VFX. A big one lately is that VFX somehow diminishes the quality of the movie or TV show because it wasn’t completely captured ‘in camera’. Another misconception is that VFX = CGI, and that’s just not always the case. In fact, one could argue there are a lot more non-CGI VFX shots done for movies, TV shows and commercials than shots with CGI.
Ivan> Invisible effects can be quite satisfying to pull off, and they are some of the most important work we do in service of storytelling. There are many challenges that come with this type of work, including a lack of planning (or very little) before a shoot, the need to gather as many real-life references as possible, getting solid match-move and paint-over work done, short turnaround times, and managing viewers' expectations. The hardest situation to be in is when production doesn't have the time or budget to plan and execute something, but they must shoot it anyway, and it becomes a ‘fix it in post’ situation, often with a very short turnaround window. Invisible effects require a certain amount of time, budget and the right artists to execute, and usually one or more of these elements are compromised.
Whereas big, glossy effects are a different beast altogether. They share similar challenges to the invisible effect shots (planning, time constraints, budget, etc.), but the execution often needs to work across many shots or scenes. This necessitates the creation of modular setups that can scale, a solid pipeline and strong leads and supervisors to bring it all together. These types of shots typically require many teams to pull off (concept art, pre-viz, integration, animation, rigging, texturing, lookdev, FX/simulations, tech animation, 3D lighting, compositing, etc.). This demands a great deal of coordination and information and data sharing to execute well and meet the filmmakers' demands.
Ivan> To ensure clear vision, utilise references, shooting boards and previsualisation for complex shots. Include the DoP in VFX discussions, especially for commercials. VFX supervisors are allies, not obstacles; build trust with them for a successful production.
Ivan> The very start of a project usually begins by understanding the scope of work. Usually, things start with a producer or EP, a script or treatment and in the case of films and episodics, a bid package. I’ll often start to break down the script or treatment and try to sort what could be practical, CG, 2D, digital matte painting, miniature, etc, and I start to come up with rough estimates of man hours for all this work. This allows us to sort what staff and freelance artists we may need and start to plan out how many people we’ll need to be able to deliver the work by the delivery date.
We’ll also need to work out with our technology team what our tech demands might look like. Things like our render farm needs, GPU and workstation needs, software and licence needs, etc. The producer and I will also work in conjunction with our comp, FX and CG supervisors and creative directors to home in on the most effective and impactful way to carry out the various tasks.
Sometimes we’ll need to seek a compromise with the agency and/or director based on budgets and short timelines for delivery. Once scope and process are aligned, VFX production kicks off with crew formation, scheduling, and asset delivery roadmaps.
Ivan> We have a term that goes by a few different names. For us it’s ‘CBB’, which stands for could be better. As you run through dailies and see the progress of work, and have a rough idea of how much time you can invest in a shot, you have to start making calls on what you can live with not being perfect but acceptable – what you’re willing to invest both financially and physically/emotionally to be as awesome as possible, and what the client is willing to accept as well to meet their artistic and/or commercial goals.
At the end of the day, we are providing a product or service to a client, so they dictate when it’s ‘final’. There are times when a shot may have something that few people notice, but I’m not 100% happy with it, and I know that with just a little more effort, or a render fix, or a simulation or animation tweak, we can turn a CBB into a solid final shot, or even a good final shot into great final shot. Sometimes it's hard to let go of those shots, but you always do because soon enough new work comes, money runs out, or the show has wrapped, and you must move on.
Ivan> I don’t think my answer is a shocking surprise but for me the exponential rise in AI, machine learning and real-time graphics is clearly the tech of the future and incredibly exciting for the future of VFX. These tools can help us achieve greater clarity in visual storytelling, particularly when it comes to speed of execution and iteration. It’s not a secret that schedules for productions keep shrinking, imaginations continue to expand, and bids are getting increasingly competitive.
I’ve also found that when we are pitching for work, AI tools from generative AI like Midjourney or Stable Diffusion, to the Nuke’s Cattery AI and Flame ML tools, to offerings from Wonder Dynamics, plus the use of packages like Unreal Engine allows us to quickly put together some strong presentations that would have been almost impossible to create even just a few years ago with the time constraints we continue to encounter.
Ivan> Real time has become quite a versatile tool in the VFX toolkit, gaining traction in virtual production (VP). In some ways, VP front loads the VFX process by forcing directors and producers to have a clear idea of what they want in their virtual environments since it needs to be built before you start shooting. The trouble comes when the virtual production content is not quite what you want, and depending on the VP stage, doesn’t give you all the coverage you need so you still need to extend or sometimes outright redo the background anyway. However, it offers a much better experience for actors and excels at environments, effects, DMPs, and pre-vis. Real time also streamlines workflows and integrates with AI models for crowds and digital doubles. While powerful GPUs (the key hardware) are energy-intensive and difficult to source. Despite limitations, traditional VFX artists are impressed by the potential of real-time graphics, seeing it as another tool to tackle tight deadlines, shrinking budgets and rising expectations.
Ivan> My background in genetics fostered a love of science that I transitioned to VFX. Research is paramount for me. For a digital eagle project, I dissected chickens with my team, consulted experts, and studied real falcons.
We were tasked with creating a digital eagle for a Tecate advertising campaign. I knew a bit about bird anatomy, but I was more versed in mammalian anatomy than avian. I felt the best way to understand the process was to dissect a chicken, like we would in the lab, so I took our riggers, texture artists, modellers and animators to our kitchen, and we carefully dissected a couple of chickens to not only see the anatomy, but also push and pull the various parts of them to see where the limits were, filming and photographing the whole process for later reference.
I spoke with supervisors who had worked with birds before at Animal Logic and Blue Sky Studios and learned some of the things they did to cheat these types of actions. To ensure the eagle's realism, we went beyond reference videos. We obtained high-resolution feather scans from US Fish and Wildlife and even observed live falcons to understand their hunting behaviour. This meticulous research is what allows VFX to create believable creatures that resonate with audiences.
Ivan> If I’m taken out of the story by something that’s seemingly a visual effect, then my supervisor ‘spidey sense’ starts to analyse the picture instead of enjoying the picture. When I go into a movie like the latest James Cameron ‘Avatar’ film, I’m already in that analysis mode, so it’s up to the filmmaker to pull me in with their storytelling and characters so that I stop thinking about how it was made and just become immersed in it. When it’s well executed, I’ll often go see the film a second time and watch it from a craft perspective.
Noticeable VFX often suffers from inconsistencies between live-action and CG elements, like mismatched motion blur or lighting. Unrealistic matte paintings lacking depth, visible seams where elements are combined, and unconvincing simulations (even for fantastical creatures) can also be giveaways. Animation flaws include characters that don't move realistically for their size or lack subtle details like hair and muscle movement. Finally, lens flares that clash with the live-action footage, especially in anamorphic shots, can break the illusion.
Ivan> I was enormously proud of the work our small Method New York team did for the first ‘Black Panther’ film. Our Method Vancouver office was overwhelmed by a huge last minute ask from Marvel, and there were only two months until the film released. The Vancouver office needed us to concept design, simulate and render the energy shield elements for them to comp, and was used in the climatic last battle of the film. We had roughly six weeks to pull it off and had weekly discussions with director Ryan Coogler and the Marvel team to discuss the intricacies of how they would deploy, look and break down. It was a wild ride! At the time, none of us knew how the film would be received. I recall we delivered everything about three weeks before the film was released in early of 2018, and the rest is history.
Ivan> There was an incredible commercial/PSA from Sheba, called ‘Hope Reef’. I believe Framestore did the work, and created a photoreal underwater reef ecosystem that was breath-taking and so well executed from a photographic perspective – the CG is a treat to watch.
There was also a commercial for Ladbrokes, using footage from the first ‘Rocky’ movie’s iconic training run up the steps of the Philadelphia Museum of Art, but incorporating various hordes of fans (and vehicles) joining with Rocky on the the run up to the climactic moment when he reached the top of the stairs. It was a VFX treat and very well planned and brought to life.