Issue 00 — Spring 2025
/2

Christina Caspers-Römer on the smart tools behind the scenes that make space for creativity





Christina explores how cloud-based pipelines, game engines, and AI tools are reshaping VFX workflows. From closer client-vendor collaboration to rethinking IP and automation, the conversation reveals how innovation and creativity continue to make the real difference amid increasing complexity across the film production ecosystem.




SHARE THIS
Facebook / LinkedIn / X




GALLERY
© MARVEL, 2019. Image courtesy of TRIXTER. 01/04
© MARVEL, 2019. Image courtesy of TRIXTER. 02/04
© MARVEL, 2019. Image courtesy of TRIXTER. 03/04
© MARVEL, 2019. Image courtesy of TRIXTER. 04/04
© MARVEL, 2021. Image courtesy of TRIXTER. A01/02
© MARVEL, 2021. Image courtesy of TRIXTER. A02/02
© MARVEL, 2021. Image courtesy of TRIXTER. B01/02
© MARVEL, 2021. Image courtesy of TRIXTER. B02/02
© MARVEL, 2023. Image courtesy of TRIXTER. 01/02
© MARVEL, 2023. Image courtesy of TRIXTER. 02/02



Christina Caspers-Römer is Managing Director at TRIXTER, overseeing production from acquisition to delivery for major studios and streamers and integrating cutting-edge technology into creative workflows. She is also the co-founder of the Institute of Immersive Media. Previously Managing Director at Dark Bay and with a background in VFX and XR, she worked on projects like The Marvels, 1899, Black Widow, and I Am Groot, amongst others.



TRIXTER is bringing incredible creativity to leading film studios. It is also operating within a film production ecosystem that is becoming increasingly rich and complex. What technologies are having the biggest impact on how you work?

CCR  The industry is moving toward more collaborative workflows and shared standards. And at the heart of that shift, I’d say cloud-based pipelines and game engines are the most exciting technologies — both in virtual and traditional production.

Many pipeline tools today rely on USD — Universal Scene Description — a shared format that helps transfer data between applications and departments. That’s a great foundation, but the game-changer would be full cloud integration. And we’re not quite there yet, we still exchange a lot of data manually.

Traditionally, for example, you’d be sending emails, spreadsheets, separate breakdowns — often misaligned. What cloud-based pipelines promise is shared, real-time visibility. A director, an editor, a VFX vendor — all seeing the same data, at the same time, without needing days of file transfers during principal photography or missed communications.

It’s the same in post-production: if clients leave notes on specific shots in real-time, vendors can instantly see and act on them. And those updates could feed directly into bidding sheets, giving everyone more transparency on cost changes as they happen.



Cloud-based pipelines and game engines are the most exciting technologies out there. A director, an editor, a VFX vendor — all seeing the same data, at the same time.




How close are we to making these cloud-based workflows a reality?

CCR  We’re getting closer, especially through collaboration. Right now, we're working hand in hand with some of our clients to co-develop these pipelines. Not just as vendors, but as partners helping shape the infrastructure. And that’s a big shift.

After the pandemic and the U.S. strikes, there's more openness. More recognition that if we want better results — on time, on budget — we need better systems. And that means listening to the people doing the work.

The investment required is significant, and we can’t build the full pipeline on our own. But we’re part of the conversation — and that, already, is starting to shape a standard. Long-term, my hope is that we’ll see a cloud-based tool that’s open-source, or at least accessible to smaller creative studios as well.




How do game engines fit into this picture?

CCR   Game engines are incredibly powerful across the production chain: from pre-vis to final pixel. We can use them at almost every stage. We still need tools like Nuke, Maya, or Houdini to achieve a pixel-perfect result. But game engines let us integrate real-time data directly into the workflow and establish an early shared language.

When you’re working with green screens, for example, you often need to check another monitor, switch tools, slow down. Game engines let us visualize things live. That saves time and opens up creative possibilities — especially during pre-production, where we can import existing assets, sketch out lighting and camera angles, and make decisions with everyone looking at the same virtual image.

Clients are also increasingly asking to reuse VFX assets — like characters — in other contexts: games, AR, advertising. Once permissions are in place and copyright holders are protected, that’s where the workflow matters. If we’ve created something in a traditional VFX pipeline, using ten different tools, we might need to rebuild it to make it usable elsewhere. But if we’ve built it in a game engine from the start, we’ve already taken that next step. “Build once, use many times” — that’s the mindset.

I always say: game engines should be used more in film production. Some providers have pulled back from film-focused development, which is a shame — real-time feedback doesn’t just save money. It boosts creativity.



Build once, use many times — that’s the mindset.




You’ve been calling for stronger dialogue between vendors and clients, as well as improved integration and interoperability throughout the entire production chain. How is the industry evolving in this respect?

CCR   Communication is now much more on an eye-to-eye level. I am working with clients who actively invite us to assess together which tools are most suitable to create the content they need. That kind of dialogue makes a real difference.

Of course, there are still practical constraints. Some of these tools rely on public data, which creates legal or confidentiality concerns when dealing with sensitive content. But larger studios are beginning to partner directly with developers to create offline versions, opening up new possibilities.

Standardization across the production chain isn’t just about technology, but also finance, HR, and all the various systems that different studios use.

In the short term, I believe we’ll see more proprietary platforms built by individual studios. But eventually, we’ll need to align and standardize. That’s what will move the industry forward.



Eventually, we’ll need to align and standardise. 
That’s what will move the industry forward.




Are new AI tools also significantly shaping creative workflows?

CCR  I’m lucky to work with a team that embraces AI. They’re not afraid these tools will take their jobs — because in the end, AI is just another tool in the toolbox. It can help automate the kind of repetitive tasks no one enjoys. I remember a project where we had to remove a tiny chocolate stain from an actress’s face — frame by frame. That’s not the kind of work anyone’s proud of. If AI can handle that, we get to focus on what really matters.

At TRIXTER, we have dedicated chat spaces where people can share their latest nerdy developments. Within the Cinesite group, where TRIXTER belongs, there’s a whole division focused on integrating the latest VFX technologies into daily workflows.

That said, our clients are very clear: no AI tools can be used to create an image or a character. Otherwise, the content simply isn’t usable in production. So we follow that rule strictly — we build everything from scratch, using carefully selected references.

Still, I’d say it’s a gray area. To me, the real question — especially when it comes to IP — is: how different is prompting, from browsing thousands of images of a “red car”, and then creating your own version? Isn’t prompting sometimes a faster way of doing that?

The fear, of course, is that even a small AI-generated detail — the shape of an eye — could trigger a rights claim. So maybe the real question is: do we need to renegotiate what IP is?

When I’m building a new asset — say, a generic car — I’ll define the form, the texture, the color and materials. Suddenly, it’s a specific car. But the base model could easily be reused to create thousands of others. The same applies to animals. At TRIXTER, we’ve made a lot of cats! And in the end, a cat isn’t so different from a lion. Maybe we could start from the same model, make it bigger, and add longer hair. It’s about reusing structure, and changing texture.



Maybe the real question is: do we need to renegotiate what IP is?




So what role does creativity still play when much of the process is automated and standardised?

CCR   Let’s face it: time and budget are always limited — but the release date rarely changes. So we need to work faster, with more iterations.

Take one of our current projects where the client said, “We don’t like how this character walks.” Our supervisor came back asking, “And now?” So we go back to what we know best. We ask: Where does this character live — on the ground, in the air? How does it protect itself from rain or snow? What is it afraid of? And we let those answers shape the way it moves.

We have a concept art department and our animation department has a motion capture room. Here, our animators can jump into the MoCap suit themselves and experiment with how the character would move. It means we can iterate quickly — and if we had everything integrated into a cloud-based tool, we could also track all those decisions more transparently.

In the end, that’s where the real value lies: when technology doesn’t just automate, but amplifies the work we love doing.