Topics
Artificial Intelligence and Business Strategy
In collaboration with
BCGActor Tye Sheridan may not consider himself a technology expert, but his knowledge of visual effects (VFX) processes led him to cofound AI-driven startup Wonder Dynamics. With the company’s new product, Wonder Studio, creators can upload 2D video and transform it into 3D animations at a fraction of the cost of the motion-capture animation process typically used by Hollywood studios.
Tye joins this episode of the Me, Myself, and AI podcast to talk about the genesis of his company, and he shares his views on artificial intelligence’s impact on creativity in the film industry and the opportunities it can offer creators.
Subscribe to Me, Myself, and AI on Apple Podcasts or Spotify.
Transcript
Sam Ransbotham: Job loss from AI gets lots of attention because increasingly capable machines can do more and more every day. But these increasing AI capabilities can also help create jobs.
On today’s episode, learn how one Hollywood actor is using his industry knowledge to help creators without massive Hollywood budgets.
Tye Sheridan: I’m Tye Sheridan from Wonder Dynamics, and you’re listening to Me, Myself, and AI.
Sam Ransbotham: Welcome to Me, Myself, and AI, a podcast on artificial intelligence in business. Each episode, we introduce you to someone innovating with AI. I’m Sam Ransbotham, professor of analytics at Boston College. I’m also the AI and business strategy guest editor at MIT Sloan Management Review.
Shervin Khodabandeh: And I’m Shervin Khodabandeh, senior partner with BCG and one of the leaders of our AI business. Together, MIT SMR and BCG have been researching and publishing on AI since 2017, interviewing hundreds of practitioners and surveying thousands of companies on what it takes to build and to deploy and scale AI capabilities, and really transform the way organizations operate.
Sam Ransbotham: Hi, everybody. Today, Shervin and I are pleased to be joined by Tye Sheridan, cofounder and president of Wonder Dynamics. But you may be more familiar with Tye from films like Ready Player One and X-Men: Apocalypse and Mud. I did my homework by watching movies over the weekend here. Tye, thanks for taking the time to talk with us. Let’s get started.
Tye Sheridan: Thanks so much for having me. And thanks for doing some homework on me, apparently.
Sam Ransbotham: Well, the homework was easier than usual. Although, I mean, some of the homework involved looking at patents that you’ve filed, which was something I wasn’t expecting.
Tye Sheridan: Oh, really?
Sam Ransbotham: So maybe we can get into that, too, a little bit later. But if we think about a movie like Ready Player One, there’s clearly a bunch of AI in the content of that movie, but that’s not what we’re focusing on here. Tell us about Wonder Dynamics. What’s going on with that?
Tye Sheridan: Wonder Dynamics is a company that I started with a friend of mine, Nikola Todorovic. Nikola kind of comes from a background of visual effects as a visual effects artist and later turned supervisor and is a filmmaker as well.
For folks listening who don’t know who I am, I kind of came up through the world of film as an actor. I got involved in films from a really early age and fell in love with it, but I was always completely obsessed with filmmaking technology and the entire process.
When I met Nikola, we [were] working on a very small film called Last Days in the Desert. This was over 10 years ago. We were both kind of wanting to make our own films one day and knew we wanted to make big films and knew that we didn’t have the budget to do so. And so we really kind of started collaborating together and started developing projects, and then that’s when we realized, “Oh wow, this is going to cost $150 million. Where are we going to get a check for this?”
And so that’s really where Wonder Dynamics was born — really out of our frustration of trying to make big films and not necessarily having the budget. Wonder Dynamics’s entire mission is to enable indie filmmakers anywhere in the world to make studio-level movies with CG [computer-generated] effects but on an indie budget. We do that by using a suite of AI/machine learning models running in a platform, which is our flagship platform called Wonder Studio, which we launched back in July [2023].
Wonder Studio is an AI-driven VFX studio. You upload any 2D video, and you can extract a lot of data out of that video, from 2D pose and the lighting in the scene to the noise and the way your camera is moving, and there are 25-plus machine learning models running in the background. And then we take all of that data and we enable it to be exported into 3D software so that artists can continue working in the traditional software that they are used to working in across the industry.
Sam Ransbotham: That makes a lot of sense. Let’s go into the cost part, which I think is a pretty fascinating thing. Is that the future of movies — that we’re not going need big budgets? Or is there still going to be room for big budgets? How do you see this ecosystem playing out?
Tye Sheridan: I hope so. I think that to date, only a certain handful of people or studios have been able to make certain kinds of films. And it’s also not only limited by your budget, but it’s limited by where you live.
If you’re not in Hollywood, to date it’s been very difficult to break into the industry at that level. And I think that’s the real excitement around these new innovations, especially in AI and machine learning, where they pose a huge opportunity to artists and to the industry. And I think the industry as a whole is in … we’re going through a rough patch; let’s just put it that way.
[With] the global box office taking a huge slam during COVID, and the quickly evolving landscape of distribution, the industry’s been totally shaken up, and it’s really hard to get any film made, much less a huge, $200 million movie with a $100 million budget worth of VFX.
I do think we’re on the cusp of a new era, and … it’s not unlike digital cameras or the advent of the camera itself. Technology has been pushing our industry forward since it started. We wouldn’t be able to be making the films that we’re already making without certain technologies that we have in the industry currently.
Shervin Khodabandeh: Can you walk us through what the normal process would’ve been and, with Wonder Studio, whether that process is almost the same but now without the technological and cost limitations that they would have had, or is this a completely new process? From a user’s perspective — is it like they need to learn something completely new, or is it almost the same? I’m just curious how the ways of working of the artist and the creative team are changing because of this.
Tye Sheridan: Yeah, it’s a good question. So in theory, you could just get a 2D video out of Wonder Studio, and for some people, it’s going to be good enough for their final VFX. For someone who’s not a 3D artist, it might be good enough for; if you’re a 12-year-old boy and you want to go out and make a sci-fi short film with your friends and have some cool visual effects — that’s fine. If you’re working on a Warner Bros. science fiction movie … let’s just say you’re working on Avatar — the next Avatar. Well, you know, there’s a certain standard, right?
And Wonder Studio will only get you a certain amount of the way. So you can think about it as your first visual effects pass — so getting you 80% there. And then you take what Wonder Studio gives you, speeding up your process tenfold, and then continue to work on that.
Traditionally, the way it works — and … I just mentioned Avatar, or you can take movies like Ready Player One, which I worked on, that Steven Spielberg directed, where we spent eight weeks in motion capture — traditionally, the animation is motion capture. So you need a huge volume with this Vicon system, with infrared cameras and sensors that are basically tracking those little balls that you’ve seen people wearing with the funny suits, and that’s what’s giving you your animation data.
Now, you still have to go and clean that up. It’s not perfect. There’s going to be a lot of attributes within the animation data itself, even with that method, but that method is very expensive — I mean, to rent these stages. That’s why only studios, really, can do that. And then, if you’re an animator yourself, in theory, you could go and you could hand-animate this animation, but it takes a long time.
So what Wonder Studio essentially is offering to artists is an avenue of achieving that animation at a super-reduced cost.
Shervin Khodabandeh: And faster, too, probably.
Tye Sheridan: And faster. You can upload, let’s just say, a one-minute video and extract all the animation data and get to your “first pass” of animation in an hour.
And, yeah, it’s not going to be perfect in every shot, but it might be pretty darn close in a lot of shots, you know? Sometimes it’s pretty close to final. But, again, Wonder Studio is really not built to replace the process. It’s really to supplement the artists’ process, to speed them up, to create opportunities for filmmakers or storytellers that have a big idea but don’t necessarily have the resources or the knowledge to go and see that idea as a reality.
Shervin Khodabandeh: Yeah. You know, I’m looking at you in our video Zoom, and you’re a young, quite accomplished actor, and folks know you from your films and all that. They may not know your background and the patents and all of the sort of technical stuff that you’ve been doing. Tell us more about how you got into that, and when did you have the time to learn all these things? Just tell us a little bit more about your technical background and how it came about.
Sam Ransbotham: Maybe there’s a lot of downtime when he’s wearing the [motion capture] suit.
Tye Sheridan: The truth is, I don’t really have a technical background. I grew up in a really small town in East Texas, and working on films — the idea of that was just so far-fetched. I couldn’t even fathom that it was a thing you could do as a job. That’s how far away from the industry I was.
I really randomly kind of fell into the industry. I got randomly invited to come and audition for this film called The Tree of Life; it was directed by Terrence Malick. But essentially, they recruited 10,000 kids in the state of Texas through the public school systems to come and audition for this film. And I just got an invitation, and my parents said, “Oh, why don’t you go to this audition? You can always tell people that you got to audition for a movie one time.” I said, “Yeah, that sounds cool. Why not?” So I went on a whim. And after a yearlong audition process, I end up getting cast in this film. I was 11 years old at the time. So the first day on the film set, acting was just as new to me as the entire process.
And I just became obsessed with it all. So, really, I say I don’t have a technical background, [but] I do when it pertains to storytelling.
For me, it’s the personal connection that’s interesting to me. So in terms of technology that you use in the filmmaking process, I’m all over it. I’m all about it. It’s very interesting to me. Everything else is … I love cameras, that kind of stuff. Beyond that, I would say my technical background and knowledge is very, very, very limited.
Shervin Khodabandeh: But it’s a lot of the design thinking and understanding the pain points and the desirability of a product like this.
Tye Sheridan: Right. And that’s been our approach the whole time. Nikola and I are artists ourselves. We really have been approaching this as artists building out of frustration with the process versus “Hey, all these new research models are at a certain place now, and why don’t we start a company to generate videos for creatives?” We have a very different approach. We’re coming at it from a completely different direction. And I do think that’s partly what makes Wonder Dynamics unique in comparison to maybe some of the other generative tools out there or AI-driven products out there.
Sam Ransbotham: These tools are … I don’t want to simplify them and make them sound too easy, but the entry to getting into tools is so much lower than it was before, which then lets somebody who has more of the knowledge of the process come together and create a tool versus having to start in the research lab like you were saying, Tye.
Shervin Khodabandeh: No, it’s so true. And look, I mean, Tye: Sam and I have been researching AI and adoption for eight years, and a lot of my work at BCG has been over the last decade or so around implementing AI and coming up with AI strategies for companies and all that. But the missing link has always been around process, right? It’s been “data and tech only gets you so far” — models and all that. At the end of the day, the biggest lift is, you’ve got to change somebody’s way of working, you’ve got to change the process — all of that.
And so, if I look back at all of our podcasts, all of our conversations, we always start with, “OK, this is this wonderful technology, wonderful product. How do you get people to use it?” Whereas your story is very refreshing, where you’re starting with a problem, but because technology has become so much more available now than it was maybe a decade ago, the barrier for you to create something that’s really disrupting the industry, that’s helping, is much lower than it would’ve been, and it’s really, really refreshing. I think we will hopefully see … I think this is such an inspiring story that, like, we’re hopefully going to see a lot more of these stories where you actually start from the user and work your way to the technology.
Tye Sheridan: I think that what we learned is that good technology without good presentation is a wash.
Shervin Khodabandeh: I want to ask you something maybe a bit more philosophical or existential. That is, we’re seeing these technologies advance quite rapidly, at a much more accelerated pace than they used to before, right? I mean, it took over a decade for computer vision to get good enough. It took less than half a year for large language models to really, really get super cognitive and able to understand and have logic.
So if you play that forward, you know, three, four, five years from now, clearly it is encroaching upon a whole bunch of things that we used to say, “These are uniquely human things,” including the creative process, including the open-ended brainstorming things. And we’ve seen images and videos created by these large language models; we’ve seen poetry and entire scripts and movies and all that.
So my question is, how do you see the future and the human role in the creative process itself? And the second part of that question is, do you think there should be some hard regulators or limiters in what should be allowed?
Tye Sheridan: It’s a good question. I mean, I think we often get so caught up in what’s happening in our day-to-day and what the new tech is and how it’s affecting things, and I think we forget, fundamentally, as a species, we build tools. We build tools. That’s what we do. They make our lives easier, and our survival rate goes up. Essentially, that’s what we’re doing, and I don’t think you’re ever going to stop that. That is our nature, to keep continuing building tools, to, I guess, make our lives, easier, right?
But it makes sense for people to get defensive about something that so quickly has changed, fundamentally, the entire ecosystem or the landscape of an industry or has the potential to fundamentally change the landscape of an industry. It’s scary. That’s our natural kind of mechanism — to be defensive.
But I think the more important thing that we should all be doing is having these conversations. Talking, posing the exact same question you just posed to me back at all of your friends and your family and your coworkers and thinking about, well, what does the future look like? And I’m glad you said three, four, five years because beyond that, who knows? It’s all moving so rapidly.
In terms of regulation and how we balance this innovation with trying to maintain the integrity of industries and people’s livelihoods, it’s a very, very difficult question. I do think we have to be cognizant of it, and we have to be approaching it ethically and very, very thoughtfully as innovators [regarding] the effect that it will have on society as a whole.
That’s super important, but I think for me, I try to look at these things optimistically and look at the ways that it can improve people’s lives [and] create new possibilities. And I think people had a similar reaction when the internet came about.
Sam Ransbotham: Yeah, it’s not our first time around with this.
Tye Sheridan: Yeah, exactly. It’s not our first time around. We’ve had technologies that have come into play, and it’s shaken up our world in a huge way, but they’ve also created a lot of new opportunities. I’m really excited about a future where our lives are better — where technology makes our lives better. And, yeah, I also think, today, we’re so immersed in our own technology devices.
But I’m curious: Does AI alleviate some of that and, in some way, allow us to have more human connections? I don’t know.
Sam Ransbotham: I think we’re all kind of hoping that. I mean, you mentioned the time factor. We’re all crunched on time, and we’re back-to-back scheduling things, and that has to help out there.
At the same time, there’s some other startup out there right now who is thinking about eating your lunch too. And that’s the perpetual dynamic there. There’s another technology coming along the pike. How do you think about both of those things?
Tye Sheridan: It’s a great question. I was in a really interesting place during the [2023] strikes in the industry. A big point of contention was around AI. Of course, with the advent of ChatGPT and some of these large language models, writers were very concerned, and I think that the Screen Actors Guild became very concerned that studios would take someone’s likeness and then just go make movies with that likeness without the artist involved.
And I think, one, looking at it from the outside, I think there was a lot of misunderstanding of what was currently possible. I think that people were giving a lot of credit to technology that maybe existed but [was] in its infancy still. But regardless, fundamentally, we’re right to be concerned about artists and thinking about how this is going to impact their careers and their lives. I think that was a really good example. I think we’ll continue to see that conversation evolve.
And, yeah, I was in a really interesting place because, you know, [I am the] cofounder of a company that’s technically building AI-driven production tools, and then I’m an actor as well. I think for us, we’ve always really been thoughtful about that — that we wanted to build tools for artists and not encroach on their process.
Beyond that and building a great brand, if you can get in the artists’ corner and build something that they love and they find very useful and continue to expand on that and continue building, I think it’s really the only way to stay competitive.
Sam Ransbotham: I think everyone’s facing that.
Shervin Khodabandeh: You said something quite simple and quite deep that sort of brought this all back to our humanity that is, like, we build tools. We’ve been building tools [throughout] all of our evolution. And this is not the first time we’ve built a tool that could potentially be dangerous. I mean, you think about the automobile and electricity and …
Tye Sheridan: Oppenheimer.
Shervin Khodabandeh: You know, all of it. All of this. There’s always been a trade-off between the power of the tool and the unintended consequences, the harm it could have done, and we’ve threaded that with some dialectic process over time [and] not always gotten it perfectly, but we’ve always sort of landed. There’s a futurist-optimist view here that says there is actually no reason we would not expect the same here this time around, right?
Sam Ransbotham: Predictive analytics says that we’re going to have this; we’re going to come out all right.
Tye Sheridan: That’s right.
Sam Ransbotham: So, Tye, we have a segment where we’re going to ask you a bunch of rapid-fire questions to kind of close this out.
Tye Sheridan: OK.
Sam Ransbotham: Short answers are fine. What’s the biggest opportunity for artificial intelligence right now?
Tye Sheridan: I always go back to the humanity thing and human experiences. To have better human experiences and live in a world where people are happier, people are more loving to each other, people understand each other more. I think that is the biggest opportunity with AI.
And I think that’s why storytelling is important to me. That’s why I’m completely obsessed with it. One of the most important tools we have is telling stories because it allows us to understand each other, it allows us to connect in a realm that’s some in-between world, where there are two minds coming together and colliding with the same idea in a union of understanding, and that is powerful.
So AI-driven storytelling, obviously, is an interest of mine, but I’m really excited about the opportunities there.
Sam Ransbotham: And, to your point, I mean, you’re building tools that can make more people tell better stories. So, what’s the biggest misconception that people have about AI?
Tye Sheridan: Well, there are a lot of misconceptions, but I’ll just say, specifically in the film industry, that it’s going to replace everyone overnight and that it’s going to replace the creative process. And I don’t necessarily believe that. Artists are essential, and I think there’s something innately human about that creative process that’s hard to replace.
Sam Ransbotham: So growing up in Palestine, Texas, what was the first career you wanted? You said you didn’t want to be a film star or didn’t even think that was a possibility. What did you want to do?
Tye Sheridan: I wanted to be a baseball player. I played year-round pretty much until I was about 16, and then I kind of had to choose. I remember I had to choose between the film thing and baseball. I just didn’t have time anymore to do both. And it was the right decision. I was never going to make it as a baseball player.
Sam Ransbotham: You don’t know that.
Tye Sheridan: Yeah, yeah; that’s true. I was decent. Not, you know, probably not all-star level.
Sam Ransbotham: So when is there too much artificial intelligence? When are people using the tool in places that they shouldn’t?
Tye Sheridan: When we’re all gone.
Sam Ransbotham: When your first scenario doesn’t play out, huh?
Tye Sheridan: Yeah, exactly. So when is there too much artificial intelligence? I think when we’re all sitting at home and not doing anything, not having any outside experience.
Sam Ransbotham: Actually, I wasn’t thinking about this, but that’s a perfect tie-in for Ready Player One, to add, I know that you end that movie by turning off the Oasis two days a week. What’s one thing you wish that AI could do that it currently can’t?
Tye Sheridan: Oh, man, that is a great question. Printers. If AI could just make the whole printing ecosystem easier, that would just really save a lot of people some frustration.
Sam Ransbotham: We haven’t heard that one; that’s a good one. Tye, it’s been great talking to you. I think one of the things that you’ve really brought out is this idea of enabling a creative process versus replacing [it]. And so much of the discussion is about replacing and this replacement thinking. And you really point out how it could be a real enabler for human experiences, and we appreciate you bringing that to our show. Thanks for taking the time.
Tye Sheridan: Thank you guys so much. Thanks for having me. It was a lot of fun.
Shervin Khodabandeh: Thanks for listening. On our next episode, Sam and I speak with Paul Romer, former chief economist at the World Bank and director of Boston College’s Center for the Economics of Ideas. Please join us.
Allison Ryder: Thanks for listening to Me, Myself, and AI. We believe, like you, that the conversation about AI implementation doesn’t start and stop with this podcast. That’s why we’ve created a group on LinkedIn specifically for listeners like you. It’s called AI for Leaders, and if you join us, you can chat with show creators and hosts, ask your own questions, share your insights, and gain access to valuable resources about AI implementation from MIT SMR and BCG. You can access it by visiting mitsmr.com/AIforLeaders. We’ll put that link in the show notes, and we hope to see you there.