The new short of the protagonist of Twilight has turned out to be a pleasant surprise among the machine learning community.
Computer science and art are two fields that do not usually overlap; The image we have of Hollywood stars is not exactly that of people with the knowledge necessary to innovate in computing.
A study where you least expect it
There are exceptions, of course; Hedy Lamarr’s case is by far the most famous. The popular actress laid the foundation for what we know today as WiFi, with the invention of the widened spectrum. The influence that his invention had we still notice today.
Kristen Stewart’s invention does not reach that level, but it is perhaps more interesting in the current context. Maybe the name will ring a bell if I tell you that is most famous for playing Bella Swan in the Twilight saga (Twilight).
Yes, the infamous series of films, considered some of the worst we have had in the 21st century; Of course, both the films and the actress have won several Razzie awards (the so-called anti-Oscar). Of course, at the box office they were a huge success, with profits of hundreds of millions of dollars.
Let’s say, if we are guided by the topics, Stewart is not the person from whom we could expect a study detailed on Artificial Intelligence. And yet, that is what he has published on the scientific portalarXiv.
The short of the protagonist of Twilight has been treated by an AI
It turns out that Kristen Stewart is not only an actress, but also a director; Recently has debuted with the short film Come Swim, whose premiere was this week at the Sundance Film Festival.
At the moment it seems that the short has received good reviews, especially attracting attention for his impressionist style and for the use of various alternative techniques.
The interesting thing is that part of that original style is by the use of an Artificial Intelligence; That’s what the study published by Stewart (pdf) is about, who took advantage of the tape to experiment with neural networks.
Neural networks are computer systems connected to each other; They are able to learn on their own, by communicating as the neurons in our brain do.
Stewart’s idea was apply the style of a box you had drawn to the movie; For this, the team gave the image of the painting to the neural network, and commissioned it to apply it to a frame of the film.
The neural network was able to detect the frame style and transfer it to the movie image; Once the developers adjusted the effect, Stewart applied it to various scenes in the short.
The end result is striking, at best. It is not the first time that we see a painted frame effect, it is something that we can achieve with any image editor. What is innovative is that it is not just any painting; is the style of a director’s picture applied to her own film.
This study may be a good indication of how cinema can evolve using machine learning. Each director can bring their personal touch to the image using IAs; Who knows if it could be as big a revolution as CGI (computer graphics) was.