Pages

Sunday, 30 July 2017

Neural Networks Model Audience Reactions to Movies

reactions

Deep learning software models the audience reactions to the movies

Blockbusters and tentpole movies have become a mega event not just for the fans but also for the studios. A huge amount of money is stake when movie are released but for some time now movies are failing to get desired results as per the expectations of studio executives.

Engineers at the Disney Research had developed a new deep learning software which makes effective use of the neural networks to map and access the viewers’ facial expressions to the movies. This particular software is result of collaboration between Disney Research and researchers from the Caltech and Simon Fraser University.

This new age deep learning software will arm studios with the knowledge of how movies are going to perform on box office through utilizing a newly developed algorithm called factorized variational auto encoders (FVAEs).

How it works?

This software makes use of the deep learning to translate the images of highly complex objects automatically. These objects can be anything from the human face, forests, trees to moving objects and this software essentially turns their images into sets of numerical data through a process called encoding or latent representation.

Thereby they were able to understand how human react to the movies by understanding how much they are smiling or how worries they were in a particular scene and so on. In next stage these neural networks are fed with the metadata which helps in bringing better understanding of the audience responses.

Researchers are all set to showcase their findings to the world at the upcoming event called IEE Conference ion Computer Vision and Pattern Recognition in July.

Futures prospects and application of this new software

Research team has performed extensive testing of this software to make the best use of the neural networks to unlock how human perceive movies in real life. This software was applied in more than 150 showings of nine blockbusters ranging from The Jungles Book, Big Hero 6 to Stars Wars: The Force Awakens.

In the 400 seater cinema hall researchers made use of four infrared cameras to make out the audience face reactions in the dark. The result of these testing provided highly astonishing findings with the help of some 16 million individual facial images captured by the cameras.


Lead researcher has stressed the amount of data collected by the software is too much for a person to comprehend on its own. The FVAEs effectively understood the nueral networks and brought some of the greatest finding for the researchers. It helped in understanding how audience reacted to certain scenes and how movie making can be enhanced to strike cord with audience hot points.

This software will not be just limited to study the audience reaction to the movies but it can also find application in analyzing varied things or places like forest wherein it can state how trees responds to different climatic and environmental changes. Later on this very finding can be utilized in creating animated simulation of the flora all around us with precision.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.