"If you're looking at the real world using this, you can now poke and prod real things in the real world and see how they would move if you had poked them in real life," Justin Chen, a doctoral student who worked on the project, told NBC News.
In the case of "Pokemon Go," which was used as an example to demonstrate the technology, that means Pikachu, a virtual character, could bounce across a real-life bush, moving in accordance with the real-life vibrations and movements of the plant as it blows in the wind.
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
Pulling off such a feat would typically require building a 3D model, which can be expensive and time consuming. For the MIT team, which also included doctoral student Abe Davis and Professor Fredo Durand, all it took was a few seconds of video and an algorithm.
“It provides a lot of the information we normally depend on 3D models for without actually having to capture a full 3D model," Davis said.
The researchers looked for "vibration modes" at various frequencies in video clips, showing the ways an object reacts to force, such as wind or banging on a table. They analyzed those vibrations to learn more about the physical properties of the object they captured on video.
"An algorithm analyzes [the object] and uses it to determine how it will respond to arbitrary forces," Davis told NBC News.
This allowed the team to put virtual characters in real-life situations where they could then interact with their environments.
"The results pretty much looked better than we ever thought it would," Chen said.
Aside from the potential for games, such as "Pokemon Go," the team believes the technology could be used to make movies or even by engineers wanting to find out how an old bridge may respond to inclement weather.
Alyssa Newcomb is an NBC News contributor who writes about business and technology.