Researchers from MIT's CSAIL Lab have invented a new algorithm called 'Interactive Dynamic Video' that allows users -- or virtual characters -- to manipulate video environments without actually moving any objects in the real world. The technology could solve one of the most crucial flaws in the augmented reality industry: the fact that, no matter how realistic the character, they could only ever interact with flat, still spaces.
Interactive Dynamic Video is powered by a predictive algorithm. In its current iteration, users need to take a short video of the environment they wish to manipulate, and the algorithm registers tiny vibrations in the objects around. It then extrapolates these vibrations to understand how the objects would move if more force were applied. The result is video environments that users or characters can virtually move through the Interactive Dynamic Video system.
To put it in truly world-shaking terms, the MIT system could eventually be used to let AR Pokemon Go characters rustle bushes or ripple water on users' phone screens.