We have Virtual Reality and Augmented Reality but what if you could explore live scenes, in the next room or distant planet, using the same head-mounted gear?

You may recall the scene from Blade Runner where Rick Deckard uses the voice-activated Esper Machine on a photo to zoom in, enhance, track left, pan right, moving around the room as awkwardly as if he were there. Jasper van Loenen’s Esper experiment attempts to recreate a live version of the idea using modern technology — a VR headset and wireless cameras.

The installation uses 60 ESP32-CAM wireless IP cameras with a custom mounting and power system, router, PC and a HTC Vive. The PC runs a program made in Unity and the camera locations are recorded using an AR Android app made with openFrameworks.

Your view of another room through the VR headset is recreated using a series of floating images taken from the wireless cameras. It’s a dizzying, mish-mashed photo gallery of sorts, using only static images, but the concept holds and sets the stage to take it to the next level where a complete 3D view of the room is possible.

For instance, fewer cameras could be used to capture video instead of static images. These could then be interpolated, stitched together and used to generate the 3-dimensional recreation of space and objects. Better yet, marry this concept with VR live streaming and you could have live environments to explore, plus jump forward, and back, in time.

You can see highlights of the project in the video below and learn more about Jasper’s Esper project and others on his website.

Author

Josh is founder and editor at SolidSmack.com, founder at Aimsift Inc., and co-founder of EvD Media. He is involved in engineering, design, visualization, the technology making it happen, and the content developed around it. He is a SolidWorks Certified Professional and excels at falling awkwardly.