
Tutorial: How to map a VR environment to your home

Same space, new reality! I recently created a prototype for a small escape from the reality of home isolation during the ongoing COVID-19 pandemic. After sharing a video of it on twitter, I’ve received a lot of questions about it and requests for a tutorial, so I created this guide show how it was made and how it can be recreated.
Continue reading “Tutorial: How to map a VR environment to your home”
test
AR indoor navigation

Can AR be used for reliable indoor navigation?
Since GPS technologies are not suitable for indoor navigation, an obvious solution would be to use the underlying technology that makes AR (augmented reality) possible : SLAM. SLAM is used in both AR and VR for mapping and inside-out positional tracking and it seems to be the perfect candidate for indoor navigation. As a bonus, AR overlays such as arrows and destination points would make indoor navigation in large premises less of a hassle and perhaps even enjoyable.
Continue reading “AR indoor navigation”PaintShop: a multiplayer painting app for the Oculus Quest

VR is a lot more fun with friends. As TiltBrush multiplayer doesn’t seem to be anywhere close to release, I decided I would spend a few days adding Oculus Quest support and multiplayer functionality to a vr painting code base I created a while back.
You can download the app from here to try with a friend: https://github.com/eman-insilico/Virtual-Studio/releases
Continue reading “PaintShop: a multiplayer painting app for the Oculus Quest”InductVR is now on Steam

The VR Training application I’ve been working on for the past year is now available to download for free on Steam.
Continue reading “InductVR is now on Steam”Experimenting with making tutorials in VR

While there are many great VR games out there, good educational content is still very scarce. The main problem is the cost & effort of development is too high for content that is usually accessible for free. This is an attempt at coming up with an easier way to create tutorials and educational content. Instructor/Developer records their actions and voice in VR and users can follow along either in a live 1 to 1 tutorial or a prerecorded session.
This method has the advantage that videos could be created to be shared with non VR users.
Building force-feedback devices
With the release of the Vive trackers, I decided to start prototyping some ideas for haptic and force feedback hardware to be used in conjunction with the InductVR app.
Besides their smaller, more versatile form factor, the vive trackers allow for some basic input/output functionality meaning they can be used for wireless communication between any sensors, arduino, or pi controllers and your main VR application.
By shorting the ground pin (2) with any of the other pins (I used 4 here) on the vive tracker you can emulate the usual controller trigger/grip/menu functions.
After this initial test, it was time to start building something a little more interesting
Virtual Telepresence
More information on this project and how to recreate can be found in this document.
This was a case study I did for a remotely operated robot arm. The arm was 3D printed and actuated by 2 small servos controlled by an arduino. The live link with the VR controller was done using Unity. More information on this project and how to recreate can be found in this document. I am currently working on a bigger version of this project which I hope to use to simulate haptic feedback and to experiment with teleoperation.
//————————————————–
Another recent experiment I did was related to body orientation in Virtual Reality. I wanted to see what it would feel like to reorient my virtual body and walk upwards at 90 degree angle.
I knew from experience that if I were to just rotate the virtual environment 90 degrees, this will only give the sensation that the world is moving, but not my body. So I added a visual representation of feet (using 3d tracked boots) and came up with the construction below to give some feeling of movement.
The mapping of the movement was done using a petentiometer and an arduino:
(special thanks to Jorge for the idea 🙂 )
I set up the virtual experience in 3 stages. First was the identification with the virtual feet which was produced by 1 to 1 visual and tactile feedback. Participants would see their feet and be able to feel the edges of the virtual cliff. The tactile feedback accentuated the vertigo effect which in turn produced more identification with the seen virtual feet.
The next stage was visual identification. In the virtual environment there was a large mirror that gradually got bigger and closer to the participant and shifted orientation when people were nearing the edge of the cliff.
And last stage was the actual physical movement that was reflected with the shift of the virtual world. Once the participant got to the edge of the cliff and experienced the shifting of the world with their movement, when they turned around the world flipped 90 degrees and they would walk upwards on the cliff at a 90 degree orientation.
DeFabriek, Eindhoven 22/04/2017 – Emanuel Tomozei
Open source ‘Tilt Brush’
I’ve spent the last few weeks making an open source version of Google’s Tiltbrush.
It was developed to be used inside Unity, for speeding up development by sketching or planning UI/UX right inside Unity.
It has at the time of writing all the functionality you will find in Tiltbrush.