The IEEE Virtual Reality and 3D User Interface Symposium are two of my favorite conferences of the year (fortunately they happen to be co-located!). I was able to catch up with many colleagues that I don’t often get to see, and I had the opportunity to meet quite a few new people. Although I can’t write about the entirety of both conferences, as that would be quite a lengthy article, below are some of the combined highlights along with my personal experiences and perspectives.
The week started with a tutorial I organized for VR Development using Unity. I began the first session with an overview of Unity and the Oculus Unity SDK. For the second session, I moved onto a discussion of latency, user interfaces, and some tips on reducing simulator sickness.
Here is a video of the second session:
IEEE VR Tutorial Session 2.
Sebastien Kuntz, Founder and President of of I’m in VR, then provided an overview of his Unity MiddleVR software. He started off with the basics and then allowed the audience to follow along and start integrating components together in a more detailed format.
Arno Hartholt, lead of the Virtual Human Toolkit from the University of Southern California Institute for Creative Technologies, and Lakulish Antani of Impulsonic also showed off their work within Unity. I really wanted to see these sessions but was unable to since I had to attend the Workshop on Immersive Volumetric Interaction and also present the paper “Bimanual Selection and Interaction with Volumetric Regions of Interest”. I guess that is a sign of a good conference when you can’t possibly see everything you want to see!
We ended the tutorial with the audience asking the panel speakers questions.
Henry Fuchs gave an inspiring keynote that was one of the best talks I have ever witnessed! Henry started by saying that he changed his entire speech after arriving to the conference and discussing the acquisition of Oculus VR by Facebook with other colleagues. He believes that this is the best thing that could have ever happened in VR as it validates the work many of us have been doing for many years, and the industry will be encouraged to expand into VR and innovate. He ended by stating that not many professionals have a chance to change the world over the course of their entire career, but in our case the timing is right for us to have the rare opportunity to change the world.
I led a panel of HMD experts consisting of Steve Ellis–Research Scientist of the Advanced Displays Group at NASA Ames Research Center, Yuval Boger—CEO of Sensics, and David A. Smith—Chief Innovation Officer at Lockheed Martin MTLS. The four of us collectively had more than 50 years of experience with virtual reality!
Steve started off the panel with a brief history of HMDs. I think his slide showing some HMDs over the last 400 years might be my favorite slide of all time!
Steve continued by talking about what Steve Jobs, if he were still with us today, might consider to be the essential requirements of HMDs.
Yuval Boger discussed the different parts of of HMDs and other VR components that will be useful in the future (e.g., eye tracking, biometric sensors, gesture recognition). I asked a tough question about whether low-cost commodity HMDs would overtake professional high-end HMDs in a similar way that Nvidia put Silicon Graphics out of Business over a decade ago. Yuval’s great response was that HMDs are different than graphics cards and there would always be a high-end market that would require ruggedized HMDs (e.g., for military and medical applications where HMDs would have to be certified and withstand harsh conditions).
David Smith talked about how we are creating a totally new medium and language in a similar way that film and radio had to be invented . I could not agree more with David’s claim that great content designed specifically for the experience is what matters—porting existing content to VR will never result in the ideal experience. David also stated that we need to be better at selling the story and emotion if we wish for mainstream adoption. Listing a set of features is not going to get us there.
Demos, demos, and more demos!
One of my favorite parts of a conference is trying the demos. This is the real test. After all, just because a system sounds great on paper, it many not work the same in practice.
My good friend Luv Kohli ran the research demos this year, and the research demos team did a great job of selecting some really innovative demos. It was impressive how so many research groups were able to bring their working demos to the conference. One of my favorite demos was from my friend Evan Suma’s group out of the USC Institute for Creative Technologies where they were scanning in bodies (i.e., attendees) using the Kinect. They scanned me into their system and into my email—I can’t wait to get put my avatar into a VR environment!
University of Minnesota demos
Some universities are fortunate enough to have a VR research lab. The University of Minnesota has three research labs/groups that focus on virtual reality! They have a full CAVE as well as two HMD labs both with large area tracking where we received some cool demos.
As one of the 3DUI contest co-chairs this year, I was very happy to see some extremely innovative systems participate in the challenging task of annotating hierarchies of point-cloud datasets . To view the systems in action, head over to http://3dui.org/contestants. The winning systems that shared first place were:
• “Go’Then’Tag: A 3-D Point Cloud Annotation Technique” by Manuel Veit and Antonio Capobianco from Holo3 at Schiltigheim and ICube at theUniversity of Strasbourg.
• “Slice-n-Swipe: A Free-Hand Gesture User Interface for 3D Point Cloud Annotation” by Felipe Bacim, Mahdi Nabiyouni, & Doug A. Bowman from the Center for Human-Computer Interaction and Department of Computer Science at Virginia Tech.
One team won a voucher for a Sixense Stem system (which will be shipped this summer) and the other team won a set of Leap Motion sensors.
Zach Wendt, President of IGDA Twin Cities, organized a social gaming event Wednesday night. This was a great chance for VR professionals to meet the local gaming community in a casual environment. Rob Lindeman of the Worcester Polytechnic Institute and I were happy to demonstrate a couple of VR Games (Rob showed one of his students projects AaaAAaaaAaaCULUS!!!! and I showed a version of my VR Apocalypse game shown on ABC’s Shark Tank with Virtuix ) to several individuals that had not previously experienced VR.
Upcoming VR Events
East Coast Game Conference (April 23-24, Raleigh, North Carolina).
I will be speaking about VR and also showing my VR Apocalypse game on the exhibit floor VR where you can try the game for yourself!
Neurogaming Conference (May 7-8, San Francisco, California).
The potential of integrating neuro sensors with virtual reality is largely unexplored and I expect plenty of attendees and speakers to have ideas on how to mix the two. I’ll be leading the panel “Immersive Experiences—Virtual Reality Neurogaming” with Palmer Luckey—Founder of OculusVR, Ana Maiques—CEO of Neuroelectrics, and Amir Rubin—CEO of Sixense Entertainment.
Silicon Valley Virtual Reality (May 19-20, Mountain View, California)
I will be leading a panel on locomotion and simulator sickness.
Augmented World Expo (May 27-29, San Jose, California)
I will be speaking about virtual humans in augmented and virtual reality.