HTC Vive & Oculus Development

Solve your problems or get new ideas with basic brainstorming

Get Started. It's Free
or sign up with your email address
HTC Vive & Oculus Development by Mind Map: HTC Vive & Oculus Development

1. Action Points

1.1. Interaction Library

1.1.1. ViveController

1.1.2. Grab&Release

1.1.3. Navigation

1.1.3.1. Jan and David will work on this

1.1.3.2. We have a new interaction feature using the touch sensor. 1. Hopefully it will be faster since Isector is not used. Drawback is that it needs another viewport. Test will show whats fastest.  2. The new Outline Select shader that Jhonny originally did is almost working now.

1.1.4. Physics interaction

1.2. Demo Application

1.2.1. SG - Human Anatomy

1.2.2. Irvine - Motor Room & Machine Assembly

1.2.3. Jan: In Irvine we have set up a demo room using an empty office space. Brock (US modeler) made a few 3D models that is matching the exact size of the psychical space to be able to create a "physical immersive" experience. It will basically work like this: when you see the wall in the virtual space, the real wall all around you will be at the same position.

1.2.4. SG - IOT Retail application (port from Accenture Retail app)

1.2.5. Accenture Underground

1.2.6. SG - Engine Explorer

1.3. Jan: Develop a standardized UI to be used on both Oculus and Vive (other other similar systems).

1.4. Jan: Improve documentation with guidelines and help to develop content.

1.5. Test with Dynamic load to see if we could develop a demo library similar to Icube 30

2. Ideas

2.1. Develop library of interaction and logic to make the app development faster

2.2. Develop good technical demo for EON HMD

2.3. EON AVR Player for HMD

3. Real-time AR camera for tracking of HTC Vive

3.1. The HW setup

3.1.1. Green-screen stage

3.1.2. Position tracking camera

3.1.3. The camera connected to a workstation

3.1.3.1. One Vive/Oculus controller will be on top of the camera

3.1.3.2. Can we use a phone camera instead of a DSLR or a video cam? that'll make it much user friendly

3.1.4. Projectors and big screen to projected to large audience

3.1.5. Headset and controller, connected to workstation

3.1.6. Graphics workstation

3.2. The scenario

3.2.1. 1. User wearing headset and play the game on the green-screen stage

3.2.2. 2. A camera with Vive controller attached on top shot the user's interaction

3.2.3. 3. The audience watch the virtual game, with the player inside the game and interact with virtual game object

3.3. The challenges

4. Creating alternative Controlers. Example a Phone Case that would allow to make a phone part of the VR Experience https://drive.google.com/drive/u/0/folders/0B4-3TjSkDRk7bU1ZcThBTFQ3Skk

5. Goals

5.1. To enable other developers to develop the app faster

5.2. To be able to demonstrate our capability with HMD to get more project

5.3. To be fun with the new trend & device

6. Problems

6.1. Lack of development knowledge with the new device

6.2. Lack of the channel for sharing our idea and work

6.3. Lack of good content for AR/VR

7. EON AVR Player for HMD

7.1. Application Flow

7.1.1. 1. User download the json file of the AVR lesson and put into C:\lesson

7.1.2. 2. User run PlayTemplate_HMD.eoz. The simulation will performing the following sequence:

7.1.2.1. 2.1 Upon initialization, load the lesson.json from C:\lesson folder

7.1.2.2. 2.2 When parsing the lesson.json, replace the .EMP by .EOP

7.1.2.3. 2.3 Upon fully loading the model, user press START

7.1.2.4. 2.4 User will landed into VR mode automatically and need to put the headset on

7.1.2.4.1. 2.4.1 First, user will Explore the lesson, dissect, toggle annotation... User can use both controller to interact

7.1.2.4.2. 2.4.2 When click on Play, user will do Quiz, Identify, Locate, Build

7.1.2.4.3. 2.4.3 Upon completion, user will be presented with their scoring

7.2. Consideration

7.2.1. Code refactoring: We might just need to disable the AR & Touch Mode, change the State Machine to switch to VR mode automatically

7.2.2. We need to adjust the VR experience to use the SteamVRManager instead

7.2.3. We need to have option for user to switch between several standard environment

7.2.4. Interaction

7.2.4.1. Grab & Release

7.2.4.2. Navigation

7.2.4.3. Physics

7.3. Status

7.3.1. POC, VR mode of the current AVR app

7.3.2. Use right controller to interact (button sliding, grab & release)

7.3.3. Constraint: rethink the UI

8. Virtual Prototype of UI

8.1. Oculus Store type

8.2. Target user: designer to design UX for Virtal application