Slides from workshop on Augmented and Virtual Reality in Unity3D.
Part of the Embodied Vision course of the Media Technology MSc. program (http://mediatechnology.leiden.edu/)
More information on www.robindelange.com
2. Hello world
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Robin de Lange
Part-time PhD student @ MediaTechnology
Part-time
entrepreneur
Studies Physics Philosophy Artificial
Intelligence
Media Technology
3. What are we going to do?
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
- What is Unity3D?
- Unity3D interface and project overview
- Creating a simple scene
- Scripting: Hello world application
Introduction
Getting started with Unity3D
- Idea for this workshop
- How to install Unity plugin
- Demonstration of demo Oculus Rift scene
- Creating a simple Oculus Rift scene
Virtual Reality: Oculus Rift
Augmented Reality:Vuforia - InstallingVuforia library
- Run aVuforia demo application
Individual/teamwork - Making everything work, experimenting for
assignment
- Tutorials and questions!
5. Introduction: Graduation project
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Main goal: raise questions and discussion
- How would this change our mathematical understanding?
- How can AR headsets change the way we solve problems?
- Will we be able to solve more complex problems by off-
loading part of the cognitive load?
- How should education change in response to the rise of these
new tools?
Similar approach in PhD research
6. Introduction: Other projects
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Started using Unity3D for other projects
Scan children’s books
View 3d cloud with opinions
Add own reviews
Together with Berber de
Vries
7. Introduction: Other projects
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
3 Oktoberapp
• App around
Leidens Ontzet
• Android and
iOS
• Only GUI:
timetable,
soundboard,
etc.
8. Introduction: Other projects
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Brainstorm:
Live EEG data visualization for Oculus Rift
Together with Eva Delincakova & Bert Spaan @ Hack the Brain
hackathon
9. Getting started:What is Unity?
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Let’s ask David Helgason, one of the founders of Unity.
http://www.youtube.com/watch?v=4mtzAXSiR1w
10. Getting started:Why use Unity?
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
• Not only for games, also for other interactive
environments
• Very easy multiple platform output. No restriction
in mobile apps
• Integration with devices as Rift, Kinect and Emotiv
EPOC
• Integration with 3D modelling software:
Blender/3ds Max/Maya/….
11. Getting started:Why use Unity?
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
• Active community of millions
• Thousands of models, scripts and plugins via Asset
Store
• Many tutorials and guides available
• Nice combination of 3D environment and scripting
• Environment meant for collaboration between
programmer, designers and everything in between
12. Getting started: Non-game projects
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
BitGym
http://www.youtube.com/watch?v=0r9V57cTovI
Bike configurator
http://www.bikeconfig.com/
Physics demo
http://www.coffeeshopphysics.com/magnetodynamics/index.php
?demo=VanDeGraaffExample
13. Getting started: Interface Overview
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
START UNITY
14. Getting started: Oculus Rift example
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Download from: https://developer.oculusvr.com/
Signing up required, use Dropbox link otherwise
If Rift is connected: run OculusConfigUtil
Rift not required for development
Import OculusUnityIntegration package in project
15. Getting started:AR withVuforia
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Introduction videos:
https://developer.vuforia.com/resources/sample-apps/features
Instruction on downloading and installingVuforia library:
https://developer.vuforia.com//resources/sdk/unity?d=aw2
16. Individual/teamwork:Assignment
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
During the lectures you have learned about many different special and visual
effects used in film and the different goals (such as: distraction, shock, spectacle,
narrative, integration, immersion) that can be reached by applying these effects.
For this assignment you are challenged to make use of the visual effects offered
by Augmented andVirtual Reality to support one (or more) of these goals you
find most interesting.
Since learning Unity3D is an essential part of this workshop, you should make
use of this software for your project. Exceptions can be made however, if you can
give good reasons for this.
Groups: 1-3 persons
17. Individual/teamwork:Assignment
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
• Use AR/VR visuals for goals like shock, spectacle
and discontinuity, similar to early days of film
• Explore the effect of editing in AR/VR
Ideas:
18. Individual/teamwork: Learn more
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
Now:
• GetVuforia and/or Oculus Rift working
• Experiment with examples
• Follow a few tutorials
• Think about assignment
• Ask questions!
Later:
• Feel free to contact me
• www.robindelange.com for details.
19. Individual/teamwork: Learn more
EMBODIED VISION: AUGMENTED AND VIRTUAL REALITY IN UNITY3D
General Unity tutorials:
http://unity3d.com/learn
http://www.unity3dstudent.com/
Vuforia:
https://developer.vuforia.com/resources/tutorials
Graphical User Interface: NGUI
http://www.tasharen.com/?page_id=140
Data visualization:
https://vimeo.com/59696565
http://catlikecoding.com/unity/tutorials/graphs/
Notas do Editor
Good morning everyone. In this workshop I'll give an introduction to Unity3D and show how to get started with building Augmented and Virtual Reality applications.
Let me first briefly introduce myself.
I'm Robin de Lange and I'm part-time entrepreneur and part time PhD student, here at Media Technology. For my Bachelor I studied Physics and Philosophy and did extra courses on Artificial Intelligence. After that I followed the Media Technology program, finishing last December.
So, what are we going to do today?
I found Unity to be a really useful environment and started to use it for other projects as well.
After doing these projects I realized that Unity was a tool that could be very useful to many Media Tech students. That’s why I decided to do this workshop, hopefully you’re going to use it for some interesting projects.
Let’s first look how one of the founders of Unity answers that question
3D physics engine and environment to develop games
Almost industry standard
Why would you use Unity? You’ve already learned Processing, Max MSP, Open Frameworks and who knows what else. I’m going to be a fanboy
- Everyone: Start Unity
Creating a new project
Finding folder
Possibility to import different standard packages
Discussion of different panels
Project panel: here you see the file structure of the Assets folder of the project. Save the scene to show this scene is also an asset.
Scene panel: Here you see how your scene looks like, which you can navigate
Hierarchy: Here you see all the Game Objects, which are the elements of a scene. Everything that you want in the scene needs to be in the hierarchy
Game Objects have different components, you see these in the Inspector panel
Every new scene starts with a new camera. The camera decides what is shown in the Game panel
Creating a scene
Create an empty Game object
Transform component. Every object has a position in 3D space, also Game Objects which only have a script and no graphical representation.
Let’s create a very simple scene.
Create a cube and drag it in front of the camera
Let’s look at what this cube is made of: mesh filter, mesh renderer and box collider
Run cube not move and is very dark
Make the cube a Rigidbody, now gravity affects it
Run cube keeps falling
Create terrain, explain briefly
Run
Add light to the scene. This is essential, sometimes you don’t understand why something isn’t working, turns out to be that there are no lights in the scene.
Explain about different lights: point lights, directional and ambient light
Scripting
Now suppose we want to make a custom behavior. We can create a new script in which we program this behavior
In Unity3D you can program in C#, Javascript or Boo. I will use C# as I know it best. I wouldn’t recommend Boo as it is not very common. You can use Javascript too, it’s generally a bit looser language.
Explain basic script structure:, two standard methods: Start and Update. Update runs every frame. Time between frames can differ.
Script has to be attached to a GameObject, otherwise it doesn’t do anything. Connection between code and 3D environment.
Code Debug.Log(“Hello world”) if gameObject.transform.position.y < 0.5f
Run and show console
Code private int framecount. “On the ground for” + framecount);
Run and show console
Create 3D text, now we want to change the text of this object by the script that is attached to the Cube object
Need to refer to this other Game Object. Create public GameObject textObject = null;
textMesh = textObject.GetComponent<TextMesh>();
Output to stand-alone program
Output to Android phone
Oculus Rift example
What to download
Open Tuscany example>> BREAK
Run Tuscany demo
How does this work? Explain how the scene is built
There is only one Gameobject that makes this suitable for Oculus Rift: the OVR Player controller. Consists of two camera at eye height, separated an eye length from each other.
Now make our own scene viewable with Oculus Rift
Import Oculus UnityIntegration.package
Open old scene and drag OVR prefab into it. Explain what a prefab is.
Add a few extra objects and run the application
Okay, after this very quick introduction to how to get started to make VR applications with Unity, let’s go into AR applications.
Vuforia is a free library from Qualcomm, one of the biggest chip manufacturers in the world. It offers excellent tracking and also text recognition.
Discuss how to install
Image Targets. Different ways to add them.
Adding each Image Target to the hierarchy and giving them different features
User defined targets
Cloud recognition
Image targets can be any picture, but they work much better if there is good local contrast and no repetition. Images with high clarity. There are all sorts of
Open Image Targets demo.
Show all the essential parts of the scene. The ARCamera which relates the camera of your mobile device with the camera in the scene.
The light, very important
Scene has no Main Camera. When starting a new scene you should delete this.
Image targets with script. Get’s activated whenever
Duplicating teapot
Run program on mobile phone
Adding ImageTarget with other
Later you can work further to get everything working and experiment with these examples.
For now, I would like to explain about the Assignment.
I have seen only very little experiments with the VR possibilities like in the early days of film. Most projects aim to achieve immersion, to use the visual effects for the narrative
Dan mentioned the video artist who tried to simulate the effect when he closed his eyes and rubbed his fingers in them. I would find it interesting to use a similar approach in VR, to explore the medium
Another project that might be interesting is experimenting with the concept of editing in AR or VR. Changing or cutting of images in AR is associated with a loss of tracking, but can we also use it in other ways? The Oculus Rift is not only being used for interactive environments, but now also for 3d video worlds.
Let’s first look how one of the founders of Unity answers that question
Let’s first look how one of the founders of Unity answers that question