AR/VR

Thesis- Mindscape Prototype 1

This is a prototype of my IdeaScape Thesis project: an immersive creative environment that combines 3D drawing, dance and music in order to generate new ideas and insights. It is not about making a final work of art, but more about what happens before the artist starts painting and the environment where creating takes place.

For this use case, the user is working on traditional Brazilian Carnaval costumes and culture. The hub environment is a simple masonry house like the one most Brazilians live. The walls have posters depicting carnaval costumes. These were taken by me when visiting the arts and crafts museum of Rio de Janeiro in 2017. Populating the room are 3D models that I created in other VR 3D modeling program. These are also “IdeaObjects”: Vessels where other objects, data, animation, images or entire worlds can be attached to. When one of the objects are activated, a dancing version of it shows up and music plays (unfortunately, not Brazilian drumming in this demo). Not only that, but this specific model activates an entire environment outside the house. This colorful space was also created by me inside Oculus Medium.

Now immersed in music and dance, I can also paint in 3D space while I activate further “IdeaObjects” that were contained in that original model. In this case, after generating three dancers another, bigger dancer shows up. They all dance and generate a cacophony of sound that should immerse the artist on the mood and ambiance assigned by the original Idea Object.

The goal of this interface is not being “efficient” in a traditional way of giving quick and easy access to data, but to be “efficient” in an inspiring way. It is a place for the mind of the user to wander. It is not about “getting things done quickly”, but making things that are meaningful.

Steam VR: Assigning inputs to OpenVR controllers in Unity

Creating spheres by pressing the trigger button.

Creating spheres by pressing the trigger button.


UNITY 2018.3.0f2 – OPEN VR INPUT

Assigning input to open controllers

Since I started dabbling in VR development I’ve been struggling with the apparently simple operation of assigning actions to VR touch controllers. What is a simple scripting operation for keyboard and mouse, became an exoteric guessing game when applied to VR controllers, one that involves Steam VR accounts, binding profiles and the creation of action sets.

The philosophy behind this convoluted process is to create a product agnostic input system, but I’m still baffled on why I can’t just press a button and assign the script I want for this. To complicate matters, because of the shifting sands of VR development, any software update can cause minor changes that make scripts stop working and trusty tutorials out of date.

Because of this I decided to make this guide on assigning Open VR inputs for the latest Unity version (2018.3). This is not intended to be a tutorial ( I don’t know if I’ll be able to make it work) but as a guide and journal of my process.

Open VR Inputs for Unity 2018.3

  • Downloaded SteamVR v2.2.0 assets from the github page.

    • Installing package: Assets – Import Packages – Custom Package – (steamvr2.2 folder)
  • Setup the scene: Player object, Input and Teleport System

    • Create a plane object.

    • Add a Steam VR Player gameobject (type “player” on the Project tab search bar to find it). Place it anywhere on the scene.

    • Disable the “Main Camera”

    • Activate the Input System: Windows – SteamVR Input – prompt will show up: “yes” -

      • The SteamVR Input windows is where you create actions that will be later assign to buttons on a different screen (I’ll check this later). For now press “save and generate”. This will activate the SteamVR sample commands to get started. Close the screen.
  • Add Teleport system:

    • Duplicate the Plane, rename it (exp “TeleportPlane”) and move it on the Y axis (up) just a little bit (exp: 0.05) – This plane will become the teleport area.

    • Add the SteamVR “Teleporting” game object to the scene – this will control the teleporting action.

    • Go to the copied plane (“TeleportPlane”) and “Add Component” – TeleportArea. – Now the plane will be transparent and limits the teleport area.

    • Press play and test the scene on your headset.

    • ! (For some bizarre reason, the headset wasn’t displaying anything. Again, I spent time looking for any errors until I pulled the classic “turn de pc off and on again” and it worked!).

  • Steam VR Key binding – I want to create a sphere on my hand’s position when I press the trigger button.

    • Window – SteamVR Input

    • Create a new input. Named my “DrawTrigger” and set it to “Boolean”. This means it can only be “on” or “off” true/false. If I wanted to use the analog trigger control in order to detect how much pressure I’m applying to the trigger, I had to choose “Vector1”. “Vector2” is for the analog stick 2d position and “Vector3” is for the controller gyroscope…I think.

    • Save and Generate

    • Go back to the SteamVR Input

    • Click “Open binding UI”. This will open a browser window with your steam account and the binding profile for this specific Unity project.

      • Press “edit” on your browser: It will open the “public binding for oculus touch in (unity project name)”.

      • Here you can assign SteamVR Inputs to specific buttons. It also have several tabs, with different button biding collections, like “platformer”, “buggy”…I’ll keep at the “default” tab, because this is where I created my action in Unity.

      • On “Trigger” press the “+” icon to create a new button assignment. Choose “Button”

      • The new “Button” has two modes: “Click” and “Touch”. I want “Click”

      • Click “none” next to “Click”. Choose the action you just created “drawtrigger” in my case.

      • Save by clicking “Save Personal Binding” at the bottom of the window.

      • Go back to Unity.

  • Code Key Command – With the button assigned, It is time to code what you want it to do. UNITY VERSION ALERT! – Open VR is always evolving, which means the coding seems to change all the time. Most tutorial I looked didn’t work on Unity 2018.3. That’s the main reason I’m doing this tutorial for myself.

    • Create Script – Create your new script. Ideally make a folder for it inside “Assets”. I created a “_Script” folder. I named my script “Draw”. Recapitulating: It will create a sphere where my hand position is. Open the script on your code editor.

    • !!!!1 – Add Valve VR Libraries. Before coding you have to add the libraries:

When you open the script it, the library area on top should look like this:

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

Add the following

using Valve.VR;

using Valve.VR.InteractionSystem;
  • !!!!2 – Add Namespace. I banged my head on this one for a while.
    The `public class Draw : MonoBehaviour` must be inside the brackets of a namespace `namespace Valve.VR.InteractionSystem.Sample` like this:
using System.Collections;

using System.Collections.Generic;

using UnityEngine;

using UnityEngine.UI;

using Valve.VR;

using Valve.VR.InteractionSystem;

namespace Valve.VR.InteractionSystem.Sample

{

public class Draw : MonoBehaviour

{

}

}
  • Now the script is ready for you to code, otherwise it wouldn’t recognize any of the Steam VR inputs.

  • Calling the buttons and actions. Creating the Steam VR variables

  • Inside `public class Draw : MonoBehaviour` add

namespace Valve.VR.InteractionSystem.Sample

{

public class Draw : MonoBehaviour

{

//Create variables for the SteamVR actions

public SteamVR_Action_Boolean triggerPress;

//Call Rigidbody objects to be created.

public Rigidbody paintObject;

private void Update()

{

//when the left hand trigger button is pressed down, create object at the same
position and rotation of the current parent object.

if (triggerPress.GetStateDown(SteamVR_Input_Sources.LeftHand))

{

Rigidbody paintObjectClone = (Rigidbody)Instantiate(paintObject,
transform.position, transform.rotation);

}

}

}

}
  • Assigning objects to the variables on the Unity Inspector screen

The “public” variables called at the beginning of the scrip must be assigned “by hand” on the Unity Inspector screen. This allows the same script to be used with different objects at different contexts.

    • Create a “Sphere” 3D object and make it a prefab by dragging and dropping it inside the folder you assigned for prefabs. Named mine “_Prefab”. The sphere will be the “paint” created by the “Draw” script when I press the trigger button.
  • Add the “Draw” script to the object you want to control it – In this case SteamVROjbectsLefHand

  • On the “Draw” script component inside the LeftHand object assign the “DrawTrigger” SteamVR action to “Trigger Press” and the sphere prefab to Paint Object – as shown in the animation bellow:

Assigning the variable objects to the script on the Inspector window

Assigning the variable objects to the script on the Inspector window

Little Button AR

“Little Button” is an augmented reality app where you play with a little boy who’s able to make flowers blossom wherever he goes. This a part of my “Magical Windows” class for my Interactive Telecommunications Program degree at NYU.

In its current form, a glitch turned it into a flower painting program, where the user can become my “Little Button” character and spread flowers in the real world.

This ar experiment is part of the larger “Little Button” world. This is a character I’ve been working for a while - aimed at children’s books and apps.

For this APP the final vision is to introduce kids to this little blue guy!

Magic Windows

 

Setting up ARKit in unity

 

There is no plugin at the asset store! So go here: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin

The zipped file looks like a Unity Package, but it isn’t! You should copy its content to your project “Assets” folder.

Unity will update

Unlike Vuforia, Arkit doesn’t create a new Camera object, but uses Unity’s

·       Create an empty object named “Camera Parent”

·       Move the “Main Camera” to “Camera Parent” – making it ‘child’.

o   Main Camera – Inspector

§  Clear Flags: Depth Only

§  Culing Mask: Everything

§  Field of View: 60

§  Clipping Planes: Near: 0.1  / Far: 37.81

o   Add component: Unity AR Video

§  Material: YUVMaterial (or any clear material)

o   Add component: Unity AR Camera Near Far

·       Create empty object: ARCameraManager

o   Add component: Unity AR Camera Manager

§  Camera: Drag n drop the Main Camera

§  Plane Detection: Select if you want it to track planes on either horizontal or vertical or both. Sometimes you don’t want to track walls or the floor.

§  Get Point Cloud: toggle

§  Enable Light Estimator: Toggle

§  Enable Auto Focus: Toggle (YES!)

§  DONE!

ARKit is setup. Now setup the example scene

·       Create Empty Object “DebugViz”

o   Inside it create another empty object named: PointFinder

§  Add component: Unity Point Cloud Example

·       Increase Num Points to Show

·       Point Cloud Prefab: (grab example asset) PointCloudPrefab

§  Add another empty object to DebugViz: PlaneFinder

·       Add component: Unity AR Generate Plane

o   Plane Prefab: debugPlanePrefab (also part of the example collection)

Build for IOS

·       Player Settings

o   Bundle Identifier: add your company name:

o   Camera Usage Description: (Type whater you want to the user allow the use of the phone’s camera…..could be “ARBABY”…whatever.

o   Target minimum iOS Version:  11.0 (it has to be 11.0 or above)

o   Requires ARKit support: toggle

Open the project in XCode and go through all the usual Apple nuisance of adding team/your apple developer coder (automatic); connect a phone to your computer; authorize everything and if everything works you should have an icon with the Unity App in your phone.

 

Unity AR Hit teste example

·       Create empty object: hit_parent

o   Inside it create empty object: hit / OR…. Create a 3D object (exp: Cube) and ..

o   Add the “Unity AR Hit test example” script

             

§  Open the script

This script allows to detect if a touch on the screen is hitting one of the AR objects.

              Remove all the Update Unity part – from “#if” all the way to “else” and remove “endElse”. We don’t need it to run inside the unity player

             

              Remove comments for all ARHitTestResultType.xxxxxxx (activating them)

 

Build/ Compile and Run….it sort of works. A cube showed up and it stays in there…once. But no interaction.

 

 

 

AR Final – Little Button

 

Little button is tiny little boy that lives inside a flower that only blooms when bathe in moonlight. He one of the character of a children’s book I want to write.

In the AR experience the user controls the moonlight. At first you have to lighten the flower until it blossoms, revealing a sleeping Little Button inside. He then wakes up and jumps from the flower eager to play with the user, who controls a spotlight representing moonlight. Little Button will follow the light and wherever he goes, flowers blossom on his wake. By moving him around, the player can create drawings and patterns with the flower, mixed with their real environment through the use of AR.

Goals

·       Make character that moves wherever the player points.

·       Make flowers bloom on his wake.

·       Model the character, flower, flowers path.

·       Animate Character

·       Add sound and effects

·       Add Music.

·       Deploy/Build to ARKit – Iphone.

 

The following code allows to make the character rotate and then move toward a mouse.click position. Not exactly what I need, but still a good start.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class CharController : MonoBehaviour

{

    Quaternion charRot;

    Vector3 targetPosition;

    Vector3 lookAtTarget;

    float rotSpeed = 5;

    float speed =5;

 

    bool moving = false;

 

    // Use this for initialization

    void Start()

    {

 

    }

 

    // Update is called once per frame

    void Update()

    {

        if (Input.GetMouseButton(0))

        {

            SetTargetPosition();

        }

        //if "Move()" is left running on the update, the object will jitter!

        Move();

        if (moving)

        {

            Move();

        }

        Debug.Log(transform.position.x);

    }

 

    void SetTargetPosition()

    {

        Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);

        RaycastHit hit;

        //If you are not hitting anything change the "1000" value.

        if (Physics.Raycast(ray, out hit, 1000))

        {

            targetPosition = hit.point;

            //LookAt snaps toward the position direction. Not what I need

            // this.transform.LookAt(targetPosition);

 

            //Turning around without making it look in the air. Transform only the Y position. Turn around the Y axis, left and right. By finding the X and Y differences we get a position to rotate the character. It projects the target X to the same level as the character

            lookAtTarget = new Vector3(targetPosition.x - transform.position.x, transform.position.y, targetPosition.z - transform.position.z);

            //use a Quaternion function with the values from LookAtTarget. Rotation that has to occur in order to look at the target.

            charRot = Quaternion.LookRotation(lookAtTarget);

 

           

 

            //toggle moving

            moving = true;

 

        }

    }

    //Moving the character...smoothly using "Slerp" - Slerp smooths values

    void Move()

    {

        //smoothly rotates character toward the target position. "Spherically interpolates between a and b by t. The parameter t is clamped to the range [0,1]

        transform.rotation = Quaternion.Slerp(transform.rotation, charRot, rotSpeed * Time.deltaTime);

 

        //move toward target

        transform.position = Vector3.MoveTowards(transform.position, targetPosition, speed * Time.deltaTime);

 

        //toggle bool moving to false. Too accurate, might get into problem. Must set a range of being "on target" means.

        if (transform.position == targetPosition)

        {

            moving = false;

        }

    }

 

}

 

 

 

It does have a bug, where the character floats toward the camera if clicked upon. I thought that the code should cancel that, but it still floats. Going back to basics on raycasting to figure out why.

 

Now I have to update the code to behave more closely to what I need. Instead of moving where clicked, I want it to constantly follow the mouse…. –

It seems that the clicking movement might be better when using the phone. Just have to update to touch.

Adding Touch: Touching Phases and Touch Controls

Youtube tutorial on Screen and World Coordinates for Raycast here. Had to take a look at tutorials on using touch interfaces. The code for the mouse movement doesn’t use “ScreenToWorldPoint”, but “ScreenPointToRay” – Why?

The following code is an exercise on how coordinates for mobile work. It encompasses Raycasting and “ScreenToWorldPoint” functions. It also makes things disappear when clicked upon. Tried to make it play a sound since I’d like to have sound, but it didn’t work.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

//require a component for playing sound

[RequireComponent(typeof(AudioSource))]

public class touchManager : MonoBehaviour {

 

   public  AudioSource audioData;

    public AudioClip fxClip;

 

    // Use this for initialization

    void Start () {

             

       }

      

       // Update is called once per frame

       void Update () {

        if (Input.GetMouseButton(0))

        {

            //Orthogonal End position - Vector3 position of the far end of the camera projection. For the Z position I will use the "Far"plane.

            //

            Vector3 mousePosFar = new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.farClipPlane);

 

            //For the Near position Z = Camera NearClipPlane

            Vector3 mousePosNear = new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.nearClipPlane);

 

            Debug.DrawRay(mousePosNear, mousePosFar - mousePosNear, Color.green);

 

            //Convert the above "mousePosFar" and "mousePosNear" to ScreenToWorldPoint

 

            Vector3 mousePosF = Camera.main.ScreenToWorldPoint(mousePosFar);

            Vector3 mousePosN = Camera.main.ScreenToWorldPoint(mousePosNear);

 

            //Debug: Ray(start position, direction, color)

            //Debug.DrawRay(mousePosN, mousePosF - mousePosN, Color.green);

 

            //Raycast hit object. Part of Unity Physics system

            RaycastHit hit;

 

            if(Physics.Raycast(mousePosN,mousePosF-mousePosN, out hit))

            {

                audioData = GetComponent<AudioSource>();

               audioData.Play(0);

               // Destroy(hit.transform.gameObject);

          }

 

        }

       }

}

 

 

Been watching theses tutorials for an hour. Stopped to eat. Losing track of the big picture. What do I need? : Get the character to go to where indicated by the user using touch.

Parentesis – Debugin an Iphone while on Windows

 

Testing IOS builds while working on a windows computer is a pain. It usually involves having to build the scene, copy the (big) file to a mac computer and build it to an IOS device using Xcode.

I’ve been trying to use ARKitRemote to let Unity run games on IOS devices connected with an USB cable, but I’ve had errors all the time.  I’ve build the file in Unity with the player Debug settings, but still can’t connect. Maybe I don’t have the IOS device drivers for windows. I’ll install ITUNES for windows again like commented here. Still no success though.

 

Back to coding.

Updating priorities

 

I have exactly 24h before my deadline for this project. At the time I have only a character that follows and moves toward a clicked/toward location on 3D space. Also, I have the IOS deployment pipeline solved. What next?

I have to make some strategic decisions. Either focus on getting the mechanics right, or add some visual bells and whistles that would help show where I’d like to get with this project. Both options must have the flower path system implemented, since this was a direct request from my instructor Rui Pereira.

Since I have a moving character – although a still buggy one – I will work on the flower path.

I just hope this code will translate well to AR.

Flower Path

 

After the introduction, where the character wakes up and jumps to the flower, he will start following the moonlight/AR camera, while leaving a path of flower where he goes.

I have a couple of options for the the flower:

·       I could code a 3D line following a drawing app tutorial (here) and instead of using the finger, the line would follow the character. The problem is making flowers instead of a line.

·       Or I could instantiate random blooming flowers prefabs. I have to be careful not to tax the mobile system too much. Also, I don’t really know how to do this. I know how to instantiate objects, but not how to time them very well. I will go with this one though.

 

AR_dev_flowerPath01.gif

Get to instantiate placeholder objects where the character is moving.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class FlowerPath : MonoBehaviour {

 

    public Transform flowers;

    public Transform origin;

 

    public CharController CharScript;

   

   

 

       // Use this for initialization

       void Start () {

      

       }

      

       // Update is called once per frame

       void Update () {

 

        //create condition to start drawing path/running "createPath" based on the "bool moving" in the "CharController" script located in the "Character Dummy" GameObject (for now)

        if (CharScript.moving)

        {

            createPath();

        }

    }

    //instantiate "flowers" on the position of "origin"

    void createPath()

    {

        Vector3 pos = new Vector3(0.2f, 0f, 0f);

        Instantiate(flowers, origin.position + pos, flowers.rotation);

    }

}

 

 

 

Now the character instantiates one “flower” every frame after checking if the “moving” function from the “CharController” is “true”.

Now I have to find a way to give an interval between the creation of the “flowers”.

I tried turning the “createPath” into an IEnumerator:

 

 

I only managed to delay the activation of the function, and not how fast it instantiates “flowers”.

Fixed the Jittering!

Character was jittering and making it difficult to toggle the “moving” Boolean. At first I thought that it was because the condition for toggling was too strict:

        if (transform.position == targetPosition)

 

 

                          So I added a distance condition:

float dist = Vector3.Distance(targetPosition, transform.position);

        Debug.Log(dist);

 

        if (dist <= 1)

           

        {

            moving = false;

        }

 

But It kept jittering. Looking at the variable distance debug on the console, I realized that it kept floating.

I fixed by changing from “Update” to “Fixed Update”. Now it is stable!

 

Create random position for the flowers

 

My beautiful flowers/cylinders are being instantiated, but I’m still not happy with the interval, since it instantiate two flowers in one frame. But it is fine for now.

Another problem is that the “flowers” are always appearing on the same position relative to the character. I want this position to be a little more random. I’ll have to figure out how to do random.position while still keeping some kind of limit.

Here’s the solution:

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class FlowerPath : MonoBehaviour {

 

    public Transform flowers;

    public Transform origin;

 

    public CharController CharScript;

 

    private float InstantiationTimer = 0.2f;

   

   

 

       // Use this for initialization

       void Start () {

      

       }

      

       // Update is called once per frame

       void Update () {

 

        //create condition to start drawing path/running "createPath" based on the "bool moving" in the "CharController" script located in the "Character Dummy" GameObject (for now)

        if (CharScript.moving)

        {

            StartCoroutine("createPath");

        }

    }

    //instantiate "flowers" on the position of "origin"

    IEnumerator createPath()

    {

        InstantiationTimer -= Time.deltaTime;

        if(InstantiationTimer <= 0)

        {

           

            yield return new WaitForSeconds(0);

            //Position origin offset

            Vector3 position = new Vector3(Random.Range(0.0f, 1.0f),-0.5f , Random.Range(0.0f, 1.0f));

 

            Instantiate(flowers, origin.position + position, flowers.rotation);

            InstantiationTimer = 0.2f;

        }

       

    }

}

 

 

Now the “flowers” are being instantiated a little apart from one another. I’d like to find a way for them to always be fixed on the ground plane. At the moment they are always relative to the Character position and if it jumps some “flowers” will float in order to follow they Y position. I made the offset y a little negative to ground them.

Following the moonlight.

 

For now the character moves to the touch/click coordinates. I want it to follow an object – the moonlight to be more specific. (Still not sure of this scheme though. Touching seem so straightforward but doesn’t tell the story I want).

 

Options for the moonlight. I could use a real light source, a projection map or a 3d object.

Before that I have to see if the character control still works in ar.

(Developing on a windows PC and building to an Iphone is a pain!)

Added AR and tested….It doesn’t work as I thought!

 

Deployed to my Iphone and the game is visible, but the character is not really moving o top of the ARplane but inside instantiations of the normal plane. It does go where I point it to go, but I don’t think it is working as it should. It is also too big. I’m afraid I’ll have to adapt the character control script to some ARKit sample code. The problem is that it takes forever to deploy the build from my windows laptop to the iphone. Well…will have to check this tomorrow.

 

New day – a new computer

It seems that I really had to be coding starting from ARKit sample scripts from the beginning. Coding a normal touch/mouse game and adding ARKit camera and scripts later doesn’t actually work. The game runs, but as an overley on top of the camera, and not as objects really interacting with the AR planes generated by ARkit. Also the proportions are all wrong. The Character Dummy is huge and just “scaling” it in Unity doesn’t seem to be the best way to approach scale as described in this Unity blog post on the subject.

Also, I’m installing my version of Unity (2018.2.10) into a Mac laptop to streamline development. With my windows pc I’m unable to quickly test code.

 I’ve been trying to start my project on a Mac and run the ARki_Remote for more than an hour now. I haven’t even started transferring my code to the new project. 4 hours to the deadline!

 

Rebuilt the dummy character and the flower path code on the mac version. Still don’t have a touch control to make the character move.

Adapting Character Control to ARkit

I have to add my character control, which is based on raycasting detection, to this ARKit template code

using System.Collections;

using System.Collections.Generic;

using UnityEngine;



public class FlowerPath : MonoBehaviour {



    public Transform flowers;

    public Transform origin;



    public CharController CharScript;



    private float InstantiationTimer = 2f;







       // Use this for initialization

       void Start () {



       }



       // Update is called once per frame

       void Update () {



        //create condition to start drawing path/running   "createPath" based on the "bool moving" in the   "CharController" script located in the "Character Dummy"   GameObject (for now)

        if (CharScript.moving)

        {

              StartCoroutine("createPath");

        }

    }

    //instantiate "flowers" on the position of   "origin"

    IEnumerator   createPath()

    {

        InstantiationTimer   -= Time.deltaTime;

        if(InstantiationTimer <= 0)

        {

            Vector3 pos = new Vector3(0.2f, 0f, 0f);

            yield return new   WaitForSeconds(2);

              Instantiate(flowers, origin.position + pos, flowers.rotation);

        }



    }

}

Magic Windows - Working with AR Kit

Setting up ARKit in unity

 

There is no plugin at the asset store! So go here: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin

The zipped file looks like a Unity Package, but it isn’t! You should copy its content to your project “Assets” folder.

Unity will update

Unlike Vuforia, Arkit doesn’t create a new Camera object, but uses Unity’s

·         Create an empty object named “Camera Parent”

·         Move the “Main Camera” to “Camera Parent” – making it ‘child’.

o   Main Camera – Inspector

§  Clear Flags: Depth Only

§  Culing Mask: Everything

§  Field of View: 60

§  Clipping Planes: Near: 0.1  / Far: 37.81

o   Add component: Unity AR Video

§  Material: YUVMaterial (or any clear material)

o   Add component: Unity AR Camera Near Far

·         Create empty object: ARCameraManager

o   Add component: Unity AR Camera Manager

§  Camera: Drag n drop the Main Camera

§  Plane Detection: Select if you want it to track planes on either horizontal or vertical or both. Sometimes you don’t want to track walls or the floor.

§  Get Point Cloud: toggle

§  Enable Light Estimator: Toggle

§  Enable Auto Focus: Toggle (YES!)

§  DONE!

ARKit is setup. Now setup the example scene

·         Create Empty Object “DebugViz”

o   Inside it create another empty object named: PointFinder

§  Add component: Unity Point Cloud Example

·         Increase Num Points to Show

·         Point Cloud Prefab: (grab example asset) PointCloudPrefab

§  Add another component to DebugViz: PlaneFinder

·         Add component: Unity AR Generate Plane

o   Plane Prefab: debugPlanePrefab (also part of the example collection)

Build for IOS

·         Player Settings

o   Bundle Identifier: add your company name:

o   Camera Usage Description: (Type whater you want to the user allow the use of the phone’s camera…..could be “ARBABY”…whatever.

o   Target minimum iOS Version:  11.0 (it has to be 11.0 or above)

o   Requires ARKit support: toggle

Open the project in XCode and go through all the usual Apple nuisance of adding team/your apple developer coder (automatic); connect a phone to your computer; authorize everything and if everything works you should have an icon with the Unity App in your phone.

 

Unity AR Hit teste example

·         Create empty object: hit_parent

o   Inside it create empty object: hit / OR…. Create a 3D object (exp: Cube) and ..

o   Add the “Unity AR Hit test example” script

               

§  Open the script

This script allows to detect if a touch on the screen is hitting one of the AR objects.

                Remove all the Update Unity part – from “#if” all the way to “else” and remove “endElse”. We don’t need it to run inside the unity player

               

                Remove comments for all ARHitTestResultType.xxxxxxx (activating them)

 

Build/ Compile and Run….it sort of works. A cube showed up and it stays in there…once. But no interaction.

 

 

Disaster Syndrome - Rope physics

Gabriel Brasil and Michael Fuller's amazing VR adventures! Today, we tried OBI Rope physics plugin in Unity together with VR controllers and it was a success! VR High fives for everybody!

NYU / ITP

November 10, 2018

Class: Desert of the Real - Deep dive in Social VR

Instructor: Igal Nassima

DISASTER SYNDROME – 2018 11 08

Created a new Unity Project named “Disaster Syndrome”.

Setup SteamVr and its basic interactions for our first scene.

GOALS 1: create two networked characters and create the rope dynamics interactions

OBI Rope Asset

We bought the OBI Rope Particle at the Unity Asset Store, as recommended by our instructor Igal.

OBI Rope comes with several solutions for our Rope dynamics.

Goal 1.1: Use OBI Rope plugin in conjunction with steam vr.

Obi is a collection of particle-based physics plugins for Unity. That means that everything is made out of small spheres called particles. Particles can affect each other, affect and be affected by other objects all through the use of constraints.”

“Obi only takes care of particle allocation and constraint setup. The physical simulation itself is performed by Oni (japanese for "demon") which is a completely engine-agnostic particle physics library written in C++. You can think of Obi as the middle-layer between Unity and Oni. Let's take a look at Obi's component architecture, and why it's made that way…”

Looked at OBI webpage for manuals and tutorials on setting up the plugin

 

Goal 1.1 achieved!: It works!

We started with the Obi Rope sample scenes, looking at each of different mechanics available. The Rope Chain scene has a chain attached to one Cube object that can be controlled by the user in the Scene window, and at the other end of the chain there is a big ball. When the Cube is moved, the chain and the ball move accordingly, knocking down physics objects in the scene.

After playing with the scene, we copied it and added the Steam VR plugins to it: Teleporting; Player; Teleport Area; Interactables and Throwable.

The goal was to attach the chain link to a throwable object,

First we just made the original chain Cube a child of the hand, but it couldn’t be thrown and the pivot point was far away from the hand/controller position.

Our second test involved adding Steam Vr’s “Throwable” and “Interactable” to the Cube attached to the chain. This made the cube move erratically on the scene, apparently “running” away from my hand. It was funny, but not what we are looking for.

Third was the charm, when we created a new throwable object “Capsule” and just added the “Obi Particle Rendered” and assigned the “Obi Chain” object/actor to it.

The Sphere was positioned on a platform since I couldn’t physically reach it on the floor (wonders of VR). In a moment the reminded us of Raiders of the Lost Arc, I carefully approached the “Capsule” and it remained attached to the chain and the ball. When thrown, it worked: The Capsule dragged the chain and the sphere with it.

Desert of the Real - Create a VR paintbrush

This is a 3-week-old assignment that I didn’t manage to do until now. The goal is to create a VR painting program.

Everytime I start to code I just freeze looking at the empty script. I just don’t know how to organize my thoughts and don’t have a clear idea of what is to be achieved. For this reason, I decided to write everything I want to do in plain English.

Before coding I set up the scene using the basic SteamVr package. I set up a Teleport area for locomotion; a plane to make this area visible; a cube platform to set my objects and for orientation; three cubes prefabs, each with a different material/color (red, green and blue) and finally a SteamVR player controller. I also put some throwable objects just to test the controller responsiveness.

The class template has an empty object called “InteractiveObject” where the BrushHandler and ColorSelector scripts are kept. The ControllerHandler script is found in the Players “RighHand”.

 

 

 

Now for the coding part:

Goals:

·       Have a Brush object

·       The brush is controlled with the VR controller.

·       Have colored objects on the scene that will serve as colorPalette.

·       When Brush touches one of these objects – it changes to the objects color.

·       When pressing the Trigger Button on the controller, the Brush creates many copies of itself named paintObjects, allowing to paint the scene with it.

·       The paintObject must have the color of the brush.

·       If the brush touches a colorPalette with a different color, it should change color but the paintObject already on the scene should remain with its original color – Once paint is set, it should remain the same color. (no changing the color of what was already drawn)

·       There should be a limit on how many paintObjects are on the scene at any given time:  a limit on how much paint can be on the scene. If this limit is reached, the ‘oldest’ paintObject must be deleted.

·        

 

There are three scripts already created by the class instructor Igal Nassima. They come with comments and guidelines, but I don’t really understand his workflow. I’m not sure where Igal want us to set the scripts; if the Brush object is created by the script or if it will already be on screen.

I will try to do it somehow…

·       ControllerHandler

o   Makes the brush paint

o   Checks what color is in the brush

o   Get the brush color

o   Creates instances of the paintObject that has the current color of the brush.

·       ColorSelector

o   “Create a public Color Type that can be set from the inspector to be used in each menu Item” – (comment by Igal) – Don’t understand what he means by “color type” and why it must be set from the inspector.

o   “Setup Collider Trigger to send the color of this object to the Brush Handler” – Checks if there is a collision between the brush and another object and send the color of this other object back to the Brush Handler.

o   “Find Object of Type à then call the function that sets the currentColor” –

·       BrushHandler

o   Store the current color on the brush

o   Sets what gameobject will be drawn/created/instantiated when activated by the trigger on the controllerHandler.

o      //create a public function that SETS UP the color of the currentColor (hint, function must have a Color variable)

o       //create a public function that RETURNS the Color of the currentColor

§  Don’t know the difference between Set UP the color and “Returns the Color” and what it is supposed to do..

§  Also, Don’t know how to make scripts communicate with one another.

Found an example at my colleague Ellen Nickles documentation of this assignment:

//Inside the ControllerHandler script it calls the the color on BrushHandler:
  private void OnTriggerEnter(Collider other)

    {

        FindObjectOfType<BrushHandler>(). currentColor = myColor;

    }

 

 

 

Creating the Brush

The Brush doesn’t paint, but stores its own color and changes color when

·       The brush is a Sphere prefab named ‘Brush’.

·       It will have a color.

·       It will get the color of the object it touches

·       It will return the color it has

·       It has a script called BrushHandler. It will set the following behaviors:

o   Stores the currentColor

o   Creates/Returns the prefab that will be drawn (the “paint”)

 

Starting to write the code itself

              Don’t really know where to start. I look at the three template scripts, but don’t know which function to make. I don’t even know if I should create the brush object of if it is instantiated by one of the scripts.

 

Instantiation Tests

In order to practice basic coding, I also created a Test script to try things out.

I created a transform function that makes a cube move and it asks for a GameObject. I added a different object on the inspector but the object where the script was attached – the cube -  still was the one that moved. Why?

1 Creating – Instantiating an object

              I will create a brush and make it instantiate copies of itself when user press the Oculus Controller Trigger button. I’ll figure the other actions (changing color) later.

              We managed to run a basic instantiate code copied from the Unity API manual. At first it didn’t worked, because we added the script to the prefab we wanted to instantiate. This freezes Unity because it was trying to instantiate the object WITH the instantiate script. Once we added a different prefab to the public GameObject slot in the Inspector the instantiate code worked fine.

Basic Unity Instantiate example 1:

//The Manual didn’t used “public”.  

 public Transform prefab;

    void Start()

    {

        for (int i = 0; i < 30; i++)

        {

            Instantiate(prefab, new Vector3(i * 2.0F,i* 1, 0), Quaternion.identity);

        }

    }

Basic Unity Instantiate example 2:

public class TestScript : MonoBehaviour

{

    // Instantiate a rigidbody then set the velocity

 

    public Rigidbody projectile;

 

    void Update()

    {

        // Ctrl was pressed, launch a projectile

        if (Input.GetButtonDown("Fire1"))

        {

            // Instantiate the projectile at the position and rotation of this transform

            Rigidbody clone;

            clone = Instantiate(projectile, transform.position, transform.rotation);

 

            // Give the cloned object an initial velocity along the current

            // object's Z axis

            clone.velocity = transform.TransformDirection(Vector3.forward * 10);

        }

    }

}

 

STARTING FROM SCRATCH (?!!)

                  For some reason it seems that SteamVr plugin is no longer functional on my project. I can’t find the Steam Controller Input inside the ‘Window’ menu. I downloaded the template project again and it seems to be working fine there. I’ll have to setup my scene again.

Update: Maybe the SteamVR plugin wasn’t turned on (for some reason). I started a new project anyway.

 

 

 

Spheres everywhere! Instantianting is a success (and out of control)

Spheres everywhere! Instantianting is a success (and out of control)

Making the paint brush:

Now we have two scripts that allows us to create spheres on the scene. One creates on sphere at every frame and other creates one sphere when “Fire1”/ Ctrl is pressed on the keyboard.

By changing the prefab instantiated by theses scripts it was possible to create a static sphere on the scene – one that has no gravity or doesn’t fly away. By pressing “Fire1”/Ctrl it was possible to create flying sphere and ‘paint’ the scene with spheres.

Goal:  Create static instances of a sphere while the Oculus Controller Trigger button is pressed and stop creating when not pressing.

`This is the code created

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

public class DrawTest : MonoBehaviour

{

public Transform paint;

void Update()
{

    if (Input.GetButtonDown("Fire1"))
    {

        createPaint();
    }

}

void createPaint()

{`

        for (int i = 0; i < 1; i++)
        {

            Instantiate(paint, transform.position, transform.rotation);

        }

}

}

 

This script works like the first one: It creates one sphere every time “Fire1”/Ctrl is pressed, but it doesn’t keep creating more balls if the button is still pressed. Maybe I need to change “Input.GetButtonDown” to something else…

Changed from “Input.GetButtonDown” – that returns true when the button is pressed down and only resets if released – to “Input.GetButton” – that returns true every frame the button is pressed. And it works!

Nice smooth sphere paint created on the scene.

Nice smooth sphere paint created on the scene.

3D Sound for Immersive Experiences

For my week 5 assignment, I had to create my own sounds using VCV Rack - an open source modular synth simulator.

I had a blast creating more than an hour of blip, bloops and other original sounds.

Snippets of the track were edited and mastered in Reaper to be turned into spatial sound objects for a future VR experience created with the Unreal Engine.

using vcv sounds in unreal - 3d audio spatialization.

Selected some of my favorite and more relevant sounds created in VCV Rack and added them to Unreal engine as 3D spatial sound objects.

The sound files were edited for “looping” and placed in areas around this environment I’ve built using basic template objects.

The goal was to create distinct transitions between areas with the help of careful placement of 3D sound.

For some of the effects I created sound cues - programmable sound objects. This allows for Unreal to change sound parameters in many ways. In this case a combined two different sound files and added a random filter and several different EQ modulators. This allowed for the creation of different sounds inside the “temple” area every time the user entered it. It almost works, since for now it only creates a few simple variations of the original sounds.

The next goal is to add some “head locked sounds”: ambient sound that covers the entire area, creating a more cohesive experience as well as amplifying the sense of being in an open, living world.

HighresScreenshot00001.png

Magic Windows - Live VR Body Painting

Exploring the boundaries of augmented and virtual reality with live body painting projection.

Recently I’ve been exploring drawings softwares like Tilte Brush and Quill, that allow to use VR as a creative medium while inside VR.

For this projection mapping/augmented reality assignment, I decided to do a live VR painting that would be projected into my body.

vr body mapping.JPG

The projector setup was very basic, with it standing on a tripod facing me. For the time I didn’t use any live mapping software, since the challenge was how to make me - the artist - aware of my surroundings and myself being projected upon while immersed inside the VR painting program.

For this I’ve found an app called OVRDrop. Together with Virtual Desktop, these software re-creates your desktop inside any VR environment.

ego vr setup.JPG

At first I used a webcam and later a DSLR camera connected to my computer to feed OVRDrop with external footage of myself. This allowed me to carefully move my head and arms in order to properly align my drawings with the projection.

Projector connected to youtube stream. Webcam taped on top of it - later replaced by DSLR camera.

Projector connected to youtube stream. Webcam taped on top of it - later replaced by DSLR camera.

I also experimented with using a YouTube live streaming to connect the projector with what I was seeing/drawing inside the headset. I was also able to see the streamed video inside the VR painting app. This created some very interesting, albeit disconcerting experiences as I tried to synchronize my movements with the delay of the streamed video. One possibility is to further extrapolate this multi body / multi temporal perspective, allowing for several streams and projectors to be connected with my VR point of view.

Having someone to help setup and position myself was essential. Thanks Dan!

Having someone to help setup and position myself was essential. Thanks Dan!

As of this being VR or AR, the whole goal was to extrapolate the gradient between both. Artist and viewers are both seeing the same animation, but they are in different positions of the AR/VR gradient. I’m immersed in VR while I see and manipulate my “augmented” self.

0C4A7220.JPG

Next Steps

For future installments, I’d like to allow the viewers to interact further with the experience, either on location or through streaming services.

Use computer vision to track my body and features, allowing me to move and interact with the live VR drawings being created.