Magic Windows

Little Button AR

“Little Button” is an augmented reality app where you play with a little boy who’s able to make flowers blossom wherever he goes. This a part of my “Magical Windows” class for my Interactive Telecommunications Program degree at NYU.

In its current form, a glitch turned it into a flower painting program, where the user can become my “Little Button” character and spread flowers in the real world.

This ar experiment is part of the larger “Little Button” world. This is a character I’ve been working for a while - aimed at children’s books and apps.

For this APP the final vision is to introduce kids to this little blue guy!

Magic Windows

 

Setting up ARKit in unity

 

There is no plugin at the asset store! So go here: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin

The zipped file looks like a Unity Package, but it isn’t! You should copy its content to your project “Assets” folder.

Unity will update

Unlike Vuforia, Arkit doesn’t create a new Camera object, but uses Unity’s

·       Create an empty object named “Camera Parent”

·       Move the “Main Camera” to “Camera Parent” – making it ‘child’.

o   Main Camera – Inspector

§  Clear Flags: Depth Only

§  Culing Mask: Everything

§  Field of View: 60

§  Clipping Planes: Near: 0.1  / Far: 37.81

o   Add component: Unity AR Video

§  Material: YUVMaterial (or any clear material)

o   Add component: Unity AR Camera Near Far

·       Create empty object: ARCameraManager

o   Add component: Unity AR Camera Manager

§  Camera: Drag n drop the Main Camera

§  Plane Detection: Select if you want it to track planes on either horizontal or vertical or both. Sometimes you don’t want to track walls or the floor.

§  Get Point Cloud: toggle

§  Enable Light Estimator: Toggle

§  Enable Auto Focus: Toggle (YES!)

§  DONE!

ARKit is setup. Now setup the example scene

·       Create Empty Object “DebugViz”

o   Inside it create another empty object named: PointFinder

§  Add component: Unity Point Cloud Example

·       Increase Num Points to Show

·       Point Cloud Prefab: (grab example asset) PointCloudPrefab

§  Add another empty object to DebugViz: PlaneFinder

·       Add component: Unity AR Generate Plane

o   Plane Prefab: debugPlanePrefab (also part of the example collection)

Build for IOS

·       Player Settings

o   Bundle Identifier: add your company name:

o   Camera Usage Description: (Type whater you want to the user allow the use of the phone’s camera…..could be “ARBABY”…whatever.

o   Target minimum iOS Version:  11.0 (it has to be 11.0 or above)

o   Requires ARKit support: toggle

Open the project in XCode and go through all the usual Apple nuisance of adding team/your apple developer coder (automatic); connect a phone to your computer; authorize everything and if everything works you should have an icon with the Unity App in your phone.

 

Unity AR Hit teste example

·       Create empty object: hit_parent

o   Inside it create empty object: hit / OR…. Create a 3D object (exp: Cube) and ..

o   Add the “Unity AR Hit test example” script

             

§  Open the script

This script allows to detect if a touch on the screen is hitting one of the AR objects.

              Remove all the Update Unity part – from “#if” all the way to “else” and remove “endElse”. We don’t need it to run inside the unity player

             

              Remove comments for all ARHitTestResultType.xxxxxxx (activating them)

 

Build/ Compile and Run….it sort of works. A cube showed up and it stays in there…once. But no interaction.

 

 

 

AR Final – Little Button

 

Little button is tiny little boy that lives inside a flower that only blooms when bathe in moonlight. He one of the character of a children’s book I want to write.

In the AR experience the user controls the moonlight. At first you have to lighten the flower until it blossoms, revealing a sleeping Little Button inside. He then wakes up and jumps from the flower eager to play with the user, who controls a spotlight representing moonlight. Little Button will follow the light and wherever he goes, flowers blossom on his wake. By moving him around, the player can create drawings and patterns with the flower, mixed with their real environment through the use of AR.

Goals

·       Make character that moves wherever the player points.

·       Make flowers bloom on his wake.

·       Model the character, flower, flowers path.

·       Animate Character

·       Add sound and effects

·       Add Music.

·       Deploy/Build to ARKit – Iphone.

 

The following code allows to make the character rotate and then move toward a mouse.click position. Not exactly what I need, but still a good start.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class CharController : MonoBehaviour

{

    Quaternion charRot;

    Vector3 targetPosition;

    Vector3 lookAtTarget;

    float rotSpeed = 5;

    float speed =5;

 

    bool moving = false;

 

    // Use this for initialization

    void Start()

    {

 

    }

 

    // Update is called once per frame

    void Update()

    {

        if (Input.GetMouseButton(0))

        {

            SetTargetPosition();

        }

        //if "Move()" is left running on the update, the object will jitter!

        Move();

        if (moving)

        {

            Move();

        }

        Debug.Log(transform.position.x);

    }

 

    void SetTargetPosition()

    {

        Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);

        RaycastHit hit;

        //If you are not hitting anything change the "1000" value.

        if (Physics.Raycast(ray, out hit, 1000))

        {

            targetPosition = hit.point;

            //LookAt snaps toward the position direction. Not what I need

            // this.transform.LookAt(targetPosition);

 

            //Turning around without making it look in the air. Transform only the Y position. Turn around the Y axis, left and right. By finding the X and Y differences we get a position to rotate the character. It projects the target X to the same level as the character

            lookAtTarget = new Vector3(targetPosition.x - transform.position.x, transform.position.y, targetPosition.z - transform.position.z);

            //use a Quaternion function with the values from LookAtTarget. Rotation that has to occur in order to look at the target.

            charRot = Quaternion.LookRotation(lookAtTarget);

 

           

 

            //toggle moving

            moving = true;

 

        }

    }

    //Moving the character...smoothly using "Slerp" - Slerp smooths values

    void Move()

    {

        //smoothly rotates character toward the target position. "Spherically interpolates between a and b by t. The parameter t is clamped to the range [0,1]

        transform.rotation = Quaternion.Slerp(transform.rotation, charRot, rotSpeed * Time.deltaTime);

 

        //move toward target

        transform.position = Vector3.MoveTowards(transform.position, targetPosition, speed * Time.deltaTime);

 

        //toggle bool moving to false. Too accurate, might get into problem. Must set a range of being "on target" means.

        if (transform.position == targetPosition)

        {

            moving = false;

        }

    }

 

}

 

 

 

It does have a bug, where the character floats toward the camera if clicked upon. I thought that the code should cancel that, but it still floats. Going back to basics on raycasting to figure out why.

 

Now I have to update the code to behave more closely to what I need. Instead of moving where clicked, I want it to constantly follow the mouse…. –

It seems that the clicking movement might be better when using the phone. Just have to update to touch.

Adding Touch: Touching Phases and Touch Controls

Youtube tutorial on Screen and World Coordinates for Raycast here. Had to take a look at tutorials on using touch interfaces. The code for the mouse movement doesn’t use “ScreenToWorldPoint”, but “ScreenPointToRay” – Why?

The following code is an exercise on how coordinates for mobile work. It encompasses Raycasting and “ScreenToWorldPoint” functions. It also makes things disappear when clicked upon. Tried to make it play a sound since I’d like to have sound, but it didn’t work.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

//require a component for playing sound

[RequireComponent(typeof(AudioSource))]

public class touchManager : MonoBehaviour {

 

   public  AudioSource audioData;

    public AudioClip fxClip;

 

    // Use this for initialization

    void Start () {

             

       }

      

       // Update is called once per frame

       void Update () {

        if (Input.GetMouseButton(0))

        {

            //Orthogonal End position - Vector3 position of the far end of the camera projection. For the Z position I will use the "Far"plane.

            //

            Vector3 mousePosFar = new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.farClipPlane);

 

            //For the Near position Z = Camera NearClipPlane

            Vector3 mousePosNear = new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.nearClipPlane);

 

            Debug.DrawRay(mousePosNear, mousePosFar - mousePosNear, Color.green);

 

            //Convert the above "mousePosFar" and "mousePosNear" to ScreenToWorldPoint

 

            Vector3 mousePosF = Camera.main.ScreenToWorldPoint(mousePosFar);

            Vector3 mousePosN = Camera.main.ScreenToWorldPoint(mousePosNear);

 

            //Debug: Ray(start position, direction, color)

            //Debug.DrawRay(mousePosN, mousePosF - mousePosN, Color.green);

 

            //Raycast hit object. Part of Unity Physics system

            RaycastHit hit;

 

            if(Physics.Raycast(mousePosN,mousePosF-mousePosN, out hit))

            {

                audioData = GetComponent<AudioSource>();

               audioData.Play(0);

               // Destroy(hit.transform.gameObject);

          }

 

        }

       }

}

 

 

Been watching theses tutorials for an hour. Stopped to eat. Losing track of the big picture. What do I need? : Get the character to go to where indicated by the user using touch.

Parentesis – Debugin an Iphone while on Windows

 

Testing IOS builds while working on a windows computer is a pain. It usually involves having to build the scene, copy the (big) file to a mac computer and build it to an IOS device using Xcode.

I’ve been trying to use ARKitRemote to let Unity run games on IOS devices connected with an USB cable, but I’ve had errors all the time.  I’ve build the file in Unity with the player Debug settings, but still can’t connect. Maybe I don’t have the IOS device drivers for windows. I’ll install ITUNES for windows again like commented here. Still no success though.

 

Back to coding.

Updating priorities

 

I have exactly 24h before my deadline for this project. At the time I have only a character that follows and moves toward a clicked/toward location on 3D space. Also, I have the IOS deployment pipeline solved. What next?

I have to make some strategic decisions. Either focus on getting the mechanics right, or add some visual bells and whistles that would help show where I’d like to get with this project. Both options must have the flower path system implemented, since this was a direct request from my instructor Rui Pereira.

Since I have a moving character – although a still buggy one – I will work on the flower path.

I just hope this code will translate well to AR.

Flower Path

 

After the introduction, where the character wakes up and jumps to the flower, he will start following the moonlight/AR camera, while leaving a path of flower where he goes.

I have a couple of options for the the flower:

·       I could code a 3D line following a drawing app tutorial (here) and instead of using the finger, the line would follow the character. The problem is making flowers instead of a line.

·       Or I could instantiate random blooming flowers prefabs. I have to be careful not to tax the mobile system too much. Also, I don’t really know how to do this. I know how to instantiate objects, but not how to time them very well. I will go with this one though.

 

AR_dev_flowerPath01.gif

Get to instantiate placeholder objects where the character is moving.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class FlowerPath : MonoBehaviour {

 

    public Transform flowers;

    public Transform origin;

 

    public CharController CharScript;

   

   

 

       // Use this for initialization

       void Start () {

      

       }

      

       // Update is called once per frame

       void Update () {

 

        //create condition to start drawing path/running "createPath" based on the "bool moving" in the "CharController" script located in the "Character Dummy" GameObject (for now)

        if (CharScript.moving)

        {

            createPath();

        }

    }

    //instantiate "flowers" on the position of "origin"

    void createPath()

    {

        Vector3 pos = new Vector3(0.2f, 0f, 0f);

        Instantiate(flowers, origin.position + pos, flowers.rotation);

    }

}

 

 

 

Now the character instantiates one “flower” every frame after checking if the “moving” function from the “CharController” is “true”.

Now I have to find a way to give an interval between the creation of the “flowers”.

I tried turning the “createPath” into an IEnumerator:

 

 

I only managed to delay the activation of the function, and not how fast it instantiates “flowers”.

Fixed the Jittering!

Character was jittering and making it difficult to toggle the “moving” Boolean. At first I thought that it was because the condition for toggling was too strict:

        if (transform.position == targetPosition)

 

 

                          So I added a distance condition:

float dist = Vector3.Distance(targetPosition, transform.position);

        Debug.Log(dist);

 

        if (dist <= 1)

           

        {

            moving = false;

        }

 

But It kept jittering. Looking at the variable distance debug on the console, I realized that it kept floating.

I fixed by changing from “Update” to “Fixed Update”. Now it is stable!

 

Create random position for the flowers

 

My beautiful flowers/cylinders are being instantiated, but I’m still not happy with the interval, since it instantiate two flowers in one frame. But it is fine for now.

Another problem is that the “flowers” are always appearing on the same position relative to the character. I want this position to be a little more random. I’ll have to figure out how to do random.position while still keeping some kind of limit.

Here’s the solution:

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class FlowerPath : MonoBehaviour {

 

    public Transform flowers;

    public Transform origin;

 

    public CharController CharScript;

 

    private float InstantiationTimer = 0.2f;

   

   

 

       // Use this for initialization

       void Start () {

      

       }

      

       // Update is called once per frame

       void Update () {

 

        //create condition to start drawing path/running "createPath" based on the "bool moving" in the "CharController" script located in the "Character Dummy" GameObject (for now)

        if (CharScript.moving)

        {

            StartCoroutine("createPath");

        }

    }

    //instantiate "flowers" on the position of "origin"

    IEnumerator createPath()

    {

        InstantiationTimer -= Time.deltaTime;

        if(InstantiationTimer <= 0)

        {

           

            yield return new WaitForSeconds(0);

            //Position origin offset

            Vector3 position = new Vector3(Random.Range(0.0f, 1.0f),-0.5f , Random.Range(0.0f, 1.0f));

 

            Instantiate(flowers, origin.position + position, flowers.rotation);

            InstantiationTimer = 0.2f;

        }

       

    }

}

 

 

Now the “flowers” are being instantiated a little apart from one another. I’d like to find a way for them to always be fixed on the ground plane. At the moment they are always relative to the Character position and if it jumps some “flowers” will float in order to follow they Y position. I made the offset y a little negative to ground them.

Following the moonlight.

 

For now the character moves to the touch/click coordinates. I want it to follow an object – the moonlight to be more specific. (Still not sure of this scheme though. Touching seem so straightforward but doesn’t tell the story I want).

 

Options for the moonlight. I could use a real light source, a projection map or a 3d object.

Before that I have to see if the character control still works in ar.

(Developing on a windows PC and building to an Iphone is a pain!)

Added AR and tested….It doesn’t work as I thought!

 

Deployed to my Iphone and the game is visible, but the character is not really moving o top of the ARplane but inside instantiations of the normal plane. It does go where I point it to go, but I don’t think it is working as it should. It is also too big. I’m afraid I’ll have to adapt the character control script to some ARKit sample code. The problem is that it takes forever to deploy the build from my windows laptop to the iphone. Well…will have to check this tomorrow.

 

New day – a new computer

It seems that I really had to be coding starting from ARKit sample scripts from the beginning. Coding a normal touch/mouse game and adding ARKit camera and scripts later doesn’t actually work. The game runs, but as an overley on top of the camera, and not as objects really interacting with the AR planes generated by ARkit. Also the proportions are all wrong. The Character Dummy is huge and just “scaling” it in Unity doesn’t seem to be the best way to approach scale as described in this Unity blog post on the subject.

Also, I’m installing my version of Unity (2018.2.10) into a Mac laptop to streamline development. With my windows pc I’m unable to quickly test code.

 I’ve been trying to start my project on a Mac and run the ARki_Remote for more than an hour now. I haven’t even started transferring my code to the new project. 4 hours to the deadline!

 

Rebuilt the dummy character and the flower path code on the mac version. Still don’t have a touch control to make the character move.

Adapting Character Control to ARkit

I have to add my character control, which is based on raycasting detection, to this ARKit template code

using System.Collections;

using System.Collections.Generic;

using UnityEngine;



public class FlowerPath : MonoBehaviour {



    public Transform flowers;

    public Transform origin;



    public CharController CharScript;



    private float InstantiationTimer = 2f;







       // Use this for initialization

       void Start () {



       }



       // Update is called once per frame

       void Update () {



        //create condition to start drawing path/running   "createPath" based on the "bool moving" in the   "CharController" script located in the "Character Dummy"   GameObject (for now)

        if (CharScript.moving)

        {

              StartCoroutine("createPath");

        }

    }

    //instantiate "flowers" on the position of   "origin"

    IEnumerator   createPath()

    {

        InstantiationTimer   -= Time.deltaTime;

        if(InstantiationTimer <= 0)

        {

            Vector3 pos = new Vector3(0.2f, 0f, 0f);

            yield return new   WaitForSeconds(2);

              Instantiate(flowers, origin.position + pos, flowers.rotation);

        }



    }

}

Magic Windows - Working with AR Kit

Setting up ARKit in unity

 

There is no plugin at the asset store! So go here: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin

The zipped file looks like a Unity Package, but it isn’t! You should copy its content to your project “Assets” folder.

Unity will update

Unlike Vuforia, Arkit doesn’t create a new Camera object, but uses Unity’s

·         Create an empty object named “Camera Parent”

·         Move the “Main Camera” to “Camera Parent” – making it ‘child’.

o   Main Camera – Inspector

§  Clear Flags: Depth Only

§  Culing Mask: Everything

§  Field of View: 60

§  Clipping Planes: Near: 0.1  / Far: 37.81

o   Add component: Unity AR Video

§  Material: YUVMaterial (or any clear material)

o   Add component: Unity AR Camera Near Far

·         Create empty object: ARCameraManager

o   Add component: Unity AR Camera Manager

§  Camera: Drag n drop the Main Camera

§  Plane Detection: Select if you want it to track planes on either horizontal or vertical or both. Sometimes you don’t want to track walls or the floor.

§  Get Point Cloud: toggle

§  Enable Light Estimator: Toggle

§  Enable Auto Focus: Toggle (YES!)

§  DONE!

ARKit is setup. Now setup the example scene

·         Create Empty Object “DebugViz”

o   Inside it create another empty object named: PointFinder

§  Add component: Unity Point Cloud Example

·         Increase Num Points to Show

·         Point Cloud Prefab: (grab example asset) PointCloudPrefab

§  Add another component to DebugViz: PlaneFinder

·         Add component: Unity AR Generate Plane

o   Plane Prefab: debugPlanePrefab (also part of the example collection)

Build for IOS

·         Player Settings

o   Bundle Identifier: add your company name:

o   Camera Usage Description: (Type whater you want to the user allow the use of the phone’s camera…..could be “ARBABY”…whatever.

o   Target minimum iOS Version:  11.0 (it has to be 11.0 or above)

o   Requires ARKit support: toggle

Open the project in XCode and go through all the usual Apple nuisance of adding team/your apple developer coder (automatic); connect a phone to your computer; authorize everything and if everything works you should have an icon with the Unity App in your phone.

 

Unity AR Hit teste example

·         Create empty object: hit_parent

o   Inside it create empty object: hit / OR…. Create a 3D object (exp: Cube) and ..

o   Add the “Unity AR Hit test example” script

               

§  Open the script

This script allows to detect if a touch on the screen is hitting one of the AR objects.

                Remove all the Update Unity part – from “#if” all the way to “else” and remove “endElse”. We don’t need it to run inside the unity player

               

                Remove comments for all ARHitTestResultType.xxxxxxx (activating them)

 

Build/ Compile and Run….it sort of works. A cube showed up and it stays in there…once. But no interaction.

 

 

Magic Windows - Live VR Body Painting

Exploring the boundaries of augmented and virtual reality with live body painting projection.

Recently I’ve been exploring drawings softwares like Tilte Brush and Quill, that allow to use VR as a creative medium while inside VR.

For this projection mapping/augmented reality assignment, I decided to do a live VR painting that would be projected into my body.

vr body mapping.JPG

The projector setup was very basic, with it standing on a tripod facing me. For the time I didn’t use any live mapping software, since the challenge was how to make me - the artist - aware of my surroundings and myself being projected upon while immersed inside the VR painting program.

For this I’ve found an app called OVRDrop. Together with Virtual Desktop, these software re-creates your desktop inside any VR environment.

ego vr setup.JPG

At first I used a webcam and later a DSLR camera connected to my computer to feed OVRDrop with external footage of myself. This allowed me to carefully move my head and arms in order to properly align my drawings with the projection.

Projector connected to youtube stream. Webcam taped on top of it - later replaced by DSLR camera.

Projector connected to youtube stream. Webcam taped on top of it - later replaced by DSLR camera.

I also experimented with using a YouTube live streaming to connect the projector with what I was seeing/drawing inside the headset. I was also able to see the streamed video inside the VR painting app. This created some very interesting, albeit disconcerting experiences as I tried to synchronize my movements with the delay of the streamed video. One possibility is to further extrapolate this multi body / multi temporal perspective, allowing for several streams and projectors to be connected with my VR point of view.

Having someone to help setup and position myself was essential. Thanks Dan!

Having someone to help setup and position myself was essential. Thanks Dan!

As of this being VR or AR, the whole goal was to extrapolate the gradient between both. Artist and viewers are both seeing the same animation, but they are in different positions of the AR/VR gradient. I’m immersed in VR while I see and manipulate my “augmented” self.

0C4A7220.JPG

Next Steps

For future installments, I’d like to allow the viewers to interact further with the experience, either on location or through streaming services.

Use computer vision to track my body and features, allowing me to move and interact with the live VR drawings being created.