Featured

Thesis- Mindscape Prototype 1

This is a prototype of my IdeaScape Thesis project: an immersive creative environment that combines 3D drawing, dance and music in order to generate new ideas and insights. It is not about making a final work of art, but more about what happens before the artist starts painting and the environment where creating takes place.

For this use case, the user is working on traditional Brazilian Carnaval costumes and culture. The hub environment is a simple masonry house like the one most Brazilians live. The walls have posters depicting carnaval costumes. These were taken by me when visiting the arts and crafts museum of Rio de Janeiro in 2017. Populating the room are 3D models that I created in other VR 3D modeling program. These are also “IdeaObjects”: Vessels where other objects, data, animation, images or entire worlds can be attached to. When one of the objects are activated, a dancing version of it shows up and music plays (unfortunately, not Brazilian drumming in this demo). Not only that, but this specific model activates an entire environment outside the house. This colorful space was also created by me inside Oculus Medium.

Now immersed in music and dance, I can also paint in 3D space while I activate further “IdeaObjects” that were contained in that original model. In this case, after generating three dancers another, bigger dancer shows up. They all dance and generate a cacophony of sound that should immerse the artist on the mood and ambiance assigned by the original Idea Object.

The goal of this interface is not being “efficient” in a traditional way of giving quick and easy access to data, but to be “efficient” in an inspiring way. It is a place for the mind of the user to wander. It is not about “getting things done quickly”, but making things that are meaningful.

Live Performance - Tension of the Self

Teaser Promo for live performance using motion capture, live music and dance.

Created by: Gabriel Brasil (Concept and Choreography), Shuai Wang (Art Direction and Unreal Development), Shuting Jiang (Music and Unreal Development) and Terrick Gutierres (Production and Concept).

Part of the "Bodies in Motion" class - NYU IMA/ITP

instructors: Todd Bryant and Kat Sullivan

NYU Magnet, 8th floor

2 Metrotech Center - Brooklyn

As concept artist and choreography director, my role was to elaborate and implement an overarching aesthetic. All while taking into the consideration our experience using the motion capture studio.

Through this experience I became proficient in motion capture studio management, direction and data cleaning.

The studio located at NYU’s Magnet Brooklyn campus, included a 15 camera Optitrack system.

Early concepts based on the works of the renown choreographer Martha Graham.

Tension_Final_web.jpg

UX Design and Innovation Consultant

Between 2016 and 2017 I was Senior Innovation Designer at MJV Innovation headquarters in Rio de Janeiro Brazil. The company has 700+ employees in Brazil, Europe and the United States. I was initially hired to help with their first Virtual Reality project for a major international corporation and became full time part of their team coordinating innovation projects, organizing workshops and making in depth field research for product development.

Created new business models for Lafarge-Holcim cement giant. Conducted extensive field research, interviews, used 2d illustration and 3d modeling skills to create tangible scenarios. Both projects were approved and are now being applied to all of South America. Coordinated ideation workshops using Canvas, Agile and Design Thinking for corporate clients.

Extensive interviews and market research for retail and B2B clients.

Extensive interviews and market research for retail and B2B clients.

In house ideation process to coordinate innovation strategies for corporate clients.

In house ideation process to coordinate innovation strategies for corporate clients.

Multi level approach to innovation. From data analysis to broad insights based on field research.

Multi level approach to innovation. From data analysis to broad insights based on field research.

I was responsible for Persona research and Illustration. My drawing skills became valuable tools during the whole process.

I was responsible for Persona research and Illustration. My drawing skills became valuable tools during the whole process.

Early frameworks for APP prototyping.

Early frameworks for APP prototyping.

Concept for immersive sales force training using Virtual Reality.

Concept for immersive sales force training using Virtual Reality.

Workshops with clients and stakeholders were integral part of our process.

Workshops with clients and stakeholders were integral part of our process.

Gabriel MJV
I used illustration to create trend and innovation boards on future technologies and brand strategies.

I used illustration to create trend and innovation boards on future technologies and brand strategies.

Instructional and Graphic Design - Odebrecht and Santiago Calatrava

Instructional and Graphic Designer during the Porto Rio urban renewal project

I worked two years for the a consortium of construction companies, where I was responsible for creating, distributing and managing all instructional material for the 3000 construction workers and managers involved in the largest urban renewal project in the country, which included the construction of Calatrava’s Museum of Tomorrow.

The work was done on site and involved daily field interviews and reports. In two years, I created more than 400 original graphics and booklets. My journalistic communications experience was essential in understanding and creating bridges between all stakeholders, while my design and art skills allowed for optimizing the daily creation of original content.

Little Button AR

“Little Button” is an augmented reality app where you play with a little boy who’s able to make flowers blossom wherever he goes. This a part of my “Magical Windows” class for my Interactive Telecommunications Program degree at NYU.

In its current form, a glitch turned it into a flower painting program, where the user can become my “Little Button” character and spread flowers in the real world.

This ar experiment is part of the larger “Little Button” world. This is a character I’ve been working for a while - aimed at children’s books and apps.

For this APP the final vision is to introduce kids to this little blue guy!

Magic Windows

 

Setting up ARKit in unity

 

There is no plugin at the asset store! So go here: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin

The zipped file looks like a Unity Package, but it isn’t! You should copy its content to your project “Assets” folder.

Unity will update

Unlike Vuforia, Arkit doesn’t create a new Camera object, but uses Unity’s

·       Create an empty object named “Camera Parent”

·       Move the “Main Camera” to “Camera Parent” – making it ‘child’.

o   Main Camera – Inspector

§  Clear Flags: Depth Only

§  Culing Mask: Everything

§  Field of View: 60

§  Clipping Planes: Near: 0.1  / Far: 37.81

o   Add component: Unity AR Video

§  Material: YUVMaterial (or any clear material)

o   Add component: Unity AR Camera Near Far

·       Create empty object: ARCameraManager

o   Add component: Unity AR Camera Manager

§  Camera: Drag n drop the Main Camera

§  Plane Detection: Select if you want it to track planes on either horizontal or vertical or both. Sometimes you don’t want to track walls or the floor.

§  Get Point Cloud: toggle

§  Enable Light Estimator: Toggle

§  Enable Auto Focus: Toggle (YES!)

§  DONE!

ARKit is setup. Now setup the example scene

·       Create Empty Object “DebugViz”

o   Inside it create another empty object named: PointFinder

§  Add component: Unity Point Cloud Example

·       Increase Num Points to Show

·       Point Cloud Prefab: (grab example asset) PointCloudPrefab

§  Add another empty object to DebugViz: PlaneFinder

·       Add component: Unity AR Generate Plane

o   Plane Prefab: debugPlanePrefab (also part of the example collection)

Build for IOS

·       Player Settings

o   Bundle Identifier: add your company name:

o   Camera Usage Description: (Type whater you want to the user allow the use of the phone’s camera…..could be “ARBABY”…whatever.

o   Target minimum iOS Version:  11.0 (it has to be 11.0 or above)

o   Requires ARKit support: toggle

Open the project in XCode and go through all the usual Apple nuisance of adding team/your apple developer coder (automatic); connect a phone to your computer; authorize everything and if everything works you should have an icon with the Unity App in your phone.

 

Unity AR Hit teste example

·       Create empty object: hit_parent

o   Inside it create empty object: hit / OR…. Create a 3D object (exp: Cube) and ..

o   Add the “Unity AR Hit test example” script

             

§  Open the script

This script allows to detect if a touch on the screen is hitting one of the AR objects.

              Remove all the Update Unity part – from “#if” all the way to “else” and remove “endElse”. We don’t need it to run inside the unity player

             

              Remove comments for all ARHitTestResultType.xxxxxxx (activating them)

 

Build/ Compile and Run….it sort of works. A cube showed up and it stays in there…once. But no interaction.

 

 

 

AR Final – Little Button

 

Little button is tiny little boy that lives inside a flower that only blooms when bathe in moonlight. He one of the character of a children’s book I want to write.

In the AR experience the user controls the moonlight. At first you have to lighten the flower until it blossoms, revealing a sleeping Little Button inside. He then wakes up and jumps from the flower eager to play with the user, who controls a spotlight representing moonlight. Little Button will follow the light and wherever he goes, flowers blossom on his wake. By moving him around, the player can create drawings and patterns with the flower, mixed with their real environment through the use of AR.

Goals

·       Make character that moves wherever the player points.

·       Make flowers bloom on his wake.

·       Model the character, flower, flowers path.

·       Animate Character

·       Add sound and effects

·       Add Music.

·       Deploy/Build to ARKit – Iphone.

 

The following code allows to make the character rotate and then move toward a mouse.click position. Not exactly what I need, but still a good start.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class CharController : MonoBehaviour

{

    Quaternion charRot;

    Vector3 targetPosition;

    Vector3 lookAtTarget;

    float rotSpeed = 5;

    float speed =5;

 

    bool moving = false;

 

    // Use this for initialization

    void Start()

    {

 

    }

 

    // Update is called once per frame

    void Update()

    {

        if (Input.GetMouseButton(0))

        {

            SetTargetPosition();

        }

        //if "Move()" is left running on the update, the object will jitter!

        Move();

        if (moving)

        {

            Move();

        }

        Debug.Log(transform.position.x);

    }

 

    void SetTargetPosition()

    {

        Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);

        RaycastHit hit;

        //If you are not hitting anything change the "1000" value.

        if (Physics.Raycast(ray, out hit, 1000))

        {

            targetPosition = hit.point;

            //LookAt snaps toward the position direction. Not what I need

            // this.transform.LookAt(targetPosition);

 

            //Turning around without making it look in the air. Transform only the Y position. Turn around the Y axis, left and right. By finding the X and Y differences we get a position to rotate the character. It projects the target X to the same level as the character

            lookAtTarget = new Vector3(targetPosition.x - transform.position.x, transform.position.y, targetPosition.z - transform.position.z);

            //use a Quaternion function with the values from LookAtTarget. Rotation that has to occur in order to look at the target.

            charRot = Quaternion.LookRotation(lookAtTarget);

 

           

 

            //toggle moving

            moving = true;

 

        }

    }

    //Moving the character...smoothly using "Slerp" - Slerp smooths values

    void Move()

    {

        //smoothly rotates character toward the target position. "Spherically interpolates between a and b by t. The parameter t is clamped to the range [0,1]

        transform.rotation = Quaternion.Slerp(transform.rotation, charRot, rotSpeed * Time.deltaTime);

 

        //move toward target

        transform.position = Vector3.MoveTowards(transform.position, targetPosition, speed * Time.deltaTime);

 

        //toggle bool moving to false. Too accurate, might get into problem. Must set a range of being "on target" means.

        if (transform.position == targetPosition)

        {

            moving = false;

        }

    }

 

}

 

 

 

It does have a bug, where the character floats toward the camera if clicked upon. I thought that the code should cancel that, but it still floats. Going back to basics on raycasting to figure out why.

 

Now I have to update the code to behave more closely to what I need. Instead of moving where clicked, I want it to constantly follow the mouse…. –

It seems that the clicking movement might be better when using the phone. Just have to update to touch.

Adding Touch: Touching Phases and Touch Controls

Youtube tutorial on Screen and World Coordinates for Raycast here. Had to take a look at tutorials on using touch interfaces. The code for the mouse movement doesn’t use “ScreenToWorldPoint”, but “ScreenPointToRay” – Why?

The following code is an exercise on how coordinates for mobile work. It encompasses Raycasting and “ScreenToWorldPoint” functions. It also makes things disappear when clicked upon. Tried to make it play a sound since I’d like to have sound, but it didn’t work.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

//require a component for playing sound

[RequireComponent(typeof(AudioSource))]

public class touchManager : MonoBehaviour {

 

   public  AudioSource audioData;

    public AudioClip fxClip;

 

    // Use this for initialization

    void Start () {

             

       }

      

       // Update is called once per frame

       void Update () {

        if (Input.GetMouseButton(0))

        {

            //Orthogonal End position - Vector3 position of the far end of the camera projection. For the Z position I will use the "Far"plane.

            //

            Vector3 mousePosFar = new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.farClipPlane);

 

            //For the Near position Z = Camera NearClipPlane

            Vector3 mousePosNear = new Vector3(Input.mousePosition.x, Input.mousePosition.y, Camera.main.nearClipPlane);

 

            Debug.DrawRay(mousePosNear, mousePosFar - mousePosNear, Color.green);

 

            //Convert the above "mousePosFar" and "mousePosNear" to ScreenToWorldPoint

 

            Vector3 mousePosF = Camera.main.ScreenToWorldPoint(mousePosFar);

            Vector3 mousePosN = Camera.main.ScreenToWorldPoint(mousePosNear);

 

            //Debug: Ray(start position, direction, color)

            //Debug.DrawRay(mousePosN, mousePosF - mousePosN, Color.green);

 

            //Raycast hit object. Part of Unity Physics system

            RaycastHit hit;

 

            if(Physics.Raycast(mousePosN,mousePosF-mousePosN, out hit))

            {

                audioData = GetComponent<AudioSource>();

               audioData.Play(0);

               // Destroy(hit.transform.gameObject);

          }

 

        }

       }

}

 

 

Been watching theses tutorials for an hour. Stopped to eat. Losing track of the big picture. What do I need? : Get the character to go to where indicated by the user using touch.

Parentesis – Debugin an Iphone while on Windows

 

Testing IOS builds while working on a windows computer is a pain. It usually involves having to build the scene, copy the (big) file to a mac computer and build it to an IOS device using Xcode.

I’ve been trying to use ARKitRemote to let Unity run games on IOS devices connected with an USB cable, but I’ve had errors all the time.  I’ve build the file in Unity with the player Debug settings, but still can’t connect. Maybe I don’t have the IOS device drivers for windows. I’ll install ITUNES for windows again like commented here. Still no success though.

 

Back to coding.

Updating priorities

 

I have exactly 24h before my deadline for this project. At the time I have only a character that follows and moves toward a clicked/toward location on 3D space. Also, I have the IOS deployment pipeline solved. What next?

I have to make some strategic decisions. Either focus on getting the mechanics right, or add some visual bells and whistles that would help show where I’d like to get with this project. Both options must have the flower path system implemented, since this was a direct request from my instructor Rui Pereira.

Since I have a moving character – although a still buggy one – I will work on the flower path.

I just hope this code will translate well to AR.

Flower Path

 

After the introduction, where the character wakes up and jumps to the flower, he will start following the moonlight/AR camera, while leaving a path of flower where he goes.

I have a couple of options for the the flower:

·       I could code a 3D line following a drawing app tutorial (here) and instead of using the finger, the line would follow the character. The problem is making flowers instead of a line.

·       Or I could instantiate random blooming flowers prefabs. I have to be careful not to tax the mobile system too much. Also, I don’t really know how to do this. I know how to instantiate objects, but not how to time them very well. I will go with this one though.

 

AR_dev_flowerPath01.gif

Get to instantiate placeholder objects where the character is moving.

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class FlowerPath : MonoBehaviour {

 

    public Transform flowers;

    public Transform origin;

 

    public CharController CharScript;

   

   

 

       // Use this for initialization

       void Start () {

      

       }

      

       // Update is called once per frame

       void Update () {

 

        //create condition to start drawing path/running "createPath" based on the "bool moving" in the "CharController" script located in the "Character Dummy" GameObject (for now)

        if (CharScript.moving)

        {

            createPath();

        }

    }

    //instantiate "flowers" on the position of "origin"

    void createPath()

    {

        Vector3 pos = new Vector3(0.2f, 0f, 0f);

        Instantiate(flowers, origin.position + pos, flowers.rotation);

    }

}

 

 

 

Now the character instantiates one “flower” every frame after checking if the “moving” function from the “CharController” is “true”.

Now I have to find a way to give an interval between the creation of the “flowers”.

I tried turning the “createPath” into an IEnumerator:

 

 

I only managed to delay the activation of the function, and not how fast it instantiates “flowers”.

Fixed the Jittering!

Character was jittering and making it difficult to toggle the “moving” Boolean. At first I thought that it was because the condition for toggling was too strict:

        if (transform.position == targetPosition)

 

 

                          So I added a distance condition:

float dist = Vector3.Distance(targetPosition, transform.position);

        Debug.Log(dist);

 

        if (dist <= 1)

           

        {

            moving = false;

        }

 

But It kept jittering. Looking at the variable distance debug on the console, I realized that it kept floating.

I fixed by changing from “Update” to “Fixed Update”. Now it is stable!

 

Create random position for the flowers

 

My beautiful flowers/cylinders are being instantiated, but I’m still not happy with the interval, since it instantiate two flowers in one frame. But it is fine for now.

Another problem is that the “flowers” are always appearing on the same position relative to the character. I want this position to be a little more random. I’ll have to figure out how to do random.position while still keeping some kind of limit.

Here’s the solution:

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

 

public class FlowerPath : MonoBehaviour {

 

    public Transform flowers;

    public Transform origin;

 

    public CharController CharScript;

 

    private float InstantiationTimer = 0.2f;

   

   

 

       // Use this for initialization

       void Start () {

      

       }

      

       // Update is called once per frame

       void Update () {

 

        //create condition to start drawing path/running "createPath" based on the "bool moving" in the "CharController" script located in the "Character Dummy" GameObject (for now)

        if (CharScript.moving)

        {

            StartCoroutine("createPath");

        }

    }

    //instantiate "flowers" on the position of "origin"

    IEnumerator createPath()

    {

        InstantiationTimer -= Time.deltaTime;

        if(InstantiationTimer <= 0)

        {

           

            yield return new WaitForSeconds(0);

            //Position origin offset

            Vector3 position = new Vector3(Random.Range(0.0f, 1.0f),-0.5f , Random.Range(0.0f, 1.0f));

 

            Instantiate(flowers, origin.position + position, flowers.rotation);

            InstantiationTimer = 0.2f;

        }

       

    }

}

 

 

Now the “flowers” are being instantiated a little apart from one another. I’d like to find a way for them to always be fixed on the ground plane. At the moment they are always relative to the Character position and if it jumps some “flowers” will float in order to follow they Y position. I made the offset y a little negative to ground them.

Following the moonlight.

 

For now the character moves to the touch/click coordinates. I want it to follow an object – the moonlight to be more specific. (Still not sure of this scheme though. Touching seem so straightforward but doesn’t tell the story I want).

 

Options for the moonlight. I could use a real light source, a projection map or a 3d object.

Before that I have to see if the character control still works in ar.

(Developing on a windows PC and building to an Iphone is a pain!)

Added AR and tested….It doesn’t work as I thought!

 

Deployed to my Iphone and the game is visible, but the character is not really moving o top of the ARplane but inside instantiations of the normal plane. It does go where I point it to go, but I don’t think it is working as it should. It is also too big. I’m afraid I’ll have to adapt the character control script to some ARKit sample code. The problem is that it takes forever to deploy the build from my windows laptop to the iphone. Well…will have to check this tomorrow.

 

New day – a new computer

It seems that I really had to be coding starting from ARKit sample scripts from the beginning. Coding a normal touch/mouse game and adding ARKit camera and scripts later doesn’t actually work. The game runs, but as an overley on top of the camera, and not as objects really interacting with the AR planes generated by ARkit. Also the proportions are all wrong. The Character Dummy is huge and just “scaling” it in Unity doesn’t seem to be the best way to approach scale as described in this Unity blog post on the subject.

Also, I’m installing my version of Unity (2018.2.10) into a Mac laptop to streamline development. With my windows pc I’m unable to quickly test code.

 I’ve been trying to start my project on a Mac and run the ARki_Remote for more than an hour now. I haven’t even started transferring my code to the new project. 4 hours to the deadline!

 

Rebuilt the dummy character and the flower path code on the mac version. Still don’t have a touch control to make the character move.

Adapting Character Control to ARkit

I have to add my character control, which is based on raycasting detection, to this ARKit template code

using System.Collections;

using System.Collections.Generic;

using UnityEngine;



public class FlowerPath : MonoBehaviour {



    public Transform flowers;

    public Transform origin;



    public CharController CharScript;



    private float InstantiationTimer = 2f;







       // Use this for initialization

       void Start () {



       }



       // Update is called once per frame

       void Update () {



        //create condition to start drawing path/running   "createPath" based on the "bool moving" in the   "CharController" script located in the "Character Dummy"   GameObject (for now)

        if (CharScript.moving)

        {

              StartCoroutine("createPath");

        }

    }

    //instantiate "flowers" on the position of   "origin"

    IEnumerator   createPath()

    {

        InstantiationTimer   -= Time.deltaTime;

        if(InstantiationTimer <= 0)

        {

            Vector3 pos = new Vector3(0.2f, 0f, 0f);

            yield return new   WaitForSeconds(2);

              Instantiate(flowers, origin.position + pos, flowers.rotation);

        }



    }

}

Desert of the Real - Create a VR paintbrush

This is a 3-week-old assignment that I didn’t manage to do until now. The goal is to create a VR painting program.

Everytime I start to code I just freeze looking at the empty script. I just don’t know how to organize my thoughts and don’t have a clear idea of what is to be achieved. For this reason, I decided to write everything I want to do in plain English.

Before coding I set up the scene using the basic SteamVr package. I set up a Teleport area for locomotion; a plane to make this area visible; a cube platform to set my objects and for orientation; three cubes prefabs, each with a different material/color (red, green and blue) and finally a SteamVR player controller. I also put some throwable objects just to test the controller responsiveness.

The class template has an empty object called “InteractiveObject” where the BrushHandler and ColorSelector scripts are kept. The ControllerHandler script is found in the Players “RighHand”.

 

 

 

Now for the coding part:

Goals:

·       Have a Brush object

·       The brush is controlled with the VR controller.

·       Have colored objects on the scene that will serve as colorPalette.

·       When Brush touches one of these objects – it changes to the objects color.

·       When pressing the Trigger Button on the controller, the Brush creates many copies of itself named paintObjects, allowing to paint the scene with it.

·       The paintObject must have the color of the brush.

·       If the brush touches a colorPalette with a different color, it should change color but the paintObject already on the scene should remain with its original color – Once paint is set, it should remain the same color. (no changing the color of what was already drawn)

·       There should be a limit on how many paintObjects are on the scene at any given time:  a limit on how much paint can be on the scene. If this limit is reached, the ‘oldest’ paintObject must be deleted.

·        

 

There are three scripts already created by the class instructor Igal Nassima. They come with comments and guidelines, but I don’t really understand his workflow. I’m not sure where Igal want us to set the scripts; if the Brush object is created by the script or if it will already be on screen.

I will try to do it somehow…

·       ControllerHandler

o   Makes the brush paint

o   Checks what color is in the brush

o   Get the brush color

o   Creates instances of the paintObject that has the current color of the brush.

·       ColorSelector

o   “Create a public Color Type that can be set from the inspector to be used in each menu Item” – (comment by Igal) – Don’t understand what he means by “color type” and why it must be set from the inspector.

o   “Setup Collider Trigger to send the color of this object to the Brush Handler” – Checks if there is a collision between the brush and another object and send the color of this other object back to the Brush Handler.

o   “Find Object of Type à then call the function that sets the currentColor” –

·       BrushHandler

o   Store the current color on the brush

o   Sets what gameobject will be drawn/created/instantiated when activated by the trigger on the controllerHandler.

o      //create a public function that SETS UP the color of the currentColor (hint, function must have a Color variable)

o       //create a public function that RETURNS the Color of the currentColor

§  Don’t know the difference between Set UP the color and “Returns the Color” and what it is supposed to do..

§  Also, Don’t know how to make scripts communicate with one another.

Found an example at my colleague Ellen Nickles documentation of this assignment:

//Inside the ControllerHandler script it calls the the color on BrushHandler:
  private void OnTriggerEnter(Collider other)

    {

        FindObjectOfType<BrushHandler>(). currentColor = myColor;

    }

 

 

 

Creating the Brush

The Brush doesn’t paint, but stores its own color and changes color when

·       The brush is a Sphere prefab named ‘Brush’.

·       It will have a color.

·       It will get the color of the object it touches

·       It will return the color it has

·       It has a script called BrushHandler. It will set the following behaviors:

o   Stores the currentColor

o   Creates/Returns the prefab that will be drawn (the “paint”)

 

Starting to write the code itself

              Don’t really know where to start. I look at the three template scripts, but don’t know which function to make. I don’t even know if I should create the brush object of if it is instantiated by one of the scripts.

 

Instantiation Tests

In order to practice basic coding, I also created a Test script to try things out.

I created a transform function that makes a cube move and it asks for a GameObject. I added a different object on the inspector but the object where the script was attached – the cube -  still was the one that moved. Why?

1 Creating – Instantiating an object

              I will create a brush and make it instantiate copies of itself when user press the Oculus Controller Trigger button. I’ll figure the other actions (changing color) later.

              We managed to run a basic instantiate code copied from the Unity API manual. At first it didn’t worked, because we added the script to the prefab we wanted to instantiate. This freezes Unity because it was trying to instantiate the object WITH the instantiate script. Once we added a different prefab to the public GameObject slot in the Inspector the instantiate code worked fine.

Basic Unity Instantiate example 1:

//The Manual didn’t used “public”.  

 public Transform prefab;

    void Start()

    {

        for (int i = 0; i < 30; i++)

        {

            Instantiate(prefab, new Vector3(i * 2.0F,i* 1, 0), Quaternion.identity);

        }

    }

Basic Unity Instantiate example 2:

public class TestScript : MonoBehaviour

{

    // Instantiate a rigidbody then set the velocity

 

    public Rigidbody projectile;

 

    void Update()

    {

        // Ctrl was pressed, launch a projectile

        if (Input.GetButtonDown("Fire1"))

        {

            // Instantiate the projectile at the position and rotation of this transform

            Rigidbody clone;

            clone = Instantiate(projectile, transform.position, transform.rotation);

 

            // Give the cloned object an initial velocity along the current

            // object's Z axis

            clone.velocity = transform.TransformDirection(Vector3.forward * 10);

        }

    }

}

 

STARTING FROM SCRATCH (?!!)

                  For some reason it seems that SteamVr plugin is no longer functional on my project. I can’t find the Steam Controller Input inside the ‘Window’ menu. I downloaded the template project again and it seems to be working fine there. I’ll have to setup my scene again.

Update: Maybe the SteamVR plugin wasn’t turned on (for some reason). I started a new project anyway.

 

 

 

Spheres everywhere! Instantianting is a success (and out of control)

Spheres everywhere! Instantianting is a success (and out of control)

Making the paint brush:

Now we have two scripts that allows us to create spheres on the scene. One creates on sphere at every frame and other creates one sphere when “Fire1”/ Ctrl is pressed on the keyboard.

By changing the prefab instantiated by theses scripts it was possible to create a static sphere on the scene – one that has no gravity or doesn’t fly away. By pressing “Fire1”/Ctrl it was possible to create flying sphere and ‘paint’ the scene with spheres.

Goal:  Create static instances of a sphere while the Oculus Controller Trigger button is pressed and stop creating when not pressing.

`This is the code created

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

public class DrawTest : MonoBehaviour

{

public Transform paint;

void Update()
{

    if (Input.GetButtonDown("Fire1"))
    {

        createPaint();
    }

}

void createPaint()

{`

        for (int i = 0; i < 1; i++)
        {

            Instantiate(paint, transform.position, transform.rotation);

        }

}

}

 

This script works like the first one: It creates one sphere every time “Fire1”/Ctrl is pressed, but it doesn’t keep creating more balls if the button is still pressed. Maybe I need to change “Input.GetButtonDown” to something else…

Changed from “Input.GetButtonDown” – that returns true when the button is pressed down and only resets if released – to “Input.GetButton” – that returns true every frame the button is pressed. And it works!

Nice smooth sphere paint created on the scene.

Nice smooth sphere paint created on the scene.

3D Sound for Immersive Experiences

For my week 5 assignment, I had to create my own sounds using VCV Rack - an open source modular synth simulator.

I had a blast creating more than an hour of blip, bloops and other original sounds.

Snippets of the track were edited and mastered in Reaper to be turned into spatial sound objects for a future VR experience created with the Unreal Engine.

using vcv sounds in unreal - 3d audio spatialization.

Selected some of my favorite and more relevant sounds created in VCV Rack and added them to Unreal engine as 3D spatial sound objects.

The sound files were edited for “looping” and placed in areas around this environment I’ve built using basic template objects.

The goal was to create distinct transitions between areas with the help of careful placement of 3D sound.

For some of the effects I created sound cues - programmable sound objects. This allows for Unreal to change sound parameters in many ways. In this case a combined two different sound files and added a random filter and several different EQ modulators. This allowed for the creation of different sounds inside the “temple” area every time the user entered it. It almost works, since for now it only creates a few simple variations of the original sounds.

The next goal is to add some “head locked sounds”: ambient sound that covers the entire area, creating a more cohesive experience as well as amplifying the sense of being in an open, living world.

HighresScreenshot00001.png

Magic Windows - Live VR Body Painting

Exploring the boundaries of augmented and virtual reality with live body painting projection.

Recently I’ve been exploring drawings softwares like Tilte Brush and Quill, that allow to use VR as a creative medium while inside VR.

For this projection mapping/augmented reality assignment, I decided to do a live VR painting that would be projected into my body.

vr body mapping.JPG

The projector setup was very basic, with it standing on a tripod facing me. For the time I didn’t use any live mapping software, since the challenge was how to make me - the artist - aware of my surroundings and myself being projected upon while immersed inside the VR painting program.

For this I’ve found an app called OVRDrop. Together with Virtual Desktop, these software re-creates your desktop inside any VR environment.

ego vr setup.JPG

At first I used a webcam and later a DSLR camera connected to my computer to feed OVRDrop with external footage of myself. This allowed me to carefully move my head and arms in order to properly align my drawings with the projection.

Projector connected to youtube stream. Webcam taped on top of it - later replaced by DSLR camera.

Projector connected to youtube stream. Webcam taped on top of it - later replaced by DSLR camera.

I also experimented with using a YouTube live streaming to connect the projector with what I was seeing/drawing inside the headset. I was also able to see the streamed video inside the VR painting app. This created some very interesting, albeit disconcerting experiences as I tried to synchronize my movements with the delay of the streamed video. One possibility is to further extrapolate this multi body / multi temporal perspective, allowing for several streams and projectors to be connected with my VR point of view.

Having someone to help setup and position myself was essential. Thanks Dan!

Having someone to help setup and position myself was essential. Thanks Dan!

As of this being VR or AR, the whole goal was to extrapolate the gradient between both. Artist and viewers are both seeing the same animation, but they are in different positions of the AR/VR gradient. I’m immersed in VR while I see and manipulate my “augmented” self.

0C4A7220.JPG

Next Steps

For future installments, I’d like to allow the viewers to interact further with the experience, either on location or through streaming services.

Use computer vision to track my body and features, allowing me to move and interact with the live VR drawings being created.

Bodies in Motion - Studio day Week 2

Our second studio day