Work

Cognitive Explorers - VR tool for Neuroscience

Cognitive Explorers is part of a neuroscience research that aims at detecting the neuro-physiological signs related to complex emotions using Virtual Reality and biofeedback sensors.  

In 2017 I concluded my master thesis in "Arts and Neurosciences" at the Universidade Federal do Rio de Janeiro with this project developed in conjunction with professor José Otávio and the Limbisseen lab team. 

The VR environment was modeled in Maya and deployed in Unity.

In it, the subject start in an environment modeled after the real lab where he is located. Throughout the experience, the subject travels through increasingly bizarre environments. This is deliberate, in order to generate anxious, but still ambiguous emotional reactions from the users. Jump scares where avoided. 

After the experience the subjects where interviewed and the data was analyzed. 

The final conclusions stated that, although some events inside virtual reality generated similar reactions from most subjects, the follow up interviews detected that a wide range of emotional responses could be generated by the same VR environment. The use of VR as a tool inside the lab made it possible to closely follow the subjects reactions inside it's digital controlled environment. 

As a VR developer, I concluded that this medium is capable of deep, nuanced experiences, richer and with more nuance than most experiences commonly used inside laboratorial environments.

Abstract from the original Thesis - Gabriel Brasil

"This work explores new methodologies for the use of Virtual Reality equipment in conjunction with sensors that monitor the peripheral physiological functions, with the goal of creating a better understanding of the role of our emotional states and intuition in the cognitive construction of reality.

It starts from the idea that we are explorers of our cognitive processes and, from this search, the universe we live in is created. The advent of Virtual Reality is but another chapter in the use of technology as a prosthetics, dictating how we move our physical and embodied minds through the intersections between realities created by our sense of presence. Some enthusiasts of these new sensory tools become adepts of cyborgism, surgically implanting electronic prosthesis in order to create new senses, new panoramas.

After meeting one of these exploring cyborgs, Cognitive Explorers is born: an experiment seeking to understand the immersion processes of virtual reality through the measurement of emotional and physiological states. The current work introduces the history of Virtual Reality and its technologies, as well as presenting best practices in the use of Virtual Reality visors and the creation of virtual environments for laboratory use.

During Cognitive Explorers, two Empatica E4 sensors were used to monitor the emotional and physiological states of 22 volunteers while they experienced a virtual environment create exclusively for this.

The data analysis show that there is a great correlational potential between the emotional states and the experiences lived by the volunteers. Virtual moments created to generate stress or contemplation are identifiable by reading graphs of cardiological, electrodermal and temperature arousal. The immersion allowed by accessible Virtual Reality technologies made this pilot project possible and understanding the methodologies behind its use opens the way to new advances in cognition and the neurosciences."

 

Teaching Innovation, Technology and Design for Children

vivaRio2017.jpg

I take great joy in teaching technology, art, and design for kids and young adults, and have taken all opportunities to do so since 2015.

In 2015 I was invited to be a consultant for the Estacio Tunnel Lab, an innovation accelerator aimed at young students from a group of private universities. There I worked together with the team of *Favela Games*, a social venture aimed at teaching kids from the poor favelas of Rio de Janeiro how to make games and gain important technical and design skill valued by today's market. 

The project gained prizes and media attention, with our two main designers making appearances on national news and tv shows. 

After this experience, I was invited to teach in other schools, often 8th graders from lower-income households. From the end of 2016 to beginning 2017, I was hired as a teacher and consultant to one of Viva Rio's main innovation hub at the Cantagalo Favela in the heart of Rio de Janeiro. There I showed students how to make their own 360 videos; taught how to draw as a design tool; we've made music videos and games.

Favela Games Estacio.jpg

The students were also invited to visit the Neurosciences and Arts lab at the Federal University of Rio de Janeiro, where I concluded my first masters. There they could see the latest in virtual reality and have their first contact with a scientific organization and academic professionals.

Through my consultant company - Imaginaria - our team set up the "Makerzinhos" (Little Maker) stand at the high-end Jockey Club Summer camp exhibition. We used makey-makey boards, tin foil, and cardboard to create banana-operated games and piano stairs.

makerzinhos 2015 01.jpg
makerzinhos 2015 02.jpg
Viva Rio 2017 02.jpg

Toys that Make Noise! XR - MIT VR Hackathon 2018

Augmented Reality interface for the Microsoft Hololens. Created by Barak Chamo, Maï Iszak and Gabriel Brasil during the MIT Reality Virtually Hackathon - October 2017.

"Toys that make Noise" was developed by combining Vuforia's Augmented Reality platform with the Hololens - Microsoft "self-contained, holographic computer" - to bring personal objects and memories to life. 

In the real world we surround ourselves with things that have meaning and are attached with personal memories: Pictures in frames, little souvenirs, gifts and one of a kind objects. But in our increasingly digital lives, all of our dearest memories are just files in folders like any other. 

With "Toys that make wow" we use Augmented Reality to connect both our physical cherished mementos, with our digital memories. Just need to stare at a picture on a frame, and a video of that event is played; look at a hand drawn illustration and 3D objects appear. 

Physical objects can also talk to one another. In our experience, each souvenir played part of a music that is played as whole when all objects were activated by staring at them in order.

toyswow_team_web.jpg

Team behind "Toys that make Noise": Maï Iszak, Gabriel Brasil and Barack Chamo.

Transcription Exercise - My 2014 ITS lecture on Virtual Reality

LINK TO TRANSCRIPTION FILE HERE

The goal for this first assignment of the Reading and Writing Electronic Text, was to transcribe something, anything that wasn't already available as electronic text.I have chosen to transcribe the following youtube video from a lecture I gave in 2014, together with Franey Nogueira, about this (not so) new thing called Virtual Reality

about the Video

The lecture took place at the Institute for Technology and Society of Rio de Janeiro, where every week experts in different fields would talk about the latest and most pressing issues. That year, the Oculus Rift headset was just released for developers and researcher, and I took it to myself to spread the word on this new media, while also pointing to the challenges and shortcomings of it. Meanwhile, the visual artist Franey Nogueira got interested in exploring the role of VR in her work and contemporary art. Franey and I were very excited to be invited there, and, after carefully watching the recording for this assignment, I realized that maybe I was too excited.

Thoughts on the transcription process

First, the transcription is in Portuguese, since it is the language of the original video. It took me about an hour to type 1701 words or less than 12 minutes out of the whole 1 hour and 33 minutes video.

During the transcription, I became awfully aware of my shortcomings as a speaker back then: I skipped words, mumbled and was often unable to finish structured sentences. I remember the final reactions and commentaries from the public to be quite positive, but looking and listening to all of my sentences, over and over again, while also typing then, was like holding a microscope into my own words.

Interesting enough, I realized that I was inadvertently correcting myself; fixing errors. If I mumbled something or finished a sentence abruptly, I would just not transcribe it or make small corrections. As I kept writing - and becoming more conscious of my speech - I would actually become more precise. The second half of the transcription has more "..." and mumblings than the first half.

When Franey, my colleague, spoke, I felt more at ease to be more precise with the transcription and I believe I've made barely any unconscious changes or improvements. This seemed to make it even more clear to me on how I was editing the transcription without even realizing it.

Plain Text format politics

After finishing and uploading the transcription to this website (here), I realized that the plain text format, when opened in my Chrome Browser, transformed basic Portuguese punctuations (like "ão" in "não"/"no") into a barely readable garble. The teacher, Allison Parrish, warned about using "weird signs" when using plain text, but I didn't realize that it would include some of the most important signs in my native Portuguese Language. I clearly understand the technical and historical reasons for the code not recognizing "ão", but seeing my transcription transformed into such a mess made me wonder about the political aspects of not having your own language recognized as "plain text". What does it mean when your language is not..."plain"?

Syllabus and first-week assignment:

http://rwet.decontextualize.com/schedule/