Week 4 Update

This week I started the production phase of this project by creating a series of different levels to mimic the flow of the experience, starting from Main Menu up to lobby selection. I spent some time watching Unity live stream tutorial video for how to create, save and load persistent data, since I will need to implement similar thing to record users preferences, like their favorite animal, things that made them uncomfortable, etc.

During this time, there were some level design changes as a result of rapid prototyping and user testing. The most constrained setting, which was the laying down position, had a very small range of viewing. During several testing on the previous room level, I noticed that it was hard to access the menu placed on wall to the side. I really had to strain my neck to be able to select the menu items and it was really uncomfortable. This needed to change. I also realize the lobby level can be used as a relaxation place as well. Also I started to design around adding downloadable contents (DLC) to the project.

This week was super busy in term of professional meetups in the evening, so I had to cut some development time.

On Tuesday, Unity 2017 was released. After backing up the project, I decided to upgrade. To my surprised, I had almost no issue. Usually when I upgrade to a newer Unity version, a lot of scripts and sometimes prefabs would break. I was expecting tons of errors, but so far there were only a couple issues.

The first one was in the OVROverlay.cs script from Oculus Utilities 1.16.0 beta. The method Cubemap.CreateExternalTexture used 4 variables, but somehow the script used 6 variables instead. Once I fixed that particular line, the error disappeared.

The second issue came from a custom outline shader that I used before the upgrade. Somehow after upgrading, it added dark blue tints to all the outline materials. Since I didn’t know much about writing custom shader from scratch, I had to look for another. I found three different scripts online, tested each of them, and found one that a lot easier to implement than the old one.

In the evening, I went to an Indie Game Developer social meetup, which was held monthly. I met several other VR content creators there.

On Wednesday, I created the different levels, played around with different UI elements and changed design several times to ensure user comfort. Things were good. In the evening, I went to a local Hololens meetup. I wasn’t planning to go, but then I heard they would demonstrate Microsoft new Acer VR HMD there. I glad I went though, since the talks were really interesting. The first speaker, Sean Ong, shared his experience in creating virtual apartment tour for a company in Dubai. The second speaker, Thomas Wester, shared his experience in capturing dancing motion into VR and AR experience in his team project, “Heroes – a Duet in Mixed Reality”, which was created as 2D film, 360 video (available on GearVR) and for the Hololens. I really like this talk, since it was the first time I have seen Hololens not used for business purpose. You can view these talks here.

My impression on Acer HMD… It was very light, even lighter than GearVR + phone. However the demo used a PC with older graphic card, so graphic-wise it was similar to PSVR. In term of room-scale, it felt like Oculus Rift, for seating experience. We can move a little bit, but there was not much room to walk around, unlike HTC Vive. They didn’t have the VR controller yet, so instead we had to use Xbox controller. I personally dislike VR experience using game controller. At the moment I was unimpressed with the HMD. I will have to try this device again with better PC and the hand controllers in the future.

On Thursday, I did some code and assets cleaned up. I then noticed that the new outline shader was acting strange. It would work fine for a bit then it wouldn’t work at all. I wasn’t able to touch more on this since I had to go to Seattle Unity user group meetup and learned about a real-time geo-spatial plugin called Mapbox. Having to drop was I was doing while it was still unresolved actually gave me a lot of anxiety. Those who befriend me of Facebook probably saw me complaining about it. Thankfully the content of the meetup talk was really interesting and I also got to meet some old friends that I haven’t met in a while. My anxiety was reduced a lot afterward.

The next day, instead of jumping straight into figuring out the issue, I started the day with me-time, taking extra time with hot shower, made delicious breakfast (usually I forgot to eat breakfast) and organized the house a bit so it didn’t look like a typhoon just passed through. When working solo, it is really easy to get burned out, and I noticed the anxiety might be the first sign. So after I felt more relaxed, I opened up the project and tried to figure out what causing the shader issue. After several testing, I noticed that during run-time, there was extra camera under OVRCameraRig. In my project, I modified CenterEyeAnchor game object by adding more children for UI and gaze interaction and their supporting scripts.

During run-time, usually CenterEyeAnchor was placed under LeftEyeAnchor automatically. But during my testing, my custom CenterEyeAnchor was placed under LeftEyeAnchor as usual, but then extra CenterEyeAnchor sub-object was created with a camera attached to it and it prevented the new shader from displaying correctly.

After I modified the OVRCameraRig.cs script and disable this extra camera, everything worked again, hooray. Let’s hope this won’t create a new wonky behavior in the future.

In the evening, I went to an event hosted by TPCast, which was a device that transform HTC Vive into a wireless HMD. I was skeptic about it before, thinking it would have latency issue, and the battery pack would be uncomfortable. However I had a good experience with it. It didn’t have any latency, and I forgot about the battery pack. It felt a lot more comfortable than having the long heavy cable, that’s for sure.

[Now, for a bit of rambling.]
This week I met a couple people that made me think more about VR industry and where we are heading. Both actually happened before and during TPCast event.

We arrived about 45 mins early for TPCast. The lobby had some nice sofa, so I let my husband tried the latest build of EnlivenVR while killing some time. We also met with another 2017 OLP member and chatted for a bit. Then an older gentleman approached us. He looked somewhat disturbed. He saw my husband using GearVR a while back, and wanted to share his concern. The gentleman was a senior composer working in movie industry, and had some people told him to look into VR. This event was his first time going to any VR event, and he noticed all the demo were on ‘violent’ games. Then as he talked to one of the organizers, he was told that that was what VR all about, and that really upset him. I was really surprised, since I have seen many interesting projects, games or non-games ones. I told him there were a lot more than violence in VR, and that I was in a middle of making a relaxation VR experience. He seemed happy to hear that. However he seemed disinterested from looking more into VR based on this first experience, which made me pretty sad. The gentleman left before the event even started.

As we made our way to the event room, one of the organizers was curious about what happened. We explained to him, and he seemed shocked as well. It turned out they were showing Space Pirate Trainer, and a bow/arrow game, which to most gamers were considered non-violent. I was expecting to see something like Raw Data, Arizona Sunshine or other zombie survival shooter games instead. The organizers then tried to catch up to the gentleman to talk to him, but he was long gone.

At the event, there were a total of 6 ladies: four attendees and two organizers. I chatted with these ladies. Like most VR events, the attendees were mostly men. One of the lady, just like me, has been working in tech for a while so she was fine. But another lady, who was there with her mom, was very new to VR. Just like the gentleman from before, this was also her first VR event. She was a student from business school, curious about this new technology but felt really intimidated for being a minority. We talked for a while, and I shared my experience that although it were very common to be minority in this kind of tech events, the ladies in Seattle and all over the world are trying to make VR/AR industry more inclusive to women and other minorities. She seemed relieved to hear that, and interested to come to more local meetups.

When it was my turn to try the device, I asked the organizer if I could try different app, a non-game one. They had Tilt Brush installed, so I went with that. I had a good experience being wireless, able to move around and draw from different angles without the need to teleport around or worry about stepping on tangled cable. When my turn was over, the two female staffs came to me, shared how they never try Tilt Brush before and now they were really interested of what else VR were capable of aside from gaming.

In the end of the day, I was left pondering. As a gamer, am I desensitized to violent contents? I don’t feel disturbed for shooting zombies or slashing monsters. For me, they’re no different than the fruits we slice in Fruit Ninja, just some objects to interact with. But for those who are not familiar, do we look like violent people for enjoying these kind of games? As content creators, what considerations should I put when creating contents, to make people like that gentleman not to stay away from VR?
[End of rambling]

Back to the project talk. To do list for next week:
– Create and test saving/loading custom data.
– Use custom data to drive object generation in room and garden level.
– New menu design that won’t hurt my neck to much.

See you next week! And don’t forget to take a break and treat yourself once in a while. It really helps.

Week 3 Update

I did a few more testing for user view range early in the week to continue gathering more data. Then based on these data, I created white box level design of the bedroom, using Unity basic 3D geometry.

In the beginning, I used real world measurement for the bed and the room size, but somehow it felt really small. Afterward I tweaked the dimension more to make the room more comfortable. Once I found something that feel right, I added more objects, applied flat materials and set up early lighting.

Plan for next week:
– add gaze interaction modules
– create more white box for other levels
– prototype and test game play flow
– look into save setting system
– if time permitted, start working on 3D assets

See you next week!


Week 2 Update

Earlier this week, I managed to figure out several scripts for gaze-click interactions, such as to change scene, hide/show objects and turn animation on and off. This is a big victory for me, since this interaction is the main mechanics for my project. Now the fun part begins! Below is a capture from Unity, with hide/show object on the left, play animation kitten on the right, and scene changer cube on the center.


Art direction:
In the middle of the week, I spent one whole day collecting reference pictures for art direction. I like the art style of old Disney movies like Pinocchio, Dumbo, etc as well as Studio Ghibli movies. For the demo submission, I decided to create two different levels:

1. Bedroom, for starting level or lobby, where user can share their likes and dislikes by selecting items around them. I find this level to be super important, as everyone has different preferences, especially when it comes to dislikes. I want the users to be as comfortable as they can in this experience, and hopefully we can identify things that trigger their discomfort before sending them into the next level.

2. Secret garden or hidden spot on a hill. This will be an example of the area where user can spend 5 to 15 minutes trying to relax, surrounded by their favorite animal companions. There will be interactive objects as well to keep the user entertained if they decide to be more proactive instead of just being passive. These interactive objects will be populated in the level based on the the user’s preference. For example, if normally we have butterflies in this level, but the user were somehow afraid of them, then we could disable the butterflies and replace them with birds.

On the reference pictures above, I also listed secret cave / hidden beach, which I would like to implement if I have extra time before demo submission. Right now this level and about 3-4 others are listed under implement later list, when it comes to the final product.

By the end of the week, I started doing some user experience testing to figure out comfortable viewing range. In my design, I want user to have option to run the experience in three different resting positions:
1. laying flat on a bed or sofa, with a pillow under the head
2. recline, for example with 2-3 pillows behind the back
3. sitting up straight

For the initial testing, I created an array of interactive blocks in Unity, as shown below. The blocks are about 1x1x1m, and their location range from about 2 m behind the the user to about 6 m in front of them.

Then we try to interact with these cubes, for the three different resting positions. From several sessions, we found out that position 1 (laying down flat) has the smallest range, while position 3 (sitting up straight) has the largest range.


Next week to do list:
1) More testing – based on this week’s data, try with smaller objects in various distance.
2) Create white-boxed levels based on the gaze data.
3) Test these levels.
4) Start creating custom art assets for each level
5) Implement audio (temp background music, sound effects) within the test levels
6) Figure out how to capture and record VR test sessions.

That’s all the update I have for this week. Happy fourth, everyone. I know everyone’s working hard, but don’t forget to take a break once in a while. :smile:


Week 1 Progress

As a requirement to qualify for Oculus Launch Pad scholarship, each participant is required to have a weekly blog post at Oculus forum. Since it’s not visible for non-participants, I will have a copy of it on this site as well. Here it is for this week:

Hello, I am a 3D environmental artist who also been teaching myself Unity development so I can make my own games and VR/AR experiences. I met so many brilliant and wonderful people at the Oculus Launch Pad (OLP) boot camp, and learned a lot. There were many great ideas and I’m looking forward to see them into fruition.

About my project:
EnlivenVR (working title) is an interactive relaxation experience for mobile VR platform, where users can take a break from their busy life and escape to virtual safe space, surrounded by their favorite creatures and soothing sounds, while laying on their back in a comfortable place like a bed or a sofa.
By mainly using gaze selection and head movement, users can interact with the environment. Tap and Gear VR controller will be implemented as well to give audience more input options.
I am hoping that after 10-15 minutes of using the app, users feel more relaxed, refreshed and in a better mood to face the rest of their day.

Why this (VR) project:
Gaze-based interaction in mobile VR is something I have been trying to explore. The first VR system I tried to build for was Google cardboard, since I only had access to android phone at that time. I got myself a cheap paper cardboard viewer, and for some reason the clicker did not work with my phone. As a result, I would have to put my finger by the viewer’s nose opening and tapped the phone screen that way. It was really uncomfortable, and I wished I had super power to click with only my gaze.

During a local VR meetup, I had a conversation about this gaze-click interaction with a lady who was really interested in this idea. She has spinal muscular atrophy, a genetic disease affecting part of the nervous system that controls voluntary muscle movement. On some bad days she was unable to use her arms and legs. This type of control would enable her to still interact with the virtual world even as her limbs refused to listen to her brain.

In another VR meetup, there was a conversation about accessibility. One of the problems mentioned was how to accommodate for people who were bedridden. We talked about ways to set up different cameras to make  people laying on a bed feeling like they’re standing up or sitting straight.

As I started thinking of a product when writing pitch document in preparation for OLP Bootcamp, these things came back to me. I was thinking, why don’t we make an experience for users who lay flat on the ground, a bed or a sofa, without using their hands. Instead of creating complex camera systems, why can’t we just make the interaction happening above the users, where their gaze can reach. This way, I can target both people who needs accessibility and those who are just too lazy to move their arms.

Why relaxation experience? I believe mental health is important. In our current political climate, I feel anxious and depressed more often than usual, and I know a lot of people also who have similar issues. As someone who had been through severe depression, sometimes it takes a lot of willpower just to get out of bed in the morning. I am hoping to create a tool for people like me and help finding that extra energy, extra push to face the day. I’ve read articles about successful people who also spend some time in their daily life to relax and meditate before starting their day.

Progress of week 1:
1) Hardware & software setup
The first thing I did with the Samsung phone was enabling developer mode and removing unnecessary bloatware, then install necessary apps that were needed for development. I already have the latest Unity and Android/Java SDKs installed on my work machine from previous projects, so I downloaded Oculus Utilities for Unity and VR examples.

2) Learn how to build simple scene
Before going crazy with building levels and behaviors in Unity, I needed to learn how to deploy GearVR app to the phone first. So I created a simple scene with a cube, a OVR camera and build it. I noticed unlike for Google Cardboard, GearVR app actually try to detect for the hardware before it can be loaded.

3) Playtest and analyze different existing GearVR apps, especially something with similar theme
At first I downloaded several free apps and 360 videos especially the ones mentioned during Launchpad presentations, like Zero Days VR, In the Eyes of the Animals, Face Your Fears, and Strangers with Patrick Watson. Then Oculus summer sale was happening, yay! For experiences, I downloaded Relax VR, Jurassic World: Apatosaurus, Singularity, Land’s End,  A Night Sky and Bait!. I found similar design elements as my project in Relax VR, A Night Sky and Bait!, which is a good thing.

4) Explore Oculus and Unity toolkits
So far I have spent the largest portion of my time on this. Oculus development is new to me. The Oculus Utilities for Unity had very small number of example scenes, and the Oculus Sample Framework for Unity  project ended giving tons of errors that prevent me from even building any scene. After 2 days struggling with the sample framework, I just tossed it out the window and downloaded Unity VR Samples from the asset store. This one didn’t give me errors, and gaze interaction was similar to what I have in mind.

5) Create initial design
To prevent feature creeps during development, I created initial design document, mainly identifying what needs to be in the experience, what can be added during polish time, and what is nice to have for future contents. I left out a lot of the details since this document will most likely change a lot in the next few weeks, and those details will be added as we build and test the prototype.

6) Create deliverable timeline & consider public events for testing
This one goes in hand with the initial design document. To plan for public testing of this product, I start listing when local VR meetups usually happen. In September, one week before Launchpad submission deadline, there will also be PAX Dev and PAX Prime. At PAX, I can probably get some feedback while waiting in line for panels or going to the safe space room. The great thing with mobile VR is I don’t need to lug around laptop or beefy machines to showcase my app. All I need are the phone, the viewer, cleaning supplies and a charger.
Knowing the dates of these event help in planning deliverables. Based on all this, I pretty much need to have working prototype in a month, then spend the rest on time on more testing and polishing.

7) Start prototyping, tackling mechanics first
As a 3D artist, programming will be my greatest challenge. So, instead of concentrating on art, this week I started with tackling mechanics first, to make sure the building blocks are working properly. I use a lot of art placeholders with cubes and other Unity 3D objects. Pretty stuff will be added later.

8) Setup streaming using Chromecast
People say that mobile VR is an isolated experience, but guess what, GearVR is compatible with Google Chromecast. Last night we went to Best Buy and got one, then tested it right away. I was able to stream my session to a TV. The frame rate was pretty choppy, but this will be great for testing later, to watch for user interaction. I will write my setup for streaming in a later blog.

Things to consider during development cycle:
1) Keep it simple. Since I will work on this project solo, it’s important to know my own abilities and not going for crazy ideas. From my experience with hackathons and game jams, if I’m struggling with the same problem over 1-2 days, maybe I should find and try another solution.

2) Why VR. This question keeps popping in my mind after going through the boot camp, and it really affect my design decision.

3) Think of user needs first. Since I’m creating a safe space for people, I want to make sure user comfort is the main priority. For example no jump scare, have a way to exit the app fast if needed, and have a way to remove any trigger or uncomfortable objects from the environment.

4) Test ideas through prototype. Fail fast and fail often.
Sometimes, especially in VR, ideas looks good on paper, but as we implement it in VR, it turned into really bad experience. Many questions can be answered through prototype as well. Don’t know what’s considered close or far? Put several cubes in different location. Put on the HMD and check for yourself.




Setting up for Samsung GearVR

In the past, I have only worked with PC VR/AR devices such as HTC Vive and Microsoft Hololens. Mobile VR development is totally new for me. I attempted to create Google Cardboard experience last year during Ludum Dare but after having too slow of a progress, I dumped the idea.

To prepare for Oculus launchpad bootcamp, I actually went through some tutorials from Udemy Make Mobile VR Games in Unity with C# for Google Cardboard one week before the event. I only had access to Google Cardboard and Daydream at that time, but it gave me some idea how to build and deploy VR apps from Unity onto the phone.

Then yesterday, as I fought nasty sore throat/cold after the bootcamp trip, I managed to successfully built and deploy simple test scene to the new Samsung S7 phone and the GearVR, yay.

I work on Windows 10 system, so here were the steps for my setup:


  • Install the latest Unity. At the moment, I have Unity5.6.1f1, which is the latest release version.

Android development software:

  • Install Java Development Kit (JDK) Most people probably already have Java installed on their machine, but not JDK. Most have JDE, which is Java Development Environment. Make sure to find and install JDK as well, or else Unity will complain when building to android phone. Also, make note of the JDK installation path directory. You will need this later in Unity.
  • Install Android Studio Development Bundle or Standalone Android SDK Tools. I went with the standalone SDK tools, since I will be building mostly in Unity. It doesn’t harm you to install Android Studio as well. I just don’t have too much free space on my hard drive, so I go with minimal requirement. If you do get Android Studio, make sure you download the version that has the SDK or else there’s no point. To download just the SDK, scroll down all the way to the bottom to see different software options. Make note of the Android installation path directory. You will need this later in Unity and when identifying phone device ID. Also, make sure to run the Android SDK Manager and install the corresponding tools and API version.


Oculus development software:

  • Install Oculus Utilities for Unity 5. I downloaded OVR Unity utilities version 1.15.0, since I don’t feel like writing everything from scratch.
  • Optional: download and install the Oculus Sample Framework for Unity 5 Project. Do be careful when importing this package though. Make sure not to overwrite the latest Oculus Utilities files, or else you will get tons of Unity errors, which will prevent you from even running any scene.

We’re almost there! To be able to deploy to our shiny Samsung S7 phone, we need to identify and get its device ID, then create an Oculus signature file. Here are what to do:

  • Make sure your phone has developer mode enabled. To to this, go to your phone Settings -> System -> About device -> tap on Build Number 7 times. Just keep tapping until it finally says you’re a developer. Then go back to System -> Developer Options and turn it on. I also turn on Stay awake, USB debugging, and Verify apps via USB. Not sure what the last part does, but it sounds safer. Better safe than sushi.. err, sorry.
  • connect your phone to your working PC though USB cable
  • bring out command prompt window  (hit search icon on desktop, type cmd and enter)

  • go to android sdk installation folder. (Use cd.. to go up a folder, cd foldername to go in a folder. My installation is at c:\Program Files (x86)\Android\)
  • Go into android-sdk\platform-tools\ folder (my full path is c:\Program Files (x86)\Android\android-sdk\platform-tools\)
  • type ‘adb devices’ (without the ‘ ‘). This will give you the device ID

Most of the prep work is down. Now in Unity, do these steps:

  • Go to Edit -> Preferences -> External Tools
  • Copy and paste the path to Android SDK tool (mine is C:/Program Files (x86)/Android/android-sdk )
  • Copy and paste the path to Java JDK tool (mine is C:\Program Files\Java\jdk1.8.0_92 )


  • Go to File -> Build Settings -> Player Settings
    • Under Other Settings, enable Virtual Reality Supported. Use Oculus Virtual Reality SDK
  • If you’re using spatial audio, go to Edit -> Project Settings -> Audio. On Spatializer Plugin, pick Oculus Spatializer
  • Remember the signature file you created? Place a copy of it under Project/Assets/Plugins/Android/assets/


Now create your test scene

  • remove current Main Camera. Drag and drop OVRCameraRig prefab from OVR -> Prefabs folder (part of the Oculus Utilities package. Import this if you haven’t done so.)
  • Create a new 3D object -> Cube, place it 2-3 unit in front of the camera.
  • Save this scene as ‘test’
  • Go to File -> Build Settings, hit Add Open Scenes button to add the scene to the list
  • Still in Build Settings, under Platform, select Android and hit Switch Platform. Let Unity does its thing for a minute or so.
  • Once Unity finished, and your phone is still connected, hit Build And Run. You will be prompted to name and save the apk file. I usually create a new folder called Build and save the apk there.

  • Once Unity finish building, the phone will load the file right away. Just place it in GearVR and give it a test.

Hopefully this guide helps. See you next time.

Oculus Launchpad Bootcamp

As the first phase of this year’s Launchpad program, Oculus held a two full-day bootcamp at Facebook campus in Menlo Park, CA on June 10 and 11. One hundred brilliant participants with various background, ethnicity, different skin colors, and all with the common passion in Virtual Reality were gathering in this space. It was a lovely sight. I would say more than 50% of the participants were colored women, and the rest were colored men. I didn’t meet with any transgender person. But then again, I did not have the chance to talk to all 100 participants.

On the first day, the event started with breakfast and mingling among participants, followed by a short ice-breaking session to get to know each other. Robin Hunickle from Funomena came next with her inspirational talk about failing. The following talk was from Facebook’s lawyer, about law and legal matters. From then on, the participants were divided into two groups: 360 film making and games/experience. I was interested in VR experience, so I stayed to listed to Bernie Yee from Oculus Rex sharing his knowledge and experience as producer in video game and VR industry. The talk structure came in lectures as well as group exercises. We took a break in between this session to have lunch. After this track, all participants gathered in the main room again to listen to Storytelling presentation. The last talk of the day was an open Q&A session hosted by Jason Reuben. A lot of people were asking tough and good questions, ranging from upcoming tech, challenges in VR, to how to overcome unconscious bias.

At the end of day one, Oculus held a social mixer at the garden of Facebook building rooftop. If you ever watch The Kingdom of Dreams and Madness, a documentary of Studio Ghibli, where they showed the garden area at Studio Ghibli rooftop, it was similar, except Facebook rooftop garden was a lot larger. Up there they set up some VR demo stations featuring Google Earth and Facebook Spaces, where participants could try them with Rift & Touch controller or on GearVR. I have tried Google Earth, and really not interested in social VR at this time since personal space is still an issue, so I spent my time grabbing snack for dinner, had a drink from the open wine bar, and get to know the others.

The second day started with breakfast and mingling, followed by the first talk by last year Launchpad participants, sharing their experience of the program and the projects they are working on. It gave us ideas what to look forward in the next few weeks as we go through this program. Chris Pruett, head of Oculus Mobile Dev Engineering gave the second talk about VR user comfort issues, design and motion controls based on his experiences.

Then, we had Unity tutorial session, which divided the attendees into two groups based on their Unity knowledge: beginner and advance. I considered myself an intermediate, so I was not sure where to go at this time. On the screen for beginner session showed the Viking tutorial, which, if had been taught in a couple Unity Roadshow workshop in Seattle. However, I noticed that Sarah Stumbo was giving the presentation, so I decided to stick around, since I like her teaching style, where she would go over and explain the scripts used in project. One of the attendee was having issue where her laptop was not powerful enough to run Unity, and since I have done this tutorial in the past, I lend her my laptop. I was also able to spend time and help an old friend getting used to Unity while catching up with her stories. I was also able to tinker with the GearVR headset, and learned how to set it up with the help of the others around my table.

After lunch and more tutorial session on 360 video, everyone gathered together and listen to the talk on VR for good, which was another Oculus program similar to Launchpad but more about creating products that promote empathy and good. The following talk was about Creating Compelling Pitch by Isabel Tewes, Developer Strategy, and Dorian Dargon, a producer. Dorian was actually one of the Launchpad participant from last year who then got hired by Facebook. During this session, there was presentation and short exercise on pitching with a partner. I have only pitched in my Game Design course a few years ago, so this workshop was really useful. I learned a lot from my partner’s feedback, especially in giving enough information and why I am the right person for this project. Promoting myself was still a hard thing to do for me.

The last talk was a journey story of one of last year’s Launchpad participant, Jewel Lim. She shared her life story and what VR and the program meant for her. Her talk was really touching and inspirational.

Overall, the bootcamp was a great inspirational and motivational experience. I got to create new bonds with other creative people, as well as renewing the bonds with some people I have met in the past. To the mentors and speakers who share their experiences, thank you. It helps to see the process and the human side of things, instead of just an end product.  Also, big thank you for the organizer of this event, like Ebony. You enable us this opportunity to meet and learn from so many people. Oh, and for the free hardware too!

In the future, I will post my notes for more details of these talks. Stay in tune!

Hardware from the bootcamp
Pitching session
Pitching session


Hello everyone!

This weekend I will be visiting Facebook HQ to attend a 2-days 2017 Oculus Launchpad boot camp. I’m so excited to meet with the other participants and mentors. In the weeks following, I will share the progress of this Virtual Reality experience in this blog.

See you soon!