Week 6 Update

I started this week by tinkering pet swapping module, by creating 3 different points in space related to user position (sitting, reclining or laying down flat), placed planes to represent the pets, and wrote scripts to instantiate and destroy objects at run time based on preference. I was a bit worried about performance at this point, but since there would be only one pet at a time in the level, the instancing/destroying method shouldn’t affect performance much.

Afterward, I modify the menu system a bit. For easy access, I want the menu to be available at all times, probably floating somewhere the user can gaze at. However, I don’t want the menu to obstruct the user’s view of the world. After pondering for a while, I use the menu system from my favorite mobile game, Neko Atsume as reference. I like how its menu always available at the top left corner, and it is really easy to use.

Here is a sample of the pet swapping module and the new menu system.

For positional menu, I decided to line the buttons vertically instead of horizontally like before. This way, it is a lot more comfortable for the user in any position to pick the button menu based on his or her current position.

As I test the opening scene in space, I realize this level can be pretty relaxing as well. So instead of just floating in space, I started creating a mock-up space-pod. For the art direction of this project, I try to avoid square, or sharp edge geometry, and create more curvy shapes. So for the space-pod, I made it a transparent sphere.

For the pod seat, I looked up some reference pictures for sci-fi captain seats and found out that most of the designs look really uncomfortable, like the ones from Star Trek. Then I turn my search to pool floats, and they look really comfortable. However, having a pool float inside a bubble feel a bit weird, so I added some structure, like curvy metal beams to the pod. The beams help in creating the feeling of inside a personal space pod, and interestingly enough, I feel safer being inside this room compared to when I was just floating in space.

Below is a video of the space pod experience. I haven’t had the time to implement the space as its own relaxation level.

I let my husband tested this build last night. Unfortunately, due to time, I wasn’t able to create custom materials for the mock up space pod, so I used existing glass material from my previous project, which was a survival horror game, and the glass was cracked. As my husband tried the demo, he felt very uncomfortable in the space area since the cracked glass made him nervous.

For next week, most likely I won’t have much done in term of development, since I will be at DevGAMM, Casual Connect Seattle and also a 48-hours game jam hosted by Seattle Indies. I will definitely bring my latest build and GearVR, and test it to whoever wants to give it a try. We’ll see how it goes.

Here is the video of the latest build. See you next week!


Week 5 Update

My anxiety problem continued from last week, and I figured it came from a really dense scheduling. I took a few step back, looked at current schedule and milestone, and started rearranging priorities to be more reasonable. It is great to have goals, but it’s an issue if we break down before reaching that goal. So in the beginning of the week, I dedicated a couple days away from Unity development and concentrated on paper design. I spent those days to relax myself using the latest Enliven VR build. It actually worked pretty well with the few features available right now. I only had issue when I started analyzing things while using the application instead of just simply relaxing, and triggered the anxiety. I had to keep reminding myself to take off the developer hat while I was using the headset.

In the middle of the week, I experimented with menu range and rotation based on user position. This short video below showed the mock up I had, where user started at reclined position, then selected sitting and laying down positions. The five buttons per row represent user comfort range for horizontal view. I can actually go up to 7 buttons, but the extra two buttons might cause neck discomfort over time. In actual experience, this menu will be hidden until user decide to activate it.

This user position is then carry over as persistent data from one scene to another, using Level manager with DoNotDestroy() script attach to it. Below is a mock up video of the latest build of Enliven VR.  The music is a temporary placeholder, composed by Niki Wonoto, a talented friend from Indonesia. He’s given me permission to use his pieces in my free projects. For Enliven VR, I’m requesting a couple new pieces from him.

Plan for next week:
– modify persistent data to implement instance based on user option
– implement menu hide/show
– start creating art assets

See you next week!

Week 4 Update

This week I started the production phase of this project by creating a series of different levels to mimic the flow of the experience, starting from Main Menu up to lobby selection. I spent some time watching Unity live stream tutorial video for how to create, save and load persistent data, since I will need to implement similar thing to record users preferences, like their favorite animal, things that made them uncomfortable, etc.

During this time, there were some level design changes as a result of rapid prototyping and user testing. The most constrained setting, which was the laying down position, had a very small range of viewing. During several testing on the previous room level, I noticed that it was hard to access the menu placed on wall to the side. I really had to strain my neck to be able to select the menu items and it was really uncomfortable. This needed to change. I also realize the lobby level can be used as a relaxation place as well. Also I started to design around adding downloadable contents (DLC) to the project.

This week was super busy in term of professional meetups in the evening, so I had to cut some development time.

On Tuesday, Unity 2017 was released. After backing up the project, I decided to upgrade. To my surprised, I had almost no issue. Usually when I upgrade to a newer Unity version, a lot of scripts and sometimes prefabs would break. I was expecting tons of errors, but so far there were only a couple issues.

The first one was in the OVROverlay.cs script from Oculus Utilities 1.16.0 beta. The method Cubemap.CreateExternalTexture used 4 variables, but somehow the script used 6 variables instead. Once I fixed that particular line, the error disappeared.

The second issue came from a custom outline shader that I used before the upgrade. Somehow after upgrading, it added dark blue tints to all the outline materials. Since I didn’t know much about writing custom shader from scratch, I had to look for another. I found three different scripts online, tested each of them, and found one that a lot easier to implement than the old one.

In the evening, I went to an Indie Game Developer social meetup, which was held monthly. I met several other VR content creators there.

On Wednesday, I created the different levels, played around with different UI elements and changed design several times to ensure user comfort. Things were good. In the evening, I went to a local Hololens meetup. I wasn’t planning to go, but then I heard they would demonstrate Microsoft new Acer VR HMD there. I glad I went though, since the talks were really interesting. The first speaker, Sean Ong, shared his experience in creating virtual apartment tour for a company in Dubai. The second speaker, Thomas Wester, shared his experience in capturing dancing motion into VR and AR experience in his team project, “Heroes – a Duet in Mixed Reality”, which was created as 2D film, 360 video (available on GearVR) and for the Hololens. I really like this talk, since it was the first time I have seen Hololens not used for business purpose. You can view these talks here.

My impression on Acer HMD… It was very light, even lighter than GearVR + phone. However the demo used a PC with older graphic card, so graphic-wise it was similar to PSVR. In term of room-scale, it felt like Oculus Rift, for seating experience. We can move a little bit, but there was not much room to walk around, unlike HTC Vive. They didn’t have the VR controller yet, so instead we had to use Xbox controller. I personally dislike VR experience using game controller. At the moment I was unimpressed with the HMD. I will have to try this device again with better PC and the hand controllers in the future.

On Thursday, I did some code and assets cleaned up. I then noticed that the new outline shader was acting strange. It would work fine for a bit then it wouldn’t work at all. I wasn’t able to touch more on this since I had to go to Seattle Unity user group meetup and learned about a real-time geo-spatial plugin called Mapbox. Having to drop was I was doing while it was still unresolved actually gave me a lot of anxiety. Those who befriend me of Facebook probably saw me complaining about it. Thankfully the content of the meetup talk was really interesting and I also got to meet some old friends that I haven’t met in a while. My anxiety was reduced a lot afterward.

The next day, instead of jumping straight into figuring out the issue, I started the day with me-time, taking extra time with hot shower, made delicious breakfast (usually I forgot to eat breakfast) and organized the house a bit so it didn’t look like a typhoon just passed through. When working solo, it is really easy to get burned out, and I noticed the anxiety might be the first sign. So after I felt more relaxed, I opened up the project and tried to figure out what causing the shader issue. After several testing, I noticed that during run-time, there was extra camera under OVRCameraRig. In my project, I modified CenterEyeAnchor game object by adding more children for UI and gaze interaction and their supporting scripts.

During run-time, usually CenterEyeAnchor was placed under LeftEyeAnchor automatically. But during my testing, my custom CenterEyeAnchor was placed under LeftEyeAnchor as usual, but then extra CenterEyeAnchor sub-object was created with a camera attached to it and it prevented the new shader from displaying correctly.

After I modified the OVRCameraRig.cs script and disable this extra camera, everything worked again, hooray. Let’s hope this won’t create a new wonky behavior in the future.

In the evening, I went to an event hosted by TPCast, which was a device that transform HTC Vive into a wireless HMD. I was skeptic about it before, thinking it would have latency issue, and the battery pack would be uncomfortable. However I had a good experience with it. It didn’t have any latency, and I forgot about the battery pack. It felt a lot more comfortable than having the long heavy cable, that’s for sure.

[Now, for a bit of rambling.]
This week I met a couple people that made me think more about VR industry and where we are heading. Both actually happened before and during TPCast event.

We arrived about 45 mins early for TPCast. The lobby had some nice sofa, so I let my husband tried the latest build of EnlivenVR while killing some time. We also met with another 2017 OLP member and chatted for a bit. Then an older gentleman approached us. He looked somewhat disturbed. He saw my husband using GearVR a while back, and wanted to share his concern. The gentleman was a senior composer working in movie industry, and had some people told him to look into VR. This event was his first time going to any VR event, and he noticed all the demo were on ‘violent’ games. Then as he talked to one of the organizers, he was told that that was what VR all about, and that really upset him. I was really surprised, since I have seen many interesting projects, games or non-games ones. I told him there were a lot more than violence in VR, and that I was in a middle of making a relaxation VR experience. He seemed happy to hear that. However he seemed disinterested from looking more into VR based on this first experience, which made me pretty sad. The gentleman left before the event even started.

As we made our way to the event room, one of the organizers was curious about what happened. We explained to him, and he seemed shocked as well. It turned out they were showing Space Pirate Trainer, and a bow/arrow game, which to most gamers were considered non-violent. I was expecting to see something like Raw Data, Arizona Sunshine or other zombie survival shooter games instead. The organizers then tried to catch up to the gentleman to talk to him, but he was long gone.

At the event, there were a total of 6 ladies: four attendees and two organizers. I chatted with these ladies. Like most VR events, the attendees were mostly men. One of the lady, just like me, has been working in tech for a while so she was fine. But another lady, who was there with her mom, was very new to VR. Just like the gentleman from before, this was also her first VR event. She was a student from business school, curious about this new technology but felt really intimidated for being a minority. We talked for a while, and I shared my experience that although it were very common to be minority in this kind of tech events, the ladies in Seattle and all over the world are trying to make VR/AR industry more inclusive to women and other minorities. She seemed relieved to hear that, and interested to come to more local meetups.

When it was my turn to try the device, I asked the organizer if I could try different app, a non-game one. They had Tilt Brush installed, so I went with that. I had a good experience being wireless, able to move around and draw from different angles without the need to teleport around or worry about stepping on tangled cable. When my turn was over, the two female staffs came to me, shared how they never try Tilt Brush before and now they were really interested of what else VR were capable of aside from gaming.

In the end of the day, I was left pondering. As a gamer, am I desensitized to violent contents? I don’t feel disturbed for shooting zombies or slashing monsters. For me, they’re no different than the fruits we slice in Fruit Ninja, just some objects to interact with. But for those who are not familiar, do we look like violent people for enjoying these kind of games? As content creators, what considerations should I put when creating contents, to make people like that gentleman not to stay away from VR?
[End of rambling]

Back to the project talk. To do list for next week:
– Create and test saving/loading custom data.
– Use custom data to drive object generation in room and garden level.
– New menu design that won’t hurt my neck to much.

See you next week! And don’t forget to take a break and treat yourself once in a while. It really helps.

Week 3 Update

I did a few more testing for user view range early in the week to continue gathering more data. Then based on these data, I created white box level design of the bedroom, using Unity basic 3D geometry.

In the beginning, I used real world measurement for the bed and the room size, but somehow it felt really small. Afterward I tweaked the dimension more to make the room more comfortable. Once I found something that feel right, I added more objects, applied flat materials and set up early lighting.

Plan for next week:
– add gaze interaction modules
– create more white box for other levels
– prototype and test game play flow
– look into save setting system
– if time permitted, start working on 3D assets

See you next week!


Week 2 Update

Earlier this week, I managed to figure out several scripts for gaze-click interactions, such as to change scene, hide/show objects and turn animation on and off. This is a big victory for me, since this interaction is the main mechanics for my project. Now the fun part begins! Below is a capture from Unity, with hide/show object on the left, play animation kitten on the right, and scene changer cube on the center.


Art direction:
In the middle of the week, I spent one whole day collecting reference pictures for art direction. I like the art style of old Disney movies like Pinocchio, Dumbo, etc as well as Studio Ghibli movies. For the demo submission, I decided to create two different levels:

1. Bedroom, for starting level or lobby, where user can share their likes and dislikes by selecting items around them. I find this level to be super important, as everyone has different preferences, especially when it comes to dislikes. I want the users to be as comfortable as they can in this experience, and hopefully we can identify things that trigger their discomfort before sending them into the next level.

2. Secret garden or hidden spot on a hill. This will be an example of the area where user can spend 5 to 15 minutes trying to relax, surrounded by their favorite animal companions. There will be interactive objects as well to keep the user entertained if they decide to be more proactive instead of just being passive. These interactive objects will be populated in the level based on the the user’s preference. For example, if normally we have butterflies in this level, but the user were somehow afraid of them, then we could disable the butterflies and replace them with birds.

On the reference pictures above, I also listed secret cave / hidden beach, which I would like to implement if I have extra time before demo submission. Right now this level and about 3-4 others are listed under implement later list, when it comes to the final product.

By the end of the week, I started doing some user experience testing to figure out comfortable viewing range. In my design, I want user to have option to run the experience in three different resting positions:
1. laying flat on a bed or sofa, with a pillow under the head
2. recline, for example with 2-3 pillows behind the back
3. sitting up straight

For the initial testing, I created an array of interactive blocks in Unity, as shown below. The blocks are about 1x1x1m, and their location range from about 2 m behind the the user to about 6 m in front of them.

Then we try to interact with these cubes, for the three different resting positions. From several sessions, we found out that position 1 (laying down flat) has the smallest range, while position 3 (sitting up straight) has the largest range.


Next week to do list:
1) More testing – based on this week’s data, try with smaller objects in various distance.
2) Create white-boxed levels based on the gaze data.
3) Test these levels.
4) Start creating custom art assets for each level
5) Implement audio (temp background music, sound effects) within the test levels
6) Figure out how to capture and record VR test sessions.

That’s all the update I have for this week. Happy fourth, everyone. I know everyone’s working hard, but don’t forget to take a break once in a while. :smile: