Nicole Kennedy Design

Ideation, Creation & Everything In Between

Unity & Adventure Creator Experimentation

Today has been a pretty productive day so far this morning we worked to get a few things out of the way and off the doom list so that we can make some decent tracks with the progress of this project.

Greg had been talking to us last week and mentioned that Nigel McAlpine at the BBC may be in possession of an Oculus Rift DK2 the same as the one we have use off already. It was recommended that we get in contact with Nigel and request the use of his Oculus so that we can further troubleshoot the issues we are having with the tracker not functioning. We will then be able to figure out if it is our hardware setup or software setup that may be the problem if we another DK2 to switch about components, we can then pin point what we need once we know.

This morning Fiona was able to show me how she could fix the current sound issue we are having with a constant buzz happening throughout the course of the sound clips.

After a discussion with her flat mate Chris she was able to show me how she used Adobe Audition to create a sound profile of the buzz which then allowed her to remove this profile clip of the buzz from the entire clip of the audio.

Today I wanted to experiment in Unity as there were quite a few things that need figured out that include Adventure creator, not exactly game logic but user experience logic so to speak so that navigation may be set up to let the participant choose if they want to play the interview or if they would like to visit the explorable gallery.

ice_screenshot_20160314-122949.png

In order for this to be achieved I spent the majority of the day problem solving in Unity first of virtual reality supported option within unity always needs to be turned on so that the game screen can be seen in unity otherwise we just get a black screen and we do not want the user navigating from the computer monitor to the oculus we want them to choose this navigation within the oculus so that experience is immersive right from the beginning.

The OVRPlayerController is what allows the user to have movement whenever they are in the experience however I do not want this on all the time but I was having issues turning this on and off and it was causing the main camera to begin with to appear a little disorientating and off position so I decided to break the project into three different scenes and then have the option from the intro screen allowing the user to decide which experience they would like to enter through the use of hotspots.

One problem I have encountered when using hotspots is the in game menu elements including the cursor do not show inside the Oculus so this makes things more difficult for a selection to be made at present it requires dual use of the oculus and the monitor which is not the behaviour we want.

We however noticed that a glow is still being displayed from the placeholder buttons with a highlight script and perhaps if we could make this glow more apparent then the user would know which option they are selecting. I also read an article about gaze dynamics which allows the user to make a selection based on what they look at the longest so this could be something else to look into experimenting with when it comes to navigation.

Here is a video to show the kind of experience we are making progress on you can see that when the user selects play interview a new scene opens where they can view the interview and this then plays automatically. The same goes for Gallery experience the user is then transported to a new scene that shows the explorable experience. In order to make this possible we made use of the following elements –

  • Hotspot placeholders with a highlight script to show what would be buttons also with a glow.
  • Hotspots and interactions for the user to click on these buttons and for an action to happen.
  • Actionlists to include the behaviour of what I want these actions to be.
  • Scenes being added to the build settings so that they can be navigated through the use of the engine : change scene command with a number being called that references each scene within the build settings. If a scene is not in the build settings then you cannot change to that scene using adventure creator.

Adventure creator triggers were also experimented with to enable and disable interview sequences – Object send message visibility works but object turn on/off will not function I want the object to be enabled or disabled as I want the audio component attached to this object to also turn off. I am still working on a way of solving this problem or finding an alternative way of doing this.  The important thing is we now know that is possible to enable and disable sequences of the interview so that they will not all play at the same time.

Here are a few more things we have been working with –

  • The camera on default does not need to include controllable movement to the player. – Upon entry to the experience I do not want the player to be able to walk around on the navigation UI perhaps they can look around but not actually move.
  • The movement needs to be removed from the player until they are within the explorable. – If the user moves to far away from the UI they may get uninterested or even lose the UI and not be able to make their selection of what they would like to view so on terms of user experience it makes more sense to steer the user and have them begin at the navigation UI without movement.
  • Object send message is not playing ball and enabling and disabling as it should.
  • A way to navigate through the sequences explorable or interview a way needs to be apparent for the user to make a choice I am thinking hotspots and interactions may become apparent here.
  • Glow of hotspots needs to be larger
  • Different scenes are maybe required for navigation
  • Why do the menu elements not show within the Oculus. – Fiona has provided me with a digital tutors tutorial to view so that we can see how this is possible but it has been done before which is comforting to know and we may be able to integrate this knowledge into Adventure creator elements.
  • Is the transition from Play Interview/ Gallery experience to harsh for the user to experience.

 

 

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Information

This entry was posted on March 14, 2016 by in Final Year - Major Project and tagged .
%d bloggers like this: