Ideation, Creation & Everything In Between
Since last Friday we have been consistently working on tweaks and fine detail to make the project better based on the feedback we were provided. We have been working on trying to get the fades of each sequence to fade a little smoother than they have been so that the transition is as pleasing to the eye as we can manage.
Fading Script Restriction –
We are using a script on two different shaped pieces of geometry to make the fades happen we are able to send an integer to set the duration of the fades to try and control how quickly they fade in or out. However now that we are going through things with a fine tooth comb we want to get very particular here however I am stuck with using integers but I want to be able to use floats so that I may set these times down to the decimal point of a second to get as close as I can.
In order to do this I will need to edit the script as this is currently not possible with the script that we have and we are pretty restricted at the moment so Im going to see what we can do.
Extended Fade Challenge –
Also one of the interview sequences we have has the audio fade as Colin is talking about dreaming and we want the participant to have the surreal dream like feel like almost fading out of reality. However this would mean me being able to fade out cutting 15 seconds into the sequence but at the moment I am restricted to a 5 second fade so I am also going to try to edit the script to allow me to fade for longer.
Editing The Script –
I have been working on editing the script and it currently has a declaration for floats but I am still not able to enter a decimal and as far as my programming skills go I don’t know what I am not already allowed to use half values for the fades so this is still an ongoing issue.
With regards to the extended fade I have edited values within the script to allow increased times for the fade however when I increase the value in the action and play the fade does not appear to be extended. Such a long fade then had a strange knock on effect on the rest of the cutscene and the fade seemed to then cut into the fade in of the next sequence even though it was set as fade out of the previous sequence so we may have to look at alternative options at this stage in the interview experience.
Particles Alternative To Extended Fade –
We got together and had a conversation to come up with an alternative plan for the audio fade out and us being restricted by not having the ability to extend the fade out for this part of the interview. The plan we have come up with is the atmospheric cloud particle effect comes in in the next sequence so we have decided to start this a little earlier to come in when the audio has begun to fade.
Colin is usually in the sequence when the audio fades but we have now removed him and we will focus on making the particles more aesthetically pleasing and pay attention to the behaviour of the particles.
In the video below you can now see how when the audio fades there is now a particle event happening instead of seeing Colin with an extended fade. We are going to test out how this works with people as we do not actually know how well a part of the sequence works until we see people try it.
Paint Stroke Updates –
We have been constantly working up the paint strokes and try to get the behaviour of these right this is because the way the paint strokes come into the scene guides the direction of the users eye so they are lead to watch the Seamus Heany sequence come into play.
The paint stroke animations have been updated to new versions of the FBX as well as new textures and materials being put into play and the alpha scripts to edit the behaviour of how the paint strokes appear.
In the video below you can see how the new paint strokes are working in the scene and they seem pretty effective at guiding the user from the left to the right to see the Seamus Heany sequence come in however we know this is going to happen so I automatically view this scene look to the left so it will be interesting to see how people who have never seen the experience or people who haven’t seen it in a while react and behave to it.
Value Of Feedback –
At this stage the value of feedback is golden it always is but particularly more for us now as we need to know how people behave to what we have made. Every change we make we can only assume that it is effective but the true test is when people try it and thats when we really learn what our next move is.
Explorable Character Controller Issue –
During the explorable sequence the participant will be able to move through the gallery and explore the scene whilst using the Oculus to be in this virtual world. In order to give the participant movement we have to use a character controller to make the camera the participant is viewing the experience through moveable.
At the moment the the experience is using a custom character controller which allows for footstep noises to be heard through the experience as we have been playing about to see if this is weird or not but non the less it is not functional we have been experiencing a sliding issues where the participant feels like they are walking on a slope and that should not be the feeling at all.
In order to fix this issues we have changed to an OVR player controller which is the Unity supported Oculus controller as correct functional movement is more important to us than footstep sounds at the moment.
Changing to the OVR player controller has fixed the sliding issues and the height/acceleration fields need to be set but this is all done based on how it feels inside the experience.
The OVR player controller has been put on to an adventure creator camera to work as when we begin to play a cutscene adventure creator will use the main camera therefore we must include the player control on to this camera by dragging the OVR player controller on to the main camera hierarchy.
The acceleration is on point but the height has been difficult to get right but we have done it based on the general height that the paintings will be viewed at but this has led to a conflict in the entrance arch way as the user now feels too tall to walk through the archway and it feels like you may hit your head as such when walking into the gallery. In order to fix this we have increased the height of the arch way and we now have a consistent height throughout and the experience feels more real.
In the images below you can see the explorable and the general height that things can be viewed at however as with anything we do you can only really get a sense of it when using the Oculus.
Challenge Of A Virtual Reality Camera Switch –
As with the majority of things we are figuring out it is hard to research what has not been done before so technically we are created the research for others. Basically we want to have a virtual reality camera switch but the key issue here is we don’t want to make the user ill while carrying out this switch.
The whole point in this switch is at the end of the interview sequence we want to have the camera switch to a close up camera of Colin after a fade has took place. As we are using adventure creator we are able to carry out the switch in a cutscene to another camera.
In order to do this we have had to duplicate the main camera and rename them to primary and secondary. The cameras must have different constant ID’s for this to work otherwise they will share the same transform and camera properties and there will be no closer camera settings saved.
The camera then switches from the primary main camera to the secondary main camera which is slightly closer to Colin’s face. The switch has to be a pretty small movement closer to the face as anything more than that feels horrible inside the Oculus and would easily make the user feel ill.
In the video below you can see the camera switch taking place but you will not get the true feeling of this switch unless you try it in the Oculus but we now know that it is possible and can play about with making it more effective.
Landscape Edits –
Since last Friday the landscape scripts and textures have been edited to make them richer in colour and to also have them enter the scene in a more effective way. The new scripts and textures are now in Unity and in the screenshot below you can see how they work well with the paint strokes and even the colour of Colin’s clothing.
During the interview we have had atmospheric cloud particle clouds enter the scene and have been editing when these particle effects will show up however now that we are using the particle effects for an extended duration as we cannot fade out of a particular sequence as early as we need to the particles are having a lot of on screen time.
This led us to edit the particles and make them better so that the user would not be uninterested by what the particles were doing. The behaviour of the particles have been edited so that the particles appear to grow and enter the scene in different bursts this is a very gradual process and will not appear to be massively impactful but we must be cautious with particles inside a virtual reality experience as we are not wanting to create any vomit comets.
The gradients of the particle effects have been edited using the colour scheme used throughout the rest of the interview to keep a consistency throughout the experience. However we have realised that the colours displayed in the Oculus are much richer inside the Oculus rather than on screen so we must keep checking the Oculus to see the true version of what we are creating.
Particles can also be edited to change the colour of gradients over time so this will be something worth experimenting with so that we can get the best response possible out of the particles. In the videos below you can see examples of the old particles as well as the new ones but please note that these look much better in the Oculus and this is only an example.
This week we have been making all the changes we know we are trying to make it better and build upon it and keep building week in week out but we need to really find out what people think about this experience. It is important to know how the user behaves using our experience and to get as much feedback as possible so that we can make this a user friendly fully immersive virtual reality experience.
We are preparing for our last class of this course and we really hope to get feedback fro our classmates trying the experience and the lecturers in the time we have. We will then build our next steps upon what the users are saying about our experience.