Ideation, Creation & Everything In Between
This morning I had an adventurous start to the morning and took a little trip to Inlifesize to return the Oculus. Myself and my team got together to plan a doomlist to cover the next two weeks tasks and how we are going to tackle these.
The majority of the work we need to do involves a massive amount of tests and we will find out what more we need to do as we encounter issues along the way. Once we broke down all that needed to be done we then prioritised our tasks and estimated the complexity and time it would take to carry out each of these tasks.
We opted to also include these tasks on to post its so that they may be moved onto a day of the next two weeks that we are going to work on that particular task. The days also include the date so that we may use the whiteboard as a calendar.
Mary suggested setting some goals for what we want to achieve from the tasks and so that we do not spend too much time on particular tasks which doesn’t seem like a bad idea.
Looking at the board there is a suitable amount of pressure placed on us and we seem to have difficult but still realistic goals. Myself and Fiona took some tasks to start experimenting with today while Mary researched so that we may make progress towards the end goal of the project.
Programmer Visit –
We are also hoping that a lot will come to light on Thursday as we have a programmer coming in that will maybe be able to help us out with a few issues involving the following:
Update on Nigel McAlpine –
Last week we had also emailed Nigel McAlpine about the lend of an Oculus and he was able to confirm that the BBC had one and that we may be able to obtain it if we need to test out things. We no longer need to try and figure out the tracker issue with switching components from the development kit as our issues were with scale but its still very useful to know where an Oculus Rift may live in Belfast.
Nigel also seemed very interested in our project and expressed that he may like to try it out sometime so we may extend an invitation for him to do just that at the end of year show if he is able to make an appearance.
To Keep In Mind –
During our planning one thing I wanted to ensure is that when the experience is broken into play interview and explorable did any of the interview have a presence in the explorable or not at all and so far we have established that if anything there may be sound from the interview in the explorable but that it would not be relevant for the depth content visuals to also be apparent at this time.
As the Oculus was returned today this also highlighted the importance that we need to setup some sort of schedule and ask when we can have the Oculus as a lot of our testing will relate to when we can obtain an Oculus. Once we know when we will have the Oculus things will seem a lot better but we can always continue without it we just need it for the testing and execution of the experience after we have made our changes and additions.
I find that it is useful to test as I go along as that way we may find out what works and what doesn’t as we progress. Often what works in Unity does not have the same feeling or effect intended when placed into Unity. A fine example of this was last week when we were testing an animation of Colin moving towards the participant this did not seem so bad in Unity but when viewed in the Oculus this was a totally different experience.
Atmospheric Skybox –
In Unity you are able to create 6 sided Skybox materials that are placed around the scene to sort of enclose to scene contents within an environmental skybox. What we would like to achieve from this would be an atmospheric skybox containing stars or some sort of galaxy effect or something in relation to space.
Six texture images are required to be placed within the skybox material which is then plugged into the environmental lighting skybox element of the lighting tab in unity.
The texture images I have used in the examples below are ones I wanted to test from the internet but I am now going to make some of my own for testing to see what other kind of effects we can get. It is nice to know that the creation of a skybox material is not too complex and that we can achieve a lot from this process if we go about it right.
More information on Unity Skyboxes can be found at the following link – http://docs.unity3d.com/Manual/class-Skybox.html 21/03/16
Opacity animation off depth kit content experiments –
The next task I wanted to look into was a test to use Unity to keyframe the opacity of the depth kit sequences so that we may give an attempt at making Colin fade in and out. This process should be possible but we are going to test it to check the feasibility and how it looks.
A fade in and out works in line with the animatic as the interview fades in and out at particular stages of the animatic and we want this to also happen in the experience.
We have made a start at experimenting with opacity and unfortunately we cannot keyframe opacity Im not sure where I got that thinking from but it can’t be done here and alternative ways of doing this involve using shaders and controlling the alpha channel however there is a slight issue with this. We cannot change the shader on Colin as he is using a custom depth kit shader that comes with the depth kit content and unfortunately it does not have an alpha to edit.
In the image below you can see that within the properties of the animator we do not have an option for opacity so it does not look like we can animate from here. The shader called DepthKit/DepthKit-Surface-Shader is also shown in this image as the default shader for this kind of content failing to use this shader will stop the depth kit content from working correctly.
Currently I am embarking on some research to find out how we can achieve our opacity fade in and out that we need for the experience. This task was marked as a two for complexity and time but I have a feeling it may have just increased in complexity a little because at present I don’t know how to achieve this.