Ideation, Creation & Everything In Between
Recently we have been making some discoveries in relation to scaling for virtual reality at present we are scaling a cube at 100x100x100 in maya which translates into unity as a scale factor of 1 which in virtual reality is seen to be as one meter.
This scale appears to be correct for objects but I’m a little confused how to make this feasible for an entire environment space and with regards to our depth kit content which is coming in at a massive scale. By default the depth kit content is coming in at a unity scale of 0.01×0.01×0.01 even though it is small translations and the game object the content was added to was 1x1x1 unity scale the content or “Colin” is appearing to be very large.
In order to get the depth kit content to fit within a one meter scale he must be scaled down to 0.0001×0.0001×0.0001 but this is to fit him within one meter and in reality he is not going to be only a meter tall which works out at approx 3 feet tall however we do need to take into consideration that he is sitting down for the interview.
Knowing that the cubes I have in unity are approx 3 feet tall this allows me to work out better the height of the participant and the height that Colin should be. We need to decide what we would prefer for the setup should the person be sitting or standing for the experience so that the heights are all within the same range.
We also feel that it would perhaps be safer to have the participants sit down to ensure the safe use of the Oculus as when wearing the device and watching people use it, it has become apparent that people forget about their surroundings and seem to think the Oculus is wireless. We don’t want anyone using the experience to get hurt using the Oculus at the End of Year show or anywhere else that we may have the set up.
I currently have a setup in Unity that has two, 1 meter cubes one on top of the other and a ground plane to keep the player within this space. Beside the cubes I have sized the depth kit content to a unity scale of 0.01 which with the content featuring Colin sitting down appears to be of approximately the correct height as he is around 5ft sitting on a high stool.
Within the scene I have added an OVRPlayerController as it allows me to control the height of the camera which is what the participant is looking through to view this experience. The reason I needed to know if the participant would be standing or sitting as the height of this camera would need to be adapted based on the decided position.
You can see in the screenshots the set up of both the Colin depth kit content and the OVRPlayerController and how this looks in the Unity scene. What is making this whole process more difficult is in Maya I am able to set scale of everything to be 100x100x100 which is coming through to unity as a scale factor of 1 and real life 1 meter. However when it comes to the Colin depth kit content we have no control over the scale before it is imported into Unity and even though I place it on an empty game object at a unity scale of 1x1x1 I still have to scale it down massively as it is not true to meter scale. What I have done is used the two stacked boxes which are approximately 6ft high and I have sized the depth kit content to 0.01×0.01×0.01 which appears to make the content the correct height for him being in a sitting position.
Also one thing that we noticed on Wednesday when we got the tracker working was when an extremely tall person uses the experience it completely alters how the experience will feel and the perspective of everything. Obviously a taller person will have a far different experience to a smaller person and height does change things but this is something we need to think about as we are looking to accommodate a universal experience for anyone.
This is what I mean when I am talking about other factors that need to be taken into account when working with virtual reality as there always seems to be a domino effect when building for virtual reality and we need to think about absolutely everything.
In the video below you can see the example of the scene and how it looks in Unity when playing the actual feeling that you get from using it in the Oculus is not something I can capture. It feels as if me as the participant is sitting right in front of Colin as if I am having a conversation with him and this also feels better because I am also sitting using the Oculus. We need to not only test hows things look but how they feel to the participant so we all tested this scene to ensure that it did feel real to us and true to size which we can agree it does.
One key factor to note is now that the scale is correct for Unity and within the Oculus the tracker is also working and feels very true to reality. The movement towards Colin is realistic and does not appear to be much larger than the actual movement or too fast for the speed the participant is moving.
Getting the scale right is crucial for the rest of the experience as we want the tracker movement to feel real and not be working in large or small leaps or at incredibly fast or slow speeds. Further testing of the scale will be required in order to get the best results but for now we are making steady progress and I would feel comfortable for us all to use a cube in maya 100x100x100 to use as reference when we are modelling content for the experience.
I feel like we are really getting somewhere with scale and that now we can figure out particles a lot more as we have the correct scale to work from and they will now make sense as the previous particle effects we have been working on will be much larger than the correct scale.
Further information on Unity scale can be found at –