Nicole Kennedy Design

Ideation, Creation & Everything In Between

Problem Solving & Encoding Video Renders

Today we have been working on figuring out the process of using custom scripts in Adventure Creator and how do we use an actionlist and action in order to call these scripts as well as having the ability to access the parameters of the script being called.

The first port of call was to do some research into the Adventure creator manual and find out if there were instructions on how to create and call custom scripts. Basically the manual advised that we use the action Object: Send message to call an object with a custom script on it however our issue here is not calling an object. We want to call a certain parameter that is within the script that is on this object.

Custom Scripts –

Here is the information that the manual provided and was extracted from the following link – (25/03/16)

Screen Shot 2016-03-25 at 11.26.46

Whilst looking through what options we have available to us in adventure creator we came across the object send message action that allows you to call a method. I am unsure what method this calls and whether or not we would be able to declare a fade in or fade out that would then be called. However earlier experimentation with the object : send message action has shown us that this action does not function correctly so it would be ideal to avoid this action if we could as it seems to be bugged.

The next action that we found that appears to be Object: Call event and this action allows us to actually call the object that has the script on it and then call the parameters such as fade in or fade out.

This slideshow requires JavaScript.

Custom Actions –

Custom Actions is something I thought it would be worthwhile to know how to do as if we got the fade in/out script working we would need to be able to have control over this. The main method of controlling the logic is being done in Adventure Creator using actions and actionlists so it makes sense to try and have any custom scripts integrated into an action so that it may be controlled easier. I have embarking on a few experiments and have had a few attempts that I have enclosed below to show a rough idea of how an adventure creator action is made.

  • Attempts to make a custom action

Screen Shot 2016-03-25 at 11.27.12

The snippet of code below shows my attempts of trying to create an action for adventure creator. I followed the tutorial but was unable to get the action to recognise the fade parameters from another script which is the fade in and fade out we have been working on.

using UnityEngine;
using System.Collections;

using UnityEditor;

namespace AC

    public class ActionFade : Action
        // Declare variables here
        public GameObject objectToAffect;
        public bool FadeInState;

        public ActionFade ()
            this.isDisplayed = true;
            category = ActionCategory.Object;
            title = “Fade In”;
            description = “Fades In The Game Object”;
        override public float Run ()
            if (objectToAffect && objectToAffect.GetComponent ())
                objectToAffect.GetComponent ().useFade = newFadeState;
            return 0f;
        #if UNITY_EDITOR

        override public void ShowGUI ()
        objectToAffect = (GameObject) EditorGUILayout.ObjectField (“GameObject to affect:”, objectToAffect, typeof (GameObject), true);
        newFadeState = EditorGUILayout.Toggle (“New Fade State:”, newFadeState);

            AfterRunningOption ();

        public override string SetLabel ()
            // Return a string used to describe the specific action’s job.
            if (objectToAffect)
                return (” (” + + ” – ” + newFadeState.ToString () + “)”);
            return “”;



Although so far I have been unsuccessful at getting the action to function it is useful to know that this it is possible and that there is a template of an action as part of adventure creator that can be edited to make your own. The issue I am having is the functions I am trying to call are not recognised by the action so nothing happens.

Terminal Renders (Encoding videos) from visualise content –

The next stage after the depth kit takes had been rendered from the visualise app was to then encode videos from the renders and in order to do this we had to use ffmpeg through the terminal on the mac. In the video below that James provided us with he shows us how to do this step by step so that we may get the content into a suitable medium to then bring into Unity.

Tutorial provided by James George that shows us how to encode the rendered out images into a video using ffmpeg.

When using terminal a few stages are required to progress through until you can encode the video the first is to cd into the location of where the depth kit scripts that James provided us with are located.

The next stage is to drag the script into the terminal window. Finally the location of the take folder itself which comes from the visualise app is dragged into the terminal window. There are a few more details to this stage in the example below you can see the numbers 1320 and 1584 these numbers are the start and end frames of the take that you wish to encode a video for. The 23.976 is the frames per second you wish to use for the video and the Scene_001 is the name that you want the output video to be called.

Nicoles-MBP:~ Nicole$ cd /Users/Nicole/Documents/DepthKit_UnityTesting/Assets/DepthKit/Scripts

Nicoles-MBP:~ Nicole$ /Users/Nicole/Documents/DepthKit_UnityTesting/Assets/DepthKit/Scripts/  /Users/Nicole/Desktop/TAKE_02_26_14_44_09_14\:44_03_08_15_17_02 1320 1584 23.976 Scene_001

In order to make the process of obtaining the frames in and out of the relevant take easily we used the production timesheet which has proved useful throughout the project since the interview was captured. This enabled me to quickly extract the start and end frames I needed to make the encoding process faster.

We made the production timesheet after the interview and after Mary had created a transcript of the content we were going to use I was able to create this sheet based on what sound files were relevant to each take as well at the time each take lasted and which frames they started and ended with. The organisation of the team is making things run a lot smoother and we have learned a lot from the previous project.

Screen Shot 2016-03-25 at 14.31.18

Production – Timesheet used to obtain frames in/out and to note output name chosen for the video.

After these pieces of code have been entered pressing enter then begins the process of encoding the video and you will see code like this appear –

Screen Shot 2016-03-25 at 14.32.08

Terminal code to begin the encoding of videos.

In the image below you can see how the process was done and the full image of the code that is displayed. This process was repeated 14 times to get all of the encoding movie texture and poster frame files that we needed to be ready to be bought into Unity.

This slideshow requires JavaScript.

This process will not only encode an mp4 video file which is known as the (movie texture) but a png image called the (poster frame) will also be created. The final file that is required is the (properties) file which is called _depthproperties.xml and can be found already made by the visualise app in each of the take folders.

Folder Structure – 

In order for the encoded videos and all of the relevant content to be collected in an organised fashion a folder structure was created using the production timesheet . What we did was took the scene name as sometimes more than one scene was taken from an individual take henceforth we could not use the take names.

A folder was then created for each scene within the depth kit content folder and each movie texture files and poster frame png images were then sent to the relevant folders. The _depthproperties file for each take had already been created by visualise so was then dragged from the take folder into each of the scene folders. The scenes that shared a take had the _depthproperties xml file duplicated from that particular take as they would need their own properties file when it came to the Unity side of things.

Now that the poster frames, movie texture files and depth property files have been created and stored in the appropriate location each of the 14 scenes are now ready to have the sound added. The movie texture file mp4 for each scene will be bought into Premiere and have the audio synced up before the content is ready to be imported and setup in Unity.

This slideshow requires JavaScript.

Reflection –

I am glad that we had experimented with encoding the videos earlier in the project as I don’t think this process would have went as smoothly had we not had practice earlier. The video that Fiona obtained from James George was crucial to this process and I am so thankful to have had access to it.

I found this process to be long and tedious but the next stage off adding sound and importing to Unity is something I have been looking forward to so I don’t mind fighting my way through this stage in order to get there.

The scripts are something I feel happy to have explored and will hopefully get to explore more throughout the project and although I am far from a programmer I feel like I am started to understand how some things work in scripting and hope to keep increasing that knowledge.

Working within such an organised team is making things run smoother and we are making significant progress and getting to experiment and reach out to new skills that we may not have had time to do in the previous project we are being ambitious and the challenges that come along with it are encouraging us to stay on top of things so that the processes for each task do not take longer than they should do.

I am finding that earlier research and practice experiments of the tasks we are now working on is paying of and making the project a nicer project to work and more fun. The majority of my research is based on processes of doing things and finding out if and how things are possible my experimentation is what guides me to my next research topic and pushes me to find out all I can.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


This entry was posted on March 25, 2016 by in Final Year - Major Project and tagged .
%d bloggers like this: