Alpha Video in Studio

Alpha Video requires the latest version of ZapWorks Studio to work, please make sure you download the latest version from the ZapWorks website

In this beta tutorial we will be taking you through how to prepare and display alpha video using ZapWorks Studio. We’ll be using Premier Pro as our video editing software of choice, but hopefully the process will be similar in other tools.


Video Preparation

While this demonstration outputs a video with a resolution of 3840x1080 we highly recommend that you keep your resolution beneath 2048 in both dimensions as some lower end devices may have trouble playing them.

First we start with our video, here we have some frankly excellent footage of Zappar’s Co-Founder, Dr Simon Taylor pretending to fly around like a familiar superhero who has jets in his feet and hands…

We’ve got this footage, which has a green screen background, in a sequence in a Premiere Pro project.


  1. The first thing to do is copy and paste this clip, and place the copy directly above the original in the timeline.


  1. Select the top clip and search for ‘Ultra Key’ in the Effects panel.


  1. Drag ‘Ultra Key’ onto the top clip and double click to open the Effect Controls panel.


  1. Use the Eyedropper tool to select the Key Color - click on the green in the Program Monitor.


  1. Copy and Paste the whole ‘Ultra Key’ effect onto the below clip.


  1. Now, reselect the top clip, and in the Ultra Key Effect Control, select ‘Alpha Channel’ as the Output.

You will see that Simon becomes a solid white shape.


  1. Next, we need to make the sequence we’ve been working in twice as wide so select Sequence > Sequence Settings and change the horizontal Frame Size to double the width of your original video (3840 in our case as our original video was 1920 pixels wide).


  1. Next, we need to position the clips so they are side by side. This needs to be pixel perfect, so don’t move the clips by eye. Double click the top clip in the timeline to open the Effect Controls panel. Change the horizontal position to 75% of the width of the sequence (2880 in our case). You will see the solid white version of Simon move over to the right in the Program Monitor.


  1. Now double click the bottom clip, and change the horizontal position to 25% of the width of the sequence (960 in our case).

You will see the full color version move over to the left.


  1. Now, we’re ready to export the video as an h.264 mp4 file.

Below you can see the before and after of this process.

Before

https://zapworksforumuploads.s3-eu-west-1.amazonaws.com/original/1X/5bd117e276a3edf3bfe758d3b9045a6bb31c7909.mp4

After

https://zapworksforumuploads.s3-eu-west-1.amazonaws.com/original/1X/b5005654b4bf9e25e8c53c205832e97c82dbc924.mp4


  1. The final preparation step is to grab a single frame snapshot of the video for reference later on when creating our experience. In Premiere you can create a snapshot by clicking the camera icon in the Program Monitor or using the appropriate keyboard shortcut (for us, on a mac, it’s Shift and E).

Another options is to use the VLC Snapshot feature.

Snapshot


Studio Project

  1. Start a new project and import your exported side-by-side MP4 video along with your snapshot image into the media Library.


  1. Drag both the video and image into the Hierarchy.


  1. Select the snapshot image node in the Hierarchy and scroll down to the scale property within the properties panel. Make a note of the first value (in our case 3.52) as we’ll be using it in a following step.


  1. The next step is to add two TransformedTextures which will allow us to seperate out the left and right side of our video as if they were two seperate video textures. We add TransformedTextures by right clicking on the root node and selecting ‘New > Textures > TransformedTexture’. Do this twice to get two TransformedTexture nodes.


  1. Rename one of the TransformedTextures to “colorTexture” and the other to “maskTexture” to help differentiate them.


  1. Select the colorTexture and in the properties panel set it’s texture to be that of the snapshot image. This will help us line things up for the final video texture which at this moment in time has no preview (why we are using the snapshot). After doing this you should see the snapshot appear in a small window in the top right of the 3D view.


  1. Now we want to position and scale the colorTexture so that it fits perfectly over the color side of our video. To do this we want to recall the scale number from before (3.52) and use that as the first (x-axis) scale value here. We then need to move the texture over to the color side of the snapshot so we take half of the scale value (1.76) and use that to set the first (x-axis) position. With this done we can now see that the color portion of the video shown in the preview box in the top right of the 3D view.


  1. Now we want to setup our maskTexture in the same way following steps 6 and 7 however this time we want the texture to cover the other side of the snapshot so we set the first (x-axis) position value to -1.76 instead of 1.76. We now see the mask portion of the snapshot in the preview box.


  1. With the TransformedTextures setup using the snapshot for guidance we can now switch both of their textures to the video. This will cause a default preview image to appear in the preview box but the video will be used when the experience is run.


  1. Next we want to create a new material that will use our TransformedTextures to create the alpha video effect. We do this by right clicking the root node in Hierarchy and selecting ‘New > Material’. Naming it videoMaterial or something similar for clarification.


  1. Selecting the material node in the Hierarchy we set the skin property to the color TransformedTexture, the mask property to the mask TransformedTexture, and the maskAlphaChannel to either r, g or b (we’ll pick r). The reason it doesn’t matter is because our mask is using black and white so either of the color channels is fine.


  1. Now we need a plane to display our video on so we drag one from the Media Library into the hierarchy renaming it “videoPlane”.


  1. With our plane selected in the Hierarchy we change it’s material property to the video material we created in step 10.


  1. We now just need to tell our video to play at some point during our experience, for ease of use we’ll use the existing ‘show’ script. With it selected in the Hierarchy we’re presented with a editor view where we can drag our video node (not the video plane or material but our original video node) into. We can drag it in above the parent show function and insert it as a variable then allowing us to call the start function on it within the show function.


  1. The final thing we want to do before previewing our experience is to select the snapshot image we used for guidance earlier and set it’s visibility property to false so this doesn’t appear within our experience. We do this by selecting it in the Hierarchy and navigating to the visible property in the Properties panel and unchecking it.


  1. We can now preview our experience by selecting the preview option in the top left and giving the resulting code a scan with the Zappar App!


You can take a look at our completed project here - AlphaVideo.zpp (1.6 MB)


Considerations

  • It may take some fiddling around with the scale and positioning of the TransformedTextures to get them lined up correctly, it’s helpful to click between the mask and color nodes to ensure they are lined up correctly.

  • The snapshot can be removed from the Hierarchy and project all together once you have positioned the TransformedTextures as it’s just used as a guide.


Any questions or issues with the tutorial let us know :slight_smile:

2 Likes

Part 2: Streaming Alpha Video

Given the nature of alpha video in ZapWorks Studio (double width video) file size can quickly become a constraint. Luckily Zappar’s streaming video implementation helps alleviate this somewhat, allowing the video to instead be streamed.

This follow up post will build upon the project created above and make use of the video player symbol to help give us a head start on the streamed video side of things.


Project

  1. Carrying on from the same project as last time, create a video player symbol by clicking on the plus icon in the Symbol Definitions panel, selecting ‘Video Player’ and then clicking ‘Create’.


  1. Drag the video player symbol from the Symbol Definitions panel into the Hierarchy beneath the videoPlane.


  1. Double click on the videoplayer symbol definition to enter it.


  1. Within the videoplayer symbol locate the backing plane within the controlsGroup group and untick the visible property in the properties panel with it selected.


  1. Do the same as the step above but this time for the bufferingBacking plane within the bufferingGroup group node.

And finally for the Plane0 plane.

The reason we are making these planes invisible is because we want to use the videoPlane we have in our parent symbol to display the video and not have the one’s set up by default in the videoplayer symbol to get in the way.


  1. Return to the parent symbol by double clicking the cog in the top left, just above the Hierarchy.


  1. Back in the parent symbol select the colorTexture node and set it’s texture to none. Do the same for the maskTexture as well.

We are doing this because we will set the texture in a script node in the following steps.


  1. Right click on the root node in the Hierarchy and select ‘New > Script > blank’ to create a new blank script.

Feel free to rename this script to something more suitable by right clicking the newly created script and selecting ‘Rename’. This script will be used to set the texture of the transformed textures so we will call it ‘textureAssignment’.


  1. With the script editor open click and drag the videoplayer node from the Hierarchy to the editor and select ‘Insert Local Variable’. This will let us access the properties of the videoplayer, which we will need in following steps.


  1. Then click and drag the root node into the script editor and select ‘Add an event handler > show’ to create an event handler function that executes when the show event is fired. This is where we will place our texture assignment code.


  1. Now we can drag in both the color and mask texture nodes into the editor and insert them as local variables like we did with the videoplayer node.


  1. Finally we can assign the textures to the transformed textures by typing out their variable names (the text before the = sign and after the var keyword), using the texture function (used for setting their texture) and then using some code to access the videoTexture from the video player symbol.

The code used to access the texture might be different depending on the name of your nodes but the default should be videoplayer0.nodes.show0.myvid.

With this code we are accessing the nodes within the video player symbol, getting the show node specifically, and asking for the myvid object, which is the texture in the video player symbol. It’s not vital you understand everything that’s going on in this step, just enough to make sure you are accessing the right component.

The complete code as shown above is as follows:

parent.on("show", () => {
	// Runs when show occurs on the parent node
	colorTexture.texture(videoplayer0.nodes.show0.myvid);
	maskTexture.texture(videoplayer0.nodes.show0.myvid);
});

  1. The last thing to do before previewing the experience is to select the videoplayer node and give it a URL to the video you wish to stream. For this example we can use the video we have here on the forum, the URL being:

https://d2j4z507ms5wl7.cloudfront.net/forum/simon_flying_-_masked.mp4

We can paste this URL into the videoURL property in the Properties panel. We can also set video to auto play and to show controls such as pause and play when tapping on the video. We do this by ticking the autoPlay and showControls checkboxes respectively.

  1. We are now ready to preview the experience and give it a scan


You can take a look at the completed project here - AlphaStreamedVideo.zpp (2.1 MB)


Considerations

  • When using your own video make sure that it it a publicly hosted MP4 or m3u8 (HLS Streaming) file hosted on a server that supports range requests.

  • If starting from scratch make sure you still grab a sceenshot of the video for aligning the transformed textures.

  • While we used the video player symbol to help speed things up, you can use the underlying Z.Video technology for a more slimline solution.

I am having trouble with step #14 in Part 1. I cannot get my editor to display a start function. I have dragged the video node into the editor above the parent show function, but cannot enable the “Start the video” function.

Any guidance will be appreciated.

1 Like

Thanks for this, was trying to figure out how to pull of something similar :grinning!!!

Cheers

Hi There,

For step 14, after dragging the video node into the script and selecting it to insert as a variable you will then need to type the code out yourself in the parent_show function. The exact code depends on what the name of your video node is in the Hierarchy but in this instance the code I manually typed at this stage was:

Simon_Flying_masked_mp40.start();

Hope this helps :slight_smile:

Mark

2 Likes

This is absolutely amazing. Works great!

However, I want a button to turn my introduction video off when clicked. I’ve created 2 states (visible and invisible) which hide the video on pointerdown, but the audio keeps going. Can you tell me of a better way to stop and hide the video and audio together?

Thank you!

Hi There,

The best way to stop a video from playing when using the video player symbol is to call the exported pause function as mentioned in the Video player symbol article.

Currently, this has to be done within a script but if you have already set up a state for hiding the video then you can create an event handler function for the active event of that state simply by dragging your state into a script and selecting ‘Add an event handler > active’.

Place the pause code inside that generated function and it should work. :slight_smile:

Mark

re: alpha video

Have you considered adding VP8 support into Zappar? The codec itself supports an alpha channel (note: VP8, not VP9). My workflow for creating such videos is: create the video, export as a series of PNG frames, glue together in VP8 format using ffmpeg, play back in Chrome/up to date versions of Firefox (VLC doesn’t support it properly).

Hi Lee,

Thanks for the suggestion.

The reason we’ve decided to go with mp4 is because it’s supported by hardware video decoders across basically all devices and VP8 won’t have the same level of support.

In addition software decoding would be too slow to do with high-resolution videos alongside all the other CPU stuff we’re trying to do.

Mark

Hello, how i can use your AlphaStreamedVideo.zpp like template, just changing the video ?

Unfortunately, it’s not as simple as a template where you can just change the URL as the video content will be different.

You will have to take the steps as outlined in the tutorial to make sure you map the left and right hand side of the videos correctly.

Thanks,
Mark

i have important question,
in your tutorial there is 3 parts
1- prepare video
2- studio project
3- and you have streaming alpha video
what parts i need to follow, 2 or 3 for the studio project!?

It depends on what form the video you want to use is currently in.

If you already have a video that has a color version side by side with a mask (as seen at the end of the video setup section) then you just need to follow the Studio project section. If your video is in another format then you’ll have to follow the video preparation steps as well.

The ‘streaming’ video part is an alternative step to the studio project that shows how you can stream an alpha video as opposed to embedding it, it’s not a necessary step.

Thanks,
Mark

That mean if i have ready the video side by side.
i can go to step streaming video, and leave the studio project section?

If you want to use the streaming video implementation and already have your video setup then yes :slight_smile:

i did the section 1 ,Video Preparation
and section 3 Streaming Alpha Video.
i left section 2. Studio Project

but it is not working, because in section 3 there is colorTexture, and maskTexture, that you created them in section 2.
that mean i need to do some parts from section 2 also ?
i can not make it work !:(…

Hi Elias,

My apologies, the streaming video implementation does indeed build upon the embedded implementation.

The steps you’ll need to follow from the embedded section (Studio Project) are step 1 and 2 regarding the snapshot and then steps 3-8 to create the color and mask textures.

From there you just need to follow the streaming steps (Part 2: Streaming Alpha Video).

Hope this helps.

Thanks,
Mark

aa that is great, thank you.
may i ask you please, what software use for decrease the Size of the mp4 ?
i see your video Simon_Flying_masked is very small in size .

Hi Elias,

We use and recommend handbrake for compressing videos :slight_smile:

Thanks,
Mark

Thanks for all the feedback, this tutorial is now live on our documentation - https://docs.zap.works/studio/alpha-video/ :slight_smile:

Mark