My first 40 Hour Project (40H) is It’s an adult project, so please don’t continue unless you’re over the age of 18.

Thick is an iOS app that plays adult videos in ARKit, so that the talent in the film is presented like a hologram. My earlier post about blade runner was the precursor to this.

It only took me 26 hours to get it to a demoable state, which is less than I expected.

Hours to complete

A majority of the time wasn’t spent on ARKit related work, it was actually on video-related work (getting the right video, editing, exporting etc.). That was really hard.

There are 3 methods I tried to achieve transparency in ARKit. Chroma key, 32 bit video, and masking.

I found the chroma key and masking method from this Stack Overflow post. I never got 32 channel video to work.

The first is chroma key, or green screen. You write a shader to mark green pixels as transparent. Your shader would look something like this:

uniform vec3 c_colorToReplace = vec3(0, 1, 0);
uniform float c_thresholdSensitivity = 0.05;
uniform float c_smoothing = 0.0;

#pragma transparent
#pragma body

vec3 textureColor = _surface.diffuse.rgb;

float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

float a = blendValue;
_surface.transparent.a = a;

Unfortunately you get fairly imprecise results. I think this something I could still do, and would make a live-cam version of the app possible.

The second is a 32 bit video, meaning a video with an alpha channel. It doesn’t appear that it’s possible to play these on iOS devices. I could never get one to work in an ARSCNView.

The last method is to create two separate videos, one with the color channel, and one with the alpha channel. This is the method I’m currently using.

The white represents opaque pixels, and black represents where we want the video to be transparent. alpha channel

You play both videos using AVPlayer.

Here’s the breakdown of what I spent time on. The (number) at the end of entries are issue numbers in GitHub that the work corresponds with.

hours breakdown part 1 hours breakdown part 2