🌎

Week 5. Augmented reality.

Today, we are going to build a Simple Video Overlay into Augmented Reality scene with Android:

Here is a brief outline of steps:

  1. Add permissions and feature
  1. Add library into build.gradle
  1. Add ArFragment
  1. Create resources directory
  1. Paste files into newly created directory
  1. Define variables and objects
  1. Set ArFragment and Mediaplayer
  1. Create ModelRenderable
  1. Set on tap action for ArFragment
  1. Release MediaPlayer

Step 1. Add permissions and feature

AndroidManifest.xml

<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.ar" android:required="true"/>

<application
        ...
        <meta-data android:name="com.google.ar.core" android:value="required"/>
	...
</application>

Step 2. Add library into build.gradle (Module: app)

android {
    ...
    compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
    }
    ...
}

dependencies {
    ...	
    implementation 'com.google.ar.sceneform.ux:sceneform-ux:1.17.1'
    ...
}

Step 3. Add ArFragment into activity_main.xml

<fragment
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:id="@+id/fragment"
    android:name="com.google.ar.sceneform.ux.ArFragment"/>

Step 4. Create resources directory:

Right click on App: New → Android Resource Directory → Resource type (raw), directory name: raw → OK

Step 5. Paste files into newly created directory.

Step 6. Define variables in MainActivity.java:

// Elements
private ArFragment arFragment;
private ModelRenderable videoRenderable;
private MediaPlayer mediaPlayer;

// The color to filter out of the video : working combination
private static final Color CHROMA_KEY_COLOR = new Color(0.1843f, 1.0f, 0.098f);

// Controls the height of the video in world space: adjustable
private static final float VIDEO_HEIGHT_METERS = 0.85f;

Step 7. Set ArFragment and Mediaplayer inside onCreate() of MainActivity.java:

// bind fragment
arFragment = (ArFragment) getSupportFragmentManager().findFragmentById(R.id.fragment);

// Create an ExternalTexture for displaying the contents of the video.
ExternalTexture texture = new ExternalTexture();

// Create an Android MediaPlayer to capture the video on the external texture's surface.
mediaPlayer = MediaPlayer.create(this, R.raw.unist);
mediaPlayer.setSurface(texture.getSurface());
mediaPlayer.setLooping(true);

Step 8. Create ModelRenderable object inside onCreate():

// Create a renderable with a material that has a parameter of type 'samplerExternal' so that
// it can display an ExternalTexture. The material also has an implementation of a chroma key
// filter.
ModelRenderable.builder()
        .setSource(this, R.raw.video_screen)
        .build()
        .thenAccept(
                renderable -> {
                    videoRenderable = renderable;
                    renderable.getMaterial().setExternalTexture("videoTexture", texture);
                    renderable.getMaterial().setFloat4("keyColor", CHROMA_KEY_COLOR);
                })
        .exceptionally(
                throwable -> {
                    Toast toast =
                            Toast.makeText(this, "Unable to load video renderable", Toast.LENGTH_LONG);
                    toast.setGravity(Gravity.CENTER, 0, 0);
                    toast.show();
                    return null;
                });

Step 9. Set on tap action for ArFragment inside onCreate() method:

arFragment.setOnTapArPlaneListener(
    (HitResult hitResult, Plane plane, MotionEvent motionEvent) -> {
        if (videoRenderable == null) {
            return;
        }

        // Create the Anchor.
        Anchor anchor = hitResult.createAnchor();
        AnchorNode anchorNode = new AnchorNode(anchor);
        anchorNode.setParent(arFragment.getArSceneView().getScene());

        // Create a node to render the video and add it to the anchor.
        Node videoNode = new Node();
        videoNode.setParent(anchorNode);

        // Set the scale of the node so that the aspect ratio of the video is correct.
        float videoWidth = mediaPlayer.getVideoWidth();
        float videoHeight = mediaPlayer.getVideoHeight();
        videoNode.setLocalScale(
                new Vector3(
                        VIDEO_HEIGHT_METERS * (videoWidth / videoHeight), VIDEO_HEIGHT_METERS, 1.0f));

        // Start playing the video when the first node is placed.
        if (!mediaPlayer.isPlaying()) {
            mediaPlayer.start();

            // Wait to set the renderable until the first frame of the  video becomes available.
            // This prevents the renderable from briefly appearing as a black quad before the video
            // plays.
            texture
                    .getSurfaceTexture()
                    .setOnFrameAvailableListener(
                            (SurfaceTexture surfaceTexture) -> {
                                videoNode.setRenderable(videoRenderable);
                                texture.getSurfaceTexture().setOnFrameAvailableListener(null);
                            });
        } else {
            videoNode.setRenderable(videoRenderable);
        }
    });
}

Step 10. Release MediaPlayer in onDestroy() method:

@Override
public void onDestroy() {
    super.onDestroy();

    if (mediaPlayer != null) {
        mediaPlayer.release();
        mediaPlayer = null;
    }
}

Full Code: