Monday, 21 April 2014

Capture video in Unity3d using Intel INDE Media Pack for Android




In one of the comments on the article about capturing video in OpenGL applications was mentioned the ability to capture video in applications created with Unity3d. We are interested in this topic , in fact - why only "clean » OpenGL application if many developers to create games using various frameworks and libraries ? Today we are pleased to present a turnkey solution - capture video in applications written using Unity3d under Android.On the grounds of this article you will not only learn how to embed video capture in Unity3d, but also to create plugins for Unity Android.Next, we consider two options to implement video capture Unity3d:1. Fullscreen post effect . The method will work only in the Pro version , and the video will not be captured by Unity GUI2 . With framebuffer (FrameBuffer). Will work for all versions Unity3d, including free and paid , Unity GUI objects will also be recorded in the video.What we need

    
Unity3d version 4.3 Pro version for the first and second methods, or free version, for which the only available method of frame buffer
    
Installed Android SDK
    
Fixed Intel INDE Media Pack
    
Apache Ant ( for assembly under the Unity plugin Android)
Create a ProjectOpen the Unity editor and create a new project . In the folder , create a folder assets list Plugins, and in it the folder Android.In the folder where you installed Intel INDE Media Pack for Android, from the libs directory , copy the two jar- file (android-< version >. Jar and domain-< version >. Jar) to a folder of your Android project.




image
In the same folder , create a new Android file named Capturing.java and copy the following code:Capturing.javapackage com.intel.inde.mp.samples.unity;import com.intel.inde.mp.android.graphics.FullFrameTexture;import android.os.Environment;import java.io.IOException;import java.io.File;public class Capturing{

    
private static FullFrameTexture texture;
  
    
public Capturing ()
    
{
        
texture = new FullFrameTexture ();
    
}

    
/ / Path to the folder in which to save the video
    
public static String getDirectoryDCIM ()
    
{
        
return Environment.getExternalStoragePublicDirectory (Environment.DIRECTORY_DCIM) + File.separator;
    
}

    
/ / Configure the video settings
    
public void initCapturing (int width, int height, int frameRate, int bitRate)
    
{
        
VideoCapture.init (width, height, frameRate, bitRate);
    
}

    
/ / Start the process of video capture
    
public void startCapturing (String videoPath)
    
{
        
VideoCapture capture = VideoCapture.getInstance ();

        
synchronized (capture)
        
{
            
try
            
{
                
capture.start (videoPath);
            
}
            
catch (IOException e)
            
{
            
}
        
}
    
}

    
/ / Called for each captured frame
    
public void captureFrame (int textureID)
    
{
        
VideoCapture capture = VideoCapture.getInstance ();

        
synchronized (capture)
        
{
            
capture.beginCaptureFrame ();
            
texture.draw (textureID);
            
capture.endCaptureFrame ();
        
}
    
}

    
/ / Stop the capture process video
    
public void stopCapturing ()
    
{
        
VideoCapture capture = VideoCapture.getInstance ();

        
synchronized (capture)
        
{
            
if (capture.isStarted ())
            
{
                
capture.stop ();
            
}
        
}
    
}}

Add another Java file , this time with the name VideoCapture.java:VideoCapture.javapackage com.intel.inde.mp.samples.unity;import com.intel.inde.mp. *;import com.intel.inde.mp.android.AndroidMediaObjectFactory;import com.intel.inde.mp.android.AudioFormatAndroid;import com.intel.inde.mp.android.VideoFormatAndroid;import java.io.IOException;public class VideoCapture{
    
private static final String TAG = "VideoCapture";

    
private static final String Codec = "video / avc";
    
private static int IFrameInterval = 1 ;

    
private static final Object syncObject = new Object ();
    
private static volatile VideoCapture videoCapture;

    
private static VideoFormat videoFormat;
    
private static int videoWidth;
    
private static int videoHeight;
    
private GLCapture capturer;

    
private boolean isConfigured;
    
private boolean isStarted;
    
private long framesCaptured;

    
private VideoCapture ()
    
{
    
}
  
    
public static void init (int width, int height, int frameRate, int bitRate)
    
{
        
videoWidth = width;
        
videoHeight = height;
  
        
videoFormat = new VideoFormatAndroid (Codec, videoWidth, videoHeight);
        
videoFormat.setVideoFrameRate (frameRate);
        
videoFormat.setVideoBitRateInKBytes (bitRate);
        
videoFormat.setVideoIFrameInterval (IFrameInterval);
    
}

    
public static VideoCapture getInstance ()
    
{
        
if (videoCapture == null)
        
{
            
synchronized (syncObject)
            
{
                
if (videoCapture == null)
                
{
                    
videoCapture = new VideoCapture ();
                
}
            
}
        
}
        
return videoCapture;
    
}

    
public void start (String videoPath) throws IOException
    
{
        
if (isStarted ())
        
{
            
throw new IllegalStateException (TAG + "already started!");
        
}

        
capturer = new GLCapture (new AndroidMediaObjectFactory ());
        
capturer.setTargetFile (videoPath);
        
capturer.setTargetVideoFormat (videoFormat);

        
AudioFormat audioFormat = new AudioFormatAndroid ("audio/mp4a-latm", 44100 , 2);
        
capturer.setTargetAudioFormat (audioFormat);

        
capturer.start ();

        
isStarted = true;
        
isConfigured = false;
        
framesCaptured = 0 ;
    
}
  
    
public void stop ()
    
{
        
if (! isStarted ())
        
{
            
throw new IllegalStateException (TAG + "not started or already stopped!");
        
}

        
try
        
{
            
capturer.stop ();
            
isStarted = false;
        
}
        
catch (Exception ex)
        
{
        
}

        
capturer = null;
        
isConfigured = false;
    
}

    
private void configure ()
    
{
        
if (isConfigured ())
        
{
            
return;
        
}

        
try
        
{
            
capturer.setSurfaceSize (videoWidth, videoHeight);
            
isConfigured = true;
        
}
        
catch (Exception ex)
        
{
        
}
    
}

    
public void beginCaptureFrame ()
    
{
        
if (! isStarted ())
        
{
            
return;
        
}

        
configure ();

        
if (! isConfigured ())
        
{
            
return;
        
}

        
capturer.beginCaptureFrame ();
    
}

    
public void endCaptureFrame ()
    
{
        
if (! isStarted () | |! isConfigured ())
        
{
            
return;
        
}

        
capturer.endCaptureFrame ();

        
framesCaptured + +;
    
}

    
public boolean isStarted ()
    
{
        
return isStarted;
    
}

    
public boolean isConfigured ()
    
{
        
return isConfigured;
    
}}

Important: Note the name of the package com.intel.inde.mp.samples.unity. It must match the name in the project settings (Player Settings / Other Settings / Bundle identifier):


image

Moreover , you should use the same name in C #- script to call Java- class. If these names do not match, your game will fall at the start .Add to your scene, any dynamic content. You can also integrate Intel ® INDE Media Pack for Android * with any existing project instead of creating it from scratch. But try to get in the scene was something dynamic . Otherwise, you will not be too interesting to look at the videos in which nothing changes.Now, as in any other Android application, we must configure the manifest . Create a folder / Plugins / Android AndroidManifest.xml file and copy the contents of :AndroidManifest.xml<? xml version = "1.0" encoding = "utf- 8" ? ><manifest
    
xmlns: android = "http://schemas.android.com/apk/res/android"
    
package = "com.intel.inde.mp.samples.unity"
    
android: installLocation = "preferExternal"
    
android: theme = "@ android: style / Theme.NoTitleBar"
    
android: versionCode = " 1"
    
android: versionName = "1.0">
    
    
<uses-sdk android:minSdkVersion="18" />
  
    
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
    
<uses-permission android:name="android.permission.INTERNET"/>
  
    
<! - Uses microphone ->
    
<uses-permission android:name="android.permission.RECORD_AUDIO" />
  
    
<! - Requires OpenGL ES> = 2.0. ->
    
<uses-feature
        
android: glEsVersion = "0x00020000"
        
android: required = "true" />
  
    
<application
     
android: icon = "@ drawable / app_icon"
        
android: label = "@ string / app_name"
        
android: debuggable = "true">
        
<activity android: name = "com.unity3d.player.UnityPlayerNativeActivity"
                  
android: label = "@ string / app_name">
            
<intent-filter>
                
<action android:name="android.intent.action.MAIN" />
                
<category android:name="android.intent.category.LAUNCHER" />
            
</ intent-filter>
            
<meta-data android:name="unityplayer.UnityActivity" android:value="true" />
            
<meta-data android:name="unityplayer.ForwardNativeEventsToDalvik" android:value="false" />
        
</ activity>
    
</ application></ manifest>

Pay attention to the line:package = "com.intel.inde.mp.samples.unity"
Package name must match the one that you specified earlier.Now we have everything you need. Since Unity will not be able to compile our own Java- files we create Ant- script .Note: if you use other classes and libraries, you should change your Ant- script properly ( more on this in the documentation) .Next Ant- script is intended only for this tutorial. Create a folder / Plugins / Android / file build.xml:build.xml
<? xml version = "1.0" encoding = "UTF- 8" ? ><project name="UnityCapturing">
    
<! - Change this in order to match your configuration ->
    
<property name="sdk.dir" value="C:\Android\sdk"/>
    
<property name="target" value="android-18"/>
    
<property name="unity.androidplayer.jarfile" value="C:\Program Files (x86)\Unity\Editor\Data\PlaybackEngines\androiddevelopmentplayer\bin\classes.jar"/>
    
<! - Source directory ->
    
<property name="source.dir" value="\ProjectPath\Assets\Plugins\Android" />
    
<! - Output directory for. Class files ->
    
<property name="output.dir" value="\ProjectPath\Assets\Plugins\Android\classes"/>
    
<! - Name of the jar to be created. Please note that the name should match the name of the class and the name
    
placed in the AndroidManifest.xml-->
    
<property name="output.jarfile" value="Capturing.jar"/>
      
<! - Creates the output directories if they don't exist yet. ->
    
<target name="-dirs" depends="message">
        
<echo> Creating output directory: $ {output.dir} </ echo>
        
<mkdir dir="${output.dir}" />
    
</ target>
   
<! - Compiles this project's. Java files into. Class files. ->
    
<target name = "compile" depends = "-dirs"
                
description = "Compiles project's. java files into. class files">
        
<javac encoding="ascii" target="1.6" debug="true" destdir="${output.dir}" verbose="${verbose}" includeantruntime="false">
            
<src path="${source.dir}" />
            
<classpath>
                
<pathelement location="${sdk.dir}\platforms\${target}\android.jar"/>
                
<pathelement location="${source.dir}\domain-1.0.903.jar"/>
                
<pathelement location="${source.dir}\android-1.0.903.jar"/>
                
<pathelement location="${unity.androidplayer.jarfile}"/>
            
</ classpath>
        
</ javac>
    
</ target>
    
<target name="build-jar" depends="compile">
        
<zip zipfile = "$ {output.jarfile}"
            
basedir = "$ {output.dir}" />
    
</ target>
    
<target name="clean-post-jar">
         
<echo> Removing post-build-jar-clean </ echo>
         
<delete dir="${output.dir}"/>
    
</ target>
    
<target name="clean" description="Removes output files created by other targets.">
        
<delete dir="${output.dir}" verbose="${verbose}" />
    
</ target>
    
<target name="message">
     
<echo> Android Ant Build for Unity Android Plugin </ echo>
        
<echo> message: Displays this message. </ echo>
        
<echo> clean: Removes output files created by other targets. </ echo>
        
<echo> compile: Compiles project's. java files into. class files. </ echo>
        
<echo> build-jar: Compiles project's. class files into. jar file. </ echo>
    
</ target></ project>

Pay attention to the way source.dir, output.dir and, of course , the name of the output file jar- output.jarfile.At the command prompt, navigate to the project folder / Plugins / Android and start the process of building a pluginant build-jar clean-post-jar
If you did everything as described above , then a few seconds later you receive a message stating that the assembly is successfully completed!


image

At the entrance to the folder should appear new file Capturing.jar, containing the code for our plugin.Plug-in ready, it remains to make the necessary changes to the code Unity3d, first of all create a wrapper linking Unity and our Android plugin. To do this , create a file in the project Capture.csCapture.csusing UnityEngine;using System.Collections;using System.IO;using System;[RequireComponent (typeof (Camera))]public class Capture: MonoBehaviour{
    
public int videoWidth = 720;
    
public int videoHeight = 1094 ;
    
public int videoFrameRate = 30 ;
    
public int videoBitRate = 3000 ;

    
private string videoDir;
    
public string fileName = "game_capturing-";
  
    
private float nextCapture = 0.0f;
    
public bool inProgress {get; private set; }
  
    
private static IntPtr constructorMethodID = IntPtr.Zero;
    
private static IntPtr initCapturingMethodID = IntPtr.Zero;
    
private static IntPtr startCapturingMethodID = IntPtr.Zero;
    
private static IntPtr captureFrameMethodID = IntPtr.Zero;
    
private static IntPtr stopCapturingMethodID = IntPtr.Zero;

    
private static IntPtr getDirectoryDCIMMethodID = IntPtr.Zero;

    
private IntPtr capturingObject = IntPtr.Zero;

    
void Start ()
    
{
        
if (! Application.isEditor)
        
{
            
/ / Get a pointer to our class
            
IntPtr classID = AndroidJNI.FindClass ("com / intel / inde / mp / samples / unity / Capturing");

            
/ / Find the constructor
            
constructorMethodID = AndroidJNI.GetMethodID (classID, "<init>", "() V");

            
/ / Register the methods implemented by the class
            
initCapturingMethodID = AndroidJNI.GetMethodID (classID, "initCapturing", "(IIII) V");
            
startCapturingMethodID = AndroidJNI.GetMethodID (classID, "startCapturing", "(Ljava / lang / String ;) V");
            
captureFrameMethodID = AndroidJNI.GetMethodID (classID, "captureFrame", "(I) V");
            
stopCapturingMethodID = AndroidJNI.GetMethodID (classID, "stopCapturing", "() V");
            
getDirectoryDCIMMethodID = AndroidJNI.GetStaticMethodID (classID, "getDirectoryDCIM", "() Ljava / lang / String;");

            
jvalue [] args = new jvalue [ 0];

            
videoDir = AndroidJNI.CallStaticStringMethod (classID, getDirectoryDCIMMethodID, args);

            
/ / Create an object
            
IntPtr local_capturingObject = AndroidJNI.NewObject (classID, constructorMethodID, args);
            
if (local_capturingObject == IntPtr.Zero)
            
{
                
Debug.LogError ("Can't create Capturing object");
                
return;
            
}

            
/ / Save the pointer to the object
            
capturingObject = AndroidJNI.NewGlobalRef (local_capturingObject);
            
AndroidJNI.DeleteLocalRef (local_capturingObject);

            
AndroidJNI.DeleteLocalRef (classID);
        
}

        
inProgress = false;
        
nextCapture = Time.time;
    
}

    
void OnRenderImage (RenderTexture src, RenderTexture dest)
    
{
        
if (inProgress && Time.time> nextCapture)
        
{
            
CaptureFrame (src.GetNativeTextureID ());
            
nextCapture + = 1.0f / videoFrameRate;
        
}

        
Graphics.Blit (src, dest);
    
}

    
public void StartCapturing ()
    
{
        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] videoParameters = new jvalue [ 4] ;

        
videoParameters [ 0 ]. i = videoWidth;
        
videoParameters [ 1 ]. i = videoHeight;
        
videoParameters [ 2 ]. i = videoFrameRate;
        
videoParameters [ 3 ]. i = videoBitRate;

        
AndroidJNI.CallVoidMethod (capturingObject, initCapturingMethodID, videoParameters);

        
DateTime date = DateTime.Now;

        
string fullFileName = fileName + date.ToString ("ddMMyy-hhmmss.fff") + ". mp4";
        
jvalue [] args = new jvalue [ 1];
        
args [ 0 ]. l = AndroidJNI.NewStringUTF (videoDir + fullFileName);
        
AndroidJNI.CallVoidMethod (capturingObject, startCapturingMethodID, args);

        
inProgress = true;
    
}

    
private void CaptureFrame (int textureID)
    
{
        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] args = new jvalue [ 1];
        
args [ 0 ]. i = textureID;

        
AndroidJNI.CallVoidMethod (capturingObject, captureFrameMethodID, args);
    
}

    
public void StopCapturing ()
    
{
        
inProgress = false;

        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] args = new jvalue [ 0];

        
AndroidJNI.CallVoidMethod (capturingObject, stopCapturingMethodID, args);
    
}}

Assign this script the main chamber . Before capturing video , you must configure the video format . You can do this directly in the editor , changing the appropriate parameters (videoWidth, videoHeight etc.)Methods Start (), StartCapturing () and StopCapturing () are rather trivial and represent a wrapper to call the plugin code of Unity.More interesting is the method OnRenderImage (). It is called after all rendering is finished , just before the results to the screen . The input image is contained in the texture src, the result we have to write to texture dest.This mechanism allows us to modify the final image , applying various effects , but it's outside of our interests , we are interested in the picture as it is. To capture a video, we need to copy the final image in the video. To do this, we pass Id Capturing texture object by calling captureFrame () and transfer Id texture as an input parameter .To draw on the screen simply copy src to dest:Graphics.Blit (src, dest);
For convenience, let's create a button with which we enable, disable video recording of the game interface .To do this, create a GUI object and assign it a handler . Handler will be located in the file CaptureGUI.csCaptureGUI.csusing UnityEngine;using System.Collections;public class CaptureGUI: MonoBehaviour{
    
public Capture capture;
    
private GUIStyle style = new GUIStyle ();

    
void Start ()
    
{
        
style.fontSize = 48;
        
style.alignment = TextAnchor.MiddleCenter;
    
}

    
void OnGUI ()
    
{
        
style.normal.textColor = capture.inProgress? Color.red: Color.green;

        
if (GUI.Button (new Rect ( 10, 200 , 350, 100 ), capture.inProgress? "[Stop Recording]": "[Start Recording]", style))
        
{
            
if (capture.inProgress)
            
{
                
capture.StopCapturing ();
            
}
            
else
            
{
                
capture.StartCapturing ();
            
}
        
}
    
}}

Do not forget to initialize an instance of a class field capture Capture.By clicking on the object will be run , stop the process of capturing video, the result will be saved in the folder / mnt / sdcard / DCIM /.As I mentioned earlier , this method will work only in the Pro version (the free version can not be used OnRenderImage () and cause Graphics.Blit), another feature - the final video will not contain objects Unity GUI. These restrictions are removed the number two way - using FrameBuffer.Capture video using the frame bufferEdit the Capturing.java, this simply replace its contentsCapturing.javapackage com.intel.inde.mp.samples.unity;import com.intel.inde.mp.android.graphics.FullFrameTexture;import com.intel.inde.mp.android.graphics.FrameBuffer;import android.os.Environment;import java.io.IOException;import java.io.File;public class Capturing{
    
private static FullFrameTexture texture;
    
private FrameBuffer frameBuffer;
  
    
public Capturing (int width, int height)
    
{
        
frameBuffer = new FrameBuffer ();
    
frameBuffer.create (width, height);

        
texture = new FullFrameTexture ();
    
}

    
public static String getDirectoryDCIM ()
    
{
        
return Environment.getExternalStoragePublicDirectory (Environment.DIRECTORY_DCIM) + File.separator;
    
}

    
public void initCapturing (int width, int height, int frameRate, int bitRate)
    
{
        
VideoCapture.init (width, height, frameRate, bitRate);
    
}

    
public void startCapturing (String videoPath)
    
{
        
VideoCapture capture = VideoCapture.getInstance ();

        
synchronized (capture)
        
{
            
try
            
{
                
capture.start (videoPath);
            
}
            
catch (IOException e)
            
{
            
}
        
}
    
}
  
    
public void beginCaptureFrame ()
    
{
    
frameBuffer.bind ();
    
}
  
    
public void captureFrame (int textureID)
    
{
        
VideoCapture capture = VideoCapture.getInstance ();

        
synchronized (capture)
        
{
            
capture.beginCaptureFrame ();
            
texture.draw (textureID);
            
capture.endCaptureFrame ();
        
}
    
}
  
    
public void endCaptureFrame ()
    
{
    
frameBuffer.unbind ();

    
int textureID = frameBuffer.getTexture ();

    
captureFrame (textureID);
    
texture.draw (textureID);
    
}

    
public void stopCapturing ()
    
{
        
VideoCapture capture = VideoCapture.getInstance ();

        
synchronized (capture)
        
{
            
if (capture.isStarted ())
            
{
                
capture.stop ();
            
}
        
}
    
}}

As you can see , not much changes . The most important of them - the emergence of a new objectFrameBuffer frameBuffer;
The constructor now accepts as parameters the width and height of the frame , it is required to create the desired size framebuffer .There are three new public method : frameBufferTexture (), beginCaptureFrame () and endCaptureFrame (). Their importance will become clearer when we get to code in C #.File VideoCapture.java we leave unchanged.Next, you need to build Android plugin on how this is done we discussed above.Now we can switch to Unity. Open the script Capture.cs and replace its contents :Capture.csusing UnityEngine;using System.Collections;using System.IO;using System;[RequireComponent (typeof (Camera))]public class Capture: MonoBehaviour{
    
public int videoWidth = 720;
    
public int videoHeight = 1094 ;
    
public int videoFrameRate = 30 ;
    
public int videoBitRate = 3000 ;

    
private string videoDir;
    
public string fileName = "game_capturing-";
  
    
private float nextCapture = 0.0f;
    
public bool inProgress {get; private set; }
    
private bool finalizeFrame = false;
    
private Texture2D texture = null;
  
    
private static IntPtr constructorMethodID = IntPtr.Zero;
    
private static IntPtr initCapturingMethodID = IntPtr.Zero;
    
private static IntPtr startCapturingMethodID = IntPtr.Zero;
    
private static IntPtr beginCaptureFrameMethodID = IntPtr.Zero;
    
private static IntPtr endCaptureFrameMethodID = IntPtr.Zero;
    
private static IntPtr stopCapturingMethodID = IntPtr.Zero;

    
private static IntPtr getDirectoryDCIMMethodID = IntPtr.Zero;

    
private IntPtr capturingObject = IntPtr.Zero;

    
void Start ()
    
{
        
if (! Application.isEditor)
        
{
            
/ / Get a pointer to our class
            
IntPtr classID = AndroidJNI.FindClass ("com / intel / inde / mp / samples / unity / Capturing");

            
/ / Find the constructor
            
constructorMethodID = AndroidJNI.GetMethodID (classID, "<init>", "(II) V");

            
/ / Register the methods implemented by the class
            
initCapturingMethodID = AndroidJNI.GetMethodID (classID, "initCapturing", "(IIII) V");
            
startCapturingMethodID = AndroidJNI.GetMethodID (classID, "startCapturing", "(Ljava / lang / String ;) V");
            
beginCaptureFrameMethodID = AndroidJNI.GetMethodID (classID, "beginCaptureFrame", "() V");
            
endCaptureFrameMethodID = AndroidJNI.GetMethodID (classID, "endCaptureFrame", "() V");
            
stopCapturingMethodID = AndroidJNI.GetMethodID (classID, "stopCapturing", "() V");

            
getDirectoryDCIMMethodID = AndroidJNI.GetStaticMethodID (classID, "getDirectoryDCIM", "() Ljava / lang / String;");
            
jvalue [] args = new jvalue [ 0];
            
videoDir = AndroidJNI.CallStaticStringMethod (classID, getDirectoryDCIMMethodID, args);

            
/ / Create an object
            
jvalue [] constructorParameters = new jvalue [ 2] ;

            
constructorParameters [ 0 ]. i = Screen.width;
            
constructorParameters [ 1 ]. i = Screen.height;

            
IntPtr local_capturingObject = AndroidJNI.NewObject (classID, constructorMethodID, constructorParameters);

            
if (local_capturingObject == IntPtr.Zero)
            
{
                
Debug.LogError ("Can't create Capturing object");
                
return;
            
}

            
/ / Save the pointer to the object
            
capturingObject = AndroidJNI.NewGlobalRef (local_capturingObject);
            
AndroidJNI.DeleteLocalRef (local_capturingObject);

            
AndroidJNI.DeleteLocalRef (classID);
        
}

        
inProgress = false;
        
nextCapture = Time.time;
    
}

    
void OnPreRender ()
    
{
        
if (inProgress && Time.time> nextCapture)
        
{
            
finalizeFrame = true;
            
nextCapture + = 1.0f / videoFrameRate;
            
BeginCaptureFrame ();
        
}
    
}

    
public IEnumerator OnPostRender ()
    
{
        
if (finalizeFrame)
        
{
            
finalizeFrame = false;
            
yield return new WaitForEndOfFrame ();
            
EndCaptureFrame ();
        
}
        
else
        
{
            
yield return null;
        
}
    
}

    
public void StartCapturing ()
    
{
        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] videoParameters = new jvalue [ 4] ;

        
videoParameters [ 0 ]. i = videoWidth;
        
videoParameters [ 1 ]. i = videoHeight;
        
videoParameters [ 2 ]. i = videoFrameRate;
        
videoParameters [ 3 ]. i = videoBitRate;

        
AndroidJNI.CallVoidMethod (capturingObject, initCapturingMethodID, videoParameters);

        
DateTime date = DateTime.Now;

        
string fullFileName = fileName + date.ToString ("ddMMyy-hhmmss.fff") + ". mp4";
        
jvalue [] args = new jvalue [ 1];

        
args [ 0 ]. l = AndroidJNI.NewStringUTF (videoDir + fullFileName);
        
AndroidJNI.CallVoidMethod (capturingObject, startCapturingMethodID, args);

        
inProgress = true;
    
}

    
private void BeginCaptureFrame ()
    
{
        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] args = new jvalue [ 0];
        
AndroidJNI.CallVoidMethod (capturingObject, beginCaptureFrameMethodID, args);
    
}

    
private void EndCaptureFrame ()
    
{
        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] args = new jvalue [ 0];
        
AndroidJNI.CallVoidMethod (capturingObject, endCaptureFrameMethodID, args);
    
}

    
public void StopCapturing ()
    
{
        
inProgress = false;

        
if (capturingObject == IntPtr.Zero)
        
{
            
return;
        
}

        
jvalue [] args = new jvalue [ 0];
        
AndroidJNI.CallVoidMethod (capturingObject, stopCapturingMethodID, args);
    
}}

In this code, we've got a lot more changes, but the logic of the left simple . First, we pass to the constructor of the frame dimensions Capturing. Pay attention to the new signature designer - (II) V. On the Java side , we create an object and pass it FrameBuffer specified parameters .Method OnPreRender () is called before the camera starts to render the scene . It is here that we switch to our FrameBuffer. Thus , all the rendering is done on the texture assigned to FrameBuffer.Method OnPostRender () is called after rendering. We are waiting for the end of the frame , and disable FrameBuffer copy texture directly on the screen by means of Media Pack ( see method endCaptureFrame () class Capturing.java).performanceDevelopers often ask - how video capture affects the performance as " prosyadet » FPS. The result always depends on the specific application , the complexity of the scene and the device on which the application is running .So that you have a means of assessing performance , let's add a simple counter FPS. To do this , add a scene to Unity GUI and attach it the following code :FPS.csusing UnityEngine;using System.Collections;public class FPSCounter: MonoBehaviour{
    
public float updateRate = 4.0f; / / 4 updates per sec.

    
private int frameCount = 0 ;
    
private float nextUpdate = 0.0f;
    
private float fps = 0.0f;
    
private GUIStyle style = new GUIStyle ();

    
void Start ()
    
{
        
style.fontSize = 48;
        
style.normal.textColor = Color.white;

        
nextUpdate = Time.time;
    
}

    
void Update ()
    
{
        
frameCount + +;

        
if (Time.time> nextUpdate)
        
{
            
nextUpdate + = 1.0f / updateRate;
            
fps = frameCount * updateRate;
            
frameCount = 0 ;
        
}
    
}

    
void OnGUI ()
    
{
        
GUI.Label (new Rect ( 10, 110 , 300, 100 ), "FPS:" + fps, style);
    
}}

On this you can consider our work finished , run the project , experiment with writing .

No comments:

Post a Comment