Skip to main content

Creating Frame Processor Plugins

Overview​

Frame Processor Plugins are native functions which can be directly called from a JS Frame Processor. (See "Frame Processors")

They receive a frame from the Camera as an input and can return any kind of output. For example, a detectFaces function returns an array of detected faces in the frame:

function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const faces = detectFaces(frame)
console.log(`Faces in Frame: ${faces}`)
}, [])

return (
<Camera frameProcessor={frameProcessor} {...cameraProps} />
)
}

For maximum performance, the detectFaces function is written in a native language (e.g. Objective-C), but it will be directly called from the VisionCamera Frame Processor JavaScript-Runtime through JSI.

Types​

Similar to a TurboModule, the Frame Processor Plugin Registry API automatically manages type conversion from JS to native. They are converted into the most efficient data-structures, as seen here:

JS TypeObjective-C/Swift TypeJava/Kotlin Type
numberNSNumber* (double)Double
booleanNSNumber* (boolean)Boolean
stringNSString*String
[]NSArray*List<Object>
{}NSDictionary*Map<String, Object>
undefined / nullnilnull
(any, any) => voidRCTResponseSenderBlock(Object, Object) -> void
ArrayBufferSharedArray*SharedArray
FrameFrame*Frame

Return values​

Return values will automatically be converted to JS values, assuming they are representable in the "Types" table. So the following Java Frame Processor Plugin:

@Nullable
@Override
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) throws Throwable {
return "cat";
}

Returns a string in JS:

export function detectObject(frame: Frame): string {
'worklet'
const result = FrameProcessorPlugins.detectObject(frame)
console.log(result) // <-- "cat"
}

You can also manipulate the buffer and return it (or a copy of it) by returning a Frame/Frame instance:

@Nullable
@Override
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) throws Throwable {
Frame resizedFrame = new Frame(/* ... */);
return resizedFrame;
}

Which returns a Frame in JS:

const frameProcessor = useFrameProcessor((frame) => {
'worklet'
// creates a new `Frame` that's 720x480
const resizedFrame = resize(frame, 720, 480)

// by downscaling the frame, the `detectObjects` function runs faster.
const objects = detectObjects(resizedFrame)
console.log(objects)
}, [])

Parameters​

Frame Processors can also accept parameters, following the same type convention as return values:

const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const faces = scanFaces(frame, { accuracy: 'fast' })
}, [])

Exceptions​

To let the user know that something went wrong you can use Exceptions:

@Nullable
@Override
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) throws Throwable {
if (arguments != null && arguments.get("codeType") instanceof String) {
// ...
} else {
throw new RuntimeException("codeType property has to be a string!");
}
}

Which will throw a JS-error:

const frameProcessor = useFrameProcessor((frame) => {
'worklet'
try {
const codes = scanCodes(frame, { codeType: 1234 })
} catch (e) {
console.log(`Error: ${e.message}`)
}
}, [])

What's possible?​

You can run any native code you want in a Frame Processor Plugin. Just like in the native iOS and Android Camera APIs, you will receive a frame (CMSampleBuffer on iOS, ImageProxy on Android) which you can use however you want. In other words; everything is possible.

Implementations​

Long-running Frame Processors​

If your Frame Processor takes longer than a single frame interval to execute, or runs asynchronously, you can create a copy of the frame and dispatch the actual frame processing to a separate thread.

For example, a realtime video chat application might use WebRTC to send the frames to the server. I/O operations (networking) are asynchronous, and we don't need to wait for the upload to succeed before pushing the next frame, so we copy the frame and perform the upload on another Thread.

@Nullable
@Override
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) throws Throwable {
if (arguments == null) {
return null;
}

String serverURL = (String)arguments.get("serverURL");
Frame frameCopy = new Frame(/* ... */);

uploaderQueue.runAsync(() -> {
WebRTC.uploadImage(frameCopy, serverURL);
frameCopy.close();
});

return null;
}

Async Frame Processors with Event Emitters​

You might also run some very complex AI algorithms which are not fast enough to smoothly run at 30 FPS (33ms). To not drop any frames you can create a custom "frame queue" which processes the copied frames and calls back into JS via a React event emitter. For this you'll have to create a Native Module that handles the asynchronous native -> JS communication, see "Sending events to JavaScript" (Android) and "Sending events to JavaScript" (iOS).

This might look like this for the user:

function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
SomeAI.process(frame) // does not block frame processor, runs async
}, [])

useEffect(() => {
SomeAI.addListener((results) => {
// gets called asynchronously, goes through the React Event Emitter system
console.log(`AI results: ${results}`)
})
}, [])

return (
<Camera frameProcessor={frameProcessor} {...cameraProps} />
)
}

This way you can handle queueing up the frames yourself and asynchronously call back into JS at some later point in time using event emitters.

Benchmarking Frame Processor Plugins​

Your Frame Processor Plugins have to be fast. Use the FPS Graph (enableFpsGraph) to see how fast your Camera is running, if it is not running at the target FPS, your Frame Processor is too slow.


🚀 Create your first Frame Processor Plugin for iOS or Android!​