Creating Frame Processor Plugins
Expose your Frame Processor Plugin to JS
To make the Frame Processor Plugin available to the Frame Processor Worklet Runtime, create the following wrapper function in JS/TS:
import { VisionCameraProxy, Frame } from 'react-native-vision-camera'
const plugin = VisionCameraProxy.getFrameProcessorPlugin('scanFaces')
/**
* Scans faces.
*/
export function scanFaces(frame: Frame): object {
'worklet'
if (plugin == null) throw new Error('Failed to load Frame Processor Plugin "scanFaces"!')
return plugin.call(frame)
}
Test it!
Simply call the wrapper Worklet in your Frame Processor:
function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const faces = scanFaces(frame)
console.log(`Faces in Frame: ${faces}`)
}, [])
return (
<Camera frameProcessor={frameProcessor} {...cameraProps} />
)
}
Next Steps
If you want to distribute your Frame Processor Plugin, simply use npm.
- Create a blank Native Module using bob or create-react-native-module
- Name it
vision-camera-plugin-xxxxx
wherexxxxx
is the name of your plugin - Remove the generated template code from the Example Native Module
- Add VisionCamera to
peerDependencies
:"react-native-vision-camera": ">= 3"
- Implement the Frame Processor Plugin in the iOS, Android and JS/TS Codebase using the guides above
- Publish the plugin to npm. Users will only have to install the plugin using
npm i vision-camera-plugin-xxxxx
and add it to theirbabel.config.js
file. - Add the plugin to the **official VisionCamera plugin list for more visibility