Mobile Monocular
About
Run
Run
Make an inference on your browser
Thanks to TensorFlowJS, the network runs on your device:
the image is not sent to the server
Use landscape pictures
First inference is slower (model is not cached yet)
Interface is blocked during inference
if possible, enable WebGL