Lower latency on facetime HD camera

I'm trying to use the built-in camera on my Macbook pro as part of an application I'm building. I'm wondering two things:

  1. What is the latency of the camera? (How many milliseconds after I perform an action does the camera capture it)

  2. Is there any way to decrease latency?

I tried decreasing the default image resolution (using cv2 in python) from 720x1280 to 480x640, but I can't tell if this reduces or increases latency.

The application needs to run in as close to real time as possible, so any reduction in latency, no matter how small would be great.


Solution 1:

The latency of the camera depends on a lot of factors - including which exact macOS software version you're running, which other programs are running at the same time, what they're doing, what kind of priority/QoS/nice values you have set for the camera software, etc.

The best way to determine latency is to actually measure it on your exact equipment. You can do that in various ways depending on your exact application - a simple way (if you're displaying the captured video) is to simply film the display with an iPhone set for slow-motion capture and then quickly slide a black piece of paper (or well anything really) in front of the camera. By examining the video filmed on the iPhone, you can count the number of frames from the time you slide the paper over the camera until the display goes black. If your phone captures at 240 fps, each frame is about 4.17 milliseconds.

You can decrease latency (and jitter) by ensuring that the involved software has high priority settings (also ensure you have the right nice values, the right QoS setting, etc), that no unnecessary software is running at the same time, etc.