USB Camera
Introduction
The USB standards effort created a device class for video capture devices, called UVC USB Video Class and similarly for audio, which is called UAC USB Audio Class. RidgeRun offers a Linux based USB camera framework that supports UVC and UAC using GStreamer as the streaming media framework.
USB is not a peer-to-peer protocol. One side is call the host and the other side is the device. In Linux, the USB gadget sub-system is used to expose a device class over a USB device interface. This allows the embedded Linux based camera to appears to a host computer (Windows, Mac, Linux, tablet, smartphone, etc) as a webcam.
UVC UAC challanges
The key assumption driving the UVC and UAC specifications was a smart host talking to a dumb device. This is a great assumption if the goal is to create low cost simple webcams to hang on your computer monitor. However, the assumption that the device is dumb makes it challanging to support UVC UAC when the device is able to buffer frames, is simultaneously streaming audio/video over the network and all the while appearing to be a simple webcam to the USB host. Specifically, common challanges when supporting UVC and UAC in a smart device include:
- Sychronizing audio and video
- Endpoints size setup and bandwidth
- Handling USB connect and disconnect events
- Supporting non-standard interaction required by some host operating systems
- PTZ hardware control
- Working around USB hardware interrupt, DMA, chaining and other limitations
- Integrating hardware accelerators into the streaming audio video pipeline
Linux USB camera technical overview
The RidgeRun USB camera framework uses Linux standard V4L2, ALSA, and USB gadget sub-systems to handle getting A/V data and passing the encoded A/V data out USB. In addition the RidgeRun USB camera framework uses GStreamer to access hardware accelerators and provide A/V synchronization.
When the host first interacts with the USB camera, it asks for the list of supported resolutions and encodings. The host picks from the list and configures the USB camera. Once configured, the selected audio and video data can be streamed over the USB isochronous endpoints. To support enumeration and configuration, RidgeRun created a uvc-gadget-daemon, that patiently waits for a USB host to attach. The video configuration requests are recevied as V4L2 events from the V4L2 UVC driver.
In the other hand, the UAC driver just looks like a normal audio card on both sides (gadget and host computer) so it can be easily use just redirecting any audio stream to it on the device and configuring the audio system on the host PC. This process is quite straighforward using GStreamer to send the audio stream to the audio interface on the device. The UAC driver doesn't require any special audio encoding since it works with PCM audio and supports up to 48kHz sample rate.
Once streaming starts, audio and video synchronization is maintained by both GSreamer pipeines sharing the same base clock. The dwPresentationTime defined in the UVC spec is used so the host can align video frames with the audio stream. The UAC stream does not require timestamp information since it can be calculated in the host side based on the audio samples.
Tested video resolutions
H.264 support can be added, but wasn't required for the solution being developed.
Frame Rate |
Encoding | Bit Rate |
Resolution |
---|---|---|---|
30 | MJPEG | 55296000 | 720P (1280x720) |
30 | MJPEG | 18432000 | 640 x 480 |
30 | MJPEG | 13824000 | 640 x 360 |
30 | MJPEG | 4608000 | 320 x 240 |
30 | MJPEG | 3456000 | 320 x 180 |
30 | MJPEG | 1152000 | 160 x 120 |
30 | MJPEG | 864000 | 160 x 90 |
30 | YUY2 | 4608000 | 320 x 240 |
30 | YUY2 | 3456000 | 320x180 |
30 | YUY2 | 1152000 | 160 x 120 |
30 | YUY2 | 864000 | 160 x 90 |