25个版本 (13个破坏性版本)

0.14.0 2023年5月12日
0.13.1 2022年12月8日
0.13.0 2022年5月19日
0.12.1 2021年5月1日
0.7.1 2020年7月30日

#61 in 视频

Download history 2681/week @ 2024-04-20 2481/week @ 2024-04-27 1985/week @ 2024-05-04 2610/week @ 2024-05-11 2144/week @ 2024-05-18 1822/week @ 2024-05-25 2480/week @ 2024-06-01 1981/week @ 2024-06-08 2092/week @ 2024-06-15 2192/week @ 2024-06-22 3522/week @ 2024-06-29 2636/week @ 2024-07-06 2336/week @ 2024-07-13 2700/week @ 2024-07-20 2429/week @ 2024-07-27 3949/week @ 2024-08-03

每月下载量11,835次
用于13个包(11个直接使用)

MIT许可证

125KB
3K SLoC

安全video4linux (v4l)绑定

crates.io license Build Status

此包提供对Video for Linux (V4L)堆栈的安全绑定。现代设备驱动程序通常会实现v4l2 API,而较旧的驱动程序可能依赖于传统的v4l API。这些传统设备可以通过选择此包的libv4l功能来使用此包。

目标

此包应提供v4l-sys包,以便能够完全(但不安全)访问libv4l*。在此基础上,将提供一个高级、更符合习惯的API来在Linux中使用视频捕获设备。

将提供简单的实用程序应用程序来列出设备和捕获帧。计划在未来提供一个简单的OpenGL/Vulkan查看器来显示帧。

变更日志

请参阅CHANGELOG.md

依赖关系

您可以选择两种依赖关系(两者都由此包内部提供)

  • libv4l-sys

    链接到libv4l*堆栈,包括libv4l1、libv4l2、libv4lconvert。这的优点是可以通过libv4lconvert等在用户空间中模拟常见的捕获格式,如RGB3。然而,一些功能,如用户ptr缓冲区,在libv4l中不受支持。

  • v4l2-sys

    仅使用由videodev2.h提供的Linux内核提供的v4l2 API。您将获得所有v4l2功能的支持,如用户ptr缓冲区,但如果您需要例如RGB/BGR缓冲区(可能不被通用设备如网络摄像头支持),则可能需要自行进行格式转换。

通过选择作为此包的功能来启用libv4lv4l2后端。

用法

以下是一个使用此包的快速示例。它介绍了从流式设备(例如网络摄像头)捕获帧的基本知识。

use v4l::buffer::Type;
use v4l::io::mmap::Stream;
use v4l::io::traits::CaptureStream;
use v4l::video::Capture;
use v4l::Device;
use v4l::FourCC;

fn main() {
    // Create a new capture device with a few extra parameters
    let mut dev = Device::new(0).expect("Failed to open device");

    // Let's say we want to explicitly request another format
    let mut fmt = dev.format().expect("Failed to read format");
    fmt.width = 1280;
    fmt.height = 720;
    fmt.fourcc = FourCC::new(b"YUYV");
    let fmt = dev.set_format(&fmt).expect("Failed to write format");

    // The actual format chosen by the device driver may differ from what we
    // requested! Print it out to get an idea of what is actually used now.
    println!("Format in use:\n{}", fmt);

    // Now we'd like to capture some frames!
    // First, we need to create a stream to read buffers from. We choose a
    // mapped buffer stream, which uses mmap to directly access the device
    // frame buffer. No buffers are copied nor allocated, so this is actually
    // a zero-copy operation.

    // To achieve the best possible performance, you may want to use a
    // UserBufferStream instance, but this is not supported on all devices,
    // so we stick to the mapped case for this example.
    // Please refer to the rustdoc docs for a more detailed explanation about
    // buffer transfers.

    // Create the stream, which will internally 'allocate' (as in map) the
    // number of requested buffers for us.
    let mut stream = Stream::with_buffers(&mut dev, Type::VideoCapture, 4)
        .expect("Failed to create buffer stream");

    // At this point, the stream is ready and all buffers are setup.
    // We can now read frames (represented as buffers) by iterating through
    // the stream. Once an error condition occurs, the iterator will return
    // None.
    loop {
        let (buf, meta) = stream.next().unwrap();
        println!(
            "Buffer size: {}, seq: {}, timestamp: {}",
            buf.len(),
            meta.sequence,
            meta.timestamp
        );

        // To process the captured data, you can pass it somewhere else.
        // If you want to modify the data or extend its lifetime, you have to
        // copy it. This is a best-effort tradeoff solution that allows for
        // zero-copy readers while enforcing a full clone of the data for
        // writers.
    }
}

请查看提供的examples以获取更多示例应用程序。

依赖关系

~75–250KB