# linux内核文档翻译之——V4L2-framework.txt V4L2 API Specification

http://blog.csdn.net/jmq_0000/article/details/7530575

V4L2驱动框架概述
=====================================

------------

由于硬件的复杂性v412驱动往往是非常复杂的： 大多数设备有多个IC，在/dev目录下有多个设备节点, 并也创建non-V4L2的设备，如DVB，ALSA，FB，
I2C和input（IR）设备。

由于缺乏一个框架也有很多共同的代码不可重构。

---------------------

1) 每个设备包含设备状态的实例结构。

2) 子设备的初始化和命令方式(如果有).

3) 创建V4L2的设备节点 (/dev/videoX, /dev/vbiX and /dev/radioX)
和跟踪设备节点的具体数据。

4)文件句柄特定的结构，包含每个文件句柄数据;

5) 视频缓冲处理。

device instances（设备实例）
|
+-sub-device instances（子设备实例）
|
\-V4L2 device nodes（V4L2的设备节点）
|
\-filehandle instances（文件句柄实例）

--------------------------

v4l2_device结构的设备实例的数据，
v4l2_subdev结构子设备实例的video_device结构存储V4L2的设备节点的数据，

v4l2_device mdev，sub-devices 和 video节点会自动出现在媒体框架作为实体。
struct v4l2_device
------------------

v4l2_device_register(struct device *dev, struct v4l2_device *v4l2_dev);

bus_id要准确）。 如果你在调用v4l2_device_register之前设置它，那么它没有改变。

你可以使用v4l2_device_set_name()设置名称根据驱动程序的名字和
driver-global atomic_t实例。这将产生的名字一样，ivtv0，ivtv1，等。如果名字最后一个数字，然后将它插入一个破折号：cx18-0，cx18-1，等这个函数返回的实例数量。

v4l2_device_unregister(struct v4l2_device *v4l2_dev);

v4l2_device_disconnect(struct v4l2_device *v4l2_dev);

static int callback(struct device *dev, void *p)
{
struct v4l2_device *v4l2_dev = dev_get_drvdata(dev);

/* test if this device was inited */
if (v4l2_dev == NULL)
return 0;
...
return 0;
}

int iterate(void *p)
{
struct device_driver *drv;
int err;

/* Find driver 'ivtv' on the PCI bus.
pci_bus_type is a global. For USB busses use usb_bus_type. */
drv = driver_find("ivtv", &pci_bus_type);
/* iterate over all ivtv device instances */
err = driver_for_each_device(drv, NULL, p, callback);
put_driver(drv);
return err;
}

static atomic_t drv_instance = ATOMIC_INIT(0);

static int __devinit drv_probe(struct pci_dev *pdev,
const struct pci_device_id *pci_id)
{
...
state->instance = atomic_inc_return(&drv_instance) - 1;
}

v4l2_device引用计数的支持就是对于这样做的目的。当video_register_device

void v4l2_device_get(struct v4l2_device *v4l2_dev);

or:

int v4l2_device_put(struct v4l2_device *v4l2_dev);

struct v4l2_subdev

------------------

（V4L2 subdev.h）创建。

v4l2_subdev结构提供与v4l2_get_subdev_hostdata（）和v4l2_set_subdev_hostdata（）可以访问目的主机私人数据。

struct v4l2_subdev_core_ops {
int (*g_chip_ident)(struct v4l2_subdev *sd, struct v4l2_dbg_chip_ident *chip);
int (*log_status)(struct v4l2_subdev *sd);
int (*init)(struct v4l2_subdev *sd, u32 val);
...
};

struct v4l2_subdev_tuner_ops {
...
};

struct v4l2_subdev_audio_ops {
...
};

struct v4l2_subdev_video_ops {
...
};

struct v4l2_subdev_ops {
const struct v4l2_subdev_core_ops  *core;
const struct v4l2_subdev_tuner_ops *tuner;
const struct v4l2_subdev_audio_ops *audio;
const struct v4l2_subdev_video_ops *video;
};

v4l2_subdev_init(sd, &ops);

int err;

media_entity_cleanup(&sd->entity);

int err = v4l2_device_register_subdev(v4l2_dev, sd);

dev字段指向了v4l2_device。如果的v4l2_device父设备具有非空MDEV字段，

v4l2_device_unregister_subdev(sd);

err = sd->ops->core->g_chip_ident(sd, &chip);但是使用这个宏将更好和更容易：

err = v4l2_subdev_call(sd, core, g_chip_ident, &chip);

-ENOIOCTLCMD if either subdev->core or subdev->core->g_chip_ident is
NULL, 或实际的结果 subdev->ops->core->g_chip_ident ops.

v4l2_device_call_all(v4l2_dev, 0, core, g_chip_ident, &chip);

err = v4l2_device_call_until_err(v4l2_dev, 0, core, g_chip_ident, &chip);

Any error except -ENOIOCTLCMD will exit the loop with that error. If no
errors (except -ENOIOCTLCMD) occurred, then 0 is returned.

grp_id

V4L2的子设备用户空间的API

-----------------------------

V4L2_SUBDEV_FL_HAS_DEVNODE标记所有注册子设备创建设备节点。子设备未注册时将被自动删除这些

VIDIOC_QUERYCTRL
VIDIOC_G_CTRL
VIDIOC_S_CTRL
VIDIOC_G_EXT_CTRLS
VIDIOC_S_EXT_CTRLS
VIDIOC_TRY_EXT_CTRLS

VIDIOC_DQEVENT

VIDIOC_SUBSCRIBE_EVENT

VIDIOC_UNSUBSCRIBE_EVENT

v4l2_subdev::flags

要正确地支持事件，poll（）的文件操作实施。Private ioctls没有在上述列表所有ioctls函数调用会直接传递

I2C sub-device drivers

----------------------

struct chipname_state {
struct v4l2_subdev sd;
...  /* additional state fields */
};

v4l2_i2c_subdev_init(&state->sd, client, subdev_ops);

struct chipname_state {
struct v4l2_subdev sd;
...  /* additional state fields */
};

v4l2_i2c_subdev_init(&state->sd, client, subdev_ops);

static inline struct chipname_state *to_state(struct v4l2_subdev *sd)
{
return container_of(sd, struct chipname_state, sd);
}

struct i2c_client *client = v4l2_get_subdevdata(sd);

struct v4l2_subdev *sd = i2c_get_clientdata(client);

v4l2_device_unregister_subdev（SD）的remove（）调用的回调保证，这是一直在做正确。

struct v4l2_subdev *sd = v4l2_i2c_new_subdev(v4l2_dev, adapter,
"module_foo", "chipid", 0x36, NULL);

“SAA7114”或“SAA7115”。一般来说，虽然I2C驱动程序会自动检测这个。

v4l2_i2c_new_subdev_cfg: this function adds new irq and platform_data
0 then that will be used (non-probing variant), otherwise the probed_addrs
are probed.

For example: this will probe for address 0x10:

struct v4l2_subdev *sd = v4l2_i2c_new_subdev_cfg(v4l2_dev, adapter,
"module_foo", "chipid", 0, NULL, 0, I2C_ADDRS(0x10));

v4l2_i2c_new_subdev_board uses an i2c_board_info struct which is passed
to the i2c driver and replaces the irq, platform_data and addr arguments.

If the subdev supports the s_config core ops, then that op is called with
the irq and platform_data arguments after the subdev was setup. The older
v4l2_i2c_new_(probed_)subdev functions will call s_config as well, but with
irq set to 0 and platform_data set to NULL.

struct video_device

-------------------

dev.h）创建。

To allocate it dynamically use:

struct video_device *vdev = video_device_alloc();

if (vdev == NULL)
return -ENOMEM;

vdev->release = video_device_release;

struct video_device *vdev = &my_vdev->vdev;

vdev->release = my_vdev_release;

- v4l2_dev: 设置这个v4l2_device父设备。
- name: 设置为描述性和独特的东西。
- fops: 设置这个v4l2_file_operations结构。
- ioctl_ops: if you use the v4l2_ioctl_ops to simplify ioctl maintenance
(highly recommended to use this and it might become compulsory in the
future!), then set this to your v4l2_ioctl_ops struct.
- lock: leave to NULL if you want to do all the locking in the driver.
Otherwise you give it a pointer to a struct mutex_lock and before any
of the v4l2_file_operations is called this lock will be taken by the
core and released afterwards.
- prio: keeps track of the priorities. Used to implement VIDIOC_G/S_PRIORITY.
If left to NULL, then it will use the struct v4l2_prio_state in v4l2_device.
If you want to have a separate priority state per (group of) device node(s),
then you can point it to your own struct v4l2_prio_state.
- parent: you only set this if v4l2_device was registered with NULL as
the parent device struct. This only happens in cases where one hardware
device has multiple PCI devices that all share the same v4l2_device core.

The cx88 driver is an example of this: one core v4l2_device struct, but
it is used by both an raw video PCI device (cx8800) and a MPEG PCI device
(cx8802). Since the v4l2_device cannot be associated with a particular
PCI device it is setup without a parent device. But when the struct
video_device is setup you do know which parent PCI device to use.
- flags: optional. Set to V4L2_FL_USE_FH_PRIO if you want to let the framework
handle the VIDIOC_G/S_PRIORITY ioctls. This requires that you use struct
v4l2_fh. Eventually this flag will disappear once all drivers use the core
priority handling. But for now it has to be set explicitly.

video_ioctl2在v4l2_file_operations结构。

v4l2_file_operations结构的file_operations的一个子集。主要区别在于该inode参数被忽略，因为它从来没有使用过。

video_device结构（entity字段）：

int err;

err = media_entity_init(&vdev->entity, 1, pad, 0);

A reference to the entity will be automatically acquired/released when the
video device is opened/closed.

v4l2_file_operations 和 锁

--------------------------------

video_device 注册

-------------------------

接下来，您注册的视频设备：这会为您创建的字符设备

err = video_register_device(vdev, VFL_TYPE_GRABBER, -1);
if (err) {
video_device_release(vdev); /* or kfree(my_vdev); */
return err;
}

VFL_TYPE_GRABBER: 对视频输入/输出设备video
XVFL_TYPE_VBI: vbiX垂直空白数据（即关闭字幕，图文电视）

VFL_TYPE_VTX: vtxX图文电视设备（不建议使用，不使用）

另使用情况是，如果一个驱动程序创建许多设备。在这种情况下它可以是有用的放置在不同范围不同视频设备。

- vfl_type: 设备类型传递到video_register_device。
- minor: 分配装置的次设备号。
- num: 设备节点数目(i.e. the X in videoX).
- index: 设备索引号。如果注册失败，那么你需要调用video_device_release（）来释放分配的video_device结构，

video_device 清除

--------------------

video_unregister_device(vdev);

This will remove the device nodes from sysfs (causing udev to remove them
from /dev).

After video_unregister_device() returns no new opens can be done. However,
in the case of USB devices some application might still have one of these
device nodes open. So after the unregister all file operations (except
release, of course) will return an error as well.

media_entity_cleanup(&vdev->entity);

This can be done from the release callback.

video_device 辅助函数

-----------------------------

- file/video_device private data

You can set/get driver private data in the video_device struct using:

void *video_get_drvdata(struct video_device *vdev);
void video_set_drvdata(struct video_device *vdev, void *data);

Note that you can safely call video_set_drvdata() before calling
video_register_device().

And this function:

struct video_device *video_devdata(struct file *file);

returns the video_device belonging to the file struct.

The video_drvdata function combines video_get_drvdata with video_devdata:

void *video_drvdata(struct file *file);

You can go from a video_device struct to the v4l2_device struct using:

struct v4l2_device *v4l2_dev = vdev->v4l2_dev;

- Device node name

The video_device node kernel name can be retrieved using

const char *video_device_node_name(struct video_device *vdev);

userspace工具，如udev的名字被用来作为一个提示。应在可能的情况下使用，

video buffer 辅助函数

-----------------------------

The v4l2 core API provides a set of standard methods (called "videobuf")
for dealing with video buffers. Those methods allow a driver to implement
read(), mmap() and overlay() in a consistent way.  There are currently
methods for using video buffers on devices that supports DMA with
scatter/gather method (videobuf-dma-sg), DMA with linear access
(videobuf-dma-contig), and vmalloced buffers, mostly used on USB drivers
(videobuf-vmalloc).

to use the videobuf layer.

struct v4l2_fh

--------------

（VIDIOC_G/ S_PRIORITY）如果的video_device标志V4L2_FL_USE_FH_PRIO也。

（在V4L2的框架，而不是驱动器）v4l2_fh用户知道驱动程序是否使用通过测试

V4L2_FL_USES_V4L2_FH位的video_device-> flags中v4l2_fh作为其文件>
private_data

struct v4l2_fh is allocated as a part of the driver's own file handle
structure and file->private_data is set to it in the driver's open
function by the driver.

In many cases the struct v4l2_fh will be embedded in a larger structure.
In that case you should call v4l2_fh_init+v4l2_fh_add in open() and
v4l2_fh_del+v4l2_fh_exit in release().

Drivers can extract their own file handle structure by using the container_of
macro. Example:

struct my_fh {
int blah;
struct v4l2_fh fh;
};

...

int my_open(struct file *file)
{
struct my_fh *my_fh;
struct video_device *vfd;
int ret;

...

my_fh = kzalloc(sizeof(*my_fh), GFP_KERNEL);

...

ret = v4l2_fh_init(&my_fh->fh, vfd);
if (ret) {
kfree(my_fh);
return ret;
}

...

file->private_data = &my_fh->fh;
return 0;
}

int my_release(struct file *file)
{
struct v4l2_fh *fh = file->private_data;
struct my_fh *my_fh = container_of(fh, struct my_fh, fh);

...
v4l2_fh_del(&my_fh->fh);
v4l2_fh_exit(&my_fh->fh);
kfree(my_fh);
return 0;
}

int v4l2_fh_init(struct v4l2_fh *fh, struct video_device *vdev)

Initialise the file handle. This *MUST* be performed in the driver's
v4l2_file_operations->open() handler.

void v4l2_fh_del(struct v4l2_fh *fh)

Unassociate the file handle from video_device(). The file handle
exit function may now be called.

void v4l2_fh_exit(struct v4l2_fh *fh)

Uninitialise the file handle. After uninitialisation the v4l2_fh
memory can be freed.

多个驱动程序需要做一些事情时，第一个文件句柄打开和关闭最后一个文件句柄时。

int v4l2_fh_is_singular(struct v4l2_fh *fh)

Returns 1 if the file handle is the only open file handle, else 0.

int v4l2_fh_is_singular_file(struct file *filp)

Same, but it calls v4l2_fh_is_singular with filp->private_data.

V4L2 events

-----------

V4L2的事件提供了一个通用的方式来传递事件到用户空间。

Useful functions：

- v4l2_event_alloc()

使用事件驱动程序必须分配的文件句柄的事件。驱动程序通过调用函数不止一次，

- v4l2_event_queue()

视频设备的队列中的事件。驱动程序的唯一责任是填写的类型和数据字段。V4L2的其他领域将被填充。

- v4l2_event_subscribe()

的video_device->
ioctl_ops> vidioc_subscribe_event必须检查驱动程序是否能

- v4l2_event_unsubscribe()

在结构v4l2_ioctl_ops vidioc_unsubscribe_event。一个驱动程序可以直接使用

v4l2_event_unsubscribe（），除非它想在取消订阅过程中涉及。

Events are delivered to user space through the poll system call. The driver
can use v4l2_fh->events->wait wait_queue_head_t as the argument for
poll_wait().

There are standard and private events. New standard events must use the
smallest available event type. The drivers must allocate their events from
their own class starting from class base. Class base is
V4L2_EVENT_PRIVATE_START + n * 1000 where n is the lowest available number.
The first event type in the class is reserved for future use, so the first
available event type is 'class base + 1'.

An example on how the V4L2 events may be used can be found in the OMAP
3 ISP driver available at <URL:http://gitorious.org/omap3camera> as of
writing this.

V4L2-framework.txt 源文档在/Documentation/video4linux目录下.

Video for Linux Two API Specification

Revision 0.24

Michael H Schimek

<mschimek@gmx.at>

Bill Dirks

Hans Verkuil

Martin Rubli

Copyright © 1999, 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008 Bill Dirks, Michael H. Schimek, Hans Verkuil, MartinRubli

This document is copyrighted © 1999-2008 by BillDirks, Michael H. Schimek, Hans Verkuil and Martin Rubli.

Permission is granted to copy, distribute and/or modifythis document under the terms of the GNU Free Documentation License,Version 1.1 or any later version published by the Free SoftwareFoundation; with no Invariant Sections, with no Front-Cover Texts, andwith
no Back-Cover Texts. A copy of the license is included in theappendix entitled "GNU Free Documentation License".

Programming examples can be used and distributed withoutrestrictions.

Introduction

1.
Common API Elements

1.1.
Opening and Closing Devices

1.1.1.
Device Naming

1.1.2.
Related Devices

1.1.3.
Multiple Opens

1.1.4.
Shared Data Streams

1.1.5.
Functions

1.2.
Querying Capabilities

1.3.
Application Priority

1.4.
Video Inputs and Outputs

1.5.
Audio Inputs and Outputs

1.6.
Tuners and Modulators

1.6.1.
Tuners

1.6.2.
Modulators

1.6.3.

1.6.4.

1.7.
Video Standards

1.8.
User Controls

1.9.
Extended Controls

1.9.1.
Introduction

1.9.2.
The Extended Control API

1.9.3.
Enumerating Extended Controls

1.9.4.
Creating Control Panels

1.9.5.
MPEG Control Reference

1.9.6.
Camera Control Reference

1.10.
Data Formats

1.10.1.
Data Format Negotiation

1.10.2.
Image Format Enumeration

1.11.
Image Cropping, Insertion and Scaling

1.11.1.
Cropping Structures

1.11.2.

1.11.3.
Examples

1.12.
Streaming Parameters

2.
Image Formats

2.1.
Standard Image Formats

2.2.
Colorspaces

2.3.
Indexed Format

2.4.
RGB Formats

Packed RGB formats -- Packed RGB formats

V4L2_PIX_FMT_SBGGR8 ('BA81') -- Bayer RGB format

V4L2_PIX_FMT_SBGGR16 ('BA82') -- Bayer RGB format

2.5.
YUV Formats

Packed YUV formats -- Packed YUV formats

V4L2_PIX_FMT_GREY ('GREY') -- Grey-scale image

V4L2_PIX_FMT_Y16 ('Y16 ') -- Grey-scale image

V4L2_PIX_FMT_YUYV ('YUYV') -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2

V4L2_PIX_FMT_UYVY ('UYVY') -- Variation of
V4L2_PIX_FMT_YUYV with different order of samplesin memory

V4L2_PIX_FMT_Y41P ('Y41P') -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1

V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12') -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0

V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9') -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0

V4L2_PIX_FMT_YUV422P ('422P') -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to
V4L2_PIX_FMT_YUYV

V4L2_PIX_FMT_YUV411P ('411P') -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to
V4L2_PIX_FMT_Y41P

V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21') -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to
V4L2_PIX_FMT_YVU420

2.6.
Compressed Formats

2.7.
Reserved Format Identifiers

3.
Input/Output

3.1.

3.2.
Streaming I/O (Memory Mapping)

3.3.
Streaming I/O (User Pointers)

3.4.
Asynchronous I/O

3.5.
Buffers

3.5.1.
Timecodes

3.6.
Field Order

4.
Interfaces

4.1.
Video Capture Interface

4.1.1.
Querying Capabilities

4.1.2.
Supplemental Functions

4.1.3.
Image Format Negotiation

4.1.4.

4.2.
Video Overlay Interface

4.2.1.
Querying Capabilities

4.2.2.
Supplemental Functions

4.2.3.
Setup

4.2.4.
Overlay Window

4.2.5.
Enabling Overlay

4.3.
Video Output Interface

4.3.1.
Querying Capabilities

4.3.2.
Supplemental Functions

4.3.3.
Image Format Negotiation

4.3.4.
Writing Images

4.4.
Video Output Overlay Interface

4.4.1.
Querying Capabilities

4.4.2.
Framebuffer

4.4.3.
Overlay Window and Scaling

4.4.4.
Enabling Overlay

4.5.
Codec Interface

4.6.
Effect Devices Interface

4.7.
Raw VBI Data Interface

4.7.1.
Querying Capabilities

4.7.2.
Supplemental Functions

4.7.3.
Raw VBI Format Negotiation

4.7.4.

4.8.
Sliced VBI Data Interface

4.8.1.
Querying Capabilities

4.8.2.
Supplemental Functions

4.8.3.
Sliced VBI Format Negotiation

4.8.4.
Reading and writing sliced VBI data

4.9.
Teletext Interface

4.10.

4.10.1.
Querying Capabilities

4.10.2.
Supplemental Functions

4.10.3.
Programming

4.11.
RDS Interface

I.
Function Reference

V4L2 close() -- Close a V4L2 device

V4L2 ioctl() -- Program a V4L2 device

ioctl VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities

ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers

ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command

ioctl VIDIOC_ENUMAUDIO -- Enumerate audio inputs

ioctl VIDIOC_ENUMAUDOUT -- Enumerate audio outputs

ioctl VIDIOC_ENUM_FMT -- Enumerate image formats

ioctl VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes

ioctl VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals

ioctl VIDIOC_ENUMINPUT -- Enumerate video inputs

ioctl VIDIOC_ENUMOUTPUT -- Enumerate video outputs

ioctl VIDIOC_ENUMSTD -- Enumerate supported video standards

ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes

ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT -- Query or select the current audio output

ioctl VIDIOC_G_CHIP_IDENT -- Identify the chips on a TV card

ioctl VIDIOC_G_CROP, VIDIOC_S_CROP -- Get or set the current cropping rectangle

ioctl VIDIOC_G_CTRL, VIDIOC_S_CTRL -- Get or set the value of a control

ioctl VIDIOC_G_ENC_INDEX -- Get meta data about a compressed video stream

ioctl VIDIOC_G_EXT_CTRLS, VIDIOC_S_EXT_CTRLS,VIDIOC_TRY_EXT_CTRLS -- Get or set the value of several controls, try controlvalues

ioctl VIDIOC_G_FBUF, VIDIOC_S_FBUF -- Get or set frame buffer overlay parameters

ioctl VIDIOC_G_FMT, VIDIOC_S_FMT,VIDIOC_TRY_FMT -- Get or set the data format, try a format

ioctl VIDIOC_G_FREQUENCY, VIDIOC_S_FREQUENCY -- Get or set tuner or modulator radiofrequency

ioctl VIDIOC_G_INPUT, VIDIOC_S_INPUT -- Query or select the current video input

ioctl VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP --

ioctl VIDIOC_G_MODULATOR, VIDIOC_S_MODULATOR -- Get or set modulator attributes

ioctl VIDIOC_G_OUTPUT, VIDIOC_S_OUTPUT -- Query or select the current video output

ioctl VIDIOC_G_PARM, VIDIOC_S_PARM -- Get or set streaming parameters

ioctl VIDIOC_G_PRIORITY, VIDIOC_S_PRIORITY -- Query or request the access priority associated with afile descriptor

ioctl VIDIOC_G_SLICED_VBI_CAP -- Query sliced VBI capabilities

ioctl VIDIOC_G_STD, VIDIOC_S_STD -- Query or select the video standard of the current input

ioctl VIDIOC_G_TUNER, VIDIOC_S_TUNER -- Get or set tuner attributes

ioctl VIDIOC_LOG_STATUS -- Log driver status information

ioctl VIDIOC_OVERLAY -- Start or stop video overlay

ioctl VIDIOC_QBUF, VIDIOC_DQBUF -- Exchange a buffer with the driver

ioctl VIDIOC_QUERYBUF -- Query the status of a buffer

ioctl VIDIOC_QUERYCAP -- Query device capabilities

ioctl VIDIOC_QUERYSTD -- Sense the video standard received by the currentinput

ioctl VIDIOC_REQBUFS -- Initiate Memory Mapping or User Pointer I/O

ioctl VIDIOC_STREAMON, VIDIOC_STREAMOFF -- Start or stop streaming I/O

V4L2 mmap() -- Map device memory into application address space

V4L2 munmap() -- Unmap device memory

V4L2 open() -- Open a V4L2 device

V4L2 poll() -- Wait for some event on a file descriptor

V4L2 select() -- Synchronous I/O multiplexing

V4L2 write() -- Write to a V4L2 device

5.
V4L2 Driver Programming

6.
Changes

6.1.
Differences between V4L and V4L2

6.1.1.
Opening and Closing Devices

6.1.2.
Querying Capabilities

6.1.3.
Video Sources

6.1.4.
Tuning

6.1.5.
Image Properties

6.1.6.
Audio

6.1.7.
Frame Buffer Overlay

6.1.8.
Cropping

6.1.9.

6.1.10.

6.1.11.
Miscellaneous

6.2.
Changes of the V4L2 API

6.2.1.
Early Versions

6.2.2.
V4L2 Version 0.16 1999-01-31

6.2.3.
V4L2 Version 0.18 1999-03-16

6.2.4.
V4L2 Version 0.19 1999-06-05

6.2.5.
V4L2 Version 0.20 (1999-09-10)

6.2.6.
V4L2 Version 0.20 incremental changes

6.2.7.
V4L2 Version 0.20 2000-11-23

6.2.8.
V4L2 Version 0.20 2002-07-25

6.2.9.
V4L2 in Linux 2.5.46, 2002-10

6.2.10.
V4L2 2003-06-19

6.2.11.
V4L2 2003-11-05

6.2.12.
V4L2 in Linux 2.6.6, 2004-05-09

6.2.13.
V4L2 in Linux 2.6.8

6.2.14.
V4L2 spec erratum 2004-08-01

6.2.15.
V4L2 in Linux 2.6.14

6.2.16.
V4L2 in Linux 2.6.15

6.2.17.
V4L2 spec erratum 2005-11-27

6.2.18.
V4L2 spec erratum 2006-01-10

6.2.19.
V4L2 spec erratum 2006-02-03

6.2.20.
V4L2 spec erratum 2006-02-04

6.2.21.
V4L2 in Linux 2.6.17

6.2.22.
V4L2 spec erratum 2006-09-23 (Draft 0.15)

6.2.23.
V4L2 in Linux 2.6.18

6.2.24.
V4L2 in Linux 2.6.19

6.2.25.
V4L2 spec erratum 2006-10-12 (Draft 0.17)

6.2.26.
V4L2 in Linux 2.6.21

6.2.27.
V4L2 in Linux 2.6.22

6.2.28.
V4L2 in Linux 2.6.24

6.2.29.
V4L2 in Linux 2.6.25

6.3.
Relation of V4L2 to other Linux multimedia APIs

6.3.1.
X Video Extension

6.3.2.
Digital Video

6.3.3.
Audio Interfaces

6.4.
Experimental API Elements

6.5.
Obsolete API Elements

A.
Video For Linux Two Header File

B.
Video Capture Example

C.

C.1.
0. PREAMBLE

C.2.
1. APPLICABILITY AND DEFINITIONS

C.3.
2. VERBATIM COPYING

C.4.
3. COPYING IN QUANTITY

C.5.
4. MODIFICATIONS

C.6.
5. COMBINING DOCUMENTS

C.7.
6. COLLECTIONS OF DOCUMENTS

C.8.
7. AGGREGATION WITH INDEPENDENT WORKS

C.9.
8. TRANSLATION

C.10.
9. TERMINATION

C.11.
10. FUTURE REVISIONS OF THIS LICENSE

C.12.

List of Types

References

List of Figures

1-1.
Image Cropping, Insertion and Scaling

3-1.
Field Order, Top Field First Transmitted

3-2.
Field Order, Bottom Field First Transmitted

4-1.
Line synchronization

4-2.
ITU-R 525 line numbering (M/NTSC and M/PAL)

4-3.
ITU-R 625 line numbering

List of Examples

1-1.
Information about the current video input

1-2.
Switching to the first video input

1-3.
Information about the current audio input

1-4.
Switching to the first audio input

1-5.
Information about the current video standard

1-6.
Listing the video standards supported by the currentinput

1-7.
Selecting a new video standard

1-8.
Enumerating all controls

1-9.
Changing controls

1-10.
Resetting the cropping parameters

1-11.
Simple downscaling

1-12.
Selecting an output area

1-13.
Current scaling factor and pixel aspect

2-1.
ITU-R Rec. BT.601 color conversion

2-1.
V4L2_PIX_FMT_BGR24 4 × 4 pixelimage

2-1.
V4L2_PIX_FMT_SBGGR8 4 × 4pixel image

2-1.
V4L2_PIX_FMT_SBGGR16 4 × 4pixel image

2-1.
V4L2_PIX_FMT_GREY 4 × 4pixel image

2-1.
V4L2_PIX_FMT_Y16 4 × 4pixel image

2-1.
V4L2_PIX_FMT_YUYV 4 × 4pixel image

2-1.
V4L2_PIX_FMT_UYVY 4 × 4pixel image

2-1.
V4L2_PIX_FMT_Y41P 8 × 4pixel image

2-1.
V4L2_PIX_FMT_YVU420 4 × 4pixel image

2-1.
V4L2_PIX_FMT_YVU410 4 × 4pixel image

2-1.
V4L2_PIX_FMT_YUV422P 4 × 4pixel image

2-1.
V4L2_PIX_FMT_YUV411P 4 × 4pixel image

2-1.
V4L2_PIX_FMT_NV12 4 × 4pixel image

3-1.
Mapping buffers

3-2.
Initiating streaming I/O with user pointers

4-1.
Finding a framebuffer device for OSD

Introduction

Video For Linux Two is the second version of the Video ForLinux API, a kernel interface for analog radio and video capture andoutput drivers.

Early drivers used ad-hoc interfaces. These were replaced inLinux 2.2 by Alan Cox' V4L API, based on the interface of the bttvdriver. In 1999 Bill Dirks started the development of V4L2 to fix someshortcomings of V4L and to support a wider range of devices.
The APIwas revised again in 2002 prior to its inclusion in Linux 2.5/2.6, andwork continues on improvements and additions while maintainingcompatibility with existing drivers and applications. In 2006/2007efforts began on FreeBSD drivers with a V4L2 interface.

This book documents the V4L2 API. Intended audience aredriver and application writers.

If you have questions or ideas regarding the API, pleasewrite to the Video4Linux mailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list.
For inquiries aboutthe V4L2 specification contact the maintainermschimek@gmx.at.

Chapter 1. Common API Elements

Programming a V4L2 device consists of thesesteps:

Opening the device

Changing device properties, selecting a video and audioinput, video standard, picture brightness a. o.

Negotiating a data format

Negotiating an input/output method

The actual input/output loop

Closing the device

In practice most steps are optional and can be executed out oforder. It depends on the V4L2 device type, you can read about thedetails inChapter
4. In this chapter we will discussthe basic concepts applicable to all devices.

1.1. Opening and Closing Devices

1.1.1. Device Naming

V4L2 drivers are implemented as kernel modules, loadedmanually by the system administrator or automatically when a device isfirst opened. The driver modules plug into the "videodev" kernelmodule. It provides helper functions and a common applicationinterface
specified in this document.

Each driver thus loaded registers one or more device nodeswith major number 81 and a minor number between 0 and 255. Assigningminor numbers to V4L2 devices is entirely up to the system administrator,this is primarily intended to solve conflicts between devices.[1] The
module options to select minor numbers are namedafter the device special file with a "_nr" suffix. For example "video_nr"for/dev/video video capture devices. The number isan offset to the base minor number associated with the device
type.[2] When the driver supports multiple devices of the sametype more than one minor number can
be assigned, separated by commas:

In /etc/modules.conf this may bewritten as:

alias char-major-81-0 mydriver
alias char-major-81-1 mydriver
alias char-major-81-64 mydriver

When an application attempts to open a devicespecial file with major number 81 and minor number 0, 1, or 64, load"mydriver" (and the "videodev" module it depends upon).

Register the first two video capture devices withminor number 0 and 1 (base number is 0), the first two radio devicewith minor number 64 and 65 (base 64).

When no minor number is given as moduleoption the driver supplies a default.
Chapter 4recommends the base minor numbers
to be used for the various devicetypes. Obviously minor numbers must be unique. When the number isalready in use the
offending device will not beregistered.

By convention system administrators create variouscharacter device special files with these major and minor numbers inthe/dev directory. The names recomended for thedifferent V4L2 device types are listed inChapter 4.

The creation of character special files (withmknod) is a privileged operation anddevices cannot be opened by major and minor number. That meansapplications cannotreliable scan
for loaded orinstalled drivers. The user must enter a device name, or theapplication can try the conventional device names.

Under the device filesystem (devfs) the minor numberoptions are ignored. V4L2 drivers (or by proxy the "videodev" module)automatically create the required device files in the/dev/v4l directory using the conventional devicenames above.

1.1.2. Related Devices

Devices can support several related functions. For examplevideo capturing, video overlay and VBI capturing are related becausethese functions share, amongst other, the same video input and tunerfrequency. V4L and earlier versions of V4L2 used the same device
nameand minor number for video capturing and overlay, but different onesfor VBI. Experience showed this approach has several problems[3],
and to make things worse the V4L videodev moduleused to prohibit multiple opens of a device.

As a remedy the present version of the V4L2 API relaxed theconcept of device types with specific names and minor numbers. Forcompatibility with old applications drivers must still register differentminor numbers to assign a default function to the device. But
if relatedfunctions are supported by the driver they must be available under allregistered minor numbers. The desired function can be selected afteropening the device as described inChapter 4.

Imagine a driver supporting video capturing, videooverlay, raw VBI capturing, and FM radio reception. It registers threedevices with minor number 0, 64 and 224 (this numbering scheme isinherited from the V4L API). Regardless if/dev/video (81,
0) or/dev/vbi (81, 224) is opened the application canselect any one of the video capturing, overlay or VBI capturingfunctions. Without programming (e. g. reading from the devicewithddorcat)/dev/video captures
video images, while/dev/vbi captures raw VBI data./dev/radio (81, 64) is invariable a radio device,unrelated to the video functions. Being unrelated does not imply thedevices can be used at the same time,
however. Theopen()function may very well return anEBUSY error
code.

Besides video input or output the hardware may alsosupport audio sampling or playback. If so, these functions areimplemented as OSS or ALSA PCM devices and eventually OSS or ALSAaudio mixer. The V4L2 API makes no provisions yet to find theserelated devices.
If you have an idea please write to the Video4Linuxmailing list: https://listman.redhat.com/mailman/listinfo/video4linux-list.

1.1.3. Multiple Opens

In general, V4L2 devices can be opened more than once.When this is supported by the driver, users can for example start a"panel" application to change controls like brightness or audiovolume, while another application captures video and audio. In other words,
panelapplications are comparable to an OSS or ALSA audio mixer application.When a device supports multiple functions like capturing and overlaysimultaneously, multiple opens
allow concurrentuse of the device by forked processes or specialized applications.

Multiple opens are optional, although drivers shouldpermit at least concurrent accesses without data exchange, i. e. panelapplications. This impliesopen() can
return anEBUSY error code when thedevice is already in use, as well asioctl() functions
initiatingdata exchange (namely theVIDIOC_S_FMT ioctl), and theread()andwrite() functions.

Mere opening a V4L2 device does not grant exclusiveaccess.[4] Initiating data exchange however assigns
the rightto read or write the requested type of data, and to change relatedproperties, to this file descriptor. Applications can requestadditional access privileges using the priority mechanism described inSection 1.3.

1.1.4. Shared Data Streams

V4L2 drivers should not support multiple applicationsreading or writing the same data stream on a device by copyingbuffers, time multiplexing or similar means. This is better handled bya proxy application in user space. When the driver supports streamsharing
anyway it must be implemented transparently. The V4L2 API doesnot specify how conflicts are solved.

1.1.5. Functions

To open and close V4L2 devices applications use theopen() andclose() function,
respectively. Devices areprogrammed using theioctl() function as explained in thefollowing
sections.

1.2. Querying Capabilities

Because V4L2 covers a wide variety of devices not allaspects of the API are equally applicable to all types of devices.Furthermore devices of the same type have different capabilities andthis specification permits the omission of a few complicated and lessimportant
parts of the API.

The VIDIOC_QUERYCAP ioctl is available to check if the kerneldevice is compatible
with this specification, and to query thefunctions andI/Omethodssupported
by the device. Other features can be queriedby calling the respective ioctl, for exampleVIDIOC_ENUMINPUTto
learn about the number, types and names of video connectors on thedevice. Although abstraction is a major objective of this API, theioctl also allows driver specific applications to reliable identifythe driver.

All V4L2 drivers must supportVIDIOC_QUERYCAP. Applications should always callthis ioctl after opening the device.

1.3. Application Priority

When multiple applications share a device it may bedesirable to assign them different priorities. Contrary to thetraditional "rm -rf /" school of thought a video recording applicationcould for example block other applications from changing videocontrols or
switching the current TV channel. Another objective is topermit low priority applications working in background, which can bepreempted by user controlled applications and automatically regaincontrol of the device at a later time.

Since these features cannot be implemented entirely in userspace V4L2 defines theVIDIOC_G_PRIORITY andVIDIOC_S_PRIORITYioctls
to request and query the access priority associate with a filedescriptor. Opening a device assigns a medium priority, compatiblewith earlier versions of V4L2 and drivers not supporting these ioctls.Applications requiring a different priority will usually callVIDIOC_S_PRIORITY after
verifying the device withtheVIDIOC_QUERYCAP ioctl.

Ioctls changing driver properties, such as VIDIOC_S_INPUT,return an EBUSY error
code after another application obtained higher priority.An event mechanism to notify applications about asynchronous propertychanges has been proposed but not added yet.

1.4. Video Inputs and Outputs

Video inputs and outputs are physical connectors of adevice. These can be for example RF connectors (antenna/cable), CVBSa.k.a. Composite Video, S-Video or RGB connectors. Only video and VBIcapture devices have inputs, output devices have outputs, at least
oneeach. Radio devices have no video inputs or outputs.

To learn about the number and attributes of theavailable inputs and outputs applications can enumerate them with theVIDIOC_ENUMINPUTandVIDIOC_ENUMOUTPUT ioctl,
respectively. Thestruct v4l2_input returned by theVIDIOC_ENUMINPUTioctl also contains
signal status information applicable when thecurrent video input is queried.

The VIDIOC_G_INPUT and VIDIOC_G_OUTPUT ioctl
return theindex of the current video input or output. To select a differentinput or output applications call theVIDIOC_S_INPUT andVIDIOC_S_OUTPUT ioctl.
Drivers must implement all the input ioctlswhen the device has one or more inputs, all the output ioctls when thedevice has one or more outputs.

Example 1-1. Information about the current video input

struct v4l2_input input;
int index;

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &index)) {
perror ("VIDIOC_G_INPUT");
exit (EXIT_FAILURE);
}

memset (&input, 0, sizeof (input));
input.index = index;

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
perror ("VIDIOC_ENUMINPUT");
exit (EXIT_FAILURE);
}

printf ("Current input: %s\n", input.name);

Example 1-2. Switching to the first video input

int index;

index = 0;

if (-1 == ioctl (fd, VIDIOC_S_INPUT, &index)) {
perror ("VIDIOC_S_INPUT");
exit (EXIT_FAILURE);
}

1.5. Audio Inputs and Outputs

Audio inputs and outputs are physical connectors of adevice. Video capture devices have inputs, output devices haveoutputs, zero or more each. Radio devices have no audio inputs oroutputs. They have exactly one tuner which in factis an
audio source, but this API associatestuners with video inputs or outputs only, and radio devices havenone of these.[5] A
connector on a TV card to loop back the receivedaudio signal to a sound card is not considered an audio output.

Audio and video inputs and outputs are associated. Selectinga video source also selects an audio source. This is most evident whenthe video and audio source is a tuner. Further audio connectors cancombine with more than one video input or output. Assumed twocomposite
video inputs and two audio inputs exist, there may be up tofour valid combinations. The relation of video and audio connectorsis defined in theaudioset field of therespective struct v4l2_input or
struct v4l2_output, where each bit representsthe index number, starting at zero, of one audio input or output.

To learn about the number and attributes of theavailable inputs and outputs applications can enumerate them with theVIDIOC_ENUMAUDIOandVIDIOC_ENUMAUDOUT ioctl,
respectively. Thestruct v4l2_audio returned by theVIDIOC_ENUMAUDIO ioctlalso contains
signal status information applicable when the currentaudio input is queried.

The VIDIOC_G_AUDIO and VIDIOC_G_AUDOUT ioctl
reportthe current audio input and output, respectively. Note that, unlikeVIDIOC_G_INPUTandVIDIOC_G_OUTPUT these
ioctls return a structureasVIDIOC_ENUMAUDIO andVIDIOC_ENUMAUDOUT do, not just an index.

To select an audio input and change its propertiesapplications call the VIDIOC_S_AUDIO ioctl.
To select an audiooutput (which presently has no changeable properties) applicationscall theVIDIOC_S_AUDOUT ioctl.

Drivers must implement all input ioctls when the devicehas one or more inputs, all output ioctls when the device has oneor more outputs. When the device has any audio inputs or outputs thedriver must set theV4L2_CAP_AUDIO flag
in thestruct v4l2_capability returned by theVIDIOC_QUERYCAP ioctl.

Example 1-3. Information about the current audio input

struct v4l2_audio audio;

memset (&audio, 0, sizeof (audio));

if (-1 == ioctl (fd, VIDIOC_G_AUDIO, &audio)) {
perror ("VIDIOC_G_AUDIO");
exit (EXIT_FAILURE);
}

printf ("Current input: %s\n", audio.name);

Example 1-4. Switching to the first audio input

struct v4l2_audio audio;

memset (&audio, 0, sizeof (audio)); /* clear audio.mode, audio.reserved */

audio.index = 0;

if (-1 == ioctl (fd, VIDIOC_S_AUDIO, &audio)) {
perror ("VIDIOC_S_AUDIO");
exit (EXIT_FAILURE);
}

1.6. Tuners and Modulators

1.6.1. Tuners

Video input devices can have one or more tunersdemodulating a RF signal. Each tuner is associated with one or morevideo inputs, depending on the number of RF connectors on the tuner.Thetype field of the respectivestruct v4l2_input returned
by theVIDIOC_ENUMINPUT ioctl is set toV4L2_INPUT_TYPE_TUNER and
itstuner field contains the index number ofthe tuner.

Radio devices have exactly one tuner with index zero, novideo inputs.

To query and change tuner properties applications use theVIDIOC_G_TUNER andVIDIOC_S_TUNER ioctl,
respectively. Thestruct v4l2_tuner returned byVIDIOC_G_TUNER alsocontains signal status
information applicable when the tuner of thecurrent video input, or a radio tuner is queried. Note thatVIDIOC_S_TUNER does not switch the current tuner,when there is more than one at all. The tuner is solely determined bythe current
video input. Drivers must support both ioctls and set theV4L2_CAP_TUNER flag in the struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl when the device has one ormore tuners.

1.6.2. Modulators

Video output devices can have one or more modulators, uh,modulating a video signal for radiation or connection to the antennainput of a TV set or video recorder. Each modulator is associated withone or more video outputs, depending on the number of RF connectors
onthe modulator. The typefield of therespective struct v4l2_output returned by
theVIDIOC_ENUMOUTPUT ioctl isset toV4L2_OUTPUT_TYPE_MODULATOR and
itsmodulator field contains the index numberof the modulator. This specification does not define radio outputdevices.

To query and change modulator properties applications usethe VIDIOC_G_MODULATOR and VIDIOC_S_MODULATOR ioctl.
Note thatVIDIOC_S_MODULATOR does not switch the currentmodulator, when there is more than one at all. The modulator is solelydetermined by the current video output. Drivers must support bothioctls and set the V4L2_CAP_TUNER (sic)
flag inthe struct v4l2_capability returned by theVIDIOC_QUERYCAP ioctl
when thedevice has one or more modulators.

To get and set the tuner or modulator radio frequencyapplications use the VIDIOC_G_FREQUENCY and VIDIOC_S_FREQUENCYioctl
which both take a pointer to a struct v4l2_frequency. These ioctlsare used for TV and radio devices alike. Drivers
must support bothioctls when the tuner or modulator ioctls are supported, orwhen the device is a radio device.

To be discussed. See also proposals by Peter Schlaf, video4linux-list@redhat.com on 23 Oct 2002,subject: "Re: [V4L] Re: v4l2 api".

1.7. Video Standards

Video devices typically support one or more different videostandards or variations of standards. Each video input and output maysupport another set of standards. This set is reported by thestd field of struct v4l2_input andstruct v4l2_output returned
by theVIDIOC_ENUMINPUT andVIDIOC_ENUMOUTPUTioctl,
respectively.

V4L2 defines one bit for each analog video standardcurrently in use worldwide, and sets aside bits for driver definedstandards, e. g. hybrid standards to watch NTSC video tapes on PAL TVsand vice versa. Applications can use the predefined bits to select aparticular
standard, although presenting the user a menu of supportedstandards is preferred. To enumerate and query the attributes of thesupported standards applications use theVIDIOC_ENUMSTD ioctl.

Many of the defined standards are actually just variationsof a few major standards. The hardware may in fact not distinguishbetween them, or do so internal and switch automatically. Thereforeenumerated standards also contain sets of one or more standardbits.

Assume a hypothetic tuner capable of demodulating B/PAL,G/PAL and I/PAL signals. The first enumerated standard is a set of Band G/PAL, switched automatically depending on the selected radiofrequency in UHF or VHF band. Enumeration gives a "PAL-B/G" or "PAL-I"choice.
Similar a Composite input may collapse standards, enumerating"PAL-B/G/H/I", "NTSC-M" and "SECAM-D/K".[6]

To query and select the standard used by the current videoinput or output applications call theVIDIOC_G_STD andVIDIOC_S_STD ioctl,
respectively. Thereceivedstandard can be sensed with theVIDIOC_QUERYSTD ioctl.
Note parameter of all these ioctls is a pointer to av4l2_std_id type (a standard set),notan
index into the standard enumeration.[7] Drivers must implement all video standard ioctlswhen the device
has one or more video inputs or outputs.

Special rules apply to USB cameras where the notion of videostandards makes little sense. More generally any capture device,output devices accordingly, which is

incapable of capturing fields or frames at the nominalrate of the video standard, or

where timestamps referto the instant the field or frame was received by the driver, not thecapture time, or

where sequence numbersrefer to the frames received by the driver, not the capturedframes.

Here the driver shall set the
std field of struct
v4l2_input and struct
v4l2_outputto
zero, the
VIDIOC_G_STD,
VIDIOC_S_STD,
VIDIOC_QUERYSTDand
VIDIOC_ENUMSTD ioctls shall return the
EINVAL error
code.
[8]

Example 1-5. Information about the current video standard

v4l2_std_id std_id;
struct v4l2_standard standard;

if (-1 == ioctl (fd, VIDIOC_G_STD, &std_id)) {
/* Note when VIDIOC_ENUMSTD always returns EINVAL this
is no video device or it falls under the USB exception,
and VIDIOC_G_STD returning EINVAL is no error. */

perror ("VIDIOC_G_STD");
exit (EXIT_FAILURE);
}

memset (&standard, 0, sizeof (standard));
standard.index = 0;

while (0 == ioctl (fd, VIDIOC_ENUMSTD, &standard)) {
if (standard.id & std_id) {
printf ("Current video standard: %s\n", standard.name);
exit (EXIT_SUCCESS);
}

standard.index++;
}

/* EINVAL indicates the end of the enumeration, which cannot be
empty unless this device falls under the USB exception. */

if (errno == EINVAL || standard.index == 0) {
perror ("VIDIOC_ENUMSTD");
exit (EXIT_FAILURE);
}

Example 1-6. Listing the video standards supported by the currentinput

struct v4l2_input input;
struct v4l2_standard standard;

memset (&input, 0, sizeof (input));

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &input.index)) {
perror ("VIDIOC_G_INPUT");
exit (EXIT_FAILURE);
}

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
perror ("VIDIOC_ENUM_INPUT");
exit (EXIT_FAILURE);
}

printf ("Current input %s supports:\n", input.name);

memset (&standard, 0, sizeof (standard));
standard.index = 0;

while (0 == ioctl (fd, VIDIOC_ENUMSTD, &standard)) {
if (standard.id & input.std)
printf ("%s\n", standard.name);

standard.index++;
}

/* EINVAL indicates the end of the enumeration, which cannot be
empty unless this device falls under the USB exception. */

if (errno != EINVAL || standard.index == 0) {
perror ("VIDIOC_ENUMSTD");
exit (EXIT_FAILURE);
}

Example 1-7. Selecting a new video standard

struct v4l2_input input;
v4l2_std_id std_id;

memset (&input, 0, sizeof (input));

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &input.index)) {
perror ("VIDIOC_G_INPUT");
exit (EXIT_FAILURE);
}

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
perror ("VIDIOC_ENUM_INPUT");
exit (EXIT_FAILURE);
}

if (0 == (input.std & V4L2_STD_PAL_BG)) {
fprintf (stderr, "Oops. B/G PAL is not supported.\n");
exit (EXIT_FAILURE);
}

/* Note this is also supposed to work when only B
or G/PAL is supported. */

std_id = V4L2_STD_PAL_BG;

if (-1 == ioctl (fd, VIDIOC_S_STD, &std_id)) {
perror ("VIDIOC_S_STD");
exit (EXIT_FAILURE);
}

1.8. User Controls

Devices typically have a number of user-settable controlssuch as brightness, saturation and so on, which would be presented tothe user on a graphical user interface. But, different deviceswill have different controls available, and furthermore, the range ofpossible
values, and the default value will vary from device todevice. The control ioctls provide the information and a mechanism tocreate a nice user interface for these controls that will workcorrectly with any device.

All controls are accessed using an ID value. V4L2 definesseveral IDs for specific purposes. Drivers can also implement theirown custom controls usingV4L2_CID_PRIVATE_BASEand higher values. The pre-defined control IDs have the prefixV4L2_CID_,
and are listed inTable 1-1. The ID is used when querying the attributes ofa control, and when getting or setting
the current value.

Generally applications should present controls to the userwithout assumptions about their purpose. Each control comes with aname string the user is supposed to understand. When the purpose isnon-intuitive the driver writer should provide a user manual, a userinterface
plug-in or a driver specific panel application. PredefinedIDs were introduced to change a few controls programmatically, forexample to mute a device during a channel switch.

Drivers may enumerate different controls after switchingthe current video input or output, tuner or modulator, or audio inputor output. Different in the sense of other bounds, another default andcurrent value, step size or other menu items. A control with a
certaincustom ID can also change name andtype.[9] Control
values are stored globally, they do notchange when switching except to stay within the reported bounds. Theyalso do not change e. g. when the device is opened or closed, when thetuner radio frequency is changed or generally never withoutapplication request.
Since V4L2 specifies no event mechanism, panelapplications intended to cooperate with other panel applications (bethey built into a larger application, as a TV viewer) may need toregularly poll control values to update their userinterface.[10]

Table 1-1. Control IDs

Applications can enumerate the available controls with theVIDIOC_QUERYCTRL andVIDIOC_QUERYMENU ioctls,
get and set acontrol value with theVIDIOC_G_CTRL andVIDIOC_S_CTRL ioctls.Drivers
must implementVIDIOC_QUERYCTRL,VIDIOC_G_CTRL andVIDIOC_S_CTRL when the device has one or morecontrols,VIDIOC_QUERYMENU when it has one ormore

Example 1-8. Enumerating all controls

struct v4l2_queryctrl queryctrl;

static void
{

} else {
exit (EXIT_FAILURE);
}
}
}

memset (&queryctrl, 0, sizeof (queryctrl));

for (queryctrl.id = V4L2_CID_BASE;
queryctrl.id < V4L2_CID_LASTP1;
queryctrl.id++) {
if (0 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
continue;

printf ("Control %s\n", queryctrl.name);

} else {
if (errno == EINVAL)
continue;

perror ("VIDIOC_QUERYCTRL");
exit (EXIT_FAILURE);
}
}

for (queryctrl.id = V4L2_CID_PRIVATE_BASE;;
queryctrl.id++) {
if (0 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
continue;

printf ("Control %s\n", queryctrl.name);

} else {
if (errno == EINVAL)
break;

perror ("VIDIOC_QUERYCTRL");
exit (EXIT_FAILURE);
}
}

Example 1-9. Changing controls

struct v4l2_queryctrl queryctrl;
struct v4l2_control control;

memset (&queryctrl, 0, sizeof (queryctrl));
queryctrl.id = V4L2_CID_BRIGHTNESS;

if (-1 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
if (errno != EINVAL) {
perror ("VIDIOC_QUERYCTRL");
exit (EXIT_FAILURE);
} else {
printf ("V4L2_CID_BRIGHTNESS is not supported\n");
}
} else if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED) {
printf ("V4L2_CID_BRIGHTNESS is not supported\n");
} else {
memset (&control, 0, sizeof (control));
control.id = V4L2_CID_BRIGHTNESS;
control.value = queryctrl.default_value;

if (-1 == ioctl (fd, VIDIOC_S_CTRL, &control)) {
perror ("VIDIOC_S_CTRL");
exit (EXIT_FAILURE);
}
}

memset (&control, 0, sizeof (control));
control.id = V4L2_CID_CONTRAST;

if (0 == ioctl (fd, VIDIOC_G_CTRL, &control)) {
control.value += 1;

/* The driver may clamp the value or return ERANGE, ignored here */

if (-1 == ioctl (fd, VIDIOC_S_CTRL, &control)
&& errno != ERANGE) {
perror ("VIDIOC_S_CTRL");
exit (EXIT_FAILURE);
}
/* Ignore if V4L2_CID_CONTRAST is unsupported */
} else if (errno != EINVAL) {
perror ("VIDIOC_G_CTRL");
exit (EXIT_FAILURE);
}

control.id = V4L2_CID_AUDIO_MUTE;
control.value = TRUE; /* silence */

/* Errors ignored */
ioctl (fd, VIDIOC_S_CTRL, &control);

1.9. Extended Controls

1.9.1. Introduction

The control mechanism as originally designed was meantto be used for user settings (brightness, saturation, etc). However,it turned out to be a very useful model for implementing morecomplicated driver APIs where each driver implements only a subset ofa larger
API.

The MPEG encoding API was the driving force behinddesigning and implementing this extended control mechanism: the MPEGstandard is quite large and the currently supported hardware MPEGencoders each only implement a subset of this standard. Further more,many
parameters relating to how the video is encoded into an MPEGstream are specific to the MPEG encoding chip since the MPEG standardonly defines the format of the resulting MPEG stream, not how thevideo is actually encoded into that format.

Unfortunately, the original control API lacked somefeatures needed for these new uses and so it was extended into the(not terribly originally named) extended control API.

1.9.2. The Extended Control API

Three new ioctls are available: VIDIOC_G_EXT_CTRLS,VIDIOC_S_EXT_CTRLS andVIDIOC_TRY_EXT_CTRLS.
These ioctls act onarrays of controls (as opposed to theVIDIOC_G_CTRL andVIDIOC_S_CTRL ioctls
that act on a single control). This is neededsince it is often required to atomically change several controls atonce.

Each of the new ioctls expects a pointer to astruct v4l2_ext_controls. This structure contains a pointer to
the controlarray, a count of the number of controls in that array and a controlclass. Control classes are used to group similar controls into asingle class. For example, control classV4L2_CTRL_CLASS_USER contains all user controls(i. e.
all controls that can also be set using the oldVIDIOC_S_CTRL ioctl). Control classV4L2_CTRL_CLASS_MPEG contains all controlsrelating to MPEG encoding, etc.

All controls in the control array must belong to thespecified control class. An error is returned if this is not thecase.

It is also possible to use an empty control array (count== 0) to check whether the specified control class issupported.

The control array is a struct v4l2_ext_control array. Thev4l2_ext_control structure
is very similar tostruct v4l2_control, except for the fact that it also allows for 64-bitvalues and pointers to
be passed (although the latter is not yet usedanywhere).

It is important to realize that due to the flexibility ofcontrols it is necessary to check whether the control you want to setactually is supported in the driver and what the valid range of valuesis. So use theVIDIOC_QUERYCTRL andVIDIOC_QUERYMENU ioctls
tocheck this. Also note that it is possible that some of the menuindices in a control of typeV4L2_CTRL_TYPE_MENUmay not be supported (VIDIOC_QUERYMENU willreturn an error). A good example is the list
of supported MPEG audiobitrates. Some drivers only support one or two bitrates, otherssupport a wider range.

1.9.3. Enumerating Extended Controls

The recommended way to enumerate over the extendedcontrols is by using VIDIOC_QUERYCTRL in
combination with theV4L2_CTRL_FLAG_NEXT_CTRL flag:

struct v4l2_queryctrl qctrl;

qctrl.id = V4L2_CTRL_FLAG_NEXT_CTRL;
while (0 == ioctl (fd, VIDIOC_QUERYCTRL, &qctrl)) {
/* ... */
qctrl.id |= V4L2_CTRL_FLAG_NEXT_CTRL;
}

The initial control ID is set to 0 ORed with theV4L2_CTRL_FLAG_NEXT_CTRL flag. TheVIDIOC_QUERYCTRL ioctl will return the firstcontrol with a higher ID than the specified one. When no such controlsare
found an error is returned.

If you want to get all controls within a specific controlclass, then you can set the initialqctrl.id value to the control class and addan extra check to break out of the loop when a control of anothercontrol class is found:

qctrl.id = V4L2_CTRL_CLASS_MPEG | V4L2_CTRL_FLAG_NEXT_CTRL;
while (0 == ioctl (fd, VIDIOC_QUERYCTRL, &qctrl)) {
if (V4L2_CTRL_ID2CLASS (qctrl.id) != V4L2_CTRL_CLASS_MPEG)
break;
/* ... */
qctrl.id |= V4L2_CTRL_FLAG_NEXT_CTRL;
}

The 32-bit qctrl.id value issubdivided into three bit ranges: the top 4 bits are reserved forflags (e. g.V4L2_CTRL_FLAG_NEXT_CTRL) and are notactually part of the ID. The remaining 28 bits form
the control ID, ofwhich the most significant 12 bits define the control class and theleast significant 16 bits identify the control within the controlclass. It is guaranteed that these last 16 bits are always non-zerofor controls. The range of 0x1000 and up
are reserved fordriver-specific controls. The macroV4L2_CTRL_ID2CLASS(id) returns the control classID based on a control ID.

If the driver does not support extended controls, thenVIDIOC_QUERYCTRL will fail when used incombination withV4L2_CTRL_FLAG_NEXT_CTRL. Inthat case the old method of enumerating control should be used
(see1.8). But if it is supported, then it is guaranteed to enumerate overall controls, including driver-private controls.

1.9.4. Creating Control Panels

It is possible to create control panels for a graphicaluser interface where the user can select the various controls.Basically you will have to iterate over all controls using the methoddescribed above. Each control class starts with a control of typeV4L2_CTRL_TYPE_CTRL_CLASS.VIDIOC_QUERYCTRL will
return the name of thiscontrol class which can be used as the title of a tab page within acontrol panel.

The flags field of struct v4l2_queryctrl also contains hints onthe behavior of the control. See theVIDIOC_QUERYCTRL documentationfor
more details.

1.9.5. MPEG Control Reference

Below all controls within the MPEG control class aredescribed. First the generic controls, then controls specific forcertain hardware.

1.9.5.1. Generic MPEG Controls

Table 1-2. MPEG Control IDs
IDType  Description    V4L2_CID_MPEG_CLASS class  The MPEG classdescriptor. Calling VIDIOC_QUERYCTRL for this control will return adescription of this control class. This description can be used as thecaption of a Tab page in a GUI, for example.    V4L2_CID_MPEG_STREAM_TYPE enum  The MPEG-1, -2 or -4output stream type. One cannot assume anything here. Each hardwareMPEG encoder tends to support different subsets of the available MPEGstream types. The currently defined stream types are:
V4L2_MPEG_STREAM_TYPE_MPEG2_PS MPEG-2 program streamV4L2_MPEG_STREAM_TYPE_MPEG2_TS MPEG-2 transport streamV4L2_MPEG_STREAM_TYPE_MPEG1_SS MPEG-1 system streamV4L2_MPEG_STREAM_TYPE_MPEG2_DVD MPEG-2 DVD-compatible streamV4L2_MPEG_STREAM_TYPE_MPEG1_VCD MPEG-1 VCD-compatible streamV4L2_MPEG_STREAM_TYPE_MPEG2_SVCD MPEG-2 SVCD-compatible stream     V4L2_CID_MPEG_STREAM_PID_PMT integer  Program Map TablePacket ID for the MPEG transport stream (default 16)    V4L2_CID_MPEG_STREAM_PID_AUDIO integer  Audio Packet ID forthe MPEG transport stream (default 256)    V4L2_CID_MPEG_STREAM_PID_VIDEO integer  Video Packet ID forthe MPEG transport stream (default 260)    V4L2_CID_MPEG_STREAM_PID_PCR integer  Packet ID for theMPEG transport stream carrying PCR fields (default 259)    V4L2_CID_MPEG_STREAM_PES_ID_AUDIO integer  Audio ID for MPEGPES    V4L2_CID_MPEG_STREAM_PES_ID_VIDEO integer  Video ID for MPEGPES    V4L2_CID_MPEG_STREAM_VBI_FMT enum  Some cards can embedVBI data (e. g. Closed Caption, Teletext) into the MPEG stream. Thiscontrol selects whether VBI data should be embedded, and if so, whatembedding method should be used. The list of possible VBI formatsdepends on the driver. The currently defined VBI format typesare:
V4L2_MPEG_STREAM_VBI_FMT_NONE No VBI in the MPEG streamV4L2_MPEG_STREAM_VBI_FMT_IVTV VBI in private packets, IVTV format (documentedin the kernel sources in the fileDocumentation/video4linux/cx2341x/README.vbi)     V4L2_CID_MPEG_AUDIO_SAMPLING_FREQ enum  MPEG Audio samplingfrequency. Possible values are:
V4L2_MPEG_AUDIO_SAMPLING_FREQ_44100 44.1 kHzV4L2_MPEG_AUDIO_SAMPLING_FREQ_48000 48 kHzV4L2_MPEG_AUDIO_SAMPLING_FREQ_32000 32 kHz     V4L2_CID_MPEG_AUDIO_ENCODING enum  MPEG Audio encoding.Possible values are:
V4L2_MPEG_AUDIO_ENCODING_LAYER_1 MPEG Layer I encodingV4L2_MPEG_AUDIO_ENCODING_LAYER_2 MPEG Layer II encodingV4L2_MPEG_AUDIO_ENCODING_LAYER_3 MPEG Layer III encoding     V4L2_CID_MPEG_AUDIO_L1_BITRATE enum  Layer I bitrate.Possible values are:
V4L2_MPEG_AUDIO_L1_BITRATE_32K 32 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_64K 64 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_96K 96 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_128K 128 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_160K 160 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_192K 192 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_224K 224 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_256K 256 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_288K 288 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_320K 320 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_352K 352 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_384K 384 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_416K 416 kbit/sV4L2_MPEG_AUDIO_L1_BITRATE_448K 448 kbit/s     V4L2_CID_MPEG_AUDIO_L2_BITRATE enum  Layer II bitrate.Possible values are:
V4L2_MPEG_AUDIO_L2_BITRATE_32K 32 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_48K 48 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_56K 56 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_64K 64 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_80K 80 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_96K 96 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_112K 112 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_128K 128 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_160K 160 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_192K 192 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_224K 224 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_256K 256 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_320K 320 kbit/sV4L2_MPEG_AUDIO_L2_BITRATE_384K 384 kbit/s     V4L2_CID_MPEG_AUDIO_L3_BITRATE enum  Layer III bitrate.Possible values are:
V4L2_MPEG_AUDIO_L3_BITRATE_32K 32 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_40K 40 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_48K 48 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_56K 56 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_64K 64 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_80K 80 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_96K 96 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_112K 112 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_128K 128 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_160K 160 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_192K 192 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_224K 224 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_256K 256 kbit/sV4L2_MPEG_AUDIO_L3_BITRATE_320K 320 kbit/s     V4L2_CID_MPEG_AUDIO_MODE enum  MPEG Audio mode.Possible values are:
V4L2_MPEG_AUDIO_MODE_STEREO StereoV4L2_MPEG_AUDIO_MODE_JOINT_STEREO Joint StereoV4L2_MPEG_AUDIO_MODE_DUAL BilingualV4L2_MPEG_AUDIO_MODE_MONO Mono     V4L2_CID_MPEG_AUDIO_MODE_EXTENSION enum  Joint Stereoaudio mode extension. In Layer I and II they indicate which subbandsare in intensity stereo. All other subbands are coded in stereo. LayerIII is not (yet) supported. Possible valuesare:
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_4 Subbands 4-31 in intensity stereoV4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_8 Subbands 8-31 in intensity stereoV4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_12 Subbands 12-31 in intensity stereoV4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_16 Subbands 16-31 in intensity stereo     V4L2_CID_MPEG_AUDIO_EMPHASIS enum  Audio Emphasis.Possible values are:
V4L2_MPEG_AUDIO_EMPHASIS_NONE NoneV4L2_MPEG_AUDIO_EMPHASIS_50_DIV_15_uS 50/15 microsecond emphasisV4L2_MPEG_AUDIO_EMPHASIS_CCITT_J17 CCITT J.17     V4L2_CID_MPEG_AUDIO_CRC enum  CRC method. Possiblevalues are:
V4L2_MPEG_AUDIO_CRC_NONE NoneV4L2_MPEG_AUDIO_CRC_CRC16 16 bit parity check     V4L2_CID_MPEG_AUDIO_MUTE bool  Mutes the audio whencapturing. This is not done by muting audio hardware, which can stillproduce a slight hiss, but in the encoder itself, guaranteeing a fixedand reproducable audio bitstream. 0 = unmuted, 1 = muted.    V4L2_CID_MPEG_VIDEO_ENCODING enum  MPEG Video encodingmethod. Possible values are:
V4L2_MPEG_VIDEO_ENCODING_MPEG_1 MPEG-1 Video encodingV4L2_MPEG_VIDEO_ENCODING_MPEG_2 MPEG-2 Video encoding     V4L2_CID_MPEG_VIDEO_ASPECT enum  Video aspect.Possible values are:
V4L2_MPEG_VIDEO_ASPECT_1x1  V4L2_MPEG_VIDEO_ASPECT_4x3  V4L2_MPEG_VIDEO_ASPECT_16x9  V4L2_MPEG_VIDEO_ASPECT_221x100       V4L2_CID_MPEG_VIDEO_B_FRAMES integer  Number of B-Frames(default 2)    V4L2_CID_MPEG_VIDEO_GOP_SIZE integer  GOP size (default12)    V4L2_CID_MPEG_VIDEO_GOP_CLOSURE bool  GOP closure (default1)    V4L2_CID_MPEG_VIDEO_PULLDOWN bool  Enable 3:2 pulldown(default 0)    V4L2_CID_MPEG_VIDEO_BITRATE_MODE enum  Video bitrate mode.Possible values are:
V4L2_MPEG_VIDEO_BITRATE_MODE_VBR Variable bitrateV4L2_MPEG_VIDEO_BITRATE_MODE_CBR Constant bitrate     V4L2_CID_MPEG_VIDEO_BITRATE integer  Video bitrate in bitsper second.    V4L2_CID_MPEG_VIDEO_BITRATE_PEAK integer  Peak video bitrate inbits per second. Must be larger or equal to the average video bitrate.It is ignored if the video bitrate mode is set to constantbitrate.    V4L2_CID_MPEG_VIDEO_TEMPORAL_DECIMATION integer  For every capturedframe, skip this many subsequent frames (default 0).    V4L2_CID_MPEG_VIDEO_MUTE bool  "Mutes" the video to afixed color when capturing. This is useful for testing, to produce afixed video bitstream. 0 = unmuted, 1 = muted.    V4L2_CID_MPEG_VIDEO_MUTE_YUV integer  Sets the "mute" colorof the video. The supplied 32-bit integer is interpreted as follows (bit0 = least significant bit):
Bit 0:7V chrominance informationBit 8:15U chrominance informationBit 16:23Y luminance informationBit 24:31Must be zero.

1.9.5.2. CX2341x MPEG Controls

The following MPEG class controls deal with MPEGencoding settings that are specific to the Conexant CX23415 andCX23416 MPEG encoding chips.

Table 1-3. CX2341x Control IDs
IDType  Description    V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE enum  Sets the SpatialFilter mode (default MANUAL). Possible valuesare:
V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_MANUAL Choose the filter manuallyV4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_AUTO Choose the filter automatically     V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER integer (0-15)  The setting for theSpatial Filter. 0 = off, 15 = maximum. (Default is 0.)    V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE enum  Select the algorithmto use for the Luma Spatial Filter (default1D_HOR). Possible values:
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_OFF No filterV4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_HOR One-dimensional horizontalV4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_VERT One-dimensional verticalV4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_HV_SEPARABLE Two-dimensional separableV4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_SYM_NON_SEPARABLE Two-dimensional symmetricalnon-separable     V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE enum  Select the algorithmfor the Chroma Spatial Filter (default 1D_HOR).Possible values are:
V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_OFF No filterV4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_1D_HOR One-dimensional horizontal     V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE enum  Sets the TemporalFilter mode (default MANUAL). Possible valuesare:
V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_MANUAL Choose the filter manuallyV4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_AUTO Choose the filter automatically     V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER integer (0-31)  The setting for theTemporal Filter. 0 = off, 31 = maximum. (Default is 8 for full-scalecapturing and 0 for scaled capturing.)    V4L2_CID_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE enum  Median Filter Type(default OFF). Possible values are:
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_OFF No filterV4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR Horizontal filterV4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_VERT Vertical filterV4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR_VERT Horizontal and vertical filterV4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_DIAG Diagonal filter     V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_BOTTOM integer (0-255)  Threshold above whichthe luminance median filter is enabled (default 0)    V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_TOP integer (0-255)  Threshold below whichthe luminance median filter is enabled (default 255)    V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_BOTTOM integer (0-255)  Threshold above whichthe chroma median filter is enabled (default 0)    V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_TOP integer (0-255)  Threshold below whichthe chroma median filter is enabled (default 255)    V4L2_CID_MPEG_CX2341X_STREAM_INSERT_NAV_PACKETS bool  The CX2341X MPEG encodercan insert one empty MPEG-2 PES packet into the stream between everyfour video frames. The packet size is 2048 bytes, including thepacket_start_code_prefix and stream_id fields. The stream_id is 0xBF(private stream 2). The payload consists of 0x00 bytes, to be filledin by the application. 0 = do not insert, 1 = insert packets.

1.9.6. Camera Control Reference

The Camera class includes controls for mechanical (orequivalent digital) features of a device such as controllable lensesor sensors.

Table 1-4. Camera Control IDs

IDType  Description    V4L2_CID_CAMERA_CLASS class  The Camera classdescriptor. Calling VIDIOC_QUERYCTRL for this control will return adescription of this control class.    V4L2_CID_EXPOSURE_AUTO integer  Enables automaticadjustments of the exposure time and/or iris aperture. The effect ofmanual changes of the exposure time or iris aperture while thesefeatures are enabled is undefined, drivers should ignore suchrequests. Possible values are:
V4L2_EXPOSURE_AUTO Automatic exposure time, automatic irisaperture.V4L2_EXPOSURE_MANUAL Manual exposure time, manual iris.V4L2_EXPOSURE_SHUTTER_PRIORITY Manual exposure time, auto iris.V4L2_EXPOSURE_APERTURE_PRIORITY Auto exposure time, manual iris.     V4L2_CID_EXPOSURE_ABSOLUTE integer  Determines the exposuretime of the camera sensor. The exposure time is limited by the frameinterval. Drivers should interpret the values as 100 µs units,where the value 1 stands for 1/10000th of a second, 10000 for 1 secondand 100000 for 10 seconds.    V4L2_CID_EXPOSURE_AUTO_PRIORITY boolean  WhenV4L2_CID_EXPOSURE_AUTO is set toAUTO orSHUTTER_PRIORITY,this control determines if the device may dynamically vary the framerate. By default this feature is disabled (0) and the frame rate mustremain constant.    V4L2_CID_PAN_RELATIVE integer  This control turns thecamera horizontally by the specified amount. The unit is undefined. Apositive value moves the camera to the right (clockwise when viewedfrom above), a negative value to the left. A value of zero does notcause motion.    V4L2_CID_TILT_RELATIVE integer  This control turns thecamera vertically by the specified amount. The unit is undefined. Apositive value moves the camera up, a negative value down. A value ofzero does not cause motion.    V4L2_CID_PAN_RESET boolean  When this control is setto TRUE (1), the camera moves horizontally to thedefault position.    V4L2_CID_TILT_RESET boolean  When this control is setto TRUE (1), the camera moves vertically to thedefault position.    V4L2_CID_PAN_ABSOLUTE integer  This controlturns the camera horizontally to the specified position. Positivevalues move the camera to the right (clockwise when viewed from above),negative values to the left. Drivers should interpret the values as arcseconds, with valid values between -180 * 3600 and +180 * 3600inclusive.    V4L2_CID_TILT_ABSOLUTE integer  This controlturns the camera vertically to the specified position. Positive valuesmove the camera up, negative values down. Drivers should interpret thevalues as arc seconds, with valid values between -180 * 3600 and +180* 3600 inclusive.    V4L2_CID_FOCUS_ABSOLUTE integer  This control sets thefocal point of the camera to the specified position. The unit isundefined. Positive values set the focus closer to the camera,negative values towards infinity.    V4L2_CID_FOCUS_RELATIVE integer  This control moves thefocal point of the camera by the specified amount. The unit isundefined. Positive values move the focus closer to the camera,negative values towards infinity.    V4L2_CID_FOCUS_AUTO boolean  Enables automatic focusadjustments. The effect of manual focus adjustments while this featureis enabled is undefined, drivers should ignore such requests.

1.10. Data Formats

1.10.1. Data Format Negotiation

Different devices exchange different kinds of data withapplications, for example video images, raw or sliced VBI data, RDSdatagrams. Even within one kind many different formats are possible,in particular an abundance of image formats. Although drivers mustprovide
a default and the selection persists across closing andreopening a device, applications should always negotiate a data formatbefore engaging in data exchange. Negotiation means the applicationasks for a particular format and the driver selects and reports
thebest the hardware can do to satisfy the request. Of courseapplications can also just query the current selection.

A single mechanism exists to negotiate all data formatsusing the aggregate struct v4l2_format and theVIDIOC_G_FMT andVIDIOC_S_FMT ioctls.
Additionally theVIDIOC_TRY_FMT ioctl can beused to examine what the hardwarecould do,without
actually selecting a new data format. The data formatssupported by the V4L2 API are covered in the respective device sectioninChapter 4. For a closer look at image formats seeChapter 2.

The VIDIOC_S_FMT ioctl is a majorturning-point in the initialization sequence. Prior to this pointmultiple panel applications can access the same device concurrently toselect the current input, change controls or modify other properties.The
first VIDIOC_S_FMT assigns a logical stream(video data, VBI data etc.) exclusively to one file descriptor.

Exclusive means no other application, more precisely noother file descriptor, can grab this stream or change deviceproperties inconsistent with the negotiated parameters. A videostandard change for example, when the new standard uses a differentnumber of scan
lines, can invalidate the selected image format.Therefore only the file descriptor owning the stream can makeinvalidating changes. Accordingly multiple file descriptors whichgrabbed different logical streams prevent each other from interferingwith their settings.
When for example video overlay is about to startor already in progress, simultaneous video capturing may be restrictedto the same cropping and image size.

When applications omit theVIDIOC_S_FMT ioctl its locking side effects areimplied by the next step, the selection of an I/O method with theVIDIOC_REQBUFSioctl
or implicit with the first read() orwrite() call.

Generally only one logical stream can be assigned to afile descriptor, the exception being drivers permitting simultaneousvideo capturing and overlay using the same file descriptor forcompatibility with V4L and earlier versions of V4L2. Switching thelogical
stream or returning into "panel mode" is possible by closingand reopening the device. Driversmay support aswitch usingVIDIOC_S_FMT.

All drivers exchanging data withapplications must support the VIDIOC_G_FMT andVIDIOC_S_FMT ioctl. Implementation of theVIDIOC_TRY_FMT is highly recommended butoptional.

1.10.2. Image Format Enumeration

Apart of the generic format negotiation functionsa special ioctl to enumerate all image formats supported by videocapture, overlay or output devices is available.[11]

The VIDIOC_ENUM_FMT ioctl must be supportedby all drivers exchanging image data
with applications.

Important: Drivers are not supposed to convert image formats inkernel space. They must enumerate only formats directly supported bythe hardware. If necessary driver writers should publish an exampleconversion routine or library for integration into applications.

1.11. Image Cropping, Insertion and Scaling

Some video capture devices can sample a subsection of thepicture and shrink or enlarge it to an image of arbitrary size. Wecall these abilities cropping and scaling. Some video output devicescan scale an image up or down and insert it at an arbitrary scan lineand
horizontal offset into a video signal.

Applications can use the following API to select an area inthe video signal, query the default area and the hardware limits.Despite their name, theVIDIOC_CROPCAP,VIDIOC_G_CROPandVIDIOC_S_CROP ioctls apply to input as well as outputdevices.

Scaling requires a source and a target. On a video captureor overlay device the source is the video signal, and the croppingioctls determine the area actually sampled. The target are imagesread by the application or overlaid onto the graphics screen. Theirsize
(and position for an overlay) is negotiated with theVIDIOC_G_FMT andVIDIOC_S_FMT ioctls.

On a video output device the source are the images passed inby the application, and their size is again negotiated with theVIDIOC_G/S_FMT ioctls, or may be encoded in acompressed video stream. The target is the video signal, and
thecropping ioctls determine the area where the images areinserted.

Source and target rectangles are defined even if the devicedoes not support scaling or theVIDIOC_G/S_CROPioctls. Their size (and position where applicable) will be fixed inthis case.All capture and output device must support theVIDIOC_CROPCAP ioctl such that applications candetermine if scaling takes place.

1.11.1. Cropping Structures

Figure 1-1. Image Cropping, Insertion and Scaling

For capture devices the coordinates of the top leftcorner, width and height of the area which can be sampled is given bythebounds substructure of thestruct v4l2_cropcap returned
by theVIDIOC_CROPCAPioctl. To support a wide range of hardware this specification does notdefine an origin or units. However by convention drivers shouldhorizontally count unscaled samples relative to 0H (the leading edgeof the
horizontal sync pulse, see Figure 4-1).Vertically ITU-R linenumbers of the first field (Figure 4-2,Figure 4-3), multiplied by two if the driver can capture bothfields.

The top left corner, width and height of the sourcerectangle, that is the area actually sampled, is given by struct v4l2_cropusing
the same coordinate system as struct v4l2_cropcap. Applications canuse the VIDIOC_G_CROP andVIDIOC_S_CROP ioctls
to get and set thisrectangle. It must lie completely within the capture boundaries andthe driver may further adjust the requested size and/or positionaccording to hardware limitations.

Each capture device has a default source rectangle, givenby the defrect substructure ofstruct v4l2_cropcap.
The center of this rectangle shall align with thecenter of the active picture area of the video signal, and cover whatthe driver writer considers the complete picture. Drivers shall resetthe source rectangle to the default when the driver is first loaded,but
not later.

For output devices these structures and ioctls are usedaccordingly, defining thetarget rectangle wherethe images will be inserted into the video signal.

Video hardware can have various cropping, insertion andscaling limitations. It may only scale up or down, support onlydiscrete scaling factors, or have different scaling abilities inhorizontal and vertical direction. Also it may not support scaling atall. At
the same time the struct v4l2_crop rectangle may have to bealigned, and both the source and target rectangles may
have arbitraryupper and lower size limits. In particular the maximumwidth andheightin struct v4l2_crop may
be smaller than thestruct v4l2_cropcap.bounds area. Therefore, asusual, drivers
are expected to adjust the requested parameters andreturn the actual values selected.

Applications can change the source or the target rectanglefirst, as they may prefer a particular image size or a certain area inthe video signal. If the driver has to adjust both to satisfy hardwarelimitations, the last requested rectangle shall take priority,
and thedriver should preferably adjust the opposite one. The VIDIOC_TRY_FMTioctl
however shall not change the driver state and therefore onlyadjust the requested rectangle.

Suppose scaling on a video capture device is restricted toa factor 1:1 or 2:1 in either direction and the target image size mustbe a multiple of 16 × 16 pixels. The source croppingrectangle is set to defaults, which are also the upper limit in thisexample,
of 640 × 400 pixels at offset 0, 0. Anapplication requests an image size of 300 × 225pixels, assuming video will be scaled down from the "full picture"accordingly. The driver sets the image size to the closest possiblevalues 304 × 224, then chooses the cropping
rectangleclosest to the requested size, that is 608 × 224(224 × 2:1 would exceed the limit 400). The offset0, 0 is still valid, thus unmodified. Given the default croppingrectangle reported byVIDIOC_CROPCAP theapplication can
easily propose another offset to center the croppingrectangle.

Now the application may insist on covering an area using apicture aspect ratio closer to the original request, so it asks for acropping rectangle of 608 × 456 pixels. The presentscaling factors limit cropping to 640 × 384, so thedriver returns the cropping
size 608 × 384 and adjuststhe image size to closest possible 304 × 192.

1.11.3. Examples

Source and target rectangles shall remain unchanged acrossclosing and reopening a device, such that piping data into or out of adevice will work without special preparations. More advancedapplications should ensure the parameters are suitable before startingI/O.

Example 1-10. Resetting the cropping parameters

(A video capture device is assumed; changeV4L2_BUF_TYPE_VIDEO_CAPTURE for otherdevices.)

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
perror ("VIDIOC_CROPCAP");
exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
crop.c = cropcap.defrect;

/* Ignore if cropping is not supported (EINVAL). */

if (-1 == ioctl (fd, VIDIOC_S_CROP, &crop)
&& errno != EINVAL) {
perror ("VIDIOC_S_CROP");
exit (EXIT_FAILURE);
}

Example 1-11. Simple downscaling

(A video capture device is assumed.)

struct v4l2_cropcap cropcap;
struct v4l2_format format;

reset_cropping_parameters ();

/* Scale down to 1/4 size of full picture. */

memset (&format, 0, sizeof (format)); /* defaults */

format.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

format.fmt.pix.width = cropcap.defrect.width >> 1;
format.fmt.pix.height = cropcap.defrect.height >> 1;
format.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;

if (-1 == ioctl (fd, VIDIOC_S_FMT, &format)) {
perror ("VIDIOC_S_FORMAT");
exit (EXIT_FAILURE);
}

/* We could check the actual image size now, the actual scaling factor
or if the driver can scale at all. */

Example 1-12. Selecting an output area

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
perror ("VIDIOC_CROPCAP");
exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));

crop.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;
crop.c = cropcap.defrect;

/* Scale the width and height to 50 % of their original size
and center the output. */

crop.c.width /= 2;
crop.c.height /= 2;
crop.c.left += crop.c.width / 2;
crop.c.top += crop.c.height / 2;

/* Ignore if cropping is not supported (EINVAL). */

if (-1 == ioctl (fd, VIDIOC_S_CROP, &crop)
&& errno != EINVAL) {
perror ("VIDIOC_S_CROP");
exit (EXIT_FAILURE);
}

Example 1-13. Current scaling factor and pixel aspect

(A video capture device is assumed.)

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;
struct v4l2_format format;
double hscale, vscale;
double aspect;
int dwidth, dheight;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
perror ("VIDIOC_CROPCAP");
exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_G_CROP, &crop)) {
if (errno != EINVAL) {
perror ("VIDIOC_G_CROP");
exit (EXIT_FAILURE);
}

/* Cropping not supported. */
crop.c = cropcap.defrect;
}

memset (&format, 0, sizeof (format));
format.fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_G_FMT, &format)) {
perror ("VIDIOC_G_FMT");
exit (EXIT_FAILURE);
}

/* The scaling applied by the driver. */

hscale = format.fmt.pix.width / (double) crop.c.width;
vscale = format.fmt.pix.height / (double) crop.c.height;

aspect = cropcap.pixelaspect.numerator /
(double) cropcap.pixelaspect.denominator;
aspect = aspect * hscale / vscale;

/* Devices following ITU-R BT.601 do not capture
square pixels. For playback on a computer monitor
we should scale the images to this size. */

dwidth = format.fmt.pix.width / aspect;
dheight = format.fmt.pix.height;

1.12. Streaming Parameters

Streaming parameters are intended to optimize the videocapture process as well as I/O. Presently applications can request ahigh quality capture mode with theVIDIOC_S_PARM ioctl.

The current video standard determines a nominal number offrames per second. If less than this number of frames is to becaptured or output, applications can request frame skipping orduplicating on the driver side. This is especially useful when usingtheread() orwrite(),
which are not augmented by timestampsor sequence counters, and to avoid unneccessary data copying.

Finally these ioctls can be used to determine the number ofbuffers used internally by a driver in read/write mode. Forimplications see the section discussing theread()function.

To get and set the streaming parameters applications callthe VIDIOC_G_PARM and VIDIOC_S_PARM ioctl,
respectively. They takea pointer to a struct v4l2_streamparm, which contains a union holdingseparate parameters
for input and output devices.

These ioctls are optional, drivers need not implementthem. If so, they return theEINVAL error code.

Chapter 2. Image Formats

The V4L2 API was primarily designed for devices exchangingimage data with applications. Thev4l2_pix_format structure defines the formatand layout of an image in memory. Image formats are negotiated withtheVIDIOC_S_FMT ioctl.
(The explanations here focus on videocapturing and output, for overlay frame buffer formats see alsoVIDIOC_G_FBUF.)

Table 2-1. struct v4l2_pix_format

__u32widthImage width in pixels.__u32heightImage height in pixels.Applications set these fields torequest an image size, drivers return the closest possible values. Incase of planar formats thewidth andheight applies to the largest plane. Toavoid ambiguities drivers must return values rounded up to a multipleof the scale factor of any smaller planes. For example when the imageformat is YUV 4:2:0,width andheight must be multiples of two.__u32pixelformatThe pixel format or type of compression, set by theapplication. This is a little endianfour character code. V4L2 definesstandard RGB formats inTable 2-1, YUV formats inSection 2.5, and reserved codes inTable 2-8enum v4l2_fieldfieldVideo images are typically interlaced. Applicationscan request to capture or output only the top or bottom field, or bothfields interlaced or sequentially stored in one buffer or alternatingin separate buffers. Drivers return the actual field order selected.For details see Section 3.6.__u32bytesperlineDistance in bytes between the leftmost pixels in twoadjacent lines.  Both applications and driverscan set this field to request padding bytes at the end of each line.Drivers however may ignore the value requested by the application,returningwidth times bytes per pixel or alarger value required by the hardware. That implies applications canjust set this field to zero to get a reasonabledefault.  Video hardware may access padding bytes,therefore they must reside in accessible memory. Consider cases wherepadding bytes after the last line of an image cross a system pageboundary. Input devices may write padding bytes, the value isundefined. Output devices ignore the contents of paddingbytes.  When the image format is planar thebytesperline value applies to the largestplane and is divided by the same factor as thewidth field for any smaller planes. Forexample the Cb and Cr planes of a YUV 4:2:0 image have half as manypadding bytes following each line as the Y plane. To avoid ambiguitiesdrivers must return abytesperline valuerounded up to a multiple of the scale factor. __u32sizeimageSize in bytes of the buffer to hold a complete image,set by the driver. Usually this isbytesperline timesheight. When the image consists of variablelength compressed data this is the maximum number of bytes required tohold an image.enum v4l2_colorspacecolorspaceThis information supplements thepixelformat and must be set by the driver,seeSection 2.2.__u32privReserved for custom (driver defined) additionalinformation about formats. When not used drivers and applications mustset this field to zero.

2.1. Standard Image Formats

In order to exchange images between drivers andapplications, it is necessary to have standard image data formatswhich both sides will interpret the same way. V4L2 includes severalsuch formats, and this section is intended to be an unambiguousspecification of
the standard image data formats in V4L2.

V4L2 drivers are not limited to these formats, however.Driver-specific formats are possible. In that case the application maydepend on a codec to convert images to one of the standard formatswhen needed. But the data can still be stored and retrieved in theproprietary
format. For example, a device may support a proprietarycompressed format. Applications can still capture and save the data inthe compressed format, saving much disk space, and later use a codecto convert the images to the X Windows screen format when the video
isto be displayed.

Even so, ultimately, some standard formats are needed, sothe V4L2 specification would not be complete without well-definedstandard formats.

The V4L2 standard formats are mainly uncompressed formats. Thepixels are always arranged in memory from left to right, and from topto bottom. The first byte of data in the image buffer is always forthe leftmost pixel of the topmost row. Following that is the
pixelimmediately to its right, and so on until the end of the top row ofpixels. Following the rightmost pixel of the row there may be zero ormore bytes of padding to guarantee that each row of pixel data has acertain alignment. Following the pad bytes, if
any, is data for theleftmost pixel of the second row from the top, and so on. The last rowhas just as many pad bytes after it as the other rows.

In V4L2 each format has an identifier which looks likePIX_FMT_XXX, defined in thevideodev.h header
file. These identifiersrepresentfour character codeswhich are also listed below, however they are not the same
as thoseused in the Windows world.

2.2. Colorspaces

[intro]

Gamma Correction

[to do]
E'R = f(R)
E'G = f(G)
E'B = f(B)

Construction of luminance and color-differencesignals

[to do]
E'Y =CoeffR E'R+ CoeffG E'G+ CoeffB E'B
(E'R - E'Y) = E'R- CoeffR E'R- CoeffG E'G- CoeffB E'B
(E'B - E'Y) = E'B- CoeffR E'R- CoeffG E'G- CoeffB E'B

Re-normalized color-difference signals

The color-difference signals are scaled back to unityrange [-0.5;+0.5]:
KB = 0.5 / (1 - CoeffB)
KR = 0.5 / (1 - CoeffR)
PB =KB (E'B - E'Y) = 0.5 (CoeffR / CoeffB) E'R+ 0.5 (CoeffG / CoeffB) E'G+ 0.5 E'B
PR =KR (E'R - E'Y) = 0.5 E'R+ 0.5 (CoeffG / CoeffR) E'G+ 0.5 (CoeffB / CoeffR) E'B

Quantization

[to do]
Y' = (Lum. Levels - 1) · E'Y + Lum. Offset
CB = (Chrom. Levels - 1)· PB + Chrom. Offset
CR = (Chrom. Levels - 1)· PR + Chrom. Offset
Rounding to the nearest integer and clamping to the range[0;255] finally yields the digital color components Y'CbCrstored in YUV images.

Example 2-1. ITU-R Rec. BT.601 color conversion

Forward Transformation

int ER, EG, EB;         /* gamma corrected RGB input [0;255] */
int Y1, Cb, Cr;         /* output [0;255] */

double r, g, b;         /* temporaries */
double y1, pb, pr;

int
clamp (double x)
{
int r = x;      /* round to nearest */

if (r < 0)         return 0;
else if (r > 255)  return 255;
else               return r;
}

r = ER / 255.0;
g = EG / 255.0;
b = EB / 255.0;

y1  =  0.299  * r + 0.587 * g + 0.114  * b;
pb  = -0.169  * r - 0.331 * g + 0.5    * b;
pr  =  0.5    * r - 0.419 * g - 0.081  * b;

Y1 = clamp (219 * y1 + 16);
Cb = clamp (224 * pb + 128);
Cr = clamp (224 * pr + 128);

/* or shorter */

y1 = 0.299 * ER + 0.587 * EG + 0.114 * EB;

Y1 = clamp ( (219 / 255.0)                    *       y1  + 16);
Cb = clamp (((224 / 255.0) / (2 - 2 * 0.114)) * (EB - y1) + 128);
Cr = clamp (((224 / 255.0) / (2 - 2 * 0.299)) * (ER - y1) + 128);

Inverse Transformation

int Y1, Cb, Cr;         /* gamma pre-corrected input [0;255] */
int ER, EG, EB;         /* output [0;255] */

double r, g, b;         /* temporaries */
double y1, pb, pr;

int
clamp (double x)
{
int r = x;      /* round to nearest */

if (r < 0)         return 0;
else if (r > 255)  return 255;
else               return r;
}

y1 = (255 / 219.0) * (Y1 - 16);
pb = (255 / 224.0) * (Cb - 128);
pr = (255 / 224.0) * (Cr - 128);

r = 1.0 * y1 + 0     * pb + 1.402 * pr;
g = 1.0 * y1 - 0.344 * pb - 0.714 * pr;
b = 1.0 * y1 + 1.772 * pb + 0     * pr;

ER = clamp (r * 255); /* [ok? one should prob. limit y1,pb,pr] */
EG = clamp (g * 255);
EB = clamp (b * 255);

Table 2-2. enum v4l2_colorspace

IdentifierValueDescriptionChromaticities[a]White PointGamma CorrectionLuminance E'YQuantizationRedGreenBlueY'Cb, CrV4L2_COLORSPACE_SMPTE170M1NTSC/PAL according toSMPTE 170M,ITU BT.601x = 0.630, y = 0.340x = 0.310, y = 0.595x = 0.155, y = 0.070x = 0.3127, y = 0.3290, Illuminant D65E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I0.299 E'R+ 0.587 E'G+ 0.114 E'B219 E'Y + 16224 PB,R + 128V4L2_COLORSPACE_SMPTE240M21125-Line (US) HDTV, see SMPTE 240Mx = 0.630, y = 0.340x = 0.310, y = 0.595x = 0.155, y = 0.070x = 0.3127, y = 0.3290, Illuminant D65E' = 4 I for I ≤0.0228,1.1115 I0.45 - 0.1115 for 0.0228 < I0.212 E'R+ 0.701 E'G+ 0.087 E'B219 E'Y + 16224 PB,R + 128V4L2_COLORSPACE_REC7093HDTV and modern devices, see ITU BT.709x = 0.640, y = 0.330x = 0.300, y = 0.600x = 0.150, y = 0.060x = 0.3127, y = 0.3290, Illuminant D65E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I0.2125 E'R+ 0.7154 E'G+ 0.0721 E'B219 E'Y + 16224 PB,R + 128V4L2_COLORSPACE_BT8784Broken Bt878 extents[b],ITU BT.601?????0.299 E'R+ 0.587 E'G+ 0.114 E'B237 E'Y + 16224 PB,R + 128 (probably)V4L2_COLORSPACE_470_SYSTEM_M5M/NTSC[c] according toITU BT.470,ITU BT.601x = 0.67, y = 0.33x = 0.21, y = 0.71x = 0.14, y = 0.08x = 0.310, y = 0.316, Illuminant C?0.299 E'R+ 0.587 E'G+ 0.114 E'B219 E'Y + 16224 PB,R + 128V4L2_COLORSPACE_470_SYSTEM_BG6625-line PAL and SECAM systems according toITU BT.470, ITU BT.601x = 0.64, y = 0.33x = 0.29, y = 0.60x = 0.15, y = 0.06x = 0.313, y = 0.329,Illuminant D65?0.299 E'R+ 0.587 E'G+ 0.114 E'B219 E'Y + 16224 PB,R + 128V4L2_COLORSPACE_JPEG7JPEG Y'CbCr, see JFIF,ITU BT.601?????0.299 E'R+ 0.587 E'G+ 0.114 E'B256 E'Y + 16[d]256 PB,R + 128V4L2_COLORSPACE_SRGB8[?]x = 0.640, y = 0.330x = 0.300, y = 0.600x = 0.150, y = 0.060x = 0.3127, y = 0.3290, Illuminant D65E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < In/aNotes: a. The coordinates of the color primaries aregiven in the CIE system (1931) b. The ubiquitous Bt878 video capture chipquantizes E'Y to 238 levels, yielding a rangeof Y' = 16 … 253, unlike Rec. 601 Y' = 16 …235. This is not a typo in the Bt878 documentation, it has beenimplemented in silicon. The chroma extents are unclear. c. No identifier exists for M/PAL which usesthe chromaticities of M/NTSC, the remaining parameters are equal to B andG/PAL. d. Note JFIF quantizesY'PBPR in range [0;+1] and[-0.5;+0.5] to257 levels, however Y'CbCr signalsare still clamped to [0;255].

2.3. Indexed Format

In this format each pixel is represented by an 8 bit indexinto a 256 entry ARGB palette. It is intended forVideo Output Overlays only.
There are no ioctls toaccess the palette, this must be done with ioctls of the Linux framebuffer API.

Table 2-3. Indexed Image Format

IdentifierCode Byte 0                            Bit76543210                          V4L2_PIX_FMT_PAL8'PAL8' i7i6i5i4i3i2i1i0

2.4. RGB Formats

Packed RGB formats -- Packed RGB formats

V4L2_PIX_FMT_SBGGR8 ('BA81') -- Bayer RGB format

V4L2_PIX_FMT_SBGGR16 ('BA82') -- Bayer RGB format

Packed RGB formats

Name
Packed RGB formats -- Packed RGB formats

Description

These formats are designed to match the pixel formats oftypical PC graphics frame buffers. They occupy 8, 16, 24 or 32 bitsper pixel. These are all packed-pixel formats, meaning all the datafor a pixel lie next to each other in memory.

When one of these formats is used, drivers shall report thecolorspace V4L2_COLORSPACE_SRGB.

Table 2-1. Packed RGB Image Formats

IdentifierCode Byte 0 in memory Byte 1 Byte 2 Byte 3  Bit76543210 76543210 76543210 76543210V4L2_PIX_FMT_RGB332'RGB1' b1b0g2g1g0r2r1r0                          V4L2_PIX_FMT_RGB444'R444' g3g2g1g0b3b2b1b0 a3a2a1a0r3r2r1r0                 V4L2_PIX_FMT_RGB555'RGBO' g2g1g0r4r3r2r1r0 ab4b3b2b1b0g4g3                 V4L2_PIX_FMT_RGB565'RGBP' g2g1g0r4r3r2r1r0 b4b3b2b1b0g5g4g3                 V4L2_PIX_FMT_RGB555X'RGBQ' ab4b3b2b1b0g4g3 g2g1g0r4r3r2r1r0                 V4L2_PIX_FMT_RGB565X'RGBR' b4b3b2b1b0g5g4g3 g2g1g0r4r3r2r1r0                 V4L2_PIX_FMT_BGR24'BGR3' b7b6b5b4b3b2b1b0 g7g6g5g4g3g2g1g0 r7r6r5r4r3r2r1r0        V4L2_PIX_FMT_RGB24'RGB3' r7r6r5r4r3r2r1r0 g7g6g5g4g3g2g1g0 b7b6b5b4b3b2b1b0        V4L2_PIX_FMT_BGR32'BGR4' b7b6b5b4b3b2b1b0 g7g6g5g4g3g2g1g0 r7r6r5r4r3r2r1r0 a7a6a5a4a3a2a1a0V4L2_PIX_FMT_RGB32'RGB4' r7r6r5r4r3r2r1r0 g7g6g5g4g3g2g1g0 b7b6b5b4b3b2b1b0 a7a6a5a4a3a2a1a0

Bit 7 is the most significant bit. The value of a = alphabits is undefined when reading from the driver, ignored when writingto the driver, except when alpha blending has been negotiated for aVideo Overlay or Video Output Overlay.

Example 2-1. V4L2_PIX_FMT_BGR24 4 × 4 pixelimage

Byte Order. Each cell is one byte.

start + 0:B00G00R00B01G01R01B02G02R02B03G03R03start + 12:B10G10R10B11G11R11B12G12R12B13G13R13start + 24:B20G20R20B21G21R21B22G22R22B23G23R23start + 36:B30G30R30B31G31R31B32G32R32B33G33R33

Important: Drivers may interpret these formats differently.

Some RGB formats above are uncommon and were probablydefined in error. Drivers may interpret them as inTable 2-2.

Table 2-2. Packed RGB Image Formats (corrected)

IdentifierCode Byte 0 in memory Byte 1 Byte 2 Byte 3  Bit76543210 76543210 76543210 76543210V4L2_PIX_FMT_RGB332'RGB1' r2r1r0g2g1g0b1b0                          V4L2_PIX_FMT_RGB444'R444' g3g2g1g0b3b2b1b0 a3a2a1a0r3r2r1r0                 V4L2_PIX_FMT_RGB555'RGBO' g2g1g0b4b3b2b1b0 ar4r3r2r1r0g4g3                 V4L2_PIX_FMT_RGB565'RGBP' g2g1g0b4b3b2b1b0 r4r3r2r1r0g5g4g3                 V4L2_PIX_FMT_RGB555X'RGBQ' ar4r3r2r1r0g4g3 g2g1g0b4b3b2b1b0                 V4L2_PIX_FMT_RGB565X'RGBR' r4r3r2r1r0g5g4g3 g2g1g0b4b3b2b1b0                 V4L2_PIX_FMT_BGR24'BGR3' b7b6b5b4b3b2b1b0 g7g6g5g4g3g2g1g0 r7r6r5r4r3r2r1r0        V4L2_PIX_FMT_RGB24'RGB3' r7r6r5r4r3r2r1r0 g7g6g5g4g3g2g1g0 b7b6b5b4b3b2b1b0        V4L2_PIX_FMT_BGR32'BGR4' b7b6b5b4b3b2b1b0 g7g6g5g4g3g2g1g0 r7r6r5r4r3r2r1r0 a7a6a5a4a3a2a1a0V4L2_PIX_FMT_RGB32'RGB4' a7a6a5a4a3a2a1a0 r7r6r5r4r3r2r1r0 g7g6g5g4g3g2g1g0 b7b6b5b4b3b2b1b0

A test utility to determine which RGB formats a driveractually supports is available from the LinuxTV v4l-dvb repository.Seehttp://linuxtv.org/repo/ for
access instructions.

V4L2_PIX_FMT_SBGGR8 ('BA81')

Name

V4L2_PIX_FMT_SBGGR8 -- Bayer RGB format

Description

This is commonly the native format of digital cameras,reflecting the arrangement of sensors on the CCD device. Only one red,green or blue value is given for each pixel. Missing components mustbe interpolated from neighbouring pixels. From left to right the
firstrow consists of a blue and green value, the second row of a green andred value. This scheme repeats to the right and down for every twocolumns and rows.

Example 2-1. V4L2_PIX_FMT_SBGGR8 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:B00G01B02G03start + 4:G10R11G12R13start + 8:B20G21B22G23start + 12:G30R31G32R33

V4L2_PIX_FMT_SBGGR16 ('BA82')

Name

V4L2_PIX_FMT_SBGGR16 -- Bayer RGB format

Description

This format is similar to V4L2_PIX_FMT_SBGGR8, except each pixel hasa depth
of 16 bits. The least significant byte is stored at lowermemory addresses (little-endian). Note the actual sampling precisionmay be lower than 16 bits, for example 10 bits per pixel with valuesin range 0 to 1023.

Example 2-1. V4L2_PIX_FMT_SBGGR16 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:B00lowB00highG01lowG01highB02lowB02highG03lowG03highstart + 8:G10lowG10highR11lowR11highG12lowG12highR13lowR13highstart + 16:B20lowB20highG21lowG21highB22lowB22highG23lowG23highstart + 24:G30lowG30highR31lowR31highG32lowG32highR33lowR33high

2.5. YUV Formats

Packed YUV formats -- Packed YUV formats

V4L2_PIX_FMT_GREY ('GREY') -- Grey-scale image

V4L2_PIX_FMT_Y16 ('Y16 ') -- Grey-scale image

V4L2_PIX_FMT_YUYV ('YUYV') -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2

V4L2_PIX_FMT_UYVY ('UYVY') -- Variation of
V4L2_PIX_FMT_YUYV with different order of samplesin memory

V4L2_PIX_FMT_Y41P ('Y41P') -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1

V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12') -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0

V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9') -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0

V4L2_PIX_FMT_YUV422P ('422P') -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to
V4L2_PIX_FMT_YUYV

V4L2_PIX_FMT_YUV411P ('411P') -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to
V4L2_PIX_FMT_Y41P

V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21') -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to
V4L2_PIX_FMT_YVU420

YUV is the format native to TV broadcast and composite videosignals. It separates the brightness information (Y) from the colorinformation (U and V or Cb and Cr). The color information consists ofred and bluecolor difference signals, this waythe green component can be reconstructed by subtracting from thebrightness component. SeeSection
2.2 for conversionexamples. YUV was chosen because early television would only transmitbrightness information. To add color in a way compatible with existingreceivers a new signal carrier was added to transmit the colordifference signals. Secondary in
the YUV format the U and V componentsusually have lower resolution than the Y component. This is an analogvideo compression technique taking advantage of a property of thehuman visual system, being more sensitive to brightnessinformation.

Packed YUV formats

Name
Packed YUV formats -- Packed YUV formats

Description

Similar to the packed RGB formats these formats storethe Y, Cb and Cr component of each pixel in one 16 or 32 bitword.

Table 2-1. Packed YUV Image Formats

IdentifierCode Byte 0 in memory Byte 1 Byte 2 Byte 3  Bit76543210 76543210 76543210 76543210V4L2_PIX_FMT_YUV444'Y444' Cb3Cb2Cb1Cb0Cr3Cr2Cr1Cr0 a3a2a1a0Y'3Y'2Y'1Y'0                 V4L2_PIX_FMT_YUV555'YUVO' Cb2Cb1Cb0Cr4Cr3Cr2Cr1Cr0 aY'4Y'3Y'2Y'1Y'0Cb4Cb3                 V4L2_PIX_FMT_YUV565'YUVP' Cb2Cb1Cb0Cr4Cr3Cr2Cr1Cr0 Y'4Y'3Y'2Y'1Y'0Cb5Cb4Cb3                 V4L2_PIX_FMT_YUV32'YUV4' a7a6a5a4a3a2a1a0 Y'7Y'6Y'5Y'4Y'3Y'2Y'1Y'0 Cb7Cb6Cb5Cb4Cb3Cb2Cb1Cb0 Cr7Cr6Cr5Cr4Cr3Cr2Cr1Cr0

Bit 7 is the most significant bit. The value of a = alphabits is undefined when reading from the driver, ignored when writingto the driver, except when alpha blending has been negotiated for aVideo Overlay or Video Output Overlay.

V4L2_PIX_FMT_GREY ('GREY')

Name

V4L2_PIX_FMT_GREY -- Grey-scale image

Description

This is a grey-scale image. It is really a degenerateY'CbCr format which simply contains no Cb or Cr data.

Example 2-1. V4L2_PIX_FMT_GREY 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Y'01Y'02Y'03start + 4:Y'10Y'11Y'12Y'13start + 8:Y'20Y'21Y'22Y'23start + 12:Y'30Y'31Y'32Y'33

V4L2_PIX_FMT_Y16 ('Y16 ')

Name

V4L2_PIX_FMT_Y16 -- Grey-scale image

Description

This is a grey-scale image with a depth of 16 bits perpixel. The least significant byte is stored at lower memory addresses(little-endian). Note the actual sampling precision may be lower than16 bits, for example 10 bits per pixel with values in range 0 to1023.

Example 2-1. V4L2_PIX_FMT_Y16 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00lowY'00highY'01lowY'01highY'02lowY'02highY'03lowY'03highstart + 8:Y'10lowY'10highY'11lowY'11highY'12lowY'12highY'13lowY'13highstart + 16:Y'20lowY'20highY'21lowY'21highY'22lowY'22highY'23lowY'23highstart + 24:Y'30lowY'30highY'31lowY'31highY'32lowY'32highY'33lowY'33high

V4L2_PIX_FMT_YUYV ('YUYV')

Name

V4L2_PIX_FMT_YUYV -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2

Description

In this format each four bytes is two pixels. Each fourbytes is two Y's, a Cb and a Cr. Each Y goes to one of the pixels, andthe Cb and Cr belong to both pixels. As you can see, the Cr and Cbcomponents have half the horizontal resolution of the Y component.V4L2_PIX_FMT_YUYVis
known in the Windowsenvironment as YUY2.

Example 2-1. V4L2_PIX_FMT_YUYV 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Cb00Y'01Cr00Y'02Cb01Y'03Cr01start + 8:Y'10Cb10Y'11Cr10Y'12Cb11Y'13Cr11start + 16:Y'20Cb20Y'21Cr20Y'22Cb21Y'23Cr21start + 24:Y'30Cb30Y'31Cr30Y'32Cb31Y'33Cr31

Color Sample Location.

0 1 2 30YCY YCY1YCY YCY2YCY YCY3YCY YCY

V4L2_PIX_FMT_UYVY ('UYVY')

Name

V4L2_PIX_FMT_UYVY -- Variation of
V4L2_PIX_FMT_YUYV with different order of samplesin memory

Description

In this format each four bytes is two pixels. Each fourbytes is two Y's, a Cb and a Cr. Each Y goes to one of the pixels, andthe Cb and Cr belong to both pixels. As you can see, the Cr and Cbcomponents have half the horizontal resolution of the Ycomponent.

Example 2-1. V4L2_PIX_FMT_UYVY 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Cb00Y'00Cr00Y'01Cb01Y'02Cr01Y'03start + 8:Cb10Y'10Cr10Y'11Cb11Y'12Cr11Y'13start + 16:Cb20Y'20Cr20Y'21Cb21Y'22Cr21Y'23start + 24:Cb30Y'30Cr30Y'31Cb31Y'32Cr31Y'33

Color Sample Location.

0 1 2 30YCY YCY1YCY YCY2YCY YCY3YCY YCY

V4L2_PIX_FMT_Y41P ('Y41P')

Name

V4L2_PIX_FMT_Y41P -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1

Description

In this format each 12 bytes is eight pixels. In thetwelve bytes are two CbCr pairs and eight Y's. The first CbCr pairgoes with the first four Y's, and the second CbCr pair goes with theother four Y's. The Cb and Cr components have one fourth thehorizontal
resolution of the Y component.

Do not confuse this format with V4L2_PIX_FMT_YUV411P. Y41P is derived from
"YUV 4:1:1packed", whileYUV411P stands for "YUV 4:1:1planar".

Example 2-1. V4L2_PIX_FMT_Y41P 8 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Cb00Y'00Cr00Y'01Cb01Y'02Cr01Y'03Y'04Y'05Y'06Y'07start + 12:Cb10Y'10Cr10Y'11Cb11Y'12Cr11Y'13Y'14Y'15Y'16Y'17start + 24:Cb20Y'20Cr20Y'21Cb21Y'22Cr21Y'23Y'24Y'25Y'26Y'27start + 36:Cb30Y'30Cr30Y'31Cb31Y'32Cr31Y'33Y'34Y'35Y'36Y'37

Color Sample Location.

0 1 2 3 4 5 6 70Y YCY Y Y YCY Y1Y YCY Y Y YCY Y2Y YCY Y Y YCY Y3Y YCY Y Y YCY Y

V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12')

Name

V4L2_PIX_FMT_YVU420,
V4L2_PIX_FMT_YUV420 -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0

Description

These are planar formats, as opposed to a packed format.The three components are separated into three sub- images or planes.The Y plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_YVU420, the Cr plane immediatelyfollows
the Y plane in memory. The Cr plane is half the width and halfthe height of the Y plane (and of the image). Each Cr belongs to fourpixels, a two-by-two square of the image. For example,Cr0 belongs to Y'00,Y'01, Y'10,
andY'11. Following the Cr plane is the Cb plane,just like the Cr plane.V4L2_PIX_FMT_YUV420 isthe same except the Cb plane comes first, then the Cr plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have half as many pad bytes after their rows. In otherwords, two Cx rows (including padding) is exactly as long as one Y row(including padding).

Example 2-1. V4L2_PIX_FMT_YVU420 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Y'01Y'02Y'03start + 4:Y'10Y'11Y'12Y'13start + 8:Y'20Y'21Y'22Y'23start + 12:Y'30Y'31Y'32Y'33start + 16:Cr00Cr01  start + 18:Cr10Cr11  start + 20:Cb00Cb01  start + 22:Cb10Cb11

Color Sample Location.

0 1 2 30Y Y Y Y  C   C 1Y Y Y Y       2Y Y Y Y  C   C 3Y Y Y Y

V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9')

Name

V4L2_PIX_FMT_YVU410,
V4L2_PIX_FMT_YUV410 -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0

Description

These are planar formats, as opposed to a packed format.The three components are separated into three sub-images or planes.The Y plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_YVU410, the Cr plane immediatelyfollows
the Y plane in memory. The Cr plane is ¼ the width and¼ the height of the Y plane (and of the image). Each Cr belongsto 16 pixels, a four-by-four square of the image. Following the Crplane is the Cb plane, just like the Cr plane.V4L2_PIX_FMT_YUV410 is
the same, except the Cbplane comes first, then the Cr plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have ¼ as many pad bytes after their rows. Inother words, four Cx rows (including padding) are exactly as long asone Y row (including padding).

Example 2-1. V4L2_PIX_FMT_YVU410 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Y'01Y'02Y'03start + 4:Y'10Y'11Y'12Y'13start + 8:Y'20Y'21Y'22Y'23start + 12:Y'30Y'31Y'32Y'33start + 16:Cr00   start + 17:Cb00

Color Sample Location.

0 1 2 30Y Y Y Y       1Y Y Y Y    C   2Y Y Y Y       3Y Y Y Y

V4L2_PIX_FMT_YUV422P ('422P')

Name

V4L2_PIX_FMT_YUV422P -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to
V4L2_PIX_FMT_YUYV

Description

This format is not commonly used. This is a planarversion of the YUYV format. The three components are separated intothree sub-images or planes. The Y plane is first. The Y plane has onebyte per pixel. The Cb plane immediately follows the Y plane inmemory.
The Cb plane is half the width of the Y plane (and of theimage). Each Cb belongs to two pixels. For example,Cb0 belongs to Y'00,Y'01. Following the Cb plane is the Cr plane,just like the Cb plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have half as many pad bytes after their rows. In otherwords, two Cx rows (including padding) is exactly as long as one Y row(including padding).

Example 2-1. V4L2_PIX_FMT_YUV422P 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Y'01Y'02Y'03start + 4:Y'10Y'11Y'12Y'13start + 8:Y'20Y'21Y'22Y'23start + 12:Y'30Y'31Y'32Y'33start + 16:Cb00Cb01  start + 18:Cb10Cb11  start + 20:Cb20Cb21  start + 22:Cb30Cb31  start + 24:Cr00Cr01  start + 26:Cr10Cr11  start + 28:Cr20Cr21  start + 30:Cr30Cr31

Color Sample Location.

0 1 2 30YCY YCY1YCY YCY2YCY YCY3YCY YCY

V4L2_PIX_FMT_YUV411P ('411P')

Name

V4L2_PIX_FMT_YUV411P -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to
V4L2_PIX_FMT_Y41P

Description

This format is not commonly used. This is a planarformat similar to the 4:2:2 planar format except with half as manychroma. The three components are separated into three sub-images orplanes. The Y plane is first. The Y plane has one byte per pixel. TheCb plane
immediately follows the Y plane in memory. The Cb plane is¼ the width of the Y plane (and of the image). Each Cb belongsto 4 pixels all on the same row. For example,Cb0 belongs to Y'00,Y'01, Y'02 andY'03.
Following the Cb plane is the Cr plane,just like the Cb plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have ¼ as many pad bytes after their rows. Inother words, four C x rows (including padding) is exactly as long asone Y row (including padding).

Example 2-1. V4L2_PIX_FMT_YUV411P 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Y'01Y'02Y'03start + 4:Y'10Y'11Y'12Y'13start + 8:Y'20Y'21Y'22Y'23start + 12:Y'30Y'31Y'32Y'33start + 16:Cb00   start + 17:Cb10   start + 18:Cb20   start + 19:Cb30   start + 20:Cr00   start + 21:Cr10   start + 22:Cr20   start + 23:Cr30

Color Sample Location.

0 1 2 30Y YCY Y1Y YCY Y2Y YCY Y3Y YCY Y

V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21')

Name

V4L2_PIX_FMT_NV12,
V4L2_PIX_FMT_NV21 -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed
to
V4L2_PIX_FMT_YVU420

Description

These are two-plane versions of the YUV 4:2:0 format.The three components are separated into two sub-images or planes. TheY plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_NV12, a combined CbCr planeimmediately
follows the Y plane in memory. The CbCr plane is the samewidth, in bytes, as the Y plane (and of the image), but is half astall in pixels. Each CbCr pair belongs to four pixels. For example,Cb0/Cr0 belongs toY'00, Y'01,Y'10,
Y'11.V4L2_PIX_FMT_NV21 is the same except the Cb andCr bytes are swapped, the CrCb plane starts with a Cr byte.

If the Y plane has pad bytes after each row, then theCbCr plane has as many pad bytes after its rows.

Example 2-1. V4L2_PIX_FMT_NV12 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0:Y'00Y'01Y'02Y'03start + 4:Y'10Y'11Y'12Y'13start + 8:Y'20Y'21Y'22Y'23start + 12:Y'30Y'31Y'32Y'33start + 16:Cb00Cr00Cb01Cr01start + 20:Cb10Cr10Cb11Cr11

Color Sample Location.

0 1 2 30Y Y Y Y  C   C 1Y Y Y Y       2Y Y Y Y  C   C 3Y Y Y Y

2.6. Compressed Formats

Table 2-7. Compressed Image Formats

IdentifierCodeDetailsV4L2_PIX_FMT_JPEG'JPEG'TBD. See also VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP.V4L2_PIX_FMT_MPEG'MPEG'MPEG stream. The actual format is determined byextended controlV4L2_CID_MPEG_STREAM_TYPE, seeTable 1-2.

2.7. Reserved Format Identifiers

These formats are not defined by this specification, theyare just listed for reference and to avoid naming conflicts. If youwant to register your own format, send an e-mail to the V4L mailinglisthttps://listman.redhat.com/mailman/listinfo/video4linux-list for
inclusion in thevideodev.hfile. If you want to share your format with other developers add alink to your documentation and send a copy to the maintainer of thisdocument, Michael Schimek<mschimek@gmx.at>,
forinclusion in this section. If you think your format should be listedin a standard format section please make a proposal on the V4L mailinglist.

Table 2-8. Reserved Image Formats

IdentifierCodeDetailsV4L2_PIX_FMT_DV'dvsd'unknownV4L2_PIX_FMT_ET61X251'E625'Compressed format of the ET61X251 driver.V4L2_PIX_FMT_HI240'HI24'  8 bit RGB format used by the BTTV driver,http://bytesex.org/bttv/ V4L2_PIX_FMT_HM12'HM12'  YUV 4:2:0 format used by theIVTV driver, http://www.ivtvdriver.org/  The format is documented in thekernel sources in the fileDocumentation/video4linux/cx2341x/README.hm12 V4L2_PIX_FMT_MJPEG'MJPG'Compressed format used by the Zoran driverV4L2_PIX_FMT_PWC1'PWC1'Compressed format of the PWC driver.V4L2_PIX_FMT_PWC2'PWC2'Compressed format of the PWC driver.V4L2_PIX_FMT_SN9C10X'S910'Compressed format of the SN9C102 driver.V4L2_PIX_FMT_WNVA'WNVA'  Used by the Winnov Videum driver, http://www.thedirks.org/winnov/ V4L2_PIX_FMT_YYUV'YYUV'unknown

Chapter 3. Input/Output

The V4L2 API defines several different methods to read from orwrite to a device. All drivers exchanging data with applications mustsupport at least one of them.

The classic I/O method using the read()and write() function is automatically selectedafter opening a V4L2 device. When the driver does not support thismethod attempts to read or write will fail at
any time.

Other methods must be negotiated. To select the streaming I/Omethod with memory mapped or user buffers applications call theVIDIOC_REQBUFS ioctl.
The asynchronous I/O method is not definedyet.

Video overlay can be considered another I/O method, althoughthe application does not directly receive the image data. It isselected by initiating video overlay with theVIDIOC_S_FMT ioctl.For

Generally exactly one I/O method, including overlay, isassociated with each file descriptor. The only exceptions areapplications not exchanging data with a driver ("panel applications",seeSection
1.1) and drivers permitting simultaneous video capturingand overlay using the same file descriptor, for compatibility with V4Land earlier versions of V4L2.

VIDIOC_S_FMT andVIDIOC_REQBUFS would permit this to some degree,but for simplicity drivers need not support switching the I/O method(after first switching away from read/write) other than by closingand
reopening the device.

The following sections describe the various I/O methods inmore detail.

Input and output devices support theread() and write() function,respectively, when the V4L2_CAP_READWRITE flag inthecapabilities field
of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl
is set.

Drivers may need the CPU to copy the data, but they may alsosupport DMA to or from user memory, so this I/O method is notnecessarily less efficient than other methods merely exchanging bufferpointers. It is considered inferior though because no meta-informationlike
frame counters or timestamps are passed. This information isnecessary to recognize frame dropping and to synchronize with otherdata streams. However this is also the simplest I/O method, requiringlittle or no setup to exchange data. It permits command line
stuntslike this (the vidctrl tool isfictitious):

> vidctrl /dev/video --input=0 --format=YUYV --size=352x288
> dd if=/dev/video of=myimage.422 bs=202752 count=1

To read from the device applications use theread() function, to write thewrite() function.Drivers
must implement one I/O method if theyexchange data with applications, but it need not be this.[12] When
reading or writing is supported, the drivermust also support the select() and poll()function.[13]

3.2. Streaming I/O (Memory Mapping)

Input and output devices support this I/O method when theV4L2_CAP_STREAMING flag in thecapabilities field of struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl is set. There are twostreaming methods, to determine
if the memory mapping flavor issupported applications must call theVIDIOC_REQBUFS ioctl.

Streaming is an I/O method where only pointers to buffersare exchanged between application and driver, the data itself is notcopied. Memory mapping is primarily intended to map buffers in devicememory into the application's address space. Device memory can
be forexample the video memory on a graphics card with a video captureadd-on. However, being the most efficient I/O method available for along time, many other drivers support streaming as well, allocatingbuffers in DMA-able main memory.

A driver can support many sets of buffers. Each set isidentified by a unique buffer type value. The sets are independent andeach set can hold a different type of data. To access different setsat the same time different file descriptors must be used.[14]

To allocate device buffers applications call theVIDIOC_REQBUFS ioctl with the desired
number of buffers and buffertype, for exampleV4L2_BUF_TYPE_VIDEO_CAPTURE.This ioctl can also be used to change the number of buffers or to freethe allocated memory, provided none of the buffers are stillmapped.

Before applications can access the buffers they must mapthem into their address space with themmap() function.
Thelocation of the buffers in device memory can be determined with theVIDIOC_QUERYBUF ioctl.
Them.offset andlength returned in a struct v4l2_buffer arepassed
as sixth and second parameter to themmap() function. The offset and length valuesmust not be modified. Remember the buffers are allocated in physicalmemory, as opposed to virtual memory which can be swapped out to disk.Applications
should free the buffers as soon as possible with themunmap() function.

Example 3-1. Mapping buffers

struct v4l2_requestbuffers reqbuf;
struct {
void *start;
size_t length;
} *buffers;
unsigned int i;

memset (&reqbuf, 0, sizeof (reqbuf));
reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbuf.memory = V4L2_MEMORY_MMAP;
reqbuf.count = 20;

if (-1 == ioctl (fd, VIDIOC_REQBUFS, &reqbuf)) {
if (errno == EINVAL)
printf ("Video capturing or mmap-streaming is not supported\n");
else
perror ("VIDIOC_REQBUFS");

exit (EXIT_FAILURE);
}

/* We want at least five buffers. */

if (reqbuf.count < 5) {
/* You may need to free the buffers here. */
printf ("Not enough buffer memory\n");
exit (EXIT_FAILURE);
}

buffers = calloc (reqbuf.count, sizeof (*buffers));
assert (buffers != NULL);

for (i = 0; i < reqbuf.count; i++) {
struct v4l2_buffer buffer;

memset (&buffer, 0, sizeof (buffer));
buffer.type = reqbuf.type;
buffer.memory = V4L2_MEMORY_MMAP;
buffer.index = i;

if (-1 == ioctl (fd, VIDIOC_QUERYBUF, &buffer)) {
perror ("VIDIOC_QUERYBUF");
exit (EXIT_FAILURE);
}

buffers[i].length = buffer.length; /* remember for munmap() */

buffers[i].start = mmap (NULL, buffer.length,
PROT_READ | PROT_WRITE, /* recommended */
MAP_SHARED,             /* recommended */
fd, buffer.m.offset);

if (MAP_FAILED == buffers[i].start) {
/* If you do not exit here you should unmap() and free()
the buffers mapped so far. */
perror ("mmap");
exit (EXIT_FAILURE);
}
}

/* Cleanup. */

for (i = 0; i < reqbuf.count; i++)
munmap (buffers[i].start, buffers[i].length);

Conceptually streaming drivers maintain two buffer queues, an incomingand an outgoing queue. They separate the synchronous capture or outputoperation locked to a video clock from the application which issubject to random disk or network delays and preemption
byother processes, thereby reducing the probability of data loss.The queues are organized as FIFOs, buffers will beoutput in the order enqueued in the incoming FIFO, and werecaptured in the order dequeued from the outgoing FIFO.

The driver may require a minimum number of buffers enqueuedat all times to function, apart of this no limit exists on the numberof buffers applications can enqueue in advance, or dequeue andprocess. They can also enqueue in a different order than buffers havebeen
dequeued, and the driver canfill enqueuedempty buffers in any order.[15] The
index number of a buffer (struct v4l2_bufferindex) plays no role here, it onlyidentifies
the buffer.

Initially all mapped buffers are in dequeued state,inaccessible by the driver. For capturing applications it is customaryto first enqueue all mapped buffers, then to start capturing and enterthe read loop. Here the application waits until a filled buffer can
bedequeued, and re-enqueues the buffer when the data is no longerneeded. Output applications fill and enqueue buffers, when enoughbuffers are stacked up the output is started withVIDIOC_STREAMON. In the write loop, whenthe application
runs out of free buffers, it must wait until an emptybuffer can be dequeued and reused.

To enqueue and dequeue a buffer applications use theVIDIOC_QBUF andVIDIOC_DQBUF ioctl.
The status of a buffer beingmapped, enqueued, full or empty can be determined at any time using theVIDIOC_QUERYBUF ioctl.
Two methods exist to suspend execution of theapplication until one or more buffers can be dequeued. By defaultVIDIOC_DQBUF blocks when no buffer is in theoutgoing queue. When theO_NONBLOCK flag wasgiven
to theopen()function,VIDIOC_DQBUFreturns immediately
with anEAGAIN error code when no buffer is available. Theselect() orpoll() function
are always available.

To start and stop capturing or output applications call theVIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl.
NoteVIDIOC_STREAMOFF removes all buffers from bothqueues as a side effect. Since there is no notion of doing anything"now" on a multitasking system, if an application needs to synchronizewith another event it should examine the
struct v4l2_buffertimestamp of captured buffers, or set thefield before enqueuing
buffers for output.

Drivers implementing memory mapping I/O mustsupport the VIDIOC_REQBUFS,VIDIOC_QUERYBUF,VIDIOC_QBUF,VIDIOC_DQBUF,VIDIOC_STREAMONandVIDIOC_STREAMOFF ioctl,
themmap(),munmap(),select() andpoll()function.[16]

[capture example]

3.3. Streaming I/O (User Pointers)

Input and output devices support this I/O method when theV4L2_CAP_STREAMING flag in thecapabilities field of struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl is set. If the particular userpointer method (not
only memory mapping) is supported must bedetermined by calling theVIDIOC_REQBUFS ioctl.

This I/O method combines advantages of the read/write andmemory mapping methods. Buffers are allocated by the applicationitself, and can reside for example in virtual or shared memory. Onlypointers to data are exchanged, these pointers and meta-informationare
passed in struct v4l2_buffer. The driver must be switchedinto user pointer I/O mode by calling theVIDIOC_REQBUFS with
thedesired buffer type. No buffers are allocated beforehands,consequently they are not indexed and cannot be queried like mappedbuffers with theVIDIOC_QUERYBUF ioctl.

Example 3-2. Initiating streaming I/O with user pointers

struct v4l2_requestbuffers reqbuf;

memset (&reqbuf, 0, sizeof (reqbuf));
reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbuf.memory = V4L2_MEMORY_USERPTR;

if (ioctl (fd, VIDIOC_REQBUFS, &reqbuf) == -1) {
if (errno == EINVAL)
printf ("Video capturing or user pointer streaming is not supported\n");
else
perror ("VIDIOC_REQBUFS");

exit (EXIT_FAILURE);
}

Buffer addresses and sizes are passed on the fly with theVIDIOC_QBUF ioctl. Although
buffers are commonly cycled,applications can pass different addresses and sizes at eachVIDIOC_QBUF call. If required by the hardware thedriver swaps memory pages within physical memory to create acontinuous area of memory. This
happens transparently to theapplication in the virtual memory subsystem of the kernel. When bufferpages have been swapped out to disk they are brought back and finallylocked in physical memory for DMA.[17]

Filled or displayed buffers are dequeued with theVIDIOC_DQBUF ioctl. The driver can
unlock the memory pages at anytime between the completion of the DMA and this ioctl. The memory isalso unlocked when VIDIOC_STREAMOFF is
called, VIDIOC_REQBUFS, orwhen the device is closed. Applications must take care
not to freebuffers without dequeuing. For once, the buffers remain locked untilfurther, wasting physical memory. Second the driver will not benotified when the memory is returned to the application's free listand subsequently reused for other purposes, possibly
completing therequested DMA and overwriting valuable data.

For capturing applications it is customary to enqueue anumber of empty buffers, to start capturing and enter the read loop.Here the application waits until a filled buffer can be dequeued, andre-enqueues the buffer when the data is no longer needed. Outputapplications
fill and enqueue buffers, when enough buffers are stackedup output is started. In the write loop, when the applicationruns out of free buffers it must wait until an empty buffer can bedequeued and reused. Two methods exist to suspend execution of theapplication
until one or more buffers can be dequeued. By defaultVIDIOC_DQBUF blocks when no buffer is in theoutgoing queue. When theO_NONBLOCK flag wasgiven to theopen() function,VIDIOC_DQBUFreturns
immediately with anEAGAIN error code when no buffer is available. Theselect() orpoll() function
are always available.

To start and stop capturing or output applications call theVIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl.
NoteVIDIOC_STREAMOFF removes all buffers from bothqueues and unlocks all buffers as a side effect. Since there is nonotion of doing anything "now" on a multitasking system, if anapplication needs to synchronize with another event
it should examinethe struct v4l2_buffertimestamp of capturedbuffers, or set the
field before enqueuing buffers for output.

Drivers implementing user pointer I/O mustsupport the VIDIOC_REQBUFS,VIDIOC_QBUF,VIDIOC_DQBUF,VIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl,
theselect() andpoll() function.[18]

3.4. Asynchronous I/O

This method is not defined yet.

3.5. Buffers

A buffer contains data exchanged by application anddriver using one of the Streaming I/O methods. Only pointers tobuffers are exchanged, the data itself is not copied. These pointers,together with meta-information like timestamps or field parity, arestored
in a struct v4l2_buffer, argument totheVIDIOC_QUERYBUF, VIDIOC_QBUF and VIDIOC_DQBUF ioctl.

Nominally timestamps refer to the first data byte transmitted.In practice however the wide range of hardware covered by the V4L2 APIlimits timestamp accuracy. Often an interrupt routine willsample the system clock shortly after the field or frame was storedcompletely
in memory. So applications must expect a constantdifference up to one field or frame period plus a small (few scanlines) random error. The delay and error can be muchlarger due to compression or transmission over an external bus whenthe frames are not properly
stamped by the sender. This is frequentlythe case with USB cameras. Here timestamps refer to the instant thefield or frame was received by the driver, not the capture time. Thesedevices identify by not enumerating any video standards, seeSection
1.7.

Similar limitations apply to output timestamps. Typicallythe video hardware locks to a clock controlling the video timing, thehorizontal and vertical synchronization pulses. At some point in theline sequence, possibly the vertical blanking, an interrupt routinesamples
the system clock, compares against the timestamp and programsthe hardware to repeat the previous field or frame, or to display thebuffer contents.

Apart of limitations of the video device and naturalinaccuracies of all clocks, it should be noted system time itself isnot perfectly stable. It can be affected by power saving cycles,warped to insert leap seconds, or even turned back or forth by thesystem
administrator affecting long term measurements. [19]

Table 3-1. struct v4l2_buffer

__u32index Number of the buffer, set by the application. Thisfield is only used for memory mapping I/Oand can range from zero to the number of buffers allocatedwith theVIDIOC_REQBUFSioctl (struct v4l2_requestbufferscount) minus one.enum v4l2_buf_typetype Type of the buffer, same as struct v4l2_formattype or struct v4l2_requestbufferstype, set by the application.__u32bytesused The number of bytes occupied by the data in thebuffer. It depends on the negotiated data format and may change witheach buffer for compressed variable size data like JPEG images.Drivers must set this field whentyperefers to an input stream, applications when an output stream.__u32flags Flags set by the application or driver, see Table 3-3.enum v4l2_fieldfield Indicates the field order of the image in thebuffer, seeTable 3-8. This field is not used whenthe buffer contains VBI data. Drivers must set it whentype refers to an input stream,applications when an output stream.struct timevaltimestamp   For input streams this is thesystem time (as returned by the gettimeofday()function) when the first data byte was captured. For output streamsthe data will not be displayed before this time, secondary to thenominal frame rate determined by the current video standard inenqueued order. Applications can for example zero this field todisplay frames as soon as possible. The driver stores the time atwhich the first data byte was actually sent out in thetimestamp field. This permitsapplications to monitor the drift between the video and systemclock. struct v4l2_timecodetimecode When type isV4L2_BUF_TYPE_VIDEO_CAPTURE and theV4L2_BUF_FLAG_TIMECODE flag is set inflags, this structure contains a frametimecode. InV4L2_FIELD_ALTERNATEmode the top and bottom field contain the same timecode.Timecodes are intended to help video editing and are typically recorded onvideo tapes, but also embedded in compressed formats like MPEG. Thisfield is independent of the timestampandsequence fields.__u32sequence Set by the driver, counting the frames in thesequence.  In V4L2_FIELD_ALTERNATE mode the top andbottom field have the same sequence number. The count starts at zeroand includes dropped or repeated frames. A dropped frame was receivedby an input device but could not be stored due to lack of free bufferspace. A repeated frame was displayed again by an output devicebecause the application did not pass new data intime.  Note this may count the frames receivede.g. over USB, without taking into account the frames dropped by theremote hardware due to limited compression throughput or busbandwidth. These devices identify by not enumerating any videostandards, seeSection 1.7. enum v4l2_memorymemory This field must be set by applications and/or driversin accordance with the selected I/O method.unionm   __u32offsetWhen memory isV4L2_MEMORY_MMAP this is the offset of the bufferfrom the start of the device memory. The value is returned by thedriver and apart of serving as parameter to themmap() functionnot useful for applications. SeeSection 3.2 for details. unsigned longuserptrWhen memory isV4L2_MEMORY_USERPTR this is a pointer to thebuffer (casted to unsigned long type) in virtual memory, set by theapplication. SeeSection 3.3 for details.__u32length Size of the buffer (not the payload) in bytes.__u32input Some video capture drivers support rapid andsynchronous video input changes, a function useful for example invideo surveillance applications. For this purpose applications set theV4L2_BUF_FLAG_INPUT flag, and this field to thenumber of a video input as in struct v4l2_input fieldindex.__u32reserved A place holder for future extensions and custom(driver defined) buffer typesV4L2_BUF_TYPE_PRIVATE and higher.

Table 3-2. enum v4l2_buf_type

V4L2_BUF_TYPE_VIDEO_CAPTURE1Buffer of a video capture stream, see Section 4.1.V4L2_BUF_TYPE_VIDEO_OUTPUT2Buffer of a video output stream, see Section 4.3.V4L2_BUF_TYPE_VIDEO_OVERLAY3Buffer for video overlay, see Section 4.2.V4L2_BUF_TYPE_VBI_CAPTURE4Buffer of a raw VBI capture stream, see Section 4.7.V4L2_BUF_TYPE_VBI_OUTPUT5Buffer of a raw VBI output stream, see Section 4.7.V4L2_BUF_TYPE_SLICED_VBI_CAPTURE6Buffer of a sliced VBI capture stream, see Section 4.8.V4L2_BUF_TYPE_SLICED_VBI_OUTPUT7Buffer of a sliced VBI output stream, see Section 4.8.V4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY8Buffer for video output overlay (OSD), see Section 4.4. Status:Experimental.V4L2_BUF_TYPE_PRIVATE0x80This and higher values are reserved for custom(driver defined) buffer types.

Table 3-3. Buffer Flags

V4L2_BUF_FLAG_MAPPED0x0001The buffer resides in device memory and has been mappedinto the application's address space, seeSection 3.2 for details.Drivers set or clear this flag when theVIDIOC_QUERYBUF,VIDIOC_QBUF orVIDIOC_DQBUFioctl is called. Set by the driver.V4L2_BUF_FLAG_QUEUED0x0002Internally drivers maintain two buffer queues, anincoming and outgoing queue. When this flag is set, the buffer iscurrently on the incoming queue. It automatically moves to theoutgoing queue after the buffer has been filled (capture devices) ordisplayed (output devices). Drivers set or clear this flag when theVIDIOC_QUERYBUF ioctl is called. After(successful) calling theVIDIOC_QBUFioctl it isalways set and afterVIDIOC_DQBUF alwayscleared.V4L2_BUF_FLAG_DONE0x0004When this flag is set, the buffer is currently onthe outgoing queue, ready to be dequeued from the driver. Drivers setor clear this flag when theVIDIOC_QUERYBUF ioctlis called. After calling theVIDIOC_QBUForVIDIOC_DQBUF it is always cleared. Of course abuffer cannot be on both queues at the same time, theV4L2_BUF_FLAG_QUEUEDandV4L2_BUF_FLAG_DONE flag are mutually exclusive.They can be both cleared however, then the buffer is in "dequeued"state, in the application domain to say so.V4L2_BUF_FLAG_KEYFRAME0x0008Drivers set or clear this flag when calling theVIDIOC_DQBUF ioctl. It may be set by videocapture devices when the buffer contains a compressed image which is akey frame (or field), i. e. can be decompressed on its own.V4L2_BUF_FLAG_PFRAME0x0010Similar to V4L2_BUF_FLAG_KEYFRAMEthis flags predicted frames or fields which contain only differences to aprevious key frame.V4L2_BUF_FLAG_BFRAME0x0020Similar to V4L2_BUF_FLAG_PFRAME this is a bidirectional predicted frame or field. [ooc tbd]V4L2_BUF_FLAG_TIMECODE0x0100The timecode field is valid.Drivers set or clear this flag when theVIDIOC_DQBUFioctl is called.V4L2_BUF_FLAG_INPUT0x0200The input field is valid.Applications set or clear this flag before calling theVIDIOC_QBUF ioctl.

Table 3-4. enum v4l2_memory

V4L2_MEMORY_MMAP1The buffer is used for memorymapping I/O.V4L2_MEMORY_USERPTR2The buffer is used for userpointer I/O.V4L2_MEMORY_OVERLAY3[to do]

3.5.1. Timecodes

The v4l2_timecode structure isdesigned to hold aSMPTE 12M or similar timecode.(structtimeval timestamps
are stored instruct v4l2_bufferfieldtimestamp.)

Table 3-5. struct v4l2_timecode

__u32typeFrame rate the timecodes are based on, see Table 3-6.__u32flagsTimecode flags, see Table 3-7.__u8framesFrame count, 0 ... 23/24/29/49/59, depending on the type of timecode.__u8secondsSeconds count, 0 ... 59. This is a binary, not BCD number.__u8minutesMinutes count, 0 ... 59. This is a binary, not BCD number.__u8hoursHours count, 0 ... 29. This is a binary, not BCD number.__u8userbits[4]The "user group" bits from the timecode.

Table 3-6. Timecode Types

V4L2_TC_TYPE_24FPS124 frames per second, i. e. film.V4L2_TC_TYPE_25FPS225 frames per second, i. e. PAL or SECAM video.V4L2_TC_TYPE_30FPS330 frames per second, i. e. NTSC video.V4L2_TC_TYPE_50FPS4 V4L2_TC_TYPE_60FPS5

Table 3-7. Timecode Flags

V4L2_TC_FLAG_DROPFRAME0x0001Indicates "drop frame" semantics for counting framesin 29.97 fps material. When set, frame numbers 0 and 1 at the start ofeach minute, except minutes 0, 10, 20, 30, 40, 50 are omitted from thecount.V4L2_TC_FLAG_COLORFRAME0x0002The "color frame" flag.V4L2_TC_USERBITS_field0x000CField mask for the "binary group flags".V4L2_TC_USERBITS_USERDEFINED0x0000Unspecified format.V4L2_TC_USERBITS_8BITCHARS0x00088-bit ISO characters.

3.6. Field Order

We have to distinguish between progressive and interlacedvideo. Progressive video transmits all lines of a video imagesequentially. Interlaced video divides an image into two fields,containing only the odd and even lines of the image, respectively.Alternating
the so called odd and even field are transmitted, and dueto a small delay between fields a cathode ray TV displays the linesinterleaved, yielding the original frame. This curious technique wasinvented because at refresh rates similar to film the image wouldfade
out too quickly. Transmitting fields reduces the flicker withoutthe necessity of doubling the frame rate and with it the bandwidthrequired for each channel.

It is important to understand a video camera does not exposeone frame at a time, merely transmitting the frames separated intofields. The fields are in fact captured at two different instances intime. An object on screen may well move between one field and
thenext. For applications analysing motion it is of paramount importanceto recognize which field of a frame is older, thetemporalorder.

When the driver provides or accepts images field by fieldrather than interleaved, it is also important applications understandhow the fields combine to frames. We distinguish between top andbottom fields, thespatial order: The first lineof the top field is the first line of an interlaced frame, the firstline of the bottom field is the second line of that frame.

However because fields were captured one after the other,arguing whether a frame commences with the top or bottom field ispointless. Any two successive top and bottom, or bottom and top fieldsyield a valid frame. Only when the source was progressive to beginwith,
e. g. when transferring film to video, two fields may come fromthe same frame, creating a natural order.

Counter to intuition the top field is not necessarily theolder field. Whether the older field contains the top or bottom linesis a convention determined by the video standard. Hence thedistinction between temporal and spatial order of fields. The diagramsbelow
should make this clearer.

All video capture and output devices must report the currentfield order. Some drivers may permit the selection of a differentorder, to this end applications initialize thefield field of struct v4l2_pix_format beforecalling
the VIDIOC_S_FMT ioctl. If this is not desired it shouldhave the valueV4L2_FIELD_ANY (0).

Table 3-8. enum v4l2_field

V4L2_FIELD_ANY0Applications request this field order when anyone of theV4L2_FIELD_NONE,V4L2_FIELD_TOP,V4L2_FIELD_BOTTOM, orV4L2_FIELD_INTERLACED formats is acceptable.Drivers choose depending on hardware capabilities or e. g. therequested image size, and return the actual field order. struct v4l2_bufferfield can never beV4L2_FIELD_ANY.V4L2_FIELD_NONE1Images are in progressive format, not interlaced.The driver may also indicate this order when it cannot distinguishbetweenV4L2_FIELD_TOPandV4L2_FIELD_BOTTOM.V4L2_FIELD_TOP2Images consist of the top field only.V4L2_FIELD_BOTTOM3Images consist of the bottom field only.Applications may wish to prevent a device from capturing interlacedimages because they will have "comb" or "feathering" artefacts aroundmoving objects.V4L2_FIELD_INTERLACED4Images contain both fields, interleaved line byline. The temporal order of the fields (whether the top or bottomfield is first transmitted) depends on the current video standard.M/NTSC transmits the bottom field first, all other standards the topfield first.V4L2_FIELD_SEQ_TB5Images contain both fields, the top field linesare stored first in memory, immediately followed by the bottom fieldlines. Fields are always stored in temporal order, the older one firstin memory. Image sizes refer to the frame, not fields.V4L2_FIELD_SEQ_BT6Images contain both fields, the bottom fieldlines are stored first in memory, immediately followed by the topfield lines. Fields are always stored in temporal order, the older onefirst in memory. Image sizes refer to the frame, not fields.V4L2_FIELD_ALTERNATE7The two fields of a frame are passed in separatebuffers, in temporal order, i. e. the older one first. To indicate the fieldparity (whether the current field is a top or bottom field) the driveror application, depending on data direction, must set struct v4l2_bufferfield toV4L2_FIELD_TOPorV4L2_FIELD_BOTTOM. Any two successive fields pairto build a frame. If fields are successive, without any dropped fieldsbetween them (fields can drop individually), can be determined fromthe struct v4l2_buffersequencefield. Imagesizes refer to the frame, not fields. This format cannot be selectedwhen using the read/write I/O method.V4L2_FIELD_INTERLACED_TB8Images contain both fields, interleaved line byline, top field first. The top field is transmitted first.V4L2_FIELD_INTERLACED_BT9Images contain both fields, interleaved line byline, top field first. The bottom field is transmitted first.

Figure 3-1. Field Order, Top Field First Transmitted

Figure 3-2. Field Order, Bottom Field First Transmitted

Chapter 4. Interfaces

4.1. Video Capture Interface

Video capture devices sample an analog video signal and storethe digitized images in memory. Today nearly all devices can captureat full 25 or 30 frames/second. With this interface applications cancontrol the capture process and move images from the driver
into userspace.

Conventionally V4L2 video capture devices are accessed throughcharacter device special files named/dev/videoand/dev/video0 to/dev/video63 with major number 81 and minornumbers 0 to
63./dev/video is typically asymbolic link to the preferred video device. Note the same devicefiles are used for video output devices.

4.1.1. Querying Capabilities

Devices supporting the video capture interface set theV4L2_CAP_VIDEO_CAPTURE flag in thecapabilities field of struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl. As secondary device functionsthey may also support
thevideo overlay(V4L2_CAP_VIDEO_OVERLAY) and theraw VBI capture(V4L2_CAP_VBI_CAPTURE) interface. At least one ofthe read/write or streaming I/O methods must be supported. Tuners andaudio inputs are optional.

4.1.2. Supplemental Functions

Video capture devices shall support audio input, tuner, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo inputandvideo standard ioctls must be supported byall video capture devices.

4.1.3. Image Format Negotiation

The result of a capture operation is determined bycropping and image format parameters. The former select an area of thevideo picture to capture, the latter how images are stored in memory,i. e. in RGB or YUV format, the number of bits per pixel or width andheight.
Together they also define how images are scaled in theprocess.

As usual these parameters are not resetatopen() time
to permit Unix tool chains, programming a deviceand then reading from it as if it was a plain file. Well written V4L2applications ensure they really get what they want, including croppingand scaling.

Cropping initialization at minimum requires to reset theparameters to defaults. An example is given inSection 1.11.

To query the current image format applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_CAPTURE and
call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_pix_formatpix member
of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as
above andinitialize all fields of the struct v4l2_pix_formatvbimember of thefmt union,
or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl
with a pointer to this structure. Drivers mayadjust the parameters and finally return the actual parameters asVIDIOC_G_FMT does.

Like VIDIOC_S_FMT theVIDIOC_TRY_FMT ioctl can be used
to learn about hardware limitationswithout disabling I/O or possibly time consuming hardwarepreparations.

The contents of struct v4l2_pix_format are discussed inChapter 2. See also the specification of theVIDIOC_G_FMT,VIDIOC_S_FMTandVIDIOC_TRY_FMTioctls for details. Videocapture devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl,
even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.

A video capture device may support the read() function and/or streaming (memory mapping oruser pointer) I/O. SeeChapter 3 for details.

4.2. Video Overlay Interface

Also known as Framebuffer Overlay or Previewing

Video overlay devices have the ability to genlock (TV-)videointo the (VGA-)video signal of a graphics card, or to store capturedimages directly in video memory of a graphics card, typically withclipping. This can be considerable more efficient than capturingimages
and displaying them by other means. In the old days when onlynuclear power plants needed cooling towers this used to be the onlyway to put live video into a window.

Video overlay devices are accessed through the same characterspecial files as video capture devices.Note the default
function of a /dev/videodeviceis video capturing. The overlay function is only available aftercalling theVIDIOC_S_FMT ioctl.

The driver may support simultaneous overlay and capturingusing the read/write and streaming I/O methods. If so, operation atthe nominal frame rate of the video standard is not guaranteed. Framesmay be directed away from overlay to capture, or one field may
be usedfor overlay and the other for capture if the capture parameters permitthis.

Applications should use different file descriptors forcapturing and overlay. This must be supported by all drivers capableof simultaneous capturing and overlay. Optionally these drivers mayalso permit capturing and overlay with a single file descriptor forcompatibility
with V4L and earlier versions of V4L2.[20]

4.2.1. Querying Capabilities

Devices supporting the video overlay interface set theV4L2_CAP_VIDEO_OVERLAY flag in thecapabilities field of struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl. The overlay I/O method specifiedbelow must be supported.
Tuners and audio inputs are optional.

4.2.2. Supplemental Functions

Video overlay devices shall support audio input, tuner, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo inputandvideo standard ioctls must be supported byall video overlay devices.

4.2.3. Setup

Before overlay can commence applications must program thedriver with frame buffer parameters, namely the address and size ofthe frame buffer and the image format, for example RGB 5:6:5. TheVIDIOC_G_FBUF and VIDIOC_S_FBUF ioctls
are available to getand set these parameters, respectively. TheVIDIOC_S_FBUF ioctl is privileged because itallows to set up DMA into physical memory, bypassing the memoryprotection mechanisms of the kernel. Only the superuser
can change theframe buffer address and size. Users are not supposed to run TVapplications as root or with SUID bit set. A small helper applicationwith suitable privileges should query the graphics system and programthe V4L2 driver at the appropriate time.

Some devices add the video overlay to the output signalof the graphics card. In this case the frame buffer is not modified bythe video device, and the frame buffer address and pixel format arenot needed by the driver. TheVIDIOC_S_FBUF ioctlis
not privileged. An application can check for this type of device bycalling theVIDIOC_G_FBUF ioctl.

A driver may support any (or none) of five clipping/blendingmethods:

Chroma-keying displays the overlaid image only wherepixels in the primary graphics surface assume a certain color.   A bitmap can be specified where each bit correspondsto a pixel in the overlaid image. When the bit is set, thecorresponding video pixel is displayed, otherwise a pixel of thegraphics surface.   A list of clipping rectangles can be specified. Inthese regions no video is displayed, so thegraphics surface can be seen here.   The framebuffer has an alpha channel that can be usedto clip or blend the framebuffer with the video.   A global alpha value can be specified to blend theframebuffer contents with video images.

When simultaneous capturing and overlay is supported andthe hardware prohibits different image and frame buffer formats, theformat requested first takes precedence. The attempt to capture(VIDIOC_S_FMT)
or overlay (VIDIOC_S_FBUF) may fail with anEBUSY error
code or return accordingly modified parameters..

4.2.4. Overlay Window

The overlaid image is determined by cropping and overlaywindow parameters. The former select an area of the video picture tocapture, the latter how images are overlaid and clipped. Croppinginitialization at minimum requires to reset the parameters todefaults.
An example is given in Section 1.11.

The overlay window is described by a struct v4l2_window. Itdefines the size of the image, its position over the
graphics surfaceand the clipping to be applied. To get the current parametersapplications set the type field of astruct v4l2_format toV4L2_BUF_TYPE_VIDEO_OVERLAY andcall
theVIDIOC_G_FMTioctl. The driver fills thev4l2_window substructure
namedwin. It is not possible to retrieve apreviously programmed clipping list or bitmap.

To program the overlay window applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OVERLAY,
initialize thewin substructure and call theVIDIOC_S_FMT ioctl.
The driver adjusts the parameters againsthardware limits and returns the actual parameters asVIDIOC_G_FMT does. LikeVIDIOC_S_FMT, theVIDIOC_TRY_FMT ioctl
can beused to learn about driver capabilities without actually changingdriver state. UnlikeVIDIOC_S_FMT this also worksafter the overlay has been enabled.

The scaling factor of the overlaid image is implied by thewidth and height given in struct v4l2_window and the size

When simultaneous capturing and overlay is supported andthe hardware prohibits different image and window sizes, the sizerequested first takes precedence. The attempt to capture or overlay aswell (VIDIOC_S_FMT)
may fail with an EBUSY error code or return accordinglymodified parameters.

Table 4-1. struct v4l2_window

struct v4l2_rectwSize and position of the window relative to thetop, left corner of the frame buffer defined withVIDIOC_S_FBUF. Thewindow can extend the frame buffer width and height, thex andycoordinates can be negative, and it can lie completely outside theframe buffer. The driver clips the window accordingly, or if that isnot possible, modifies its size and/or position.enum v4l2_fieldfieldApplications set this field to determine whichvideo field shall be overlaid, typically one ofV4L2_FIELD_ANY (0),V4L2_FIELD_TOP,V4L2_FIELD_BOTTOMorV4L2_FIELD_INTERLACED. Drivers may have to choosea different field order and return the actual setting here.__u32chromakeyWhen chroma-keying has been negotiated withVIDIOC_S_FBUF applications set this field to the desired pixel valuefor the chroma key. The format is the same as the pixel format of theframebuffer (struct v4l2_framebufferfmt.pixelformat field), with bytes in hostorder. E. g. forV4L2_PIX_FMT_BGR24 the value should be 0xRRGGBB on a little endian, 0xBBGGRR on a bigendian host.struct v4l2_clip *clipsWhen chroma-keying has notbeen negotiated andVIDIOC_G_FBUFindicated this capability,applications can set this field to point to an array ofclipping rectangles.Like the window coordinatesw, clipping rectangles are defined relativeto the top, left corner of the frame buffer. However clippingrectangles must not extend the frame buffer width and height, and theymust not overlap. If possible applications should merge adjacentrectangles. Whether this must create x-y or y-x bands, or the order ofrectangles, is not defined. When clip lists are not supported thedriver ignores this field. Its contents after callingVIDIOC_S_FMTare undefined.__u32clipcountWhen the application set theclips field, this field must contain thenumber of clipping rectangles in the list. When clip lists are notsupported the driver ignores this field, its contents after callingVIDIOC_S_FMT are undefined. When clip lists aresupported but no clipping is desired this field must be set tozero.void *bitmapWhen chroma-keying hasnot been negotiated andVIDIOC_G_FBUFindicatedthis capability, applications can set this field to point to aclipping bit mask.  It must be of the same sizeas the window, w.width andw.height. Each bit corresponds to a pixelin the overlaid image, which is displayed only when the bit isset. Pixel coordinates translate to bits like: ((__u8 *) bitmap)[w.width * y + x / 8] & (1 << (x & 7))  where 0 ≤ x <w.width and0 ≤y <w.height.a  When a clippingbit mask is not supported the driver ignores this field, its contentsafter callingVIDIOC_S_FMT are undefined. When a bit mask is supportedbut no clipping is desired this field must be set toNULL.  Applications need not create aclip list or bit mask. When they pass both, or despite negotiatingchroma-keying, the results are undefined. Regardless of the chosenmethod, the clipping abilities of the hardware may be limited inquantity or quality. The results when these limits are exceeded areundefined.b __u8global_alpha  The global alpha value used to blend theframebuffer with video images, if global alpha blending has beennegotiated (V4L2_FBUF_FLAG_GLOBAL_ALPHA, seeVIDIOC_S_FBUF,Table 3).  Notethis field was added in Linux 2.6.23, extending the structure. Howeverthe VIDIOC_G/S/TRY_FMT ioctls,which take a pointer to av4l2_format parent structure with paddingbytes at the end, are not affected. Notes: a. Should we require w.width to be a multiple of eight? b. When the image is written into frame buffermemory it will be undesirable if the driver clips out less pixelsthan expected, because the application and graphics system are notaware these regions need to be refreshed. The driver should clip outmore pixels or not write the image at all.

Table 4-2. struct v4l2_clip[21]

struct v4l2_rectcCoordinates of the clipping rectangle, relative tothe top, left corner of the frame buffer. Only window pixelsoutside all clipping rectangles aredisplayed.struct v4l2_clip *nextPointer to the next clipping rectangle, NULL whenthis is the last rectangle. Drivers ignore this field, it cannot beused to pass a linked list of clipping rectangles.

Table 4-3. struct v4l2_rect

__s32leftHorizontal offset of the top, left corner of therectangle, in pixels.__s32topVertical offset of the top, left corner of therectangle, in pixels. Offsets increase to the right and down.__s32widthWidth of the rectangle, in pixels.__s32heightHeight of the rectangle, in pixels. Width andheight cannot be negative, the fields are signed for hystericalreasons.

4.2.5. Enabling Overlay

To start or stop the frame buffer overlay applications callthe VIDIOC_OVERLAY ioctl.

4.3. Video Output Interface

Video output devices encode stills or image sequences asanalog video signal. With this interface applications cancontrol the encoding process and move images from user space tothe driver.

Conventionally V4L2 video output devices are accessed throughcharacter device special files named/dev/videoand/dev/video0 to/dev/video63 with major number 81 and minornumbers 0 to
63./dev/video is typically asymbolic link to the preferred video device. Note the same devicefiles are used for video capture devices.

4.3.1. Querying Capabilities

Devices supporting the video output interface set theV4L2_CAP_VIDEO_OUTPUT flag in thecapabilities field of struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl. As secondary device functionsthey may also support
theraw VBIoutput (V4L2_CAP_VBI_OUTPUT) interface. Atleast one of the read/write or streaming
I/O methods must besupported. Modulators and audio outputs are optional.

4.3.2. Supplemental Functions

Video output devices shall support audio output, modulator, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo outputandvideo standard ioctls must be supported byall video output devices.

4.3.3. Image Format Negotiation

The output is determined by cropping and image formatparameters. The former select an area of the video picture where theimage will appear, the latter how images are stored in memory, i. e. inRGB or YUV format, the number of bits per pixel or width and height.Together
they also define how images are scaled in the process.

As usual these parameters are not resetatopen() time
to permit Unix tool chains, programming a deviceand then writing to it as if it was a plain file. Well written V4L2applications ensure they really get what they want, including croppingand scaling.

Cropping initialization at minimum requires to reset theparameters to defaults. An example is given inSection 1.11.

To query the current image format applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT and
call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_pix_formatpix member
of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as
above andinitialize all fields of the struct v4l2_pix_formatvbimember of thefmt union,
or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl
with a pointer to this structure. Drivers mayadjust the parameters and finally return the actual parameters asVIDIOC_G_FMT does.

Like VIDIOC_S_FMT theVIDIOC_TRY_FMT ioctl can be used
to learn about hardware limitationswithout disabling I/O or possibly time consuming hardwarepreparations.

The contents of struct v4l2_pix_format are discussed inChapter 2. See also the specification of theVIDIOC_G_FMT,VIDIOC_S_FMTandVIDIOC_TRY_FMTioctls for details. Videooutput devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl,
even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.

4.3.4. Writing Images

A video output device may support the write() function and/or streaming (memory mapping oruser pointer) I/O. SeeChapter 3 for details.

4.4. Video Output Overlay Interface

Also known as On-Screen Display (OSD)

Experimental: This is an experimentalinterface and may change in the future.

Some video output devices can overlay a framebuffer image ontothe outgoing video signal. Applications can set up such an overlayusing this interface, which borrows structures and ioctls of theVideo
Overlay interface.

The OSD function is accessible through the same characterspecial file as the Video Output function.Note the default
function of such a /dev/videodeviceis video capturing or output. The OSD function is only available aftercalling theVIDIOC_S_FMT ioctl.

4.4.1. Querying Capabilities

Devices supporting the Video OutputOverlay interface set theV4L2_CAP_VIDEO_OUTPUT_OVERLAY flag in thecapabilities field of struct v4l2_capabilityreturned
by the VIDIOC_QUERYCAP ioctl.

4.4.2. Framebuffer

Contrary to the Video Overlayinterface the framebuffer is normally implemented on the TV card andnot the graphics card. On Linux it is accessible as a framebufferdevice (/dev/fbN).
Given a V4L2 device,applications can find the corresponding framebuffer device by callingthe VIDIOC_G_FBUF ioctl.
It returns, amongst other information, thephysical address of the framebuffer in thebase field of struct v4l2_framebuffer.
Theframebuffer device ioctlFBIOGET_FSCREENINFOreturns the same address in thesmem_startfield of structfb_fix_screeninfo. TheFBIOGET_FSCREENINFO ioctl
and structfb_fix_screeninfo are defined in thelinux/fb.h header file.

The width and height of the framebuffer depends on thecurrent video standard. A V4L2 driver may reject attempts to changethe video standard (or any other ioctl which would imply a framebuffersize change) with anEBUSY error code
until all applications closed theframebuffer device.

Example 4-1. Finding a framebuffer device for OSD

#include <linux/fb.h>

struct v4l2_framebuffer fbuf;
unsigned int i;
int fb_fd;

if (-1 == ioctl (fd, VIDIOC_G_FBUF, &fbuf)) {
perror ("VIDIOC_G_FBUF");
exit (EXIT_FAILURE);
}

for (i = 0; i < 30; ++i) {
char dev_name[16];
struct fb_fix_screeninfo si;

snprintf (dev_name, sizeof (dev_name), "/dev/fb%u", i);

fb_fd = open (dev_name, O_RDWR);
if (-1 == fb_fd) {
switch (errno) {
case ENOENT: /* no such file */
case ENXIO:  /* no driver */
continue;

default:
perror ("open");
exit (EXIT_FAILURE);
}
}

if (0 == ioctl (fb_fd, FBIOGET_FSCREENINFO, &si)) {
if (si.smem_start == (unsigned long) fbuf.base)
break;
} else {
/* Apparently not a framebuffer device. */
}

close (fb_fd);
fb_fd = -1;
}

/* fb_fd is the file descriptor of the framebuffer device
for the video output overlay, or -1 if no device was found. */

4.4.3. Overlay Window and Scaling

The overlay is controlled by source and target rectangles.The source rectangle selects a subsection of the framebuffer image tobe overlaid, the target rectangle an area in the outgoing video signalwhere the image will appear. Drivers may or may not support
scaling,and arbitrary sizes and positions of these rectangles. Further driversmay support any (or none) of the clipping/blending methods defined fortheVideo Overlay interface.

A struct v4l2_window defines the size of the source rectangle,its position in the framebuffer and the clipping/blending
method to beused for the overlay. To get the current parameters applications setthe type field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY and
call theVIDIOC_G_FMT ioctl. The driver fills thev4l2_window substructure
namedwin. It is not possible to retrieve apreviously programmed clipping list or bitmap.

To program the source rectangle applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY,
initializethe winsubstructure and call theVIDIOC_S_FMT ioctl.
The driver adjusts the parameters againsthardware limits and returns the actual parameters asVIDIOC_G_FMT does. LikeVIDIOC_S_FMT, theVIDIOC_TRY_FMT ioctl
can beused to learn about driver capabilities without actually changingdriver state. UnlikeVIDIOC_S_FMT this also worksafter the overlay has been enabled.

A struct v4l2_crop defines the size and position of the targetrectangle. The scaling factor of the overlay is implied
by the widthand height given in struct v4l2_window and struct v4l2_crop.
The cropping APIapplies toVideo Output andVideoOutput Overlay devices in the same way as toVideo Capture andVideoOverlay devices, merely reversing the direction of thedata flow. For more information seeSection 1.11.

4.4.4. Enabling Overlay

There is no V4L2 ioctl to enable or disable the overlay,however the framebuffer interface of the driver may support theFBIOBLANK ioctl.

4.5. Codec Interface

Suspended: This interface has been be suspended from the V4L2 APIimplemented in Linux 2.6 until we have more experience with codecdevice interfaces.

A V4L2 codec can compress, decompress, transform, or otherwiseconvert video data from one format into another format, in memory.Applications send data to be converted to the driver through awrite() call,
and receive the converted data through aread() call. For efficiency a driver may also
support streamingI/O.

[to do]

4.6. Effect Devices Interface

Suspended: This interface has been be suspended from the V4L2 APIimplemented in Linux 2.6 until we have more experience with effectdevice interfaces.

A V4L2 video effect device can do image effects, filtering, orcombine two or more images or image streams. For example videotransitions or wipes. Applications send data to be processed andreceive the result data either withread() andwrite()functions,
or through the streaming I/O mechanism.

[to do]

4.7. Raw VBI Data Interface

VBI is an abbreviation of Vertical Blanking Interval, a gapin the sequence of lines of an analog video signal. During VBIno picture information is transmitted, allowing some time while theelectron beam of a cathode ray tube TV returns to the top of thescreen.
Using an oscilloscope you will find here the verticalsynchronization pulses and short data packages ASKmodulated[22]onto
the video signal. These are transmissions of services such asTeletext or Closed Caption.

Subject of this interface type is raw VBI data, as sampled offa video signal, or to be added to a signal for output.The data format is similar to uncompressed video images, a number oflines times a number of samples per line, we call this a VBI image.

Conventionally V4L2 VBI devices are accessed through characterdevice special files named/dev/vbi and/dev/vbi0 to/dev/vbi31 withmajor number 81 and minor numbers 224 to 255./dev/vbi is
typically a symbolic link to thepreferred VBI device. This convention applies to both input and outputdevices.

To address the problems of finding related video and VBIdevices VBI capturing and output is also available as device functionunder/dev/video. To capture or output raw VBIdata with these devices applications must call theVIDIOC_S_FMTioctl.
Accessed as/dev/vbi, raw VBI capturingor output is the default device function.

4.7.1. Querying Capabilities

Devices supporting the raw VBI capturing or output API setthe V4L2_CAP_VBI_CAPTURE orV4L2_CAP_VBI_OUTPUT flags, respectively, in thecapabilitiesfield of struct v4l2_capabilityreturned
by the VIDIOC_QUERYCAP ioctl. At least one of theread/write, streaming or asynchronous
I/O methods must besupported. VBI devices may or may not have a tuner or modulator.

4.7.2. Supplemental Functions

VBI devices shall support videoinput or output, tuner ormodulator, and controls ioctlsas needed. The video standard ioctls provideinformation vital to program a VBI device, therefore must besupported.

4.7.3. Raw VBI Format Negotiation

Raw VBI sampling abilities can vary, in particular thesampling frequency. To properly interpret the data V4L2 specifies anioctl to query the sampling parameters. Moreover, to allow for someflexibility applications can also suggest different parameters.

As usual these parameters are notreset atopen() time
to permit Unix tool chains, programming adevice and then reading from it as if it was a plain file. Wellwritten V4L2 applications should always ensure they really get whatthey want, requesting reasonable parameters and then checking if theactual parameters
are suitable.

To query the current raw VBI capture parametersapplications set the type field of astruct v4l2_format toV4L2_BUF_TYPE_VBI_CAPTUREorV4L2_BUF_TYPE_VBI_OUTPUT,
and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe
struct v4l2_vbi_formatvbi member of thefmtunion.

To request different parameters applications set thetype field of a struct v4l2_format as
above andinitialize all fields of the struct v4l2_vbi_formatvbimember of thefmt union,
or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl
with a pointer to this structure. Drivers returnanEINVAL error code only when the given parameters are ambiguous, otherwisethey modify the parameters according to the hardware capabilites andreturn the actual parameters. When
the driver allocates resources atthis point, it may return an EBUSY error code to indicate the returnedparameters are valid but the required resources are currently notavailable. That may happen for instance when the video and
VBI areasto capture would overlap, or when the driver supports multiple opensand another process already requested VBI capturing or output. Anyway,applications must expect other resource allocation points which mayreturnEBUSY,
at theVIDIOC_STREAMON ioctland the first read(), write() and select() call.

VBI devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is
optional.

Table 4-4. struct v4l2_vbi_format

__u32sampling_rateSamples per second, i. e. unit 1 Hz.__u32offset  Horizontal offset of the VBI image,relative to the leading edge of the line synchronization pulse andcounted in samples: The first sample in the VBI image will be locatedoffset /sampling_rate seconds following the leadingedge. See also Figure 4-1. __u32samples_per_line __u32sample_format  Defines the sample format as in Chapter 2, a four-character-code.aUsually this isV4L2_PIX_FMT_GREY, i. e. each sampleconsists of 8 bits with lower values oriented towards the black level.Do not assume any other correlation of values with the signal level.For example, the MSB does not necessarily indicate if the signal is'high' or 'low' because 128 may not be the mean value of thesignal. Drivers shall not convert the sample format by software. __u32start[2]This is the scanning system line numberassociated with the first line of the VBI image, of the first and thesecond field respectively. SeeFigure 4-2andFigure 4-3 for valid values. VBI input drivers canreturn start values 0 if the hardware cannot reliable identifyscanning lines, VBI acquisition may not require thisinformation.__u32count[2]The number of lines in the first and secondfield image, respectively.  Drivers should be asflexibility as possible. For example, it may be possible to extend ormove the VBI capture window down to the picture area, implementing a'full field mode' to capture data service transmissions embedded inthe picture.  An application can set the first or secondcount value to zero if no data is requiredfrom the respective field;count[1] if thescanning system is progressive, i. e. not interlaced. Thecorresponding start value shall be ignored by the application anddriver. Anyway, drivers may not support single field capturing andreturn both count values non-zero.  Bothcount values set to zero, or line numbersoutside the bounds depicted inFigure 4-2 andFigure 4-3, or a field image coveringlines of two fields, are invalid and shall not be returned by thedriver.  To initialize the startand count fields, applications must firstdetermine the current video standard selection. Thev4l2_std_id ortheframelines field of struct v4l2_standard canbe evaluated for this purpose. __u32flagsSee Table 4-5 below. Currentlyonly drivers set flags, applications must set this field tozero.__u32reserved[2]This array is reserved for future extensions.Drivers and applications must set it to zero.Notes: a. A few devices may be unable tosample VBI data at all but can extend the video capture window to theVBI region.

Table 4-5. Raw VBI Format Flags

V4L2_VBI_UNSYNC0x0001  This flag indicates hardware which does notproperly distinguish between fields. Normally the VBI image stores thefirst field (lower scanning line numbers) first in memory. This may bea top or bottom field depending on the video standard. When this flagis set the first or second field may be stored first, however thefields are still in correct temporal order with the older field firstin memory.a V4L2_VBI_INTERLACED0x0002By default the two field images will be passedsequentially; all lines of the first field followed by all lines ofthe second field (compareSection 3.6V4L2_FIELD_SEQ_TB andV4L2_FIELD_SEQ_BT, whether the top or bottomfield is first in memory depends on the video standard). When thisflag is set, the two fields are interlaced (cf.V4L2_FIELD_INTERLACED). The first line of thefirst field followed by the first line of the second field, then thetwo second lines, and so on. Such a layout may be necessary when thehardware has been programmed to capture or output interlaced videoimages and is unable to separate the fields for VBI capturing atthe same time. For simplicity setting this flag implies that bothcount values are equal and non-zero.Notes: a. Most VBI services transmit on both fields, butsome have different semantics depending on the field number. Thesecannot be reliable decoded or encoded whenV4L2_VBI_UNSYNC is set.

Figure 4-1. Line synchronization

Figure 4-2. ITU-R 525 line numbering (M/NTSC and M/PAL)

(1) For the purpose of this specification field 2starts in line 264 and not 263.5 because half line capturing is notsupported.

Figure 4-3. ITU-R 625 line numbering

(1) For the purpose of this specification field 2starts in line 314 and not 313.5 because half line capturing is notsupported.

Remember the VBI image format depends on the selectedvideo standard, therefore the application must choose a new standard orquery the current standard first. Attempts to read or write data aheadof format negotiation, or after switching the video standard which
mayinvalidate the negotiated VBI parameters, should be refused by thedriver. A format change during active I/O is not permitted.

4.7.4. Reading and writing VBI images

To assure synchronization with the field number and easierimplementation, the smallest unit of data passed at a time is oneframe, consisting of two fields of VBI images immediately following inmemory.

The total size of a frame computes as follows:

(count[0] + count[1]) *
samples_per_line * sample size in bytes

The sample size is most likely always one byte,applications must check the sample_formatfield though, to function properly with other drivers.

A VBI device may support read/write and/or streaming (memory mapping oruser pointer) I/O. The latter bears thepossibility of synchronizing video andVBI data by using buffer timestamps.

Remember the VIDIOC_STREAMON ioctl and the first read(),write() and select() call
can be resource allocation points returninganEBUSY error code if the required hardware resources are temporarilyunavailable, for example the device is already in use by anotherprocess.

4.8. Sliced VBI Data Interface

VBI stands for Vertical Blanking Interval, a gap in thesequence of lines of an analog video signal. During VBI no pictureinformation is transmitted, allowing some time while the electron beamof a cathode ray tube TV returns to the top of the screen.

Sliced VBI devices use hardware to demodulate data transmittedin the VBI. V4L2 drivers shallnot do this bysoftware, see also theraw
VBIinterface. The data is passed as short packets of fixed size,covering one scan line each. The number of packets per video frame isvariable.

Sliced VBI capture and output devices are accessed through thesame character special files as raw VBI devices. When a driversupports both interfaces, the default function of a/dev/vbi device israw VBIcapturing
or output, and the sliced VBI function is only availableafter calling theVIDIOC_S_FMT ioctl
as defined below. Likewise a/dev/video device may support the sliced VBI API,however the default function here is video capturing or output.Different file descriptors must be used to pass raw and sliced VBIdata simultaneously, if
this is supported by the driver.

4.8.1. Querying Capabilities

Devices supporting the sliced VBI capturing or output APIset the V4L2_CAP_SLICED_VBI_CAPTURE orV4L2_CAP_SLICED_VBI_OUTPUT flag respectively, inthecapabilities field
of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl.
At least one of theread/write, streaming or asynchronousI/Omethods must be supported. Sliced VBI devices may have a tuneror
modulator.

4.8.2. Supplemental Functions

Sliced VBI devices shall support videoinput or output and tuner ormodulator ioctls if they have these capabilities, and they maysupportcontrol ioctls. Thevideo standard ioctls provide informationvital to program a sliced VBI device, therefore must besupported.

4.8.3. Sliced VBI Format Negotiation

To find out which data services are supported by thehardware applications can call theVIDIOC_G_SLICED_VBI_CAP ioctl.All
drivers implementing the sliced VBI interface must support thisioctl. The results may differ from those of theVIDIOC_S_FMT ioctlwhen
the number of VBI lines the hardware can capture or output perframe, or the number of services it can identify on a given line arelimited. For example on PAL line 16 the hardware may be able to lookfor a VPS or Teletext signal, but not both at the same time.

To determine the currently selected services applicationsset the type field of struct v4l2_format to V4L2_BUF_TYPE_SLICED_VBI_CAPTUREorV4L2_BUF_TYPE_SLICED_VBI_OUTPUT,
and theVIDIOC_G_FMTioctl fills thefmt.sliced member,
astruct v4l2_sliced_vbi_format.

Applications can request different parameters byinitializing or modifying the fmt.slicedmember and calling the VIDIOC_S_FMT ioctl
with a pointer to thev4l2_format structure.

The sliced VBI API is more complicated than the raw VBI APIbecause the hardware must be told which VBI service to expect on eachscan line. Not all services may be supported by the hardware on alllines (this is especially true for VBI output where Teletext is
oftenunsupported and other services can only be inserted in one specificline). In many cases, however, it is sufficient to just set theservice_set field to the required servicesand let the driver fill theservice_linesarray
according to hardware capabilities. Only if more precise controlis needed should the programmer set theservice_lines array explicitly.

The VIDIOC_S_FMT ioctl returns anEINVAL error code
only when thegiven parameters are ambiguous, otherwise it modifies the parametersaccording to hardware capabilities. When the driver allocatesresources at this point, it may return anEBUSY error code if the requiredresources
are temporarily unavailable. Other resource allocationpoints which may returnEBUSY can be theVIDIOC_STREAMON ioctl

Table 4-6. structv4l2_sliced_vbi_format

__u32service_set  Ifservice_set is non-zero when passed withVIDIOC_S_FMTorVIDIOC_TRY_FMT, theservice_lines array will be filled by thedriver according to the services specified in this field. For example,ifservice_setis initialized withV4L2_SLICED_TELETEXT_B | V4L2_SLICED_WSS_625, adriver for the cx25840 video decoder sets lines 7-22 of bothfieldsatoV4L2_SLICED_TELETEXT_Band line 23 of the first field toV4L2_SLICED_WSS_625. Ifservice_set is set to zero, then the valuesofservice_lines will be used instead.  On return the driver sets this field to the union of allelements of the returnedservice_linesarray. It may contain less services than requested, perhaps just one,if the hardware cannot handle more services simultaneously. It may beempty (zero) if none of the requested services are supported by thehardware. __u16service_lines[2][24]  Applications initialize thisarray with sets of data services the driver shall look for or inserton the respective scan line. Subject to hardware capabilities driversreturn the requested set, a subset, which may be just a singleservice, or an empty set. When the hardware cannot handle multipleservices on the same line the driver shall choose one. No assumptionscan be made on which service the driver chooses.  Dataservices are defined in Table 4-7. Array indicesmap to ITU-R line numbers (see also Figure 4-2 and Figure 4-3) as follows:   Element525 line systems625 line systems  service_lines[0][1]11  service_lines[0][23]2323  service_lines[1][1]264314  service_lines[1][23]286336  Drivers must setservice_lines[0][0] andservice_lines[1][0] to zero.__u32io_sizeMaximum number of bytes passed byone read() or write() call, and the buffer size in bytes fortheVIDIOC_QBUF andVIDIOC_DQBUF ioctl. Drivers set this field tothe size of struct v4l2_sliced_vbi_data times the number of non-zeroelements in the returnedservice_linesarray (that is the number of lines potentially carrying data).__u32reserved[2]This array is reserved for futureextensions. Applications and drivers must set it to zero.Notes: a. According to ETS 300 706 lines 6-22 of thefirst field and lines 5-22 of the second field may carry Teletextdata.

Table 4-7. Sliced VBI services

SymbolValueReferenceLines, usuallyPayloadV4L2_SLICED_TELETEXT_B(Teletext System B)0x0001ETS 300 706,ITU BT.653PAL/SECAM line 7-22, 320-335 (second field 7-22)Last 42 of the 45 byte Teletext packet, that iswithout clock run-in and framing code, lsb first transmitted.V4L2_SLICED_VPS0x0400ETS 300 231PAL line 16Byte number 3 to 15 according to Figure 9 ofETS 300 231, lsb first transmitted.V4L2_SLICED_CAPTION_5250x1000EIA 608-BNTSC line 21, 284 (second field 21)Two bytes in transmission order, including paritybit, lsb first transmitted.V4L2_SLICED_WSS_6250x4000ITU BT.1119,EN 300 294PAL/SECAM line 23Byte         0                 1
msb         lsb  msb           lsb
Bit  7 6 5 4 3 2 1 0  x x 13 12 11 10 9V4L2_SLICED_VBI_5250x1000Set of services applicable to 525line systems.V4L2_SLICED_VBI_6250x4401Set of services applicable to 625line systems.

Drivers may return an EINVAL error code when applications attempt toread or write data without prior format negotiation, after switchingthe video standard (which may invalidate the negotiated VBIparameters) and after switching
the video input (which may change thevideo standard as a side effect). The VIDIOC_S_FMT ioctl
may returnan EBUSY error code when applications attempt to change the format while i/o isin progress (between aVIDIOC_STREAMON andVIDIOC_STREAMOFF call,and

4.8.4. Reading and writing sliced VBI data

A single read() or write() call
must pass all databelonging to one video frame. That is an array ofv4l2_sliced_vbi_data structures with one ormore elements and a total size not exceedingio_size bytes. Likewise in streaming
I/Omode one buffer of io_size bytes mustcontain data of one video frame. Theid ofunusedv4l2_sliced_vbi_data elements must bezero.

Table 4-8. structv4l2_sliced_vbi_data

__u32idA flag from Table 2identifying the type of data in this packet. Only a single bit must beset. When theid of a captured packet iszero, the packet is empty and the contents of other fields areundefined. Applications shall ignore empty packets. When theid of a packet for output is zero thecontents of thedata field are undefinedand the driver must no longer insert data on the requestedfield andline.__u32fieldThe video field number this data has been capturedfrom, or shall be inserted at.0 for the firstfield,1 for the second field.__u32lineThe field (as opposed to frame) line number thisdata has been captured from, or shall be inserted at. SeeFigure 4-2 andFigure 4-3 for validvalues. Sliced VBI capture devices can set the line number of allpackets to0 if the hardware cannot reliablyidentify scan lines. The field number must always be valid.__u32reservedThis field is reserved for future extensions.Applications and drivers must set it to zero.__u8data[48]The packet payload. See Table 2 for the contents and number ofbytes passed for each data type. The contents of padding bytes at theend of this array are undefined, drivers and applications shall ignorethem.

Packets are always passed in ascending line number order,without duplicate line numbers. Thewrite() function
and theVIDIOC_QBUF ioctl must return an EINVAL error
code when applications violatethis rule. They must also return anEINVAL error code when applications pass anincorrect field or line number, or a combination offield,line andid which
has not been negotiated with theVIDIOC_G_FMT orVIDIOC_S_FMT ioctl.
When the line numbers areunknown the driver must pass the packets in transmitted order. Thedriver can insert empty packets withid setto zero anywhere in the packet array.

To assure synchronization and to distinguish from framedropping, when a captured frame does not carry any of the requesteddata services drivers must pass one or more empty packets. When anapplication fails to pass VBI data in time for output, the drivermust
output the last VPS and WSS packet again, and disable the outputof Closed Caption and Teletext data, or output data which is ignoredby Closed Caption and Teletext decoders.

A sliced VBI device may support read/write and/or streaming (memory mapping and/oruserpointer) I/O. The latter bears the possibility of synchronizingvideo and VBI data by using buffer
timestamps.

4.9. Teletext Interface

This interface aims at devices receiving and demodulatingTeletext data [ETS 300 706,ITU BT.653],
evaluating theTeletext packages and storing formatted pages in cache memory. Suchdevices are usually implemented as microcontrollers with serialinterface (I2C) and can be found on olderTV cards, dedicated Teletext decoding cards and home-brew devicesconnected
to the PC parallel port.

The Teletext API was designed by Martin Buck. It is defined inthe kernel header filelinux/videotext.h, thespecification is available fromhttp://home.pages.de/~videotext/.
(Videotext is the name ofthe German public television Teletext service.) Conventional characterdevice file names are/dev/vtx and/dev/vttuner, with device number 83, 0 and 83, 16respectively. A similar interface
exists for the Philips SAA5249Teletext decoder [specification?] with character device file names/dev/tlkN, device number 102, N.

Eventually the Teletext API was integrated into the V4L APIwith character device file names/dev/vtx0 to/dev/vtx31, device major number 81, minor numbers192 to 223. For reference the V4L Teletext API specification
isreproduced here in full: "Teletext interfaces talk the existing VTXAPI." Teletext devices with major number 83 and 102 will be removed inLinux 2.6.

There are no plans to replace the Teletext API or to integrateit into V4L2. Please write to the Video4Linux mailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list when
the need arises.

4.10.1. Querying Capabilities

Devices supporting the radio interface set theV4L2_CAP_RADIO andV4L2_CAP_TUNER flag in thecapabilities field of struct v4l2_capabilityreturned
by theVIDIOC_QUERYCAP ioctl. Other combinations ofcapability flags are reserved
for future extensions.

4.10.2. Supplemental Functions

Radio devices can support controls, and must support the tuner ioctls.

They do not support the video input or output, audio inputor output, video standard, cropping and scaling, compression andstreaming parameter, or overlay ioctls. All other ioctls and I/Omethods are reserved for future extensions.

4.10.3. Programming

Radio devices may have a couple audio controls (as discussedin Section 1.8) such as a volume control, possibly customcontrols.
Further all radio devices have one tuner (these arediscussed inSection 1.6) with index number zero to selectthe radio
frequency and to determine if a monaural or FM stereoprogram is received. Drivers switch automatically between AM and FMdepending on the selected frequency. TheVIDIOC_G_TUNER ioctlreports
the supported frequency range.

4.11. RDS Interface

The Radio Data System transmits supplementaryinformation in binary format, for example the station name or travelinformation, on a inaudible audio subcarrier of a radio program. Thisinterface aims at devices capable of receiving and decoding RDSinformation.

The V4L API defines its RDS API as follows.

data is packed in groups of three,as follows:

First Octet Least Significant Byte of RDS Block

Second Octet Most Significant Byte of RDS Block

Third Octet Bit 7: Error bit. Indicates that anuncorrectable error occurred during reception of this block. Bit 6:Corrected bit. Indicates that an error was corrected for this datablock. Bits 5-3: Received Offset. Indicates the offset received by thesync system. Bits 2-0: Offset Name. Indicates the offset applied tothis data.

It was argued the RDS API should beextended before integration into V4L2, no new API has been devised yet.Please write to the Video4Linux mailing list for discussion:https://listman.redhat.com/mailman/listinfo/video4linux-list.
Meanwhile no V4L2 driver should set theV4L2_CAP_RDS_CAPTURE capability flag.

I. Function Reference

V4L2 close() -- Close a V4L2 device

V4L2 ioctl() -- Program a V4L2 device

ioctl VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities

ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers

ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command

ioctl VIDIOC_ENUMAUDIO -- Enumerate audio inputs

ioctl VIDIOC_ENUMAUDOUT -- Enumerate audio outputs

ioctl VIDIOC_ENUM_FMT -- Enumerate image formats

ioctl VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes

ioctl VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals

ioctl VIDIOC_ENUMINPUT -- Enumerate video inputs

ioctl VIDIOC_ENUMOUTPUT -- Enumerate video outputs

ioctl VIDIOC_ENUMSTD -- Enumerate supported video standards

ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes

ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT -- Query or select the current audio output

ioctl VIDIOC_G_CHIP_IDENT -- Identify the chips on a TV card

ioctl VIDIOC_G_CROP, VIDIOC_S_CROP -- Get or set the current cropping rectangle

ioctl VIDIOC_G_CTRL, VIDIOC_S_CTRL -- Get or set the value of a control

ioctl VIDIOC_G_ENC_INDEX -- Get meta data about a compressed video stream

ioctl VIDIOC_G_EXT_CTRLS, VIDIOC_S_EXT_CTRLS,VIDIOC_TRY_EXT_CTRLS -- Get or set the value of several controls, try controlvalues

ioctl VIDIOC_G_FBUF, VIDIOC_S_FBUF -- Get or set frame buffer overlay parameters

ioctl VIDIOC_G_FMT, VIDIOC_S_FMT,VIDIOC_TRY_FMT -- Get or set the data format, try a format

ioctl VIDIOC_G_FREQUENCY, VIDIOC_S_FREQUENCY -- Get or set tuner or modulator radiofrequency

ioctl VIDIOC_G_INPUT, VIDIOC_S_INPUT -- Query or select the current video input

ioctl VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP --

ioctl VIDIOC_G_MODULATOR, VIDIOC_S_MODULATOR -- Get or set modulator attributes

ioctl VIDIOC_G_OUTPUT, VIDIOC_S_OUTPUT -- Query or select the current video output

ioctl VIDIOC_G_PARM, VIDIOC_S_PARM -- Get or set streaming parameters

ioctl VIDIOC_G_PRIORITY, VIDIOC_S_PRIORITY -- Query or request the access priority associated with afile descriptor

ioctl VIDIOC_G_SLICED_VBI_CAP -- Query sliced VBI capabilities

ioctl VIDIOC_G_STD, VIDIOC_S_STD -- Query or select the video standard of the current input

ioctl VIDIOC_G_TUNER, VIDIOC_S_TUNER -- Get or set tuner attributes

ioctl VIDIOC_LOG_STATUS -- Log driver status information

ioctl VIDIOC_OVERLAY -- Start or stop video overlay

ioctl VIDIOC_QBUF, VIDIOC_DQBUF -- Exchange a buffer with the driver

ioctl VIDIOC_QUERYBUF -- Query the status of a buffer

ioctl VIDIOC_QUERYCAP -- Query device capabilities

ioctl VIDIOC_QUERYSTD -- Sense the video standard received by the currentinput

ioctl VIDIOC_REQBUFS -- Initiate Memory Mapping or User Pointer I/O

ioctl VIDIOC_STREAMON, VIDIOC_STREAMOFF -- Start or stop streaming I/O

V4L2 mmap() -- Map device memory into application address space

V4L2 munmap() -- Unmap device memory

V4L2 open() -- Open a V4L2 device

V4L2 poll() -- Wait for some event on a file descriptor

V4L2 select() -- Synchronous I/O multiplexing

V4L2 write() -- Write to a V4L2 device

V4L2 close()

Name
v4l2-close -- Close a V4L2 device

Synopsis

#include <unistd.h>

int close(int fd);

Arguments

fd

File descriptor returned by open().

Description

Closes the device. Any I/O in progress is terminated andresources associated with the file descriptor are freed. However dataformat parameters, current input or output, control values or otherproperties remain unchanged.

Return Value

The function returns 0 onsuccess, -1 on failure and theerrno is set appropriately. Possible errorcodes:

fd is not a valid open filedescriptor.

V4L2 ioctl()

Name
v4l2-ioctl -- Program a V4L2 device

Synopsis

#include <sys/ioctl.h>

int ioctl(int fd, int request, void *argp);

Arguments

fd

File descriptor returned by open().

request

V4L2 ioctl request code as defined in the videodev.h header file, for exampleVIDIOC_QUERYCAP.

argp

Pointer to a function parameter, usually a structure.

Description

The ioctl() function is used to programV4L2 devices. The argumentfd must be an openfile descriptor. An ioctlrequest has encodedin it whether the argument is an input,
output or read/writeparameter, and the size of the argumentargp inbytes. Macros and defines specifying V4L2 ioctl requests are locatedin thevideodev.h header
file.Applications should use their own copy, not include the version in thekernel sources on the system they compile on. All V4L2 ioctl requests,their respective function and parameters are specified inReference
I,Function Reference.

Return Value

On success the ioctl() function returns0 and does not reset theerrno variable. On failure-1 is returned, when the ioctl takes anoutput
or read/write parameter it remains unmodified, and theerrno variable is set appropriately. See below forpossible error codes. Generic errors likeEBADForEFAULTare not
listed in the sectionsdiscussing individual ioctl requests.

Note ioctls may return undefined error codes. Since errorsmay have side effects such as a driver reset applications shouldabort on unexpected errors.

fd is not a valid open filedescriptor.

EBUSY

The property cannot be changed right now. Typicallythis error code is returned when I/O is in progress or the driversupports multiple opens and another process locked the property.

EFAULT

argp references an inaccessiblememory area.

ENOTTY

fd is not associated with acharacter special device.

EINVAL

The request or the data pointedto by argp is not valid. This is a very commonerror code, see the individual ioctl requests listed inReference I,Function Reference for actual causes.

ENOMEM

Not enough physical or virtual memory was available tocomplete the request.

ERANGE

The application attempted to set a control with theVIDIOC_S_CTRL ioctl to a value which is out of bounds.

ioctl VIDIOC_CROPCAP

Name
VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities

Synopsis

int ioctl(int fd, int request, struct v4l2_cropcap*argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_CROPCAP

argp

Description

Applications use this function to query the croppinglimits, the pixel aspect of images and to calculate scale factors.They set thetype field of a v4l2_cropcapstructure to the respective buffer (stream) type and call theVIDIOC_CROPCAP ioctl
with a pointer to thisstructure. Drivers fill the rest of the structure. The results areconstant except when switching the video standard. Remember thisswitch can occur implicit when switching the video input oroutput.

Table 1. struct v4l2_cropcap

enum v4l2_buf_typetypeType of the data stream, set by the application.Only these types are valid here:V4L2_BUF_TYPE_VIDEO_CAPTURE,V4L2_BUF_TYPE_VIDEO_OUTPUT,V4L2_BUF_TYPE_VIDEO_OVERLAY, and custom (driverdefined) types with code V4L2_BUF_TYPE_PRIVATEand higher.struct v4l2_rectboundsDefines the window within capturing or output ispossible, this may exclude for example the horizontal and verticalblanking areas. The cropping rectangle cannot exceed these limits.Width and height are defined in pixels, the driver writer is free tochoose origin and units of the coordinate system in the analogdomain.struct v4l2_rectdefrectDefault cropping rectangle, it shall cover the"whole picture". Assuming pixel aspect 1/1 this could be for example a640 × 480 rectangle for NTSC, a768 × 576 rectangle for PAL and SECAM centered overthe active picture area. The same co-ordinate system as for bounds is used.struct v4l2_fractpixelaspect  This is the pixel aspect (y / x) when noscaling is applied, the ratio of the actual samplingfrequency and the frequency required to get squarepixels.  When cropping coordinates refer to square pixels,the driver sets pixelaspect to 1/1. Othercommon values are 54/59 for PAL and SECAM, 11/10 for NTSC sampledaccording to [ITU BT.601].

Table 2. struct v4l2_rect

__s32leftHorizontal offset of the top, left corner of therectangle, in pixels.__s32topVertical offset of the top, left corner of therectangle, in pixels.__s32widthWidth of the rectangle, in pixels.__s32heightHeight of the rectangle, in pixels. Widthand height cannot be negative, the fields are signed forhysterical reasons.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The struct v4l2_cropcaptype isinvalid or the ioctl is not supported. This is not permitted forvideo capture, output and overlay devices, which must supportVIDIOC_CROPCAP.

ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER

Name
VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers

Synopsis

int ioctl(int fd, int request, struct v4l2_register *argp);

int ioctl(int fd, int request, const struct v4l2_register*argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER

argp

Description

Experimental: This is an experimentalinterface and may change in the future.

For driver debugging purposes these ioctls allow testapplications to access hardware registers directly. Regularapplications should not use them.

Since writing or even reading registers can jeopardize thesystem security, its stability and damage the hardware, both ioctlsrequire superuser privileges. Additionally the Linux kernel must becompiled with theCONFIG_VIDEO_ADV_DEBUG optionto
enable these ioctls.

To write a register applications must initialize all fieldsof a struct v4l2_register and callVIDIOC_DBG_S_REGISTER with
a pointer to thisstructure. Thematch_type andmatch_chip fields select a chip on the TVcard, thereg field specifies a registernumber and theval field
the value to bewritten into the register.

To read a register applications must initialize thematch_type,match_chip andreg fields, and callVIDIOC_DBG_G_REGISTER with a pointer
to thisstructure. On success the driver stores the register value in theval field. On failure the structure remainsunchanged.

When match_type isV4L2_CHIP_MATCH_HOST,match_chip selects the nth non-I2C chipon the TV card. Drivers may also interpretmatch_chip as
a random ID, but we recommendagainst that. The number zero always selects the host chip, e. g. thechip connected to the PCI bus. You can find out which chips arepresent with theVIDIOC_G_CHIP_IDENT ioctl.

When match_type isV4L2_CHIP_MATCH_I2C_DRIVER,match_chip contains a driver ID as definedin thelinux/i2c-id.h header file. For instanceI2C_DRIVERID_SAA7127 will
match any chipsupported by the saa7127 driver, regardless of its I2C bus address.When multiple chips supported by the same driver are present, theeffect of these ioctls is undefined. Again with theVIDIOC_G_CHIP_IDENT ioctl
you can find out which I2C chips arepresent.

When match_type isV4L2_CHIP_MATCH_I2C_ADDR,match_chip selects a chip by its 7 bit I2Cbus address.

Success not guaranteed: Due to a flaw in the Linux I2C bus driver these ioctls mayreturn successfully without actually reading or writing a register. Tocatch the most likely failure we recommend aVIDIOC_G_CHIP_IDENTcall confirming the presence of the selected I2C chip.

These ioctls are optional, not all drivers may support them.However when a driver supports these ioctls it must also supportVIDIOC_G_CHIP_IDENT.
Conversely it may supportVIDIOC_G_CHIP_IDENT but not these ioctls.

VIDIOC_DBG_G_REGISTER andVIDIOC_DBG_S_REGISTER were introduced in Linux2.6.21.

We recommended the v4l2-dbgutility over calling these ioctls directly. It is available from theLinuxTV v4l-dvb repository; seehttp://linuxtv.org/repo/foraccess
instructions.

Table 1. struct v4l2_register

__u32match_typeSee Table 2 for a list of possible types. __u32match_chipMatch a chip by this number, interpreted accordingto the match_type field. __u64regA register number. __u64valThe value read from, or to be written into theregister.

Table 2. Chip Match Types

V4L2_CHIP_MATCH_HOST0Match the nth chip on the card, zero for the host chip. Does not match I2C chips.V4L2_CHIP_MATCH_I2C_DRIVER1Match an I2C chip by its driver ID from thelinux/i2c-id.h header file.V4L2_CHIP_MATCH_I2C_ADDR2Match a chip by its 7 bit I2C bus address.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The driver does not support this ioctl, or the kernelwas not compiled with theCONFIG_VIDEO_ADV_DEBUGoption, or thematch_type is invalid, or theselected chip or register does not exist.

EPERM

Insufficient permissions. Root privileges are requiredto execute these ioctls.

ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD

Name
VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command

Synopsis

int ioctl(int fd, int request, struct v4l2_encoder_cmd *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD

argp

Description

Experimental: This is an experimentalinterface and may change in the future.

These ioctls control an audio/video (usually MPEG-) encoder.VIDIOC_ENCODER_CMD sends a command to theencoder,VIDIOC_TRY_ENCODER_CMD can be used totry a command without actually executing it.

To send a command applications must initialize all fields of a struct v4l2_encoder_cmd and callVIDIOC_ENCODER_CMD orVIDIOC_TRY_ENCODER_CMD with
a pointer to this structure.

The cmd field must contain thecommand code. Theflags field is currentlyonly used by the STOP command and contains one bit: If theV4L2_ENC_CMD_STOP_AT_GOP_END flag
is set,encoding will continue until the end of the currentGroupOf Pictures, otherwise it will stop immediately.

A read() call sends a START command tothe encoder if it has not been started yet. After a STOP command,read() calls will read the remaining databuffered by the driver. When the buffer is empty,read()
will return zero and the nextread() call will restart the encoder.

A close() call sends an immediate STOPto the encoder, and all buffered data is discarded.

These ioctls are optional, not all drivers may supportthem. They were introduced in Linux 2.6.21.

Table 1. struct v4l2_encoder_cmd

__u32cmdThe encoder command, see Table 2.__u32flagsFlags to go with the command, see Table 3. If no flags are defined forthis command, drivers and applications must set this field tozero.__u32data[8]Reserved for future extensions. Drivers andapplications must set the array to zero.

Table 2. Encoder Commands

V4L2_ENC_CMD_START0Start the encoder. When the encoder is alreadyrunning or paused, this command does nothing. No flags are defined forthis command.V4L2_ENC_CMD_STOP1Stop the encoder. When theV4L2_ENC_CMD_STOP_AT_GOP_END flag is set,encoding will continue until the end of the currentGroupOf Pictures, otherwise encoding will stop immediately.When the encoder is already stopped, this command doesnothing.V4L2_ENC_CMD_PAUSE2Pause the encoder. When the encoder has not beenstarted yet, the driver will return anEPERM error code. When the encoder isalready paused, this command does nothing. No flags are defined forthis command.V4L2_ENC_CMD_RESUME3Resume encoding after a PAUSE command. When theencoder has not been started yet, the driver will return anEPERM error code.When the encoder is already running, this command does nothing. Noflags are defined for this command.

Table 3. Encoder Command Flags

V4L2_ENC_CMD_STOP_AT_GOP_END0x0001Stop encoding at the end of the current Group OfPictures, rather than immediately.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The driver does not support this ioctl, or thecmd field is invalid.

EPERM

The application sent a PAUSE or RESUME command whenthe encoder was not running.

ioctl VIDIOC_ENUMAUDIO

Name
VIDIOC_ENUMAUDIO -- Enumerate audio inputs

Synopsis

int ioctl(int fd, int request, struct v4l2_audio *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUMAUDIO

argp

Description

To query the attributes of an audio input applicationsinitialize the index field and zero out thereserved array of a struct v4l2_audioand
call theVIDIOC_ENUMAUDIO ioctl with a pointerto this structure. Drivers fill the rest of the structure or return anEINVAL error code when the index is out of bounds. To enumerate all audioinputs
applications shall begin at index zero, incrementing by oneuntil the driver returnsEINVAL.

See ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO(2) for a description ofstruct v4l2_audio.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The number of the audio input is out of bounds, orthere are no audio inputs at all and this ioctl is notsupported.

ioctl VIDIOC_ENUMAUDOUT

Name
VIDIOC_ENUMAUDOUT -- Enumerate audio outputs

Synopsis

int ioctl(int fd, int request, struct v4l2_audioout *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUMAUDOUT

argp

Description

To query the attributes of an audio output applicationsinitialize the index field and zero out thereserved array of a struct v4l2_audioout andcall
theVIDIOC_G_AUDOUT ioctl with a pointerto this structure. Drivers fill the rest of the structure or return anEINVAL error code when the index is out of bounds. To enumerate all audiooutputs applications
shall begin at index zero, incrementing by oneuntil the driver returns EINVAL.

Note connectors on a TV card to loop back the received audiosignal to a sound card are not audio outputs in this sense.

See ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT(2) for a description ofstruct v4l2_audioout.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The number of the audio output is out of bounds, orthere are no audio outputs at all and this ioctl is notsupported.

ioctl VIDIOC_ENUM_FMT

Name
VIDIOC_ENUM_FMT -- Enumerate image formats

Synopsis

int ioctl(int fd, int request, struct v4l2_fmtdesc*argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUM_FMT

argp

Description

To enumerate image formats applications initialize thetype andindexfield of struct v4l2_fmtdesc and
call theVIDIOC_ENUM_FMT ioctl with a pointer to thisstructure. Drivers fill the rest of the structure or return anEINVAL error code. All formats are enumerable by beginning at index zero andincrementing
by one untilEINVAL isreturned.

Table 1. struct v4l2_fmtdesc

__u32indexNumber of the format in the enumeration, set bythe application. This is in no way related to thepixelformat field.enum v4l2_buf_typetypeType of the data stream, set by the application.Only these types are valid here:V4L2_BUF_TYPE_VIDEO_CAPTURE,V4L2_BUF_TYPE_VIDEO_OUTPUT,V4L2_BUF_TYPE_VIDEO_OVERLAY, and custom (driverdefined) types with code V4L2_BUF_TYPE_PRIVATEand higher.__u32flagsSee Table 2__u8description[32]Description of the format, a NUL-terminated ASCIIstring. This information is intended for the user, for example: "YUV4:2:2".__u32pixelformatThe image format identifier. This is afour character code as computed by the v4l2_fourcc()macro:#define v4l2_fourcc(a,b,c,d) (((__u32)(a)<<0)|((__u32)(b)<<8)|((__u32)(c)<<16)|((__u32)(d)<<24))  Several image formats are alreadydefined by this specification in Chapter 2. Note thesecodes are not the same as those used in the Windows world. __u32reserved[4]Reserved for future extensions. Drivers must setthe array to zero.

Table 2. Image Format Description Flags

V4L2_FMT_FLAG_COMPRESSED0x0001This is a compressed format.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The struct v4l2_fmtdesctypeis not supported or theindex is out ofbounds.

ioctl VIDIOC_ENUM_FRAMESIZES

Name
VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes

Synopsis

int ioctl(int fd, int request, struct v4l2_frmsizeenum *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUM_FRAMESIZES

argp

Pointer to a struct v4l2_frmsizeenum that contains an indexand pixel format and receives a frame width and height.

Description

Experimental: This is an experimentalinterface and may change in the future.

This ioctl allows applications to enumerate all frame sizes(i. e. width and height in pixels) that the device supports for thegiven pixel format.

The supported pixel formats can be obtained by using theVIDIOC_ENUM_FMT function.

The return value and the content of thev4l2_frmsizeenum.type field depend on thetype of frame sizes the device supports. Here are the semantics of thefunction for the different cases:

Discrete: The functionreturns success if the given index value (zero-based) is valid. Theapplication should increase the index by one for each call untilEINVAL is returned. Thev4l2_frmsizeenum.type field is set toV4L2_FRMSIZE_TYPE_DISCRETE by the driver. Of theunion only thediscretemember isvalid.

Step-wise: The functionreturns success if the given index value is zero andEINVAL for any other index value. Thev4l2_frmsizeenum.type field is set toV4L2_FRMSIZE_TYPE_STEPWISE by the driver. Of theunion only thestepwise member isvalid.

Continuous: This is aspecial case of the step-wise type above. The function returns successif the given index value is zero andEINVAL forany other index value. Thev4l2_frmsizeenum.type field is set toV4L2_FRMSIZE_TYPE_CONTINUOUS by the driver. Ofthe union only thestepwise member is validand thestep_width andstep_height values are set to 1.

When the application calls the function with index zero, itmust check the type field to determine thetype of frame size enumeration the device supports. Only for theV4L2_FRMSIZE_TYPE_DISCRETE type
does it makesense to increase the index value to receive more frame sizes.

Note that the order in which the frame sizes are returnedhas no special meaning. In particular does it not say anything aboutpotential default format sizes.

Applications can assume that the enumeration data does notchange without any interaction from the application itself. This meansthat the enumeration data is consistent if the application does notperform any other ioctl calls while it runs the frame sizeenumeration.

Structs

In the structs below, IN denotes avalue that has to be filled in by the application,OUT denotes
values that the driver fills in. Theapplication should zero out all members except for theIN fields.

Table 1. struct v4l2_frmsize_discrete

__u32widthWidth of the frame [pixel].__u32heightHeight of the frame [pixel].

Table 2. struct v4l2_frmsize_stepwise

__u32min_widthMinimum frame width [pixel].__u32max_widthMaximum frame width [pixel].__u32step_widthFrame width step size [pixel].__u32min_heightMinimum frame height [pixel].__u32max_heightMaximum frame height [pixel].__u32step_heightFrame height step size [pixel].

Table 3. struct v4l2_frmsizeenum

__u32index IN: Index of the given frame size in the enumeration.__u32pixel_format IN: Pixel format for which the frame sizes are enumerated.__u32type OUT: Frame size type the device supports.union  OUT: Frame size with the given index. struct v4l2_frmsize_discretediscrete  struct v4l2_frmsize_stepwisestepwise __u32reserved[2] Reserved space for future use.

Enums

Table 4. enum v4l2_frmsizetypes

V4L2_FRMSIZE_TYPE_DISCRETE1Discrete frame size.V4L2_FRMSIZE_TYPE_CONTINUOUS2Continuous frame size.V4L2_FRMSIZE_TYPE_STEPWISE3Step-wise defined frame size.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

See the description section above for a list of returnvalues that errno can have.

ioctl VIDIOC_ENUM_FRAMEINTERVALS

Name
VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals

Synopsis

int ioctl(int fd, int request, struct v4l2_frmivalenum *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUM_FRAMEINTERVALS

argp

Pointer to a struct v4l2_frmivalenum structure thatcontains a pixel format and size and receives a frame interval.

Description

This ioctl allows applications to enumerate all frameintervals that the device supports for the given pixel format andframe size.

The supported pixel formats and frame sizes can be obtainedby using the VIDIOC_ENUM_FMT and VIDIOC_ENUM_FRAMESIZESfunctions.

The return value and the content of thev4l2_frmivalenum.type field depend on thetype of frame intervals the device supports. Here are the semantics ofthe function for the different cases:

Discrete: The functionreturns success if the given index value (zero-based) is valid. Theapplication should increase the index by one for each call untilEINVAL is returned. The v4l2_frmivalenum.typefield is set to V4L2_FRMIVAL_TYPE_DISCRETE by the driver. Of theunion only the discrete member is valid.

Step-wise: The functionreturns success if the given index value is zero andEINVAL for any other index value. Thev4l2_frmivalenum.type field is set toV4L2_FRMIVAL_TYPE_STEPWISE by the driver. Of theunion only thestepwise member isvalid.

Continuous: This is aspecial case of the step-wise type above. The function returns successif the given index value is zero andEINVAL forany other index value. Thev4l2_frmivalenum.type field is set toV4L2_FRMIVAL_TYPE_CONTINUOUS by the driver. Ofthe union only thestepwise member is validand thestep value is set to 1.

When the application calls the function with index zero, itmust check the type field to determine thetype of frame interval enumeration the device supports. Only for theV4L2_FRMIVAL_TYPE_DISCRETE type
does it makesense to increase the index value to receive more frameintervals.

Note that the order in which the frame intervals arereturned has no special meaning. In particular does it not sayanything about potential default frame intervals.

Applications can assume that the enumeration data does notchange without any interaction from the application itself. This meansthat the enumeration data is consistent if the application does notperform any other ioctl calls while it runs the frame intervalenumeration.

Notes

Frame intervals and framerates: The V4L2 API uses frame intervals instead of framerates. Given the frame interval the frame rate can be computed asfollows:
frame_rate = 1 / frame_interval

Structs

In the structs below, IN denotes avalue that has to be filled in by the application,OUT denotes
values that the driver fills in. Theapplication should zero out all members except for theIN fields.

Table 1. struct v4l2_frmival_stepwise

struct v4l2_fractminMinimum frame interval [s].struct v4l2_fractmaxMaximum frame interval [s].struct v4l2_fractstepFrame interval step size [s].

Table 2. struct v4l2_frmivalenum

__u32index IN: Index of the given frame interval in theenumeration.__u32pixel_format IN: Pixel format for which the frame intervals areenumerated.__u32width IN: Frame width for which the frame intervals areenumerated.__u32height IN: Frame height for which the frame intervals areenumerated.__u32type OUT: Frame interval type the device supports.union  OUT: Frame interval with the given index. struct v4l2_fractdiscreteFrame interval [s]. struct v4l2_frmival_stepwisestepwise __u32reserved[2] Reserved space for future use.

Enums

Table 3. enum v4l2_frmivaltypes

V4L2_FRMIVAL_TYPE_DISCRETE1Discrete frame interval.V4L2_FRMIVAL_TYPE_CONTINUOUS2Continuous frame interval.V4L2_FRMIVAL_TYPE_STEPWISE3Step-wise defined frame interval.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

See the description section above for a list of returnvalues that errno can have.

ioctl VIDIOC_ENUMINPUT

Name
VIDIOC_ENUMINPUT -- Enumerate video inputs

Synopsis

int ioctl(int fd, int request, struct v4l2_input*argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUMINPUT

argp

Description

To query the attributes of a video input applicationsinitialize the index field of struct v4l2_inputand
call theVIDIOC_ENUMINPUT ioctl with apointer to this structure. Drivers fill the rest of the structure orreturn anEINVAL error code when the index is out of bounds. To enumerate allinputs applications
shall begin at index zero, incrementing by oneuntil the driver returnsEINVAL.

Table 1. struct v4l2_input

__u32indexIdentifies the input, set by theapplication.__u8name[32]Name of the video input, a NUL-terminated ASCIIstring, for example: "Vin (Composite 2)". This information is intendedfor the user, preferably the connector label on the device itself.__u32typeType of the input, see Table 2.__u32audioset  Drivers can enumerate up to 32 video andaudio inputs. This field shows which audio inputs were selectable asaudio source if this was the currently selected video input. It is abit mask. The LSB corresponds to audio input 0, the MSB to input 31.Any number of bits can be set, or none.  When the driverdoes not enumerate audio inputs no bits must be set. Applicationsshall not interpret this as lack of audio support. Some driversautomatically select audio sources and do not enumerate them sincethere is no choice anyway.  For details on audio inputs andhow to select the current input see Section 1.5. __u32tunerCapture devices can have zero or more tuners (RFdemodulators). When the type is set toV4L2_INPUT_TYPE_TUNER this is an RF connector andthis field identifies the tuner. It corresponds tostruct v4l2_tuner fieldindex. For details ontuners see Section 1.6.v4l2_std_idstdEvery video input supports one or more differentvideo standards. This field is a set of all supported standards. Fordetails on video standards and how to switch seeSection 1.7.__u32statusThis field provides status information about theinput. See Table 3 for flags.status is only valid when this is thecurrent input.__u32reserved[4]Reserved for future extensions. Drivers must setthe array to zero.

Table 2. Input Types

V4L2_INPUT_TYPE_TUNER1This input uses a tuner (RF demodulator).V4L2_INPUT_TYPE_CAMERA2Analog baseband input, for example CVBS /Composite Video, S-Video, RGB.

Table 3. Input Status Flags

GeneralV4L2_IN_ST_NO_POWER0x00000001Attached device is off.V4L2_IN_ST_NO_SIGNAL0x00000002 V4L2_IN_ST_NO_COLOR0x00000004The hardware supports color decoding, but does notdetect color modulation in the signal.Analog VideoV4L2_IN_ST_NO_H_LOCK0x00000100No horizontal sync lock.V4L2_IN_ST_COLOR_KILL0x00000200A color killer circuit automatically disables colordecoding when it detects no color modulation. When this flag is setthe color killer is enabledand has shut offcolor decoding.Digital VideoV4L2_IN_ST_NO_SYNC0x00010000No synchronization lock.V4L2_IN_ST_NO_EQU0x00020000No equalizer lock.V4L2_IN_ST_NO_CARRIER0x00040000Carrier recovery failed.VCR and Set-Top BoxV4L2_IN_ST_MACROVISION0x01000000Macrovision is an analog copy prevention systemmangling the video signal to confuse video recorders. When thisflag is set Macrovision has been detected.V4L2_IN_ST_NO_ACCESS0x02000000Conditional access denied.V4L2_IN_ST_VTR0x04000000VTR time constant. [?]

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The struct v4l2_inputindex isout of bounds.

ioctl VIDIOC_ENUMOUTPUT

Name
VIDIOC_ENUMOUTPUT -- Enumerate video outputs

Synopsis

int ioctl(int fd, int request, struct v4l2_output *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUMOUTPUT

argp

Description

To query the attributes of a video outputs applicationsinitialize the index field of struct v4l2_outputand
call theVIDIOC_ENUMOUTPUT ioctl with apointer to this structure. Drivers fill the rest of the structure orreturn anEINVAL error code when the index is out of bounds. To enumerate alloutputs applications
shall begin at index zero, incrementing by oneuntil the driver returnsEINVAL.

Table 1. struct v4l2_output

__u32indexIdentifies the output, set by theapplication.__u8name[32]Name of the video output, a NUL-terminated ASCIIstring, for example: "Vout". This information is intended for theuser, preferably the connector label on the device itself.__u32typeType of the output, see Table 2.__u32audioset  Drivers can enumerate up to 32 video andaudio outputs. This field shows which audio outputs wereselectable as the current output if this was the currently selectedvideo output. It is a bit mask. The LSB corresponds to audio output 0,the MSB to output 31. Any number of bits can be set, ornone.  When the driver does not enumerate audio outputs nobits must be set. Applications shall not interpret this as lack ofaudio support. Drivers may automatically select audio outputs withoutenumerating them.  For details on audio outputs and how toselect the current output seeSection 1.5. __u32modulatorOutput devices can have zero or more RF modulators.When the typeisV4L2_OUTPUT_TYPE_MODULATOR this is an RFconnector and this field identifies the modulator. It corresponds tostruct v4l2_modulator fieldindex. For detailson modulators see Section 1.6.v4l2_std_idstdEvery video output supports one or more differentvideo standards. This field is a set of all supported standards. Fordetails on video standards and how to switch seeSection 1.7.__u32reserved[4]Reserved for future extensions. Drivers must setthe array to zero.

Table 2. Output Type

V4L2_OUTPUT_TYPE_MODULATOR1This output is an analog TV modulator.V4L2_OUTPUT_TYPE_ANALOG2Analog baseband output, for example Composite /CVBS, S-Video, RGB.V4L2_OUTPUT_TYPE_ANALOGVGAOVERLAY3[?]

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The struct v4l2_outputindexis out of bounds.

ioctl VIDIOC_ENUMSTD

Name
VIDIOC_ENUMSTD -- Enumerate supported video standards

Synopsis

int ioctl(int fd, int request, struct v4l2_standard *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUMSTD

argp

Description

To query the attributes of a video standard,especially a custom (driver defined) one, applications initialize theindex field of struct v4l2_standard and
call theVIDIOC_ENUMSTD ioctl with a pointer to thisstructure. Drivers fill the rest of the structure or return anEINVAL error code when the index is out of bounds. To enumerate all standardsapplications
shall begin at index zero, incrementing by one until thedriver returns EINVAL. Drivers may enumerate adifferent set of standards after switching the video input oroutput.[23]

Table 1. struct v4l2_standard

__u32indexNumber of the video standard, set by theapplication.v4l2_std_ididThe bits in this field identify the standard asone of the common standards listed inTable 3,or if bits 32 to 63 are set as custom standards. Multiple bits can beset if the hardware does not distinguish between these standards,however separate indices do not indicate the opposite. Theidmust be unique. No other enumeratedv4l2_standard structure, for this input oroutput anyway, can contain the same set of bits.__u8name[24]Name of the standard, a NUL-terminated ASCIIstring, for example: "PAL-B/G", "NTSC Japan". This information isintended for the user.struct v4l2_fractframeperiodThe frame period (not field period) is numerator/ denominator. For example M/NTSC has a frame period of 1001 /30000 seconds.__u32framelinesTotal lines per frame including blanking,e. g. 625 for B/PAL.__u32reserved[4]Reserved for future extensions. Drivers must setthe array to zero.

Table 2. struct v4l2_fract

__u32numerator __u32denominator

Table 3. typedef v4l2_std_id

__u64v4l2_std_idThis type is a set, each bit representing anothervideo standard as listed below and inTable 4. The 32 most significant bits are reservedfor custom (driver defined) video standards.

#define V4L2_STD_PAL_B          ((v4l2_std_id)0x00000001)
#define V4L2_STD_PAL_B1         ((v4l2_std_id)0x00000002)
#define V4L2_STD_PAL_G          ((v4l2_std_id)0x00000004)
#define V4L2_STD_PAL_H          ((v4l2_std_id)0x00000008)
#define V4L2_STD_PAL_I          ((v4l2_std_id)0x00000010)
#define V4L2_STD_PAL_D          ((v4l2_std_id)0x00000020)
#define V4L2_STD_PAL_D1         ((v4l2_std_id)0x00000040)
#define V4L2_STD_PAL_K          ((v4l2_std_id)0x00000080)

#define V4L2_STD_PAL_M          ((v4l2_std_id)0x00000100)
#define V4L2_STD_PAL_N          ((v4l2_std_id)0x00000200)
#define V4L2_STD_PAL_Nc         ((v4l2_std_id)0x00000400)
#define V4L2_STD_PAL_60         ((v4l2_std_id)0x00000800)

V4L2_STD_PAL_60 isa hybrid standard with 525 lines, 60 Hz refresh rate, and PAL colormodulation with a 4.43 MHz color subcarrier. Some PAL video recorderscan play back NTSC tapes in this mode for display on a 50/60 Hz agnosticPAL
TV.

#define V4L2_STD_NTSC_M         ((v4l2_std_id)0x00001000)
#define V4L2_STD_NTSC_M_JP      ((v4l2_std_id)0x00002000)
#define V4L2_STD_NTSC_443       ((v4l2_std_id)0x00004000)

V4L2_STD_NTSC_443is a hybrid standard with 525 lines, 60 Hz refresh rate, and NTSCcolor modulation with a 4.43 MHz colorsubcarrier.

#define V4L2_STD_NTSC_M_KR      ((v4l2_std_id)0x00008000)

#define V4L2_STD_SECAM_B        ((v4l2_std_id)0x00010000)
#define V4L2_STD_SECAM_D        ((v4l2_std_id)0x00020000)
#define V4L2_STD_SECAM_G        ((v4l2_std_id)0x00040000)
#define V4L2_STD_SECAM_H        ((v4l2_std_id)0x00080000)
#define V4L2_STD_SECAM_K        ((v4l2_std_id)0x00100000)
#define V4L2_STD_SECAM_K1       ((v4l2_std_id)0x00200000)
#define V4L2_STD_SECAM_L        ((v4l2_std_id)0x00400000)
#define V4L2_STD_SECAM_LC       ((v4l2_std_id)0x00800000)

/* ATSC/HDTV */
#define V4L2_STD_ATSC_8_VSB     ((v4l2_std_id)0x01000000)
#define V4L2_STD_ATSC_16_VSB    ((v4l2_std_id)0x02000000)

V4L2_STD_ATSC_8_VSB andV4L2_STD_ATSC_16_VSB are U.S. terrestrial digitalTV standards. Presently the V4L2 API does not support digital TV. Seealso the Linux DVB API athttp://linuxtv.org.

#define V4L2_STD_PAL_BG         (V4L2_STD_PAL_B         |\
V4L2_STD_PAL_B1        |\
V4L2_STD_PAL_G)
#define V4L2_STD_B              (V4L2_STD_PAL_B         |\
V4L2_STD_PAL_B1        |\
V4L2_STD_SECAM_B)
#define V4L2_STD_GH             (V4L2_STD_PAL_G         |\
V4L2_STD_PAL_H         |\
V4L2_STD_SECAM_G       |\
V4L2_STD_SECAM_H)
#define V4L2_STD_PAL_DK         (V4L2_STD_PAL_D         |\
V4L2_STD_PAL_D1        |\
V4L2_STD_PAL_K)
#define V4L2_STD_PAL            (V4L2_STD_PAL_BG        |\
V4L2_STD_PAL_DK        |\
V4L2_STD_PAL_H         |\
V4L2_STD_PAL_I)
#define V4L2_STD_NTSC           (V4L2_STD_NTSC_M        |\
V4L2_STD_NTSC_M_JP     |\
V4L2_STD_NTSC_M_KR)
#define V4L2_STD_MN             (V4L2_STD_PAL_M         |\
V4L2_STD_PAL_N         |\
V4L2_STD_PAL_Nc        |\
V4L2_STD_NTSC)
#define V4L2_STD_SECAM_DK       (V4L2_STD_SECAM_D       |\
V4L2_STD_SECAM_K       |\
V4L2_STD_SECAM_K1)
#define V4L2_STD_DK             (V4L2_STD_PAL_DK        |\
V4L2_STD_SECAM_DK)

#define V4L2_STD_SECAM          (V4L2_STD_SECAM_B       |\
V4L2_STD_SECAM_G       |\
V4L2_STD_SECAM_H       |\
V4L2_STD_SECAM_DK      |\
V4L2_STD_SECAM_L       |\
V4L2_STD_SECAM_LC)

#define V4L2_STD_525_60         (V4L2_STD_PAL_M         |\
V4L2_STD_PAL_60        |\
V4L2_STD_NTSC          |\
V4L2_STD_NTSC_443)
#define V4L2_STD_625_50         (V4L2_STD_PAL           |\
V4L2_STD_PAL_N         |\
V4L2_STD_PAL_Nc        |\
V4L2_STD_SECAM)

#define V4L2_STD_UNKNOWN        0
#define V4L2_STD_ALL            (V4L2_STD_525_60        |\
V4L2_STD_625_50)

Table 4. Video Standards (based on [ITU BT.470])

Characteristics  M/NTSCa M/PAL  N/PALb B, B1, G/PALD, D1, K/PALH/PALI/PALB, G/SECAMD, K/SECAMK1/SECAML/SECAMFrame lines525625Frame period (s)1001/300001/25Chrominance sub-carrier frequency (Hz)3579545 ± 103579611.49 ± 104433618.75 ± 5 (3582056.25± 5)4433618.75 ± 54433618.75 ± 1fOR =4406250 ± 2000, fOB = 4250000± 2000Nominal radio-frequency channel bandwidth(MHz)666B: 7; B1, G: 88888888Sound carrier relative to vision carrier(MHz)+ 4.5+ 4.5+ 4.5  + 5.5 ± 0.001c de f + 6.5 ± 0.001+ 5.5+ 5.9996 ± 0.0005+ 5.5 ± 0.001+ 6.5 ± 0.001+ 6.5  + 6.5 g Notes: a. Japan uses a standardsimilar to M/NTSC(V4L2_STD_NTSC_M_JP). b. The values inbrackets apply to the combination N/PAL a.k.a.NC used in Argentina(V4L2_STD_PAL_Nc). c. In the Federal Republic of Germany, Austria, Italy,the Netherlands, Slovakia and Switzerland a system of two soundcarriers is used, the frequency of the second carrier being242.1875 kHz above the frequency of the first sound carrier. Forstereophonic sound transmissions a similar system is used inAustralia. d. New Zealand uses a soundcarrier displaced 5.4996 ± 0.0005 MHz from the visioncarrier. e. In Denmark, Finland, NewZealand, Sweden and Spain a system of two sound carriers is used. InIceland, Norway and Poland the same system is being introduced. Thesecond carrier is 5.85 MHz above the vision carrier and is DQPSKmodulated with 728 kbit/s sound and data multiplex. (NICAMsystem) f. In the United Kingdom, asystem of two sound carriers is used. The second sound carrier is6.552 MHz above the vision carrier and is DQPSK modulated with a728 kbit/s sound and data multiplex able to carry two soundchannels. (NICAM system) g. In France, adigital carrier 5.85 MHz away from the vision carrier may be used inaddition to the main sound carrier. It is modulated in differentiallyencoded QPSK with a 728 kbit/s sound and data multiplexer capable ofcarrying two sound channels. (NICAMsystem)

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The struct v4l2_standardindexis out of bounds.

ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO

Name
VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes

Synopsis

int ioctl(int fd, int request, struct v4l2_audio *argp);

int ioctl(int fd, int request, const struct v4l2_audio *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_G_AUDIO, VIDIOC_S_AUDIO

argp

Description