Media controller pipeline for image acquisition and SAM MPU products
Introduction
Starting from kernel version linux 5.15 and mainline 6.2, MPU products Linux software requires media controller paradigm for user space configuration.</br> This page describes how to use and understand the media controller pipeline in SAM MPU products.</br> To understand Media Controller, we take SAMA7G5 as example. SAMA7G5 supports both serial/parallel camera image acquisition.</br> hand You can find out more about SAMA7G5 image acquisition pipeline on this page </br> hand In this page we will use the setup with Sony imx219 sensor connected to the board. It is an example, and if you have a different sensor, the pipeline will be slightly different. </br>
Hardware blocks
Please refer to the product datasheet in the IMAGE SUBSYSTEM chapters, where each hardware block is described.</br>
Software pipeline
- The software pipeline is formed out of specific pipeline _entities_.
- Entities are defined as software constructs that perform a simple video task. It can be for example acquisition, conversion, scaling or outputting video.
- Each entity can have a variable number of _pads_. Pads are ways to send or receive data to or from an entity. Pads can be either _sink_ pads, or _source_ pads.
- A sink pad by definition is a pad where you can send data to (_sink_ data to).
- A source pad by definition is a pad that produces (it's a _source_ of) data.
- Pads can be linked together through a _link_
- You can imagine links like a flow of data. It is then logical to assume that a link is always between a _source_ pad and a _sink_ pad.
</br> hand Entities may or may not correspond to real hardware blocks that process data. Some of them may be pure software implementations.
SAMA7G5 media graph
IMX219 media graph (MIPI CSI2)
<img src="%PUBURLPATH%/%WEB%/%TOPIC%/graph.png" alt="graph.png" width="120" height="500" />
In the graph above, you can see the entities , pads, and links for the SAMA7G5 media pipeline.
This information can be retrieved using the command line tool media-ctl from the v4l2-utils package.
Media controller API version 5.15.0
Media device information
------------------------
driver atmel_isc_commo
model microchip,sama7g5-isc
serial
bus info platform:microchip-sama7g5-xisc
hw revision 0x220
driver version 5.15.0
Device topology
- entity 1: atmel_isc_scaler (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev0
pad0: Sink
[fmt:SBGGR8_1X8/3264x2464 field:none colorspace:srgb
crop.bounds:(0,0)/3264x2464
crop:(0,0)/3264x2464]
<- "csi2dc":1 [ENABLED,IMMUTABLE]
pad1: Source
[fmt:SBGGR8_1X8/3264x2464 field:none colorspace:srgb]
-> "atmel_isc_common":0 [ENABLED,IMMUTABLE]
- entity 4: csi2dc (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev1
pad0: Sink
[fmt:SRGGB8_1X8/640x480 field:none colorspace:srgb]
<- "dw-csi.0":1 [ENABLED]
pad1: Source
[fmt:SRGGB8_1X8/640x480 field:none colorspace:srgb]
-> "atmel_isc_scaler":0 [ENABLED,IMMUTABLE]
- entity 7: dw-csi.0 (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev2
pad0: Sink
[fmt:SBGGR8_1X8/0x0]
<- "imx219 1-0010":0 [ENABLED]
pad1: Source
[fmt:SBGGR8_1X8/0x0]
-> "csi2dc":0 [ENABLED]
- entity 12: imx219 1-0010 (1 pad, 1 link)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev3
pad0: Source
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:srgb xfer:srgb ycbcr:601 quantization:full-range
crop.bounds:(8,8)/3280x2464
crop:(8,8)/3280x2464]
-> "dw-csi.0":0 [ENABLED]
- entity 24: atmel_isc_common (1 pad, 1 link)
type Node subtype V4L flags 1
device node name /dev/video0
pad0: Sink
<- "atmel_isc_scaler":1 [ENABLED,IMMUTABLE]
- In this media-ctl output we can find out all the entities in the pipeline, pads and links between them, together with the current configured format for each pad.</br> hand It is important to notice that each entity has a type.
- We can see that one of them is Sensor , the imx219 entity. This is the Sony imx219 sensor.
- We can see that one of them is V4L , the video4linux2 entity. This corresponds to the /dev/video0 classic v4l2 video node.
Querying subdevice capabilities
- Entities can have a subdevice associated.
- A subdevice is a /dev/v4l-subdev node that can be queried for information regarding the subdevice capabilities.
- The media-ctl -p command will show for each entity, if there is a v4l2 subdevice associated or not, and which index each subdevice has.</br>
- Let's query our subdevices for capabilities.
Enum mbus codes
- One important capability that each subdevice must have, is a list of mbus (media bus) codes supported.</br> hand The media bus code is a number that represents the format of the data that is supported by the subdevice.</br> * A media entity associated with a subdevice can have multiple pads, and each pad can have different media buses supported. * For example one entity can be a format converter, then it's only logical that it's a different set of mbus codes is supported on the source pad compared to the sink pad. </br> hand A list of mbus codes can be found here</br>
Query the sensor (subdevice 3 ):
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=0)
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x3014: MEDIA_BUS_FMT_SRGGB8_1X8
- We queried subdev3 to see the mbus codes supported by this pad (pad number 0).
- We see that there are two codes supported: 0x300f and 0x3014 . These correspond to SRGGB10_1x10 and SRGGB8_1X8 .
Query the dw_csi entity (subdevice 2):
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=0)
0x3001: MEDIA_BUS_FMT_SBGGR8_1X8
0x3007: MEDIA_BUS_FMT_SBGGR10_1X10
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x3008: MEDIA_BUS_FMT_SBGGR12_1X12
0x3019: MEDIA_BUS_FMT_SBGGR14_1X14
0x301d: MEDIA_BUS_FMT_SBGGR16_1X16
0x1009: MEDIA_BUS_FMT_RGB666_1X18
0x1007: MEDIA_BUS_FMT_RGB565_2X8_BE
0x1008: MEDIA_BUS_FMT_RGB565_2X8_LE
0x1003: MEDIA_BUS_FMT_RGB555_2X8_PADHI_BE
0x1004: MEDIA_BUS_FMT_RGB555_2X8_PADHI_LE
0x1001: MEDIA_BUS_FMT_RGB444_2X8_PADHI_BE
0x1002: MEDIA_BUS_FMT_RGB444_2X8_PADHI_LE
0x100c: MEDIA_BUS_FMT_RGB888_2X12_LE
0x100b: MEDIA_BUS_FMT_RGB888_2X12_BE
0x100a: MEDIA_BUS_FMT_RGB888_1X24
0x2010: MEDIA_BUS_FMT_VYUY8_1X16
0x2019: MEDIA_BUS_FMT_VYUY10_2X10
0x2001: MEDIA_BUS_FMT_Y8_1X8
0x200a: MEDIA_BUS_FMT_Y10_1X10
# v4l2-ctl -d /dev/v4l-subdev2 --list-subdev-mbus-codes 1
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=1)
0x3001: MEDIA_BUS_FMT_SBGGR8_1X8
0x3007: MEDIA_BUS_FMT_SBGGR10_1X10
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x3008: MEDIA_BUS_FMT_SBGGR12_1X12
0x3019: MEDIA_BUS_FMT_SBGGR14_1X14
0x301d: MEDIA_BUS_FMT_SBGGR16_1X16
0x1009: MEDIA_BUS_FMT_RGB666_1X18
0x1007: MEDIA_BUS_FMT_RGB565_2X8_BE
0x1008: MEDIA_BUS_FMT_RGB565_2X8_LE
0x1003: MEDIA_BUS_FMT_RGB555_2X8_PADHI_BE
0x1004: MEDIA_BUS_FMT_RGB555_2X8_PADHI_LE
0x1001: MEDIA_BUS_FMT_RGB444_2X8_PADHI_BE
0x1002: MEDIA_BUS_FMT_RGB444_2X8_PADHI_LE
0x100c: MEDIA_BUS_FMT_RGB888_2X12_LE
0x100b: MEDIA_BUS_FMT_RGB888_2X12_BE
0x100a: MEDIA_BUS_FMT_RGB888_1X24
0x2010: MEDIA_BUS_FMT_VYUY8_1X16
0x2019: MEDIA_BUS_FMT_VYUY10_2X10
0x2001: MEDIA_BUS_FMT_Y8_1X8
0x200a: MEDIA_BUS_FMT_Y10_1X10
- We queried both pads of the subdev2 and obtained the possible mbus codes for this subdevice on each of its pads.
- It's good to see that the source pad of the subdevice3 has codes that are also found in the list of codes for the sink pad of the subdevice2.
- This means that most likely a common format can be found for the link between them.
- It's good to see that the source pad of the subdevice3 has codes that are also found in the list of codes for the sink pad of the subdevice2.
Query the csi2dc entity (subdevice 1):
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=0)
0x3014: MEDIA_BUS_FMT_SRGGB8_1X8
0x3001: MEDIA_BUS_FMT_SBGGR8_1X8
0x3002: MEDIA_BUS_FMT_SGRBG8_1X8
0x3013: MEDIA_BUS_FMT_SGBRG8_1X8
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x3007: MEDIA_BUS_FMT_SBGGR10_1X10
0x300a: MEDIA_BUS_FMT_SGRBG10_1X10
0x300e: MEDIA_BUS_FMT_SGBRG10_1X10
0x2008: MEDIA_BUS_FMT_YUYV8_2X8
# v4l2-ctl -d /dev/v4l-subdev1 --list-subdev-mbus-codes 1
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=1)
0x3014: MEDIA_BUS_FMT_SRGGB8_1X8
0x3001: MEDIA_BUS_FMT_SBGGR8_1X8
0x3002: MEDIA_BUS_FMT_SGRBG8_1X8
0x3013: MEDIA_BUS_FMT_SGBRG8_1X8
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x3007: MEDIA_BUS_FMT_SBGGR10_1X10
0x300a: MEDIA_BUS_FMT_SGRBG10_1X10
0x300e: MEDIA_BUS_FMT_SGBRG10_1X10
0x2008: MEDIA_BUS_FMT_YUYV8_2X8
#
- We queried both pads of the subdev1 and obtained the possible mbus codes for this subdevice on each of its pads.
Query the atmel_isc_scaler (subdevice 0):
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=0)
0x3001: MEDIA_BUS_FMT_SBGGR8_1X8
0x3013: MEDIA_BUS_FMT_SGBRG8_1X8
0x3002: MEDIA_BUS_FMT_SGRBG8_1X8
0x3014: MEDIA_BUS_FMT_SRGGB8_1X8
0x3007: MEDIA_BUS_FMT_SBGGR10_1X10
0x300e: MEDIA_BUS_FMT_SGBRG10_1X10
0x300a: MEDIA_BUS_FMT_SGRBG10_1X10
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x2008: MEDIA_BUS_FMT_YUYV8_2X8
# v4l2-ctl -d /dev/v4l-subdev0 --list-subdev-mbus-codes 1
ioctl: VIDIOC_SUBDEV_ENUM_MBUS_CODE (pad=1)
0x3001: MEDIA_BUS_FMT_SBGGR8_1X8
0x3013: MEDIA_BUS_FMT_SGBRG8_1X8
0x3002: MEDIA_BUS_FMT_SGRBG8_1X8
0x3014: MEDIA_BUS_FMT_SRGGB8_1X8
0x3007: MEDIA_BUS_FMT_SBGGR10_1X10
0x300e: MEDIA_BUS_FMT_SGBRG10_1X10
0x300a: MEDIA_BUS_FMT_SGRBG10_1X10
0x300f: MEDIA_BUS_FMT_SRGGB10_1X10
0x2008: MEDIA_BUS_FMT_YUYV8_2X8
- We queried both pads of the subdev0 and obtained the possible mbus codes for this subdevice on each of its pads.
Enum frame sizes
- One other important capability that a subdevice can have, is a list of supported frame sizes, or in other words, resolutions.</br> hand Resolutions can be a fixed discrete size, or a continuous size, in which a minimum and a maximum of a frame are specified.</br>
- A media entity associated with a subdevice can have multiple pads, and each pad can have different frame sizes supported.
- For example one entity can be a scaler, then it's only logical that it's a different set of frame sizes is supported on the source pad compared to the sink pad. </br>
Query the sensor (subdevice 3):
ioctl: VIDIOC_SUBDEV_ENUM_FRAME_SIZE (pad=0)
Size Range: 3280x2464 - 3280x2464
Size Range: 1920x1080 - 1920x1080
Size Range: 1640x1232 - 1640x1232
Size Range: 640x480 - 640x480
- We queried the subdevice3 to display the framesizes for the given pad, and given mbus code.
- According to one mbus code or another, a subdevice can have different possible frame sizes.
If we query the sensor for a mbus that is not supported:
ioctl: VIDIOC_SUBDEV_ENUM_FRAME_SIZE (pad=0)
#
- We queried the subdevice3 to display the framesizes for the given pad, and an unsupported mbus code.
- No frame size was returned.
hand Some subdevices do not support querying for frame size. This means that the frame size is unknown. When configuring the subdevice, one has to be careful if the configuration with a given frame size was successful or not.
Query the subdevice2 for frame size:
ioctl: VIDIOC_SUBDEV_ENUM_FRAME_SIZE (pad=0)
Size Range: 16x16 - 4000x3000
#
- We queried the subdevice2 to display the framesizes for the given pad.
- We obtained a range of frame sizes from 16x16 to 4000x3000. This means any frame size that fits is supported.
Get current subdevice format (per pad)
- It is important to distinguish between mbus code and format. A format _includes_ an mbus code, but a format also means a frame size, colorspace, and other type of information.
- We can query the subdevice to return the current configured format per pad.
Query the sensor (subdevice3) for the current configured format:
ioctl: VIDIOC_SUBDEV_G_FMT (pad=0)
Width/Height : 3280/2464
Mediabus Code : 0x300f (MEDIA_BUS_FMT_SRGGB10_1X10)
Field : None
Colorspace : sRGB
Transfer Function : sRGB
YCbCr/HSV Encoding: ITU-R 601
Quantization : Full Range
#
- We notice the pad number, the mbus code and the frame size that is currently configured on the subdevice.</br> hand A summary of this format can be seen in media-ctl -p output
Query the dw_csi (subdevice2) for the current configured format:
ioctl: VIDIOC_SUBDEV_G_FMT (pad=0)
Width/Height : 0/0
Mediabus Code : 0x3001 (MEDIA_BUS_FMT_SBGGR8_1X8)
Field : Any
Colorspace : Default
Transfer Function : Default (maps to Rec. 709)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Full Range)
# v4l2-ctl -d /dev/v4l-subdev2 --get-subdev-fmt 1
ioctl: VIDIOC_SUBDEV_G_FMT (pad=1)
Width/Height : 0/0
Mediabus Code : 0x3001 (MEDIA_BUS_FMT_SBGGR8_1X8)
Field : Any
Colorspace : Default
Transfer Function : Default (maps to Rec. 709)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Full Range)
#
- We notice that the frame size is not set correctly. This has to be set for this subdevice such that _streaming_ works.
Configuring subdevice format
- Once we know how to query the subdevices for different settings, we need to know also how to change these according to what we would like to do.
To configure a subdevice's pad format, we need to use media-ctl :
media-ctl -d /dev/media0 --set-v4l2 '"dw-csi.0":0[fmt:SRGGB10_1X10/3280x2464]'
media-ctl -d /dev/media0 --set-v4l2 '"csi2dc":0[fmt:SRGGB10_1X10/3280x2464]'
media-ctl -d /dev/media0 --set-v4l2 '"atmel_isc_scaler":0[fmt:SRGGB10_1X10/3280x2464]'
The above sequence of commands will configure the 4 given entities with the format specified.</br> hand The format string and specification is identical with the one printed by media-ctl -p
We can observe the difference in media-ctl output:
Media controller API version 5.15.0
Media device information
------------------------
driver atmel_isc_commo
model microchip,sama7g5-isc
serial
bus info platform:microchip-sama7g5-xisc
hw revision 0x220
driver version 5.15.0
Device topology
- entity 1: atmel_isc_scaler (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev0
pad0: Sink
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:srgb
crop.bounds:(0,0)/3264x2464
crop:(0,0)/3264x2464]
<- "csi2dc":1 [ENABLED,IMMUTABLE]
pad1: Source
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:srgb]
-> "atmel_isc_common":0 [ENABLED,IMMUTABLE]
- entity 4: csi2dc (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev1
pad0: Sink
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:srgb]
<- "dw-csi.0":1 [ENABLED]
pad1: Source
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:srgb]
-> "atmel_isc_scaler":0 [ENABLED,IMMUTABLE]
- entity 7: dw-csi.0 (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev2
pad0: Sink
[fmt:SRGGB10_1X10/3280x2464]
<- "imx219 1-0010":0 [ENABLED]
pad1: Source
[fmt:SRGGB10_1X10/3280x2464]
-> "csi2dc":0 [ENABLED]
- entity 12: imx219 1-0010 (1 pad, 1 link)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev3
pad0: Source
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:srgb xfer:srgb ycbcr:601 quantization:full-range
crop.bounds:(8,8)/3280x2464
crop:(8,8)/3280x2464]
-> "dw-csi.0":0 [ENABLED]
- entity 24: atmel_isc_common (1 pad, 1 link)
type Node subtype V4L flags 1
device node name /dev/video0
pad0: Sink
<- "atmel_isc_scaler":1 [ENABLED,IMMUTABLE]
hand All links now have frame size and mbus code configured.
- We can configure the pipeline with a different frame size and the same mbus code. Here is an example:
media-ctl -d /dev/media0 --set-v4l2 '"dw-csi.0":0[fmt:SRGGB10_1X10/1920x1080]'
media-ctl -d /dev/media0 --set-v4l2 '"csi2dc":0[fmt:SRGGB10_1X10/1920x1080]'
media-ctl -d /dev/media0 --set-v4l2 '"atmel_isc_scaler":0[fmt:SRGGB10_1X10/1920x1080]'
- And the corresponding media-ctl -p output:
Media device information
------------------------
driver atmel_isc_commo
model microchip,sama7g5-isc
serial
bus info platform:microchip-sama7g5-xisc
hw revision 0x220
driver version 5.15.0
Device topology
- entity 1: atmel_isc_scaler (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev0
pad0: Sink
[fmt:SRGGB10_1X10/1920x1080 field:none colorspace:srgb
crop.bounds:(0,0)/3264x2464
crop:(0,0)/3264x2464]
<- "csi2dc":1 [ENABLED,IMMUTABLE]
pad1: Source
[fmt:SRGGB10_1X10/1920x1080 field:none colorspace:srgb]
-> "atmel_isc_common":0 [ENABLED,IMMUTABLE]
- entity 4: csi2dc (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev1
pad0: Sink
[fmt:SRGGB10_1X10/1920x1080 field:none colorspace:srgb]
<- "dw-csi.0":1 [ENABLED]
pad1: Source
[fmt:SRGGB10_1X10/1920x1080 field:none colorspace:srgb]
-> "atmel_isc_scaler":0 [ENABLED,IMMUTABLE]
- entity 7: dw-csi.0 (2 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
device node name /dev/v4l-subdev2
pad0: Sink
[fmt:SRGGB10_1X10/1920x1080]
<- "imx219 1-0010":0 [ENABLED]
pad1: Source
[fmt:SRGGB10_1X10/1920x1080]
-> "csi2dc":0 [ENABLED]
- entity 12: imx219 1-0010 (1 pad, 1 link)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev3
pad0: Source
[fmt:SRGGB10_1X10/1920x1080 field:none colorspace:srgb xfer:srgb ycbcr:601 quantization:full-range
crop.bounds:(8,8)/3280x2464
crop:(688,700)/1920x1080]
-> "dw-csi.0":0 [ENABLED]
- entity 24: atmel_isc_common (1 pad, 1 link)
type Node subtype V4L flags 1
device node name /dev/video0
pad0: Sink
<- "atmel_isc_scaler":1 [ENABLED,IMMUTABLE]
Configuring top video driver
- In v4l2 , the top video driver is the driver that registers the /dev/video node. * This driver also registers the /dev/media node, which is the main device of the associated media controller * Video device and media device cannot be intertwined, they perform different tasks. * You could consider the media device and video device like two devices that complete each other for full functionality access and tweaking for the video capture pipeline * Video node is used to configure the video device. This is usually the _top driver_ that performs the video capture and includes the way to access the video frame. * Media node is the node that allows to access all entities, pads and links, and informs the userspace about the topology, status, and allows topology configuration and reconfiguration</br> hand Device tree is still needed to describe the hardware on the SoC / board. Device tree is a description of the hardware. The drivers themselves may or may not register media entities for their hardware devices, or, could register entities that do not have any specific hardware device associated (pure software entities), or a piece of hardware could translate into multiple entities. It's the drivers' choice what to _expose_ or _not expose_ to userspace. Thus, the media topology is a view from userspace of the available video pipeline.
- The top video driver, /dev/video can be queried for information and configured using the v4l2-ctl tool.
Querying the top video driver
- We can query the top video driver using v4l2-ctl
- Let's see what formats the top video driver can output to user space:
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'BG10' (10-bit Bayer BGBG/GRGR)
[1]: 'GB10' (10-bit Bayer GBGB/RGRG)
[2]: 'BA10' (10-bit Bayer GRGR/BGBG)
[3]: 'RG10' (10-bit Bayer RGRG/GBGB)
[4]: 'AR12' (16-bit ARGB 4-4-4-4)
[5]: 'AR15' (16-bit ARGB 1-5-5-5)
[6]: 'RGBP' (16-bit RGB 5-6-5)
[7]: 'AR24' (32-bit BGRA 8-8-8-8)
[8]: 'XR24' (32-bit BGRX 8-8-8-8)
[9]: 'YU12' (Planar YUV 4:2:0)
[10]: 'UYVY' (UYVY 4:2:2)
[11]: 'VYUY' (VYUY 4:2:2)
[12]: 'YUYV' (YUYV 4:2:2)
[13]: '422P' (Planar YUV 4:2:2)
[14]: 'GREY' (8-bit Greyscale)
[15]: 'Y10 ' (10-bit Greyscale)
[16]: 'Y16 ' (16-bit Greyscale)
#
- We can query the video node also for possible resolutions:
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'BG10' (10-bit Bayer BGBG/GRGR)
Size: Continuous 16x16 - 3264x2464
[1]: 'GB10' (10-bit Bayer GBGB/RGRG)
Size: Continuous 16x16 - 3264x2464
[2]: 'BA10' (10-bit Bayer GRGR/BGBG)
Size: Continuous 16x16 - 3264x2464
[3]: 'RG10' (10-bit Bayer RGRG/GBGB)
Size: Continuous 16x16 - 3264x2464
[4]: 'AR12' (16-bit ARGB 4-4-4-4)
Size: Continuous 16x16 - 3264x2464
[5]: 'AR15' (16-bit ARGB 1-5-5-5)
Size: Continuous 16x16 - 3264x2464
[6]: 'RGBP' (16-bit RGB 5-6-5)
Size: Continuous 16x16 - 3264x2464
[7]: 'AR24' (32-bit BGRA 8-8-8-8)
Size: Continuous 16x16 - 3264x2464
[8]: 'XR24' (32-bit BGRX 8-8-8-8)
Size: Continuous 16x16 - 3264x2464
[9]: 'YU12' (Planar YUV 4:2:0)
Size: Continuous 16x16 - 3264x2464
[10]: 'UYVY' (UYVY 4:2:2)
Size: Continuous 16x16 - 3264x2464
[11]: 'VYUY' (VYUY 4:2:2)
Size: Continuous 16x16 - 3264x2464
[12]: 'YUYV' (YUYV 4:2:2)
Size: Continuous 16x16 - 3264x2464
[13]: '422P' (Planar YUV 4:2:2)
Size: Continuous 16x16 - 3264x2464
[14]: 'GREY' (8-bit Greyscale)
Size: Continuous 16x16 - 3264x2464
[15]: 'Y10 ' (10-bit Greyscale)
Size: Continuous 16x16 - 3264x2464
[16]: 'Y16 ' (16-bit Greyscale)
Size: Continuous 16x16 - 3264x2464
- We see that the video device can output any resolution from 16x16 up to 3264x2464 . * This means that any frame size that _fits_, it's fine from the video node perspective. * We will see below how to get the current frame size.</br> hand The actual frame size is usually given by the sensor. The top video device cannot create frames, frames are created by sensors. However, entities in the pipeline can scale, adjust, crop and compose the frame into a new frame at a different frame size.
- How to see the complete format that is configured now ?
Format Video Capture:
Width/Height : 640/480
Pixel Format : 'BG10' (10-bit Bayer BGBG/GRGR)
Field : None
Bytes per Line : 1280
Size Image : 614400
Colorspace : sRGB
Transfer Function : Default (maps to sRGB)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Full Range)
Flags :
#
- This is the format that the video node will output to the userspace.
- We notice the pixel format (FOURCC codification), resolution, bytes for each line and the total image size.
Configuring the top video driver
- How to change the video format ?
# v4l2-ctl -d /dev/video0 --get-fmt-video
Format Video Capture:
Width/Height : 3264/2464
Pixel Format : 'RGBP' (16-bit RGB 5-6-5)
Field : None
Bytes per Line : 6528
Size Image : 16084992
Colorspace : sRGB
Transfer Function : Default (maps to sRGB)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Full Range)
Flags :
#
- Format was changed as requested</br> hand A list of possible pixelformat FOURCC values can be obtained from the output of the command v4l2-ctl --list-formats
Capture a frame
- Once the pipeline is configured correctly, we can use different tools to capture a frame from the video device.
- If one of the elements in the pipeline is not configured correctly, we will have errors.
- The top video driver will assert the situation at the start streaming step.
- If the subdevice is not configured to a compatible format, capture will fail.
Using v4l2-ctl to capture a frame
<
# ls -la RGBP_3264_2464.raw
-rw-r--r-- 1 root root 16084992 Jan 1 00:00 RGBP_3264_2464.raw
#
- The raw frame has been captured with v4l2-ctl .
Converting raw frame to png
- v4l2-ctl only captures raw frames
- We can use another tool like ffmpeg to convert the raw frame into a usual format photo.
- For example, from BAYER 8 bit RGGB raw format :
- Another example, from YUYV packed 16 bits format:
- RGB565 :
hand To see ffmpg supported pixel formats, try this:
warning resolution must match the one configured at the top video driver. ffmpeg cannot guess the resolution from a raw file.
Using fswebcam to capture a frame
- fswebcam can still be used to take a photo, once the pipeline is correctly configured
fswebcam -p RGB565 -r 3280x2464 -S 20 bigRGB565.png
fswebcam -p YUYV -r 3280x2464 -S 20 bigYUYV.png
fswebcam -p UYVY -r 3280x2464 -S 20 bigUYVY.png
fswebcam -p VYUY -r 3280x2464 -S 20 bigVYUY.png
fswebcam -p ABGR32 -r 3280x2464 -S 20 bigABGR32.png
fswebcam -p Y16 -r 3280x2464 -S 20 bigY16.png
fswebcam -p GREY -r 3280x2464 -S 20 bigGREY.png
fswebcam -p YUV420P -r 3280x2464 -S 20 bigYUV420P.png
# reconfigure pipeline to 1920x1080 first
fswebcam -p RGB565 -r 1920x1080 -S 20 smallRGB565.png
fswebcam -p YUYV -r 1920x1080 -S 20 smallYUYV.png
fswebcam -p UYVY -r 1920x1080 -S 20 smallUYVY.png
fswebcam -p VYUY -r 1920x1080 -S 20 smallVYUY.png
fswebcam -p ABGR32 -r 1920x1080 -S 20 smallABGR32.png
fswebcam -p Y16 -r 1920x1080 -S 20 smallY16.png
fswebcam -p GREY -r 1920x1080 -S 20 smallGREY.png
fswebcam -p YUV420P -r 1920x1080 -S 20 smallYUV420P.png
# reconfigure pipeline to 1640x1232 first
fswebcam -p RGB565 -r 1640x1232 -S 20 panoRGB565.png
fswebcam -p YUYV -r 1640x1232 -S 20 panoYUYV.png
fswebcam -p UYVY -r 1640x1232 -S 20 panoUYVY.png
fswebcam -p VYUY -r 1640x1232 -S 20 panoVYUY.png
fswebcam -p ABGR32 -r 1640x1232 -S 20 panoABGR32.png
fswebcam -p Y16 -r 1640x1232 -S 20 panoY16.png
fswebcam -p GREY -r 1640x1232 -S 20 panoGREY.png
fswebcam -p YUV420P -r 1640x1232 -S 20 panoYUV420P.png
# reconfigure pipeline to 640x480 first
fswebcam -p RGB565 -r 640x480 -S 20 tinyRGB565.png
fswebcam -p YUYV -r 640x480 -S 20 tinyYUYV.png
fswebcam -p UYVY -r 640x480 -S 20 tinyUYVY.png
fswebcam -p VYUY -r 640x480 -S 20 tinyVYUY.png
fswebcam -p ABGR32 -r 640x480 -S 20 tinyABGR32.png
fswebcam -p Y16 -r 640x480 -S 20 tinyY16.png
fswebcam -p GREY -r 640x480 -S 20 tinyGREY.png
fswebcam -p YUV420P -r 640x480 -S 20 tinyYUV420P.png
hand Even if we request 3280x2464 and the maximum resolution is 3264x2464, the media controller will resize the output frame and we will still obtain 3264x2464. It is automatically adjusted to best found resolution
Using GStreamer to capture a frame
- GStreamer can still be used to capture a frame
- GStreamer _classic_ v4l2src plugin can be used for GStreamer 1.0 .
hand Pipeline must be configured before using GStreamer .
libcamera
- libcamera is a userspace library that configures a media pipeline and allows higher level application to stop hassling with the media controller configuration
- libcamera is actually a library, so other applications have to link with libcamera in order to use it
- basically libcamera is a layer between /dev/video and /dev/media and the user
- libcamera also provides few test apps that are linked with libcamera, and can be used to validate the functionality
libcamera cam app
- The cam application is a basic libcamera app that can capture a frame.
Identifying the pipeline using cam
- Using cam application to test if the camera is available:
[3:09:29.602118600] [272] INFO IPAManager ipa_manager.cpp:138 libcamera is not installed. Adding '//src/ipa' to the IPA search path
[3:09:29.602612000] [272] WARN IPAManager ipa_manager.cpp:149 No IPA found in '/usr/lib/libcamera'
[3:09:29.602773400] [272] INFO Camera camera_manager.cpp:293 libcamera v0.0.0+58645-2021.08-dirty (2021-10-08T16:10:25+03:00)
[3:09:29.610875800] [274] WARN CameraSensor camera_sensor.cpp:197 'imx219 1-0010': Recommended V4L2 control 0x009a0922 not supported
[3:09:29.611025000] [274] WARN CameraSensor camera_sensor.cpp:249 'imx219 1-0010': The sensor kernel driver needs to be fixed
[3:09:29.611092200] [274] WARN CameraSensor camera_sensor.cpp:251 'imx219 1-0010': See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information
[3:09:29.617962400] [274] WARN CameraSensor camera_sensor.cpp:414 'imx219 1-0010': Failed to retrieve the camera location
[3:09:29.620663000] [274] WARN V4L2 v4l2_pixelformat.cpp:283 Unsupported V4L2 pixel format Y10
[3:09:29.620766200] [274] WARN V4L2 v4l2_pixelformat.cpp:283 Unsupported V4L2 pixel format Y16
[3:09:29.620828600] [274] WARN V4L2 v4l2_pixelformat.cpp:283 Unsupported V4L2 pixel format AR12
[3:09:29.620886800] [274] WARN V4L2 v4l2_pixelformat.cpp:283 Unsupported V4L2 pixel format AR15
Available cameras:
1: 'imx219' (/base/soc/flexcom@e2818000/i2c@600/camera@10)
#
- cam app has identified the camera and displays it to the user with the name of the video sensor (imx219 in our case)
- camera number 1 is _imx219_
Using cam to capture a frame
- Example:
- We can then use ffmpeg to convert this raw file to a png file if we wish to.
Video-capture-at91
video-capture-at91_summary
- Video-capture-at91 is a collection of scripts present on Github that is part of the Linux4SAM distribution, and offers examples and scripts that automatically configure the media controller pipeline.
Using video-capture-at91 scripts with sama7g5
- The scripts are available in your home directory, and are organized in directories corresponding to each sensor supported.
- There is a directory for _imx219_ and another directory for _imx274_
- There is a script that prepares the media controller pipeline for capturing for each supported resolution:
- There are also demo scripts available for each resolution, on how to use fswebcam to capture video in different formats, and how to configure the video format accordingly.
- Example of setting 3264x2464 resolution with imx219
# ./3264x2464.sh
Ready to capture at 3264x2464
#
# ./fswebcam_3264x2464.sh
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigRGB565.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigYUYV.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigUYVY.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigVYUY.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigABGR32.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigY16.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigGREY.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 1.33 seconds. (15 fps)
--- Processing captured image...
Unable to load font 'sans': Could not find/open font
Disabling the the banner.
Writing JPEG image to 'bigYUV420P.png'.
Using video-capture-at91 scripts with sama5d2
- The scripts are available in your home directory, and are organized in directories corresponding to each sensor supported.
- There is a directory for each of the supported sensors:
total 24
drwxr-xr-x 6 root root 4096 May 10 2022 .
drwx------ 3 root root 4096 Jan 1 02:08 ..
drwxr-xr-x 2 root root 4096 May 10 2022 mt9v022
drwxr-xr-x 2 root root 4096 May 10 2022 ov5640
drwxr-xr-x 2 root root 4096 May 10 2022 ov7670
drwxr-xr-x 2 root root 4096 May 10 2022 ov7740
- There is a script that prepares the media controller pipeline for capturing for each supported resolution:
- There are also demo scripts available for each resolution, on how to use fswebcam to capture video in different formats, and how to configure the video format accordingly.
- Example on how to use scripts for sama5d2 + mt9v022 sensor:
752x480_Y10.sh fswebcam_752x480.sh
# ./video-capture-at91/mt9v022/752x480_Y10.sh
Preparing MT9V022 in Y10 10bits mode
Ready to capture at 752x480
#
#
# ./video-capture-at91/mt9v022/fswebcam_752x480.sh
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 0.37 seconds. (56 fps)
--- Processing captured image...
Writing JPEG image to 'tinyGREY.png'.
#
- Example on how to use scripts for sama5d2 + ov7740 sensor in raw bayer mode:
Preparing OV7440 in RAW BAYER MODE
Ready to capture at 640x480
# ./video-capture-at91/ov7740/fswebcam_640x480.sh
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 0.38 seconds. (55 fps)
--- Processing captured image...
Writing JPEG image to 'tinyRGB565.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 0.38 seconds. (55 fps)
--- Processing captured image...
Writing JPEG image to 'tinyYUYV.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 0.38 seconds. (54 fps)
--- Processing captured image...
Writing JPEG image to 'tinyABGR32.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 0.38 seconds. (54 fps)
--- Processing captured image...
Writing JPEG image to 'tinyGREY.png'.
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Skipping 20 frames...
Capturing 1 frames...
Captured 21 frames in 0.38 seconds. (54 fps)
--- Processing captured image...
Writing JPEG image to 'tinyYUV420P.png'.
#
video-capture-at91_sama5d2
Capturing video
media-controller_ffmpeg
Using ffmpeg
- ffmpeg can be used to capture video from the v4l2 device.
- To list possible pixel formats use :
warning Media controller pipeline must be properly configured before using ffmpeg
Example at 640x480 , RGB565 format
ffmpeg -s vga -pix_fmt rgb565be -f video4linux2 -i /dev/video0 video_vga_rgb565be.avi
Example at 640x480, YUYV format
ffmpeg -s vga -pix_fmt yuyv422 -f video4linux2 -i /dev/video0 video_vga_yuyv422.avi
media-controller_ffmpeg
Related Topics
Boards
Sam9x75Curiosity
Sama7g5-ek
Sama5d27WLSom1EK
Sama5d27Som1EK
Sama5d2Xplained
Components
Summary
What is Media controller and how to use it with SAM products.