September 29, 2012

GStreamer camera streaming on i.MX6

This is just a quick post highlighting how a few simple components can be used to stream video from the camera on an i.MX6 over the network.

While working with one of our i.MX53 customers on a video streaming application, we had reason to test the camera interface, video encoding, and streaming on i.MX6. Our customer brought us a couple of command lines that worked well on his laptop:

Laptop server command-line

On the server side, the pipeline reads from the camera (v4l2src), uses jpegenc and then routes it to a TCP server on port 5000:

~/$ gst-launch v4l2src device=/dev/video0 
   ! videoscale 
   ! video/x-raw-yuv,width=640,height=480 
   ! ffmpegcolorspace 
   ! jpegenc 
   ! tcpserversink host= port=5000

Laptop client command-line

On the client side, the tcpclientsrc reads from the network port, decodes using jpegdec and sends the output to the display (autovideosink):

~/$ gst-launch tcpclientsrc host= port=5000 ! jpegdec ! autovideosink

I just finished participating in a training session by Timesys for distributor FAEs, where they gave a primer on gstreamer and its’ use on SABRE Lite.

Amply armed, I was able to take this information and translate it into a fully accelerated pipeline on the SABRE Lite:

i.MX6 server command-line

# gst-launch mfw_v4lsrc capture-width=1280 capture-height=720 capture-mode=4 
    ! vpuenc codec=12 
    ! tcpserversink host= port=5000

Note the use of the two Freescale-provided elements:

  • mfw_v4lsrc – Video For Linux source provided by Freescale for camera input, and
  • vpuenc – Freescale-provided VPU-accelerated encoder

By using this hardware acceleration, CPU utilization on the i.MX6 is less than 5% for this 720P stream and piping the output over 1Gbps ethernet, there is no noticeable latency.

To examine the parameters for gstreamer elements, use the gst-inspect utility:

# gst-inspect mfw_v4lsrc
MFW_GST_V4LSRC_PLUGIN 2.0.8 build on Aug 15 2012 14:09:19.
Factory Details:
  Long name:	v4l2 based camera src
  Class:	Src/Video
  Description:	Capture videos by using csi camera
  Author(s):	Multimedia Team
  Rank:		primary (256)

Plugin Details:
  Name:			v4lsrc.imx
  Description:		v4l2-based csi camera video src
  Filename:		/usr/lib/gstreamer-0.10/
  Version:		2.0.8
  License:		LGPL
  Source module:	gst-fsl-plugins
  Binary package:	Freescle Gstreamer Multimedia Plugins
  Origin URL:


Element Properties:
  capture-mode        : set the capture mode of camera, please check
                        the bsp release notes to decide which value can be applied,
				for example ov5460:
   				ov5640_mode_VGA_640_480 = 0,
				ov5640_mode_QVGA_320_240 = 1,
				ov5640_mode_NTSC_720_480 = 2,
				ov5640_mode_PAL_720_576 = 3,
				ov5640_mode_720P_1280_720 = 4,
				ov5640_mode_1080P_1920_1080 = 5

I hope this miniature example helps kick-start your efforts to build real-world applications using i.MX6 and gstreamer. Contact us, or your local distributor FAE, or Timesys for details of how you can see this in action.

Comments 20

  1. Tom Morrison

    Hi Eric,

    I have Freescale’s ER6-1206-Beta booted from an SD card.

    When I do gst-inspect – I am not seeing any of the Element properties that I would expect?

    I booted back to the original Timesys microSD and I am seeing all the properties I am interested?

    Is there something wrong with the Freescale’s version?

  2. Tarek

    Hi Eric,

    Very impressive!. It will also be helpful if you can give us some examples in C on how to use the gstreamer accelerated decoding APIs.


    1. Post

      Hi Tarek,

      This is a bit out of my wheel-house, but the folks at Timesys provide the sources for their very nice media player in their demo package.
      I’ve taken a few peeks inside, and it’s pretty cool. They show how you can translate a string as used in gst-launch directly into a gstreamer pipeline.

      1. Gary Thomas

        I’m not using the TimeSys stuff – where can I get the sources to build these gstreamer components myself?

        1. Post
  3. Post
  4. Post

    Benoît Thébaudeau ( was kind enough to share a tip about how to get 1080P working.

    Without some fancy gstreamer-fu, gst-launch will just say ‘failure to negotiate’ because of a test for the video height not being a multiple of 16 as I mentioned in this comment on i.MX Community.

    Benoît has a no-patch way of doing this:

    root@boundary ~$ gst-launch mfw_v4lsrc capture-mode=5 fps-n=15 
      ! video/x-raw-yuv,width=1920,height=1080,format='(fourcc)'I420,framerate='(fraction)'15/1 
      ! mfw_ipucsc 
      ! video/x-raw-yuv,width=1920,height=1088,format='(fourcc)'I420,framerate='(fraction)'15/1 
      ! vpuenc codec=12 
      ! tcpserversink host= port=5000
  5. MichalR


    I am planing to build Radio Controlled Drone with First Person View capabilities using WIFI.
    I calculated that I need real 16Mbit/s WIFI link, with about 8Mbit/s bitrate on h.264 streaming video.
    Wifi needs to be configured manually (ie. only BPSK modulation, disable B/G).
    So recently I was testing 2.0Mpx 720p@30 fps WIFI IP Camera (OVISLINK Airlive WN-200HD).
    But the latency was about 700ms using WPA2 and 300ms when unsecured even on wired LAN.

    So I finally found a Sabre Lite/Nitrogen6x SBC that are capable of 720p @ 60fps h.264 encoding and decoding. With OV5642 camera sensor it could deliver low latency solution.
    But I do not have the i.MX6 SBC. And I do not know if this project might work, because latency above 150ms won’t do even with high fps.

    So could you provide some simple tests for latency over LAN cable and WIFI?
    ie. Display some timer on Laptop/i.MX6 SBC screen(client), and next to it a video window form i.MX6 SBC(server). At the same time point the OV5642 camera(server) to screen(client). The latency value would be easy calculated from “Print Screen”.
    Any data you can give will be greatly appreciated.

    Thanks in advance

    1. Post

      Hi Michal,

      We don’t have cycles to spare to run this test, but can share that we’ve had customers do similar things (i.e. low latency video to network), and there are a lot of settings to get right.

      Circumventing buffering on the receive side is probably the hardest, since most playback code wants to favor smooth playback over low latency.

      1. MichalR

        Hi Eric,

        Thank you for the prompt answer.
        I convinced that i.MX6 platform might be very flexible with current hardware and Linux support.

        About the tests. It is very hard, for end client like me, to decide which solution/product to use. Raw specification often is very misleading and most of time has nothing to do with reality. Also it leads to spend lots of money (k of $) for products that are never there (like 30fps WebCam with real blurred 10fps). It is only suggestion from my side to find couple of hours every month for some simple real performance test and document them (even on forum).
        For example I do not know what kind of performance to expect from WiFi embedded in Nitrogen6x, or it would be better to use WIFI over USB.

        Sorry for little OT, thanks for understanding.

  6. Andrew

    I have an OV5642 camera I bought from your site (
    This is an auto-focus camera. How do I use the auto-focus functions? I am able to stream using gstreamer plugins but I can’t control the auto focus functionality. Is this under software control? Does it need to be enabled in software? How do I do this? Thanks

    1. Post

      Hi Andrew,

      Auto-focus is a software issue.

      The Freescale drivers did not include the OV5642 firmware needed to control the voice coil motor (VCM) that drives focus changes or the controls needed to invoke it.

      There’s a preliminary update in this commit. A newer version and blog post will be available shortly.

  7. Andrew

    Nice! Looking forward to your blog post. Is it possible to combine auto-focus feature with digital zoom. Looking at the ov5642.c code in the kernel, there is mention of digital zoom here:

    I could use the videocrop plugin in gstreamer but I’m wondering if its possible to restrict auto-focus on a digitally zoomed part of an image and if this can be done in software (by modifying the ov5642.c kernel driver). Thanks

    1. Post

      I’m not sure about the licensing for it, but if you Google for OV5642 Firmware User’s Guide, you’ll find a document that describes how things function.

      The short answer is that you can focus on up to five programmable zones, though the description is a bit opaque.

      This implementation only supports “Center mode”.

  8. Andrew

    Hi Eric, I would like to request clarification on some of the camera (CSI0) signals on the sabrelite. OV5642 uses 8 bits. The sabrelite schematic shows 12 data bits (CSI0_DAT8 thru CSI0_DAT19). What is the mapping between these and the OV5642 interface signals Y2-Y9?

    Also, what is the GPIO_3_CLKO2 on the sabrelite schematic used for?

    And I assume that GPIO[6]:11 is unsed for the parallel camera case? If it is unused, then what happens with this expression? Does it evaluate to false? Thanks

  9. Andrew

    Hi, any clarification on the previous question of the CSI0 signals on the sabrelite? Thanks

  10. Matthew

    Thank you guys at boundary devices for the article. I tried this out myself with the Saber SD kit and impressed some coworkers. On my PC I found it was it was necessary to turn of visual effects on the compiz-control panel. It resolved the error “X Error of failed request: BadAlloc (insufficient resources for operation)”.

  11. Dipen Patel

    Hi Eric,

    Is it possible to capture 2 simultaneous 1280x720P stream from two cameras on i.MX6Q?

    My goal is simultaneous capture from two camera at 1280x720P resolution and perform H.264 encoding and stream.

    Appreciate your earliest response.

    Dipen Patel

Leave a Reply