This is just a quick post highlighting how a few simple components can be used to stream video from the camera on an i.MX6 over the network.
While working with one of our i.MX53 customers on a video streaming application, we had reason to test the camera interface, video encoding, and streaming on i.MX6. Our customer brought us a couple of command lines that worked well on his laptop:
Laptop server command-line
On the server side, the pipeline reads from the camera (v4l2src), uses jpegenc and then routes it to a TCP server on port 5000:
~/$ gst-launch v4l2src device=/dev/video0 ! videoscale ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace ! jpegenc ! tcpserversink host=127.0.0.1 port=5000
Laptop client command-line
On the client side, the tcpclientsrc reads from the network port, decodes using jpegdec and sends the output to the display (autovideosink):
~/$ gst-launch tcpclientsrc host=127.0.0.1 port=5000 ! jpegdec ! autovideosink
I just finished participating in a training session by Timesys for distributor FAEs, where they gave a primer on gstreamer and its’ use on SABRE Lite.
Amply armed, I was able to take this information and translate it into a fully accelerated pipeline on the SABRE Lite:
i.MX6 server command-line
# gst-launch mfw_v4lsrc capture-width=1280 capture-height=720 capture-mode=4 ! vpuenc codec=12 ! tcpserversink host=127.0.0.1 port=5000
Note the use of the two Freescale-provided elements:
- mfw_v4lsrc – Video For Linux source provided by Freescale for camera input, and
- vpuenc – Freescale-provided VPU-accelerated encoder
By using this hardware acceleration, CPU utilization on the i.MX6 is less than 5% for this 720P stream and piping the output over 1Gbps ethernet, there is no noticeable latency.
To examine the parameters for gstreamer elements, use the gst-inspect utility:
# gst-inspect mfw_v4lsrc MFW_GST_V4LSRC_PLUGIN 2.0.8 build on Aug 15 2012 14:09:19. Factory Details: Long name: v4l2 based camera src Class: Src/Video Description: Capture videos by using csi camera Author(s): Multimedia Team Rank: primary (256) Plugin Details: Name: v4lsrc.imx Description: v4l2-based csi camera video src Filename: /usr/lib/gstreamer-0.10/libmfw_gst_v4lsrc.so Version: 2.0.8 License: LGPL Source module: gst-fsl-plugins Binary package: Freescle Gstreamer Multimedia Plugins Origin URL: http://www.freescale.com ... Element Properties: ... capture-mode : set the capture mode of camera, please check the bsp release notes to decide which value can be applied, for example ov5460: ov5640_mode_VGA_640_480 = 0, ov5640_mode_QVGA_320_240 = 1, ov5640_mode_NTSC_720_480 = 2, ov5640_mode_PAL_720_576 = 3, ov5640_mode_720P_1280_720 = 4, ov5640_mode_1080P_1920_1080 = 5
I hope this miniature example helps kick-start your efforts to build real-world applications using i.MX6 and gstreamer. Contact us, or your local distributor FAE, or Timesys for details of how you can see this in action.
Comments 22
Hi Eric,
I have Freescale’s ER6-1206-Beta booted from an SD card.
When I do gst-inspect – I am not seeing any of the Element properties that I would expect?
I booted back to the original Timesys microSD and I am seeing all the properties I am interested?
Is there something wrong with the Freescale’s version?
Hi Eric,
Very impressive!. It will also be helpful if you can give us some examples in C on how to use the gstreamer accelerated decoding APIs.
Thanks
Author
Hi Tarek,
This is a bit out of my wheel-house, but the folks at Timesys provide the sources for their very nice media player in their demo package.
I’ve taken a few peeks inside, and it’s pretty cool. They show how you can translate a string as used in
gst-launch
directly into a gstreamer pipeline.I’m not using the TimeSys stuff – where can I get the sources to build these gstreamer components myself?
Author
Hi Gary,
The sources are only available through Freescale. Please contact us at info@boundarydevices.com if you don’t have a local FAE and we can help the process.
Author
As a quick camera test, you can also just get a preview window like this:
root@nitrogen6x: gst-launch mfw_v4lsrc ! mfw_v4lsink
Author
Benoît Thébaudeau (benoit.thebaudeau@advansee.com) was kind enough to share a tip about how to get 1080P working.
Without some fancy gstreamer-fu, gst-launch will just say ‘failure to negotiate’ because of a test for the video height not being a multiple of 16 as I mentioned in this comment on i.MX Community.
Benoît has a no-patch way of doing this:
Hi,
I am planing to build Radio Controlled Drone with First Person View capabilities using WIFI.
I calculated that I need real 16Mbit/s WIFI link, with about 8Mbit/s bitrate on h.264 streaming video.
Wifi needs to be configured manually (ie. only BPSK modulation, disable B/G).
So recently I was testing 2.0Mpx 720p@30 fps WIFI IP Camera (OVISLINK Airlive WN-200HD).
But the latency was about 700ms using WPA2 and 300ms when unsecured even on wired LAN.
So I finally found a Sabre Lite/Nitrogen6x SBC that are capable of 720p @ 60fps h.264 encoding and decoding. With OV5642 camera sensor it could deliver low latency solution.
But I do not have the i.MX6 SBC. And I do not know if this project might work, because latency above 150ms won’t do even with high fps.
So could you provide some simple tests for latency over LAN cable and WIFI?
ie. Display some timer on Laptop/i.MX6 SBC screen(client), and next to it a video window form i.MX6 SBC(server). At the same time point the OV5642 camera(server) to screen(client). The latency value would be easy calculated from “Print Screen”.
Any data you can give will be greatly appreciated.
Thanks in advance
Michal
Author
Hi Michal,
We don’t have cycles to spare to run this test, but can share that we’ve had customers do similar things (i.e. low latency video to network), and there are a lot of settings to get right.
Circumventing buffering on the receive side is probably the hardest, since most playback code wants to favor smooth playback over low latency.
Hi Eric,
Thank you for the prompt answer.
I convinced that i.MX6 platform might be very flexible with current hardware and Linux support.
About the tests. It is very hard, for end client like me, to decide which solution/product to use. Raw specification often is very misleading and most of time has nothing to do with reality. Also it leads to spend lots of money (k of $) for products that are never there (like 30fps WebCam with real blurred 10fps). It is only suggestion from my side to find couple of hours every month for some simple real performance test and document them (even on forum).
For example I do not know what kind of performance to expect from WiFi embedded in Nitrogen6x, or it would be better to use WIFI over USB.
Sorry for little OT, thanks for understanding.
Michal
Hi MichalR,
Did you manage to build this system? I want to build a FPV system using i.MX6 (SabreLite Board) at the server and client side with a Wi-Fi adhoc connection between them. The idea is to build a RTP streaming server(with h264 encoding) to be mounted on the board and transmit it via WiFi. At the receiver side, another i.MX6 board decodes(h.264 packets) the video and streams it onto a screen. I do not expect a HD video quality, so I can build my system with around 10fps over 640×480 resolution or less.
It would be helpful, if you could let me know your experiences with these processors for a FPV system. What latency could you achieve on this processor?
Regards
I have an OV5642 camera I bought from your site (http://boundarydevice.wpengine.com/products/nit6x_5mp/)
This is an auto-focus camera. How do I use the auto-focus functions? I am able to stream using gstreamer plugins but I can’t control the auto focus functionality. Is this under software control? Does it need to be enabled in software? How do I do this? Thanks
Author
Hi Andrew,
Auto-focus is a software issue.
The Freescale drivers did not include the OV5642 firmware needed to control the voice coil motor (VCM) that drives focus changes or the controls needed to invoke it.
There’s a preliminary update in this commit. A newer version and blog post will be available shortly.
on a related note:
Can you please provide schematic for the small interface board that comes with the ov5642 camera on your website (http://boundarydevice.wpengine.com/products/nit6x_5mp/)? I have an alternate fixed focus ov5642 camera from here: http://www.kailaptech.com/Product.aspx?id=762&l1=711 and I want to experiment using the same interface board with this camera. Thanks
Nice! Looking forward to your blog post. Is it possible to combine auto-focus feature with digital zoom. Looking at the ov5642.c code in the kernel, there is mention of digital zoom here:
https://github.com/boundarydevices/linux-imx6/blob/5d942785c92d501cba0622c13442808539b6d6eb/drivers/media/video/mxc/capture/mxc_v4l2_capture.c#L1495
I could use the videocrop plugin in gstreamer but I’m wondering if its possible to restrict auto-focus on a digitally zoomed part of an image and if this can be done in software (by modifying the ov5642.c kernel driver). Thanks
Author
I’m not sure about the licensing for it, but if you Google for OV5642 Firmware User’s Guide, you’ll find a document that describes how things function.
The short answer is that you can focus on up to five programmable zones, though the description is a bit opaque.
This implementation only supports “Center mode”.
Is it possible to get schematic to this interface board? Thanks
Hi Eric, I would like to request clarification on some of the camera (CSI0) signals on the sabrelite. OV5642 uses 8 bits. The sabrelite schematic shows 12 data bits (CSI0_DAT8 thru CSI0_DAT19). What is the mapping between these and the OV5642 interface signals Y2-Y9?
Also, what is the GPIO_3_CLKO2 on the sabrelite schematic used for?
And I assume that GPIO[6]:11 is unsed for the parallel camera case? If it is unused, then what happens with this expression? Does it evaluate to false? Thanks
Hi, any clarification on the previous question of the CSI0 signals on the sabrelite? Thanks
Thank you guys at boundary devices for the article. I tried this out myself with the Saber SD kit and impressed some coworkers. On my PC I found it was it was necessary to turn of visual effects on the compiz-control panel. It resolved the error “X Error of failed request: BadAlloc (insufficient resources for operation)”.
Hi Eric,
Is it possible to capture 2 simultaneous 1280x720P stream from two cameras on i.MX6Q?
My goal is simultaneous capture from two camera at 1280x720P resolution and perform H.264 encoding and stream.
Appreciate your earliest response.
Regards,
Dipen Patel
Hi,
I have tried out a couple of camera related gstreamer plugins on imx6. Now all the gstreamer examples focus on starting a pipeline immediately as the application is run. I am trying to figure out, if I can trigger a gstreamer pipeline, based on an external signal.
For eg: I have a custom hardware interfaced to imx6, that outputs data at 60Hz. This hardware also provides a trigger signal, which I want to use to trigger the camera to record video at 30fps, which means approx. 2 sets of data for each frame. I am basically trying to synchronize the custom hardware data to the video frame captured. If we can do the triggering vice-versa, I am fine with that as well.
Now I do not understand, how do I use the Gstreamer command line tools, to achieve the triggering of camera. Could you provide some hints on how to approach this?
Regards