So I essentially scrapped all the previous stuff except for as notes of what I needed and decided to get my own Angstrom build running.
Getting the Angstrom Image running
First to get a build up and running I followed the instructions here:
http://elinux.org/BeagleBoardAndOpenEmbeddedGit which gave me a directory set up to build the recipes.
Installing TI DSP tools
However, before this will work you need to get the dsp tools installed and downloaded. This was a painful process. There were too main parts though. One is finding the XDCTools and Code Generation Tools from the TI web page and placing them in
~/angstrom-setup-scripts/sources/downloads (
http://software-dl.ti.com/dsps/dsps_public_sw/sdo_sb/targetcontent/rtsc/index.html
https://www.-a.ti.com/downloads/sds_support/TICodegenerationTools/download.htm)
For Ubuntu 10.10 x86_64, though, these downloads would not execute initially. I had to run
sudo apt-get install ia32-libs
to install a compatibility layer. Then to test this run
bitbake gstreamer-ti
before trying to compile your image. The openembedded repository moves fast, so I found between when I started the process and the next day all the problems (aside from ia32-libs) were patched. This mean I kept deleting my build directory after I noticed a change upstream (for some reason simply recompiling didn't work.
Creating custom image
The right way to do this is to create an overlay for all your recipes, and I'll do this at some point but my brief attempt didn't work. I copied the
sources/openembedded/recipes/images/x11-image sources/openembedded/recipes/images/recipe to
op-image and added the packages I wanted:
EXTRAS = "wireless-tools \
kernel-module-uvcvideo \
ti-dmai \
ti-cmem-module \
ffmpeg \
gstreamer-ti \
gst-plugin-avi \
gst-plugin-video4linux2 \
gst-plugin-videorate \
task-gstreamer-ti \
gst-plugin-udp"
DEPENDS = "task-base"
IMAGE_INSTALL = "\
${XSERVER} \
task-base-extended \
angstrom-x11-base-depends \
angstrom-gpe-task-base \
angstrom-gpe-task-settings \
${SPLASH} \
${ANGSTROM_EXTRA_INSTALL} \
${EXTRAS}"
Then to compile your image and get the boot scripts you'll need:
bitbake op-image
bitbake angstrom-uboot-scripts
Following the instructions on first page my unpackaged my rootfs image
angstrom-setup-scripts/build/tmp-angstrom_2008_1/deploy/glibc/images/beagleboard/Angstrom-op-image-glibc-ipk-2010.7-test-20110106-beagleboard.rootfs.tar.bz2 to the second partition on my sdcard. The uboot, MLO, uImage and files were copied to the first partition.
For a boot.scr I used
uboot-dsplink-512MB.cmd.scr which got all the DSP stuff working (the 99M was critical).
Now pop the sdcard in the BB-xM and give it a shot (I was using HDMI connector which worked fine).
Getting Wi-Fi working
This command worked while manually testing:
modprobe zd1211rw
iwconfig wlan0 mode managed
iwconfig wlan0 essid "ESSID"
iwconfig wlan0 key "s:WIFIKEY"
iwconfig wlan0 channel auto
iwconfig wlan0 ap auto
udhcpc -i wlan0
But to make it happen automatically add the
zd12111rw module to
/etc/modules (which is normally autogenerated) and then put this in
/etc/network/interfaces:
auto wlan0
iface wlan0 inet dhcp
wireless-essid jimmyc
wireless-key s:BSWBBROWNBSWB
wireless-mode managed
Getting webcam working
This was my big achievement of the last few days. For reference I'm using a Logitech C910. Getting it to detect wasn't a problem, it shows up as /dev/video0 automagically. It took a lot of hunting to find the 1) combination of settings to get the DSP stuff compiles and working and 2) finding the gstreamer pipeline that would compress the video
Playing it on the monitor wasn't too hard using mplayer:
mplayer -vo fbdev tv://
and this code tests if you can encode video:
gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc1 codecName=h264enc engineName=codecServer ! filesink location=sample.mp4
. Check that the file size is not zero afterwards to know it was working.
Now the real trick using this webcam was something I saw around but started dropping -- ffmpegcolorspace. Unfortunately the webcam doesn't pack the pixels in a way compatible with the TI codecs. This command though, plays my webcam on the monitor through gstreamer:
gst-launch -v v4l2src device=/dev/video0 ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=640,height=480,framerate=30/1' ! videorate ! ffmpegcolorspace ! omapdmaifbsink
Breaking down what I know.
- the video element add forces the video source to output that format (which the camera can do)
- videorate makes it get the correct rate and tag the stream with it
- ffmpegcolorspace repacks the pixels
- omapdmaifbsink plays it
Finally we can replace playback with encoding.
gst-launch -v v4l2src device=/dev/video0 always-copy=FALSE num-buffers=2000 ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=640,height=480,framerate=15/1' ! videorate ! ffmpegcolorspace ! dmaiperf print-arm-load=true engine-name=codecServer ! TIVidenc1 codecName=h264enc engineName=codecServer bitRate=100000 rateControlPreset=2 encodingPreset=2 ! dmaiperf ! filesink location=sample.mp4
Where the dmaiperf commands are just for showing statistics.
And that, in the words of Louis CK, is all the things I know. Currently the colorspace conversion is eating a lot of CPU up -- apparently ffmpegcolorspace is notoriously slow. av500 on the beagleboard IRC room helped me figure out what it did and I'm going to look into writing a NEON accelerated gstreamer plugin to do this more efficiently.