Monday, November 21, 2011

ESC Development

Note: this is a copy of some of my notes from other forums about the development of the ESC

So my pet backburner project since Portugal has been an ESC after seeing how they are essentially the limiting factor in flight performance. We had the big prototypes flying a while back but just got back the latest revision of the ESCs which will probably be the production version. I'm extremely excited with the performance:

Initial prototype:

More OpenPilot ESC testing from James Cotton on Vimeo.

And some of the first flight tests:


  • Compact form factor
  • Probably good for 40-50A (needs testing, I haven't got the transistors hot yet)
  • 2S-4S capable
  • Very high speed response
  • Self testing with diagnostic codes for failure
  • Tunable control loops to optimize for various motors and props. I'll have good values for some common ones.
  • Accurate current monitoring with soft limit (slows RPM to keep at the current limit) and emergency shut down for higher limit
  • Closed loop control so input determines precise RPM, regardless of prop size, etc
  • PWM control for standard compatibility. I2C/Serial/CANBus support for bidirectional communications for monitoring.
  • Configurable settings for current limit, control loop coefficients, RPM range
I'll be sending some out to testers and working on the GCS to make it easy to configure. More updates to come and characterization data. Overall though it flies great. The quad really was locked on and because I could get the gains much higher reacted very quickly when asked to without any wobble. I haven't actually tuned this yet so it's not at the limit of how well it can perform.


Posted Image

Also I've been tuning up the controller for my 8" props on KEDA 20-22L motors. I'm quite pleased with the response. The time from the setpoint change to 90% is around 30 ms for rise and 35 for small falls. You see the larger falls aren't quite as fast yet - I suspect that's the integral so a better tuned up FF model will help.

I'm using a transmitter for the control so it's at the point you hear the sound from the 50 ms updates when you move the stick slowly!

We'll definitely have to build up a database of settings for different motor/prop/battery combinations (although I think battery we can normalize out). One thing I've heard about MK ESCs that I'd like to avoid is that only some combinations work well with them. I'd like to avoid that for our system. 

And another with the feedforward model tuned up:
Posted Image
even faster - more like 25 ms most of the time.

And here it is characterizing the system response:

I managed to get the bandwidth up to 35 Hz or so. This is the response to sinusoidal inputs:

 Also some code that can visualize the voltages on all the motor phases:

Friday, February 25, 2011

Getting the Nook Color DSP working for GingerBread is proving quite difficult.

Essentially that playing video goes through this chain in gingerbread:
  • Android level (haven't explored yet)
  • libstagefright
  • libOMX (OpenMAX)
  • libbridge
  • ti dsp bridge driver (kernel driver)
  • physical DSP
And in the stock CM7 these libraries seem to have incompatibilities.  In addition, the source and version of them seems poorly documented which doesn't help - as well as finding "official" compatible releases.

Dalingrin has backported the 2.9.32 dspbridge driver (from the cm-nook branch I believe) to the nook 2.9.29 branch and this gets us able to load the dsp image using the libbridge present in CM7.

I would like to know which version of OMX is meant to be running with GingerBread.  Supposedly the harware-ti-omap3 branch in CM7 has working DSP on an OMAP3 platform but I would love to see a hard confirmation of this.  If this is the case, then it's likely our problem exists in the kernel driver as everything else would be the same on either sholes or n900.

The main problem is we need a more systematic way of testing where in this chain things are breaking that simply dsp image loads but video doesn't play.  Especially find out if the userspace errors trace to a kernel error deep down, which would be consistent with the previous paragraph.

Other branches
OMAPZoom has a lot of development going on in their OMAP3/OMX/kernel-dspbridge/userspace-dspbridge branches.  However, porting the hardware-ti-omap3 doesn't compile because it's coupled to other changes in the base framework. 

omapzoom gingerbread omx: after changing some libbridge/inc paths compiles file.  Need to test.
omapzoom remotes/origin/dspbridge:  hard to port into 2.6.29.  a lot of the definitions it uses are in files that have shuffled around between omap-mach2 and plat-omap/include/mach.

Friday, January 7, 2011

BB-xM running custom Angstrom build

So I essentially scrapped all the previous stuff except for as notes of what I needed and decided to get my own Angstrom build running.

Getting the Angstrom Image running

First to get a build up and running I followed the instructions here: which gave me a directory set up to build the recipes.

Installing TI DSP tools

 However, before this will work you need to get the dsp tools installed and downloaded.  This was a painful process.  There were too main parts though.  One is finding the XDCTools and Code Generation Tools from the TI web page and placing them in ~/angstrom-setup-scripts/sources/downloads (

For Ubuntu 10.10 x86_64, though, these downloads would not execute initially.  I had to run
sudo apt-get install ia32-libs
to install a compatibility layer. Then to test this run 
bitbake gstreamer-ti
before trying to compile your image.  The openembedded repository moves fast, so I found between when I started the process and the next day all the problems (aside from ia32-libs) were patched.  This mean I kept deleting my build directory after I noticed a change upstream (for some reason simply recompiling didn't work.

Creating custom image
The right way to do this is to create an overlay for all your recipes, and I'll do this at some point but my brief attempt didn't work. I copied the sources/openembedded/recipes/images/x11-image sources/openembedded/recipes/images/recipe to op-image and added the packages I wanted:
EXTRAS = "wireless-tools \
  kernel-module-uvcvideo \
  ti-dmai \
  ti-cmem-module \
  ffmpeg \
  gstreamer-ti \
  gst-plugin-avi \
  gst-plugin-video4linux2 \
  gst-plugin-videorate \
  task-gstreamer-ti \

DEPENDS = "task-base"
    ${XSERVER} \
    task-base-extended \
    angstrom-x11-base-depends \
    angstrom-gpe-task-base \
    angstrom-gpe-task-settings \
    ${SPLASH} \

Then to compile your image and get the boot scripts you'll need:
bitbake op-image
bitbake angstrom-uboot-scripts
Following the instructions on first page my unpackaged my rootfs image angstrom-setup-scripts/build/tmp-angstrom_2008_1/deploy/glibc/images/beagleboard/Angstrom-op-image-glibc-ipk-2010.7-test-20110106-beagleboard.rootfs.tar.bz2 to the second partition on my sdcard.  The uboot, MLO, uImage and  files were copied to the first partition.

For a boot.scr I used uboot-dsplink-512MB.cmd.scr which got all the DSP stuff working (the 99M was critical).

Now pop the sdcard in the BB-xM and give it a shot (I was using HDMI connector which worked fine).

Getting Wi-Fi working

This command worked while manually testing:
modprobe zd1211rw
iwconfig wlan0 mode managed
iwconfig wlan0 essid "ESSID"
iwconfig wlan0 key "s:WIFIKEY"
iwconfig wlan0 channel auto

iwconfig wlan0 ap auto
udhcpc -i wlan0 
But to make it happen automatically add the zd12111rw module to /etc/modules (which is normally autogenerated)  and then put this in /etc/network/interfaces:
auto wlan0
iface wlan0 inet dhcp 
        wireless-essid jimmyc
        wireless-key s:BSWBBROWNBSWB
        wireless-mode managed        
Getting webcam working

This was my big achievement of the last few days.  For reference I'm using a Logitech C910.  Getting it to detect wasn't a problem, it shows up as /dev/video0 automagically.  It took a lot of hunting to find the 1) combination of settings to get the DSP stuff compiles and working and 2) finding the gstreamer pipeline that would compress the video

Playing it on the monitor wasn't too hard using mplayer:
mplayer -vo fbdev tv://
and this code tests if you can encode video:
gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc1 codecName=h264enc engineName=codecServer ! filesink location=sample.mp4
.  Check that the file size is not zero afterwards to know it was working.

Now the real trick using this webcam was something I saw around but started dropping -- ffmpegcolorspace.  Unfortunately the webcam doesn't pack the pixels in a way compatible with the TI codecs.  This command though, plays my webcam on the monitor through gstreamer:
gst-launch -v v4l2src device=/dev/video0  ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=640,height=480,framerate=30/1' ! videorate ! ffmpegcolorspace ! omapdmaifbsink
 Breaking down what I know.
  • the video element add forces the video source to output that format (which the camera can do)
  • videorate makes it get the correct rate and tag the stream with it
  • ffmpegcolorspace repacks the pixels
  • omapdmaifbsink plays it
Finally we can replace playback with encoding.
gst-launch -v v4l2src device=/dev/video0 always-copy=FALSE  num-buffers=2000 ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=640,height=480,framerate=15/1' ! videorate ! ffmpegcolorspace ! dmaiperf print-arm-load=true engine-name=codecServer ! TIVidenc1 codecName=h264enc engineName=codecServer bitRate=100000 rateControlPreset=2 encodingPreset=2 ! dmaiperf ! filesink location=sample.mp4
Where the dmaiperf commands are just for showing statistics.

And that, in the words of Louis CK, is all the things I know.  Currently the colorspace conversion is eating a lot of CPU up -- apparently ffmpegcolorspace is notoriously slow.  av500 on the beagleboard IRC room helped me figure out what it did and I'm going to look into writing a NEON accelerated gstreamer plugin to do this more efficiently.