Jack Crossfire's blog View Details
Archive for February, 2013
Posted by Jack Crossfire | Feb 28, 2013 @ 02:34 AM | 9,282 Views
With the PX4Flow connected to USB & Qgroundcontrol fired up on a Mac, it was immediately clear that it had a very long, sharp, macro lens, allowing it to resolve texture from far away, but also limiting its useful altitude to 0.5m to infinity & requiring a very high framerate. The image is a 64x64 window in the middle of the sensor.

Setting it to low light mode produced a noisier image. In any mode, the position updates only said 90Hz even though it's supposed to do 250fps. Maybe it accumulates readings. There was no obvious update rate parameter.

It didn't seem to compensate for angular movement. According to the mavlink source code, flow_comp_m_* is the angular rate & altitude compensated motion in floating point meters. flow_* is the raw number of pixels moved. The usual limited range readings over carpet of the Maxbotix sonar appeared.

It didn't need any calibration. Just connect it & it outputs Mavlink position packets. The lens was already focused. It outputs packets at 100Hz on USART 3 but not USART2. It seems to run as low as 3.3V at 120mA. Below that, current drops & LEDs start going out. Above that until the 5V maximum, current is constant.

Not having the right variation of micro USB connector is a drag, but there's nothing to do on qgroundcontrol besides test the focus & convince yourself low light mode is worthless.

The same thing could probably be done for a lot less weight with the TCM8230, windowing a 64x64 area to get some textures. It would require a large grid pattern on the floor while the PX4Flow could get away with linoleum.

Flying with such a large lens pointing down is an expensive disaster in the making, but it's not clear if optical flow is going to be important enough in the long term to justify doing it from scratch.
Posted by Jack Crossfire | Feb 27, 2013 @ 09:30 PM | 10,067 Views
The PX4Flow is definitely a brick, compared to the size of modern aircraft. Most of the heft is in the lens. That seems to be the price of its reported accuracy. It's definitely not the ideal solution.

Weight: 17.5g

The Syma X1 can get it off the ground with degraded maneuverability, but would probably overheat after 2 minutes. Its bare HRLV sonar module is 4.5g. It's a real buster to mount on the Syma X1. Mounting it with the flight computer & battery is probably a bigger problem than implementing the algorithm from scratch.

The clearances on it between the components & the screw holes are the tiniest possible. Any crash is going to destroy it.

There is no source code for the PX4flow & reflashing it would probably require soldering a header to its JTAG pads, but there is source code to parse its output

https://github.com/cvg/px-ros-pkg/bl...flow_node/src/

Modern C programs are now full of google & boost dependancies, the same way they once depended on corba & com, the same way they once depended on gobject & glib. The iterations of coding conventions don't seem to go anywhere & are more determined by who's stock price is the highest, rather than what improves on the last conventions.

Whether any vehicle can navigate reliably without a hard floor can now be answered: fuggedaboudit. The Walkera Infra X uses IR for boundary detection but requires sonar on a hard surface for altitude. The IR must be extremely limited in its accuracy to require sonar for altitude.

The AR drone uses optical flow but requires sonar on a hard surface for altitude. The difference is the AR Drone needs details on its floor.

IR could be an intriguing project for many years, but in the next 60 days, you have to delegate that research to the academic community which has only shown optical flow to do what you need.
Posted by Jack Crossfire | Feb 27, 2013 @ 02:03 AM | 9,929 Views
For all the talk of the drone revolution, there remanes but 1 vehicle which is truly a plug & play drone for the masses: the AR Drone. In true dot com tradition, ifixit.com made zillions of dollars in venture capitol & never did a teardown of the 2.0, so it was time for the unemployed masses to step in.

The entire electronics assembly & battery are supported by a piece of foam. The boards still have many wires connecting to the frame, subjected to the full vibration.

A common 640x480 camera does the optical flow & seems to send some kind of compressed data. It may be a 30fps USB cam.


It has a mane brain board & sensor board, connected by just 8 pins. The 2 cameras connect to the mane board. The front cam may output uncompressed data.

A common PIC serializes the data from all the sensors. Maybe it also fuses the sensor data.

The air pressure sensor is under a piece of foam designed to not touch it.

The radio has a much smaller trace antenna than the AR Drone 1.0.

Under the can, it's a power management & USB chip from TI, some kind of flash & RAM from Micron. In usual modern fashion, the mane processor would be sandwiched under the big chip, so there would be no way to see it.

AR Drone optical flow test (1 min 49 sec)



The 1st test ever done to see how accurately the optical flow holds its position. Over carpet, it's hopeless & seeks the nearest line. With a target, it's decent at the default altitude,...Continue Reading
Posted by Jack Crossfire | Feb 26, 2013 @ 05:39 AM | 8,378 Views
So the STM32F4 had another linker anomaly where an object file had to be near the start or 3 of the PWM pins would stop working after calling a routine jump. In the past, complete crashes were caused by 32 bit jumps being called the wrong way in 16 bit ARM thumb instructions. There seem to be more cases where the compiler isn't handling 32 bit jumps properly.

Despite not being an ARM thumb expert, I do know a 32 bit jump requires loading a register 1st, then jumping to the register, while a 16 bit jump can be done to a literal.



Simply integrating the gyros was how Mikrokopter originally worked & it seemed to be real stable & fast. So the same thing was done on Marcy2 with pretty bad results, on the bench. At least it updates at over 1000hz. Doing the full quaternion rotation would be quite unpleasant & slow. The theory is a fast, inaccurate AHRS would be better for an indoor vehicle than the complete quaternion. This is all because of the lack of time to bring up the full floating point functionality & desire to have something that would work on a simpler micro.



Finally, the bench is made of wood & wifi doesn't work when the antenna is near wood. Not surprisingly, the board can't be used at all when lying on the bench.
Posted by Jack Crossfire | Feb 25, 2013 @ 07:03 AM | 7,711 Views
Having found no personal use for an optical flow guided UAV, the desire still returned to have wifi for development of it. With the code re-enabled, I2C started glitching out again at 400khz. It has only 6mm traces. Back to 200khz it went & the glitches became manageable. Any glitches will crash the vehicle.

Wifi continues to be real problematic. It could be lousy performance from the rtl8192, but it's the only thing small enough. It needs 200mA, requiring 3 MIC5205's to power the entire board, so far.
Posted by Jack Crossfire | Feb 23, 2013 @ 05:19 AM | 8,337 Views
With a miniature AR drone now the goal, a new board rises from the remanes of the old, with provisions for an optical flow camera & the return of wifi, all the processing to be done on the board. Rather than wait 2 weeks for digikey, it was easier to rip off the MOSFETs from the Syma X1.

Parrot is most definitely going to release a mini AR drone in the near future. That's the easiest prediction ever made, but one doesn't take chances. The mane unknowns are how much current the PX4flow board uses & how much it weighs. If it's out of bounds, the camera goes on the mane board & the algorithm needs to be written from scratch. This would definitely be less accurate than the PX4.

Either way, the Maxbotix sonar is required. The difference would be the large PX4 camera lens & 2nd board.

No-one has ever taken apart an AR Drone 2.0 & documented it. Judging from the 1 blurry video parrot provided, it uses an inferior camera to the PX4.
Posted by Jack Crossfire | Feb 22, 2013 @ 12:13 AM | 9,532 Views
After years of dicking around with charge pumps, pullup resistors, & BJT's to turn on MOSFETs that require 5V to turn on, finally got around to probing the Syma X1's board. It uses a MOSFET that turns on at 2.5V, the 9926A. The 9926A is an extremely popular dual MOSFET made by many manufacturers.

Almost 1/2 the last board was for generating the high gate voltage needed by the MOSFETs. The 9926A should greatly simplify matters. The Syma X1 turns it on with 3.4V PWM at 800Hz. The back EMF spikes to 7V at 800Hz, even with the snubber diodes.
Posted by Jack Crossfire | Feb 21, 2013 @ 05:01 AM | 8,969 Views
So the next vehicle needs to use optical flow + sonar & the last board needs to be scrapped. The PX4FLOW module would be the best solution, but it's probably too heavy, necessitating a new board with smaller camera, motor control, IMU, & radio.

There are some optical flow algorithms on

https://github.com/ArduEye/ArduEyeLi...rduEye_OFO.cpp

They're algorithms from 1984 intended for movement of 1 pixel. The wisdom of the time said your best bet is to scan every pixel in the image for movement of 1 pixel so any ideas about logarithmic macroblock searches, blob detection, or edge detection probably won't work.

PX4FLOW says it scans a binned 188x120 image at 250hz. The algorithm in the PX4FLOW is described in

https://pixhawk.ethz.ch/px4/_media/m...flow_paper.pdf

It's assembly language intensive. Their camera supports greyscale, allowing the fastest DMA capture.

It does basic, exhaustive absolute difference comparisons of 8x8 macroblocks in a search radius of 4 pixels. It scans 64 8x8 macroblocks, so only 1/5 of the entire image is scanned, but the scanned regions are at high enough resolutions to resolve textures.

Instead of averaging all 64 motion vectors, it takes the most frequently occurring vector. Then it does a subpixel absolute difference comparison using the closest macroblock to the final result to get subpixel motion.

The trick is using a gyro to cancel out picture movement from banking & turning, yielding only movement from horizontal motion.

The PX4Flow source code would be nice, especially for the assembly language routines. Using the smaller, easily obtained TCM8230, the nearest resolution drops to 160x120, but it needs software to output greyscale & it can't go higher than some 70fps.
Posted by Jack Crossfire | Feb 20, 2013 @ 03:22 AM | 8,868 Views
Finally got I2C completely functional on the STM32F4. The problem is the example code needed to be rewritten to use state machine to be useful.

It checks for BUSY being true every time except after STOP. Then it checks for BUSY being false. Also, if it times out from a legitimate glitch, the necessary sequence is to DeInit & reinitialize it to get it to work again. You won't likely catch that nugget of information in your porting. That got it to the full 400khz, 1600 gyro samples, 400 accel samples, 120 mag samples.

Would say no-one else does this, but just relies on a preemptive OS & threading. Maybe they have the right idea in relying on threading. It's a lot of overhead. The college love affair with threading has been replaced by maximizing every clockcycle & the lowest latency possible.

Next, in true microcontroller fashion, letting the UART overflow crashes it, so you have an interrupt handler draining the UART. Writing to the flash prevents the UART interrupt handler from working & crashes it, so you need to disable all the UARTS before writing to the flash. Another nugget taken care of by an OS.

Anyways, during the process of STM32F4 debugging, it was amusing to look through the PX4 source code, finding it was written on the NuttX OS, which rewrote all the SDK functions from scratch, with their own deeply nested abstraction layer. It's the kind of dedication to the study of a single chip that you see from college students trying to prove...Continue Reading
Posted by Jack Crossfire | Feb 18, 2013 @ 04:46 PM | 8,421 Views
It really felt like Marcy's domain, since its sole purpose was for flying Marcy class aircraft. It was a buster to move all the gear required for flying, but once set up & debugged, it was her domain. Those aircraft enjoyed having enough room to fly around instead of being confined to a single point.

There were only 3 flying campaigns in there. Google requires an internet connection to compile any Android software, so any changes to that section required driving home. Then there were the wifi connection problems.

But the rarity & the fact that it was during the holidays probably enhanced the nostalgia. The energy required for setting up required having something new to test, which was usually just 2 days out of 2 months.

Being in the UAV business for 6 years showed all too well how hard it is to make enough money on something that flies to afford a flying space. The energy storage & position sensing technology doesn't exist to make a mass consumer product out of an indoor UAV. Your target market can at most be RC pilots or collectors.

Eventually, the technology will catch up to the problem, but the 1st consumer products are going to be from the traditional high rollers with enough money & time to pull it off.
Posted by Jack Crossfire | Feb 17, 2013 @ 07:05 AM | 7,947 Views
Night run on the gopro (1 min 38 sec)


Making things go fast. Intriguing what the gopro can see at night. Cinelerra actually managed to stabilize it. When in timelapse mode, it locks exposure to the 1st frame. The longest exposure is 1/2 second, allowing it to see some stars. The hardware can certainly do better, but the software is crippled.

Mane motion (10 min 30 sec)
...Continue Reading
Posted by Jack Crossfire | Feb 15, 2013 @ 08:25 PM | 8,502 Views
As a means of embedding all the machine vision computations in a self contained ground station, a few better alternatives to the raspberry pi are slowly emerging.

http://www.toradex.com/products/apal...les/apalis-t30

Quad 1.4Ghz for $175, equivalent to 6 raspberry pi's on a single slightly smaller board for a lot less money. There seems to be a 1 year lag from the time a new performance level reaches phones to the time it's available as a standalone board. Using a rooted phone itself for the same computing power remanes insanely expensive.

The raspberry pi has never been very stable at any clockspeed. It's certainly unusable at 1ghz, but it's not bulletproof at any lower speed. It's getting power from a 5A power supply & has a fan cooling it.
Posted by Jack Crossfire | Feb 14, 2013 @ 02:36 AM | 8,617 Views
Gopro run timelapse (2 min 26 sec)


A sports camera is something never used by attainable women. There's no documentation for it, since it's only been around for 2 months. Some nuggets of information learned:

It works as a USB mass storage device, but it's a low speed USB device. You need to remove the SD card & plug it into a card reader to transfer any normal size files. If you got a 64GB card, it's going to wear out from swapping long before you ever transfer 64GB from it.

It's a miracle of bad software, since the Ambarella chips have high speed USB.

To mount the SD card in Linux, you need a kernel compiled for user space filesystems, scons, fuse, & fuse-exfat. They actually all compiled on an ancient 3 year old Fedora.

To get wifi to work, you need to download an update from gopro.com, but more importantly, you need to create an essid & password. The default essid & password don't work. You can't change the essid & password without downloading & reinstalling the update.

It can actually update from a VFat formatted card, but can't store anything on it until reformatting it as exfat.

Maximum video is 3840x2160 50 megabits. Maximum still photos are 4000x3000.

The wifi app doesn't do much. It has a delayed, low quality viewfinder mode, allows some settings to be changed, but can't download files from the card. Being able to download & play files from the camera is a no brainer. The viewfinder doesn't work...Continue Reading
Posted by Jack Crossfire | Feb 13, 2013 @ 03:01 AM | 8,439 Views
So there was another bug somewhere in the software I2C library of the past several years, causing lots of problems with the MPU9150. It was also dreadfully slow at reading modern chips, which increasingly use I2C for high speed broadband.

With the ARM at 168Mhz, consuming 60mA, doing nothing else, it only got 348 readings/sec out of the MPU9150 & managed to bit bang I2C at only 169khz. That's equivalent to what the Ladybird's piss poor Xmega16D4 did for just the gyros, so the Ladybird probably needed 1000 readings/sec to be stable.

Fortunately, a hardware I2C port happened to be available on the ARM without enlarging the board. The hardware only managed 200khz on the I2C bus, but thanks to not having any bugs, it could burst transfer out 736 readings/sec with much less CPU time. The MPU9150 bypass register worked.

Just reading the gyros, it can do 1400 readings/sec. There are ways to throttle bandwidth to the gyros to get a 1300hz gyro reading, 50hz accel reading, & 50hz mag reading, but too bad I2C can't go above 200khz.

Software I2C is still needed for the slower sensors, to minimize board space. That library has had endless bugs, but it always worked well enough & it came from a goog search.

Even with all this CPU power, the trend is to continue doing AHRS on the ground instead of the air. There are so many autopilots & each new implementation of AHRS on a microcontroller is still a major operation. It still needs a lot of either fancy fixed point or floating point compiler magic.
Posted by Jack Crossfire | Feb 11, 2013 @ 11:54 PM | 8,656 Views
http://www.securitycamera2000.com/pr....6mm-Lens.html

was not the camera used in

Walkera Ladybird QR FPV build (8 min 22 sec)


He never gave out the camera model, but he got it to run on 3V. The 420TVL required at least 5V, so it won't work on any of the toy copters without a 2nd battery. The picture quality was a lot better than the 3V camera, when it worked. It required a few power cycles to work.

It seemed to have loose components. It worked better after reflowing some random components. Hot gluing the wires to it broke it.

The web page said CMOS. The package said CCD. It didn't have any rolling shutter effect & the voltage was too high to be CMOS. The overlay circuit did a better job on it, since it has very high impedance. The sync pulses only go to 0V, so the overlay circuit could omit the LM1881 & use the microprocessor's comparator if it didn't have to work with any other camera.

The video is interlaced 59.94 fields per second, so there's potential for tight synchronization with a monocopter.
Posted by Jack Crossfire | Feb 11, 2013 @ 03:30 AM | 9,076 Views
The new Syma X1 board was fully loaded. It weighed 5.6g. The old Syma X1 board was 5.3g. The original board obviously wasn't optimized for size. The new board also has a much more powerful CPU & a full IMU. The Ladybird board was 2.7g, of course.

Test ran a motor to 1A. The MOSFETs do indeed need 5V to turn on. The mane unknowns are if the high current is going to interfere with the CPU, if the MOSFETS are going to blow, if the charge pump can bias all 4 MOSFETs, if the voltage regulator can power it at the required clockspeed.




Anyways, discovered ganging 4 GPIOs improves the video overlay. Still not as good as a multiplexer, but probably a better deal for the size & the application.
Posted by Jack Crossfire | Feb 10, 2013 @ 04:11 AM | 8,099 Views
Some tests with an ancient camcorder's composite out were disappointing. There's just not enough GPIO effectiveness to make an easily discernible overlay. There was no blank area to put a code as hoped, either. Unsharp masking brings out the code only if there's no nearby detail.

It's good enough for an initial test, but the analog multiplexer is required for reliable use. Hard to believe this horrible camcorder video was the highest quality most of the world saw until after 2005. Just 10 years after the 1st sign of an HDTV appeared in a home, there are now no more low definition TV's anywhere, at all. The transition seemed to mostly happen abruptly, after 2008.

Anyways, progress continues towards an indoor quad controller. It does little more than the Ladybird, but takes much more space. Apartment manufacturing technology isn't quite up to Chinese levels. There is hope combining the brain of a Ladybird with the passive stability of the Syma X1 frame will bring more stability than either alone.
Posted by Jack Crossfire | Feb 09, 2013 @ 01:36 AM | 8,348 Views
After all that speculation & rework, the video overlay was implemented. The only way to get black pixels was shorting the composite signal directly to a GPIO. The input protection diode didn't raise the hsync pulses as much as feared. Getting the whitest pixels required shorting directly to a PNP transistor fed by 3.3V. Both the HSYNC & VSYNC output of the LM1881 were required to get the best sync.

Using the PIC's RC oscillator wasn't good enough to show any text, but good enough for showing a code. The PIC at 32Mhz was fast enough to draw at the limit of the video resolution. Ordinary telemetry could have been sent this way years ago, when the 1st autopilot was being developed with nothing but a 72Mhz radio & the $80 for a set of XBees was unbelievable.

To minimize size, the PNP transistor was eliminated & the overlay was done by toggling a single pin, shorted to the composite signal. The whites weren't whitest, but good enough. The PIC probably can't drive enough current to damage the DSLR.

The circuit added some noise to the video. While you could probably get the sync pulses with discrete parts, the LM1881 is the most compact way.
Posted by Jack Crossfire | Feb 08, 2013 @ 05:25 AM | 7,704 Views
Surprised to find the el cheapo USB video capture dongles don't use any compression. They send the full uncompressed 720x480 30fps frames through isochronous USB packets. There are still too many latency points to rely on it for monocopter synchronization.

Given the complexity of today's video codecs, it's amazing that for the entire generation which lived from 1950-2000, all their sights came down a single wire whose voltage was just the sequential scanlines of 1 frame after another.

Only 1 gadget still produces a composite signal in the apartment, an ancient DSLR. Image features are visible in the waveform. The DC voltage is the luminance. -0.5V is a sync pulse. 0-1V is the luminance. The magnitude of a 3.4Mhz sine wave on top of the DC voltage is the saturation. The phase of the sine wave, relative to the color burst, is the hue. It's amazing this technology from 1953 could resolve 720x480 pixels & transported all our 1980's memories.

The easiest way to overlay a monocopter synchronization code on the video is to pull up or down the voltage, but the composite output & input have very low impedences, requiring pulling enough current from the composite output to fry it. It's probably OK to pull the current for a small sync code.

The right way is to use an analog multiplexer. http://electronics-home-projects.tripod.com/ uses a 74hct4052.

http://garydion.com/projects/videoverlay/ gets away with directly pulling the voltage. It uses the...Continue Reading
Posted by Jack Crossfire | Feb 06, 2013 @ 07:53 PM | 7,904 Views
Human vs computer (5 min 38 sec)


Human copter tries to interfere with computer copter without causing any damage. The mane problem with having the computer fight back is the computer knowing the position of the human. Machine vision can only locate 1. The 2nd would have to rely on less accurate sonar position sensing.

It's sort of magic that the Vicon system can differentiate 2 copters when they're close to each other, but then I haven't scrutinized all of the Vicon demos to see if they ever get near each other or systematically have a minimum distance to avoid confusing the machine vision.

The problem with making a dogfight game between a computer controlled copter & a human has always been the cost of actually shooting down the models. There was really only 1 useful idea:

Shoot opponent with laser, causing damage points. After a certain number of damage points, the computer goes into a landing. The human would go into an uncontrolled throttle reduction. Would require light sensors on the copters. Having just 1 light sensor & imposing limitations on the laser firings like a delay & maximum duration would make it more challenging.

The laser would interfere with the machine vision. An algorithm would need to exclude all new blobs during the laser firing.