Shop our Airplanes Products Shop our Surface Products
Jack Crossfire's blog View Details
Posted by Jack Crossfire | Oct 26, 2015 @ 11:22 PM | 3,690 Views
Autonomous driving begins (3 min 56 sec)

As it was when copters 1st hovered on their own, it wasn't a sudden ability to assertively drive itself but an incremental journey towards being able to drive down the path for short distances. Recorded the webcam at 30fps. The SD card has enough room for 2 hours. The Odroid seems to have 2 hours of power from the 4Ah battery.

Quite nihilistic to see such a high frame rate come from such a cheap webcam, but it was made possible by the enormous computing power applied to it, compared to what was around in its day.

By the end of the session, it was assertively steering towards the vanishing point without any damping or lead compensation. It had a hard time with shadows. It couldn't stay laterally on the path. Software for lateral control is ready to go, during the next time off.

It slowly emerged that the super chroma key was similar to the chroma keying Goog had in 2005. The Goog used chroma keying in its 1st autonomous car. Like super chroma keying, it applied many different color keys to the same image. Unlike super chroma keying, it took the color keys from the same coordinates in many points in time, rather than taking many different coordinates in the current frame. It had to drive a certain distance to gather all the possible colors in the path, but it didn't gather colors from off the path.

Like copter autopilots 10 years ago, there's no free cookbook for making an autonomous car, so every step takes a long period of discovery. It took a lot of experimenting with new edge detecting methods to discover old fashioned chroma keying was the best method. It would be impossible for 1 person to make it handle every situation, handle turns at 10mph, or avoid obstacles.

Like autonomous copters, an accepted way of doing it will eventually become an easy download. The accepted method is going to take a lot more sensors. The key sensor is LIDAR while with copters it was GPS. There's still something novel in being the 1st & doing it with just a camera.
Posted by Jack Crossfire | Oct 24, 2015 @ 09:08 PM | 3,580 Views
It was another day of rearranging components & debugging steering before any attempt at machine vision could happen. The massive steering changes required for autopilot were a long way from working. Had to move the Odroid to make room for its cables. Captured some video. Added a variable frame rate for recording the input video. Wifi was much more reliable from the macbook than the phone. Couldn't connect long enough to start the program from the phone. Once started, it didn't crash. Managed to access the STM32 just enough to do the required maintenance. Rick Hunter never had to deal with component accessibility. Such is the difference between Japanese cartoons & reality.
Posted by Jack Crossfire | Oct 24, 2015 @ 01:49 AM | 3,941 Views
After some fabrication & debugging, the 'droid was sending geometry to the autopilot & the autopilot was successfully taking configuration parameters from the phone to tune the path following. Tuning the vision algorithm would involve very difficult text editing over the unreliable wifi.

Despite every effort, there was no way to allow access to the STM32 without a major teardown. Only a serial port was extended. The vision algorithm got a slight improvement by converting the geometry detection to a polar line equation & using logarithmic searching.

Finally, there was a desire to record raw video from the camera, to reproduce the path detection offline. After struggling for many years to record 720x480 TV on dual Athlons, it was quite humbling to find the Odroid could record 640x480 in 90% quality JPEG at 30fps, in a single thread, while simultaneously performing path detection. The path detection still can't go above 160x120 without a massive slowdown.

Once fixed on the car, the Odroid has proven much more reliable than the PI. It wasn't reliable when sitting on the bench because its 0201 caps were quite sensitive. It can stay on idle forever & hasn't had any issues in extended machine vision runs using 300% CPU.

With the amount of money & time invested, there's every hope path following will work, but it probably won't.
Posted by Jack Crossfire | Oct 21, 2015 @ 08:06 PM | 3,905 Views
It came & went. Nothing happened. A lot of us were expecting someone from 1985 to show up in a flying car.
Posted by Jack Crossfire | Oct 18, 2015 @ 10:55 PM | 120,540 Views
After weeks of baby steps, the Odroid finally connected to the phone as an access point. It was just as unreliable as the Pi. This didn't involve a kernel recompile, since trying to compile the kernel failed.

The old RT8192 wifi dongle once again reappeared for the access point role. This time, one must download the mysterious hostapd_realtek.tar from somewhere. This contains a mysterious version of hostapd which Realtek modified to actually work. Plugging in the dongle got the right kernel module to load, the 8192cu.

The hostapd command was:

/root/hostapd /root/rtl_minimal.conf&

The dhcp command was:

The hostapd config file was:

# wlan interface. Check ifconfig to find what's your interface

# Network SSID (Your network name)

# Channel to be used! preferred 6 or 11

# Your network Password

# Only change below if you are sure of what you are doing

Key to getting hostapd to work was not using encryption. Commenting wpa=2 disabled it. Enabling encryption caused dhcp to fail. Another requirement was deleting /usr/sbin/NetworkManager. There was no other way to disable NetworkManager than removing the program.

The dnsmasq config file was /etc/dnsmasq.conf



Since the access point's sole purpose was aligning the camera, there was no need for encryption.
Posted by Jack Crossfire | Oct 18, 2015 @ 03:20 PM | 119,110 Views
So the Odroid has a 2nd UART on its 30 pin header, the STM32 board has another unused UART, but it isn't exposed on any pads & no-one was in the mood to make a new board. Back to SPI it was. Based on goog searches, getting SPI to work on the Odroid is a big deal. It wasn't expected to be as easy as the PI because of the small userbase & expectations didn't disappoint. Where the PI has a gootube video on every minute subject, in plain english, the Odroid documentation consists solely of gibberish like "download code" or "just change the dtb". The PI would officially come nowhere close to the required processing power.

Finally, ended up bit banging the SPI using simple writes to the /proc filesystem to change GPIOs. This achieved 3.33kbit while all 4 CPUs were processing video at 50%. Since each bit contains 3 sleep statements, it's a sign the kernel was switching tasks at 10khz. Without sleep statements, it went at 38kbit with spurrious gaps of 100us. 100us is a 10khz frequency.

38kbit is too fast for the STM32, which also uses software defined SPI. 3.33kbit can send only 13 bytes of information per frame at 30fps. All problems would be solved by making a new STM32 board with another UART. Just this small amount of bit banging made a 1% increase in CPU usage on all of the Cortex-A7's.

Finally, the GPIO voltage was only 1.8V, so level conversion transistors had to be soldered to bring it up to 3.3V. The mane problem is now fixing the Odroid to the vehicle.
Posted by Jack Crossfire | Oct 10, 2015 @ 12:22 AM | 4,556 Views
With all its dependancies resolved, the odroid finally underwent its most challenging task ever documented. The vision program compiled & processed the 160x120 test footage at slightly over 30fps. It was way below the predicted 72fps from nbench results. Fortunately, at 160x120, it had no problem consuming the webcam's maximum framerate. The latency would be 1/4 of whatever the framerate was, since it used a pipeline to parallelize an algorithm that was otherwise not parallelizable.

It sucked 2.2A at 7.4V with the fan wirring at full speed & webcam on. The Turnigy converter got scorching, even converting this meager voltage down to 5V.

At 320x240, it was hopelessly slow again at 4fps from the webcam. The CPUs maxed out & used 2.2A at 7.4V. Idle current was 0.65A at 7.4V or 1A at 5V. There was no sign of any frequency scaling. The A7's were always at 1.4Ghz & the A15's were always 2Ghz.

As predicted, it only used 4 CPUs at a time, no matter how many the vision program was configured to use. The kernel automatically allocated the 4 Cortex-A15's to the vision program, only occasionally using the Cortex-A7's. It was unknown whether 4 CPUs was the maximum of both A7's & A15's in use simultaneously, or whether it had to stop the A15's to use any A7.

CFLAGS were hard to come by, but the following seemed to work well:

-O3 -pipe -mcpu=cortex-a15 -mfloat-abi=hard -mfpu=neon-vfpv4

It was still an astounding step up, compared to what the PI or the Gumstixs brought. It was of course the power of billions of premium contracts subsidizing phones that led to such huge gains in the Odroid's Samsung processor. The set top box market that led to the PI's Broadcom processor wasn't nearly lucrative enough to generate such powerful chips. Set top boxes were a race to the bottom while phones were a race to the top. Because cell phones are a vital business expense, they benefited from tax writeoffs & corporate budgets while set top boxes were squarely funded by what meager budgets consumers had after taxes.
Posted by Jack Crossfire | Oct 08, 2015 @ 11:55 PM | 3,971 Views
After a quick goog search revealing nothing, decided to be the 1st to document the interior for you the viewers. 15mm on a full frame proved to be the right camera system, but didn't have as wide a depth of field as hoped.

Once inside, the futuristic hull fades away & it looks like any other ship from 30 years ago, if maybe a bit cleaner. There is no evidence inside the maze of corridors of any futuristic hull except for the strangeness of all aluminum walls.

The same exposed wiring, steep ladders, cramped spaces, & spartan rooms of the old days still abound. Can't imagine getting around there during a storm or a battle. On the bridge, it seemed incredibly expensive to have so much equipment specifically made for the military instead of using commercial boating gear.

Like all its other scandals, false advertizing by the contractor about the required crew led to another 15 people being crammed into a space designed for 40. Fortunately, the 3 hulls, low draft, & aluminum frame can get them home faster than any other ship. Everything was labeled & organized for transitory crews that come & go with their scholarship obligations.

They didn't show the engine room with its twin LM2500's, any debriefing room, or any closeup of the mane gun. Suspect all other areas besides what we saw were crammed with equipment. All the copters & UAV goodies were gone. There were just a couple dingees....Continue Reading
Posted by Jack Crossfire | Oct 07, 2015 @ 12:16 AM | 4,340 Views
There it is. Much to the disappointment of unboxing fans, it'll never run Android or a desktop. It could play a mean game of Asphalt 8. Its only destiny is to injest video from a webcam & output path geometry. Even the raspberry pi saw very little use besides a short stint as an access point.

Much has to be done before it starts driving a vehicle. An operating system has to be installed. A serial console needs to be exposed. A cross compiler needs to build the vision system. The vision system needs to be made parallel. SPI has to be ported from the PI. Wifi needs to be installed. The DC-DC converter needs to be connected. It needs to be configured to use the Cortex-A15.

It's comprised of a very fragile 1/32" board. Removing the headers takes a bit of care to avoid lifting pads or cracking the board. Probably cracked the board anyway. It has a 30 pin 2mm surface mount & a 4 pin 0.100" through hole header which need to be removed. It's not worth removing any other headers since the vertical profile is dictated by the fan, unlike the PI.

The bottom is very fragile compared to the PI. The power pins can be accessed from the bottom. It's disappointing that it's a very old cell phone processor repurposed for hobbyists at great cost compared to a phone containing the same processor. There's no way to have a standard single board computer take whatever the latest phone processor is or repurpose a much cheaper phone as a single board computer.
Posted by Jack Crossfire | Oct 05, 2015 @ 11:41 PM | 5,142 Views
The decision was made to pump $110 on an Odroid XU4. Reality was a bit higher than the $75 price quote because of $10 shipping, $7 tax, $7 for an SD card, $11 to ship & tax the SD card. It was already known the droid could only use the 4 Cortex A15 cores with the 4 A7 cores inoperable.

There are no definitive comparisons between the PI 1, PI 2, & the Odroid XU4 or any clear descriptions of what benchmarks they used. You can only estimate a single core of the Odroid is 3x faster than a single core of the PI 2, which is 2x faster than the PI 1. With all 4 cores, the Odroid would probably give 24x the framerate.

Based on the last optimizations, 160x120 would go at 72fps. 240x180 would go exactly at 30fps. 320x240 would go at 15fps. It brings back memories of spending a fortune on a 600Mhz single core Gumstix, just to use a neural network. It was so fast for its size, but less than a PI.

The chroma key & line detection were confirmed as the slowest parts. Since the algorithm can't be done in parallel, a frame pipeline with a 100ms latency would be best. The mane benefit is from averaging multiple frames rather than lowest latency. Changes in lighting & camera angle may provide more benefit than higher resolution from a single position, so you want to maximize the framerate before maximizing resolution.

An attempt to use linear regression to get the chroma key pixel didn't work. The simplest average algorithm here continued to be the best.
Posted by Jack Crossfire | Oct 04, 2015 @ 05:42 PM | 4,037 Views
It's a good time to reveal to the world that the Accucell 8150 doesn't really balance anything. It appears to spend most of its time with the balancing resistor network on, then briefly float the balancing pins every few seconds, presumably to determine what cell is high. It can't determine the high cell when the resistor network is active, even though the LCD clearly shows it. The Astroblinky used the same cycling algorithm, for some reason. 1 cell usually ends up over 4.20 during the floating.

It might actually work if the difference is < 0.01V but lack enough effectiveness for 0.05V. It would be worth trying the astroblinky with a bunch of voltmeters to see if the astroblinky was equally ineffective. The accucell's cell readout revealed severe imbalance in a 2S & a 3S.

The mane cause of unbalancing seems to be the temperature differential between the cells. In high current applications, the inner cells get hotter than the outer cells & discharge faster. This causes the outer cells to discharge less & overcharge.

In low current applications in hot weather, the outer cells heat up from ambient air before the inner cell. This causes the inner cell to discharge less & overcharge.

In the 2S hand controller, body heat heats up 1 cell faster than the other. Ideally the current usage & enclosure would heat all the cells evenly.

The only way to manage the unbalancing is independantly charging the cells instead of relying on a resistor network like modern chargers. This would be very expensive, requiring independent MOSFETs.
Posted by Jack Crossfire | Oct 03, 2015 @ 09:40 PM | 3,429 Views
It was the longest any vehicle ever went on a single battery. Set it to 6mph. Had the headlights off. Speed was somewhat variable because of the low accuracy of the inverter. Not sure driving the ESC using I2C would make the speed more accurate, since the bottom line is the number of bits in the 8khz waveform.

This achievement took 4Ah or 90% of the remaning capacity in the last full discharge. It would have a hard time going any farther or any faster. A fresh 5Ah battery would probably go 17 miles.
Posted by Jack Crossfire | Oct 03, 2015 @ 01:41 PM | 3,163 Views
In the hardest test video, 2.9825% of the frames failed. Shadows are the hardest. Cloudy days are the easiest, even working quite well at 160x120. Higher resolution improved the results. 4k proved too slow to scan.

Normalization slightly degraded the results. Suspect the best results would come from manually setting the exposure, setting manually tweeked constants based on the exposure & path. The saturation is still manually set. However, it is about capturing the most information & the auto exposure is choosing the exposure that captures the most information.

The key computational user is the chroma key step, with any other step manely free. A different chroma key needs to be applied for each color of the path. Then, the threshold needs to be manually tweeked until the path has the right shape.

Tried dynamically detecting the optimum threshold by applying many thresholds to the same image & measuring the average distance of every mask pixel from the mean mask pixel. The optimum path shape should always have a similar average distance of mask pixels from the center mask pixel. This was slow indeed, but always produced very tight masks. The problem was the threshold converged on a constant value & didn't change in any frames. The return on the investment of clockcycles was minimal.

Also tried finding the thresholds where the total number of masked pixels jumped a certain amount due to a sudden bleeding. This didn't produce consistent...Continue Reading
Posted by Jack Crossfire | Oct 01, 2015 @ 11:08 PM | 3,376 Views
Testing continued to try to improve the algorithm & reduce the computation requirements. On some test videos, the algorithm successfully tracked over 99% of the frames. It was still a game of chance, with the threshold, search radius, maximum offset distance, & edge kernel size just happening to be the right values. More difficult lighting naturally degraded the results.

Camera noise contributed to variability even when the position didn't change. It would benefit from a high frame rate. Smaller frame size degraded results. The computational requirements are still way beyond what can be reasonably done on the vehicle.

Normalizing the YUV channels subjectively improved results. A false color image with the normalized channels has a priori information about the dynamic range which wasn't available when the camera took the image, but it inuitively should need the dynamic range of the color channels to be the same as reality.

It didn't work at all when either converted to RGB or using luminance only. It did best with all YUV channels.

It became possible to estimate the center of the path from the 2 sides. Tracking the center would be still more robust than tracking a side.
Posted by Jack Crossfire | Oct 01, 2015 @ 12:54 AM | 2,546 Views
Back at 200m intervals with the brushless motor, it took 2Ah on 3S to do this 6.2 mile drive, with 10 10mph sprints down & a steady 6.66 mph drive back up, with headlights on. The ESC has proven less accurate than directly driving the H bridge, as expected. GPS rated the sprints at 10.5mph & the fastest sprint at 11mph. At 10mph downhill, the ESC errored high. At 6.66mph uphill, it was right on. Still, it could go 12 miles on 5Ah even with these demanding speeds. The current increases as the voltage decreases, so current usage needs to be padded.

Heartrate was 176, dropping to 173 as the intervals wore on.
Posted by Jack Crossfire | Sep 27, 2015 @ 09:19 PM | 2,486 Views
It was a long time coming, but after nearly a year & probably no more than 100 miles, the brushed motor was done. Finding the right motor was quite an ordeal. Either all ground vehicles use the same size motor or the bolt holes are in standard positions. There is no data for the bolt holes, only random posts. An obscure post recommended the Tacon 2400kV & it was the cheapest.

After converting the computer from direct H bridge driving to 50Hz PWM, the RPM constants all stayed the same. Drove it 6.8 miles at 5mph on 3S, consuming 1928mAh. The brushed motor used 3518 mAh to go 8.5 miles, so a 31% efficiency gain. Considering the brushless motor is 90% efficient & the brushed motor is 50% efficient, it was a very good translation of motor gains to mileage. The transmission didn't contribute a lot of power loss. It has a good chance of doing a full 13.1 miles on under 4Ah.

Starting was identical to the brushed motor. It didn't look anything like the banging of an AC electric train.
Posted by Jack Crossfire | Sep 26, 2015 @ 01:12 AM | 2,966 Views
Previously, the best result came from the RALPH algorithm. After nearly a year of intermittent ideas & dead end Goog searches, an algorithm emerged which made RALPH look utterly terrible. The idea came from chroma keying, but unlike normal chroma keying, this was a super chroma key.

The super chroma key worked like a magic wand, but instead of cutting off when pixels deviated away from the color key, it learned new color keys based on where the previous mask was. It differentiated between good & bad candidates for new color keys. Then it did an old fashioned greatest edge detection to find the edges of the path. The edge detection eliminated many false areas which didn't belong to the path.

While far from perfect, the super chroma key was extremely robust when tested against very shaded sections. Most sections which weren't heavily obscured by shadows were bulletproof. It could detect both sides of the path, between which a center line could be drawn & the rover programmed to drive just on the right of the center line.

It only needed 640x480 video. Processing is extremely slow. It could be some time before the super chroma key is fast enough for a test drive.

Path detection with super chroma key (0 min 58 sec)

Posted by Jack Crossfire | Sep 25, 2015 @ 12:19 AM | 2,042 Views
Lions were never crazy about basic inorganic chemistry, but propellant densification is the latest thing, like any improvement in rocketry, a major event only happening once in a lifetime. Like ion engines, it was something that spent most of our lives in science fiction, taken up & put away through the years. There's no record of it ever going beyond testing. SpaceX did yet another test, with plans to actually use it, someday.

At least the oxygen side will be refrigerated to 1 deg above the point at which it becomes a solid, providing a 7% increase in the amount of fluid mass in the same volume. Someday, maybe the kerosine side will be densified. It has the same effect as fitting a bigger rocket in a smaller space. It doesn't increase the efficiency or the thrust.
Posted by Jack Crossfire | Sep 20, 2015 @ 10:31 PM | 2,640 Views
The decision was made to convert the lunchbox to brushless. None of the flying ESC's in the apartment could support reverse, so it was time for a SimonK flashing of a Supersimple 30A. None of the blog posts about Arduino programming contained the pinout used to program the last ESC, so here it is:

black - ground
- sck pin 13
- miso pin 12
- mosi pin 11
- SS pin 10

The key parameter for reverse mode was RC_PULS_REVERSE in tgy.asm

Then came make tp_8khz.hex to generate a high efficiency 8khz target. Then came running arduino to install the Arduino ISP sketch. Then finally came the avrdude command:

/root/arduino-1.6.0/hardware/tools/avr/bin/avrdude -C/root/arduino-1.6.0/hardware/tools/avr/etc/avrdude.conf -v -e -patmega8 -cstk500v1 -P/dev/ttyACM0 -b19200 -Uflash:w:tp_8khz.hex:i -Ulock:w:0x0F:m

Then came the dreaded stk500_recv(): programmer is not responding

It erased successfully, read the device ID, but wouldn't take the program. There was no evidence of the chip being fried besides reset being 3.8V instead of 5V. A power test showed it was indeed erased. It was a brick. Fortunately, no-one buys just 1 Hobbyking ESC. It may have to be run without reverse.

According to the range test, it takes 2.5A at 12V to go 6mph. It was a major investment to make the TBLE-02s precise enough. Another stock ESC probably won't be precise enough.

A useful command to erase the mega8:

/root/arduino-1.6.0/hardware/tools/avr/bin/avrdude -C/root/arduino-1.6.0/hardware/tools/...Continue Reading
Posted by Jack Crossfire | Sep 19, 2015 @ 03:08 PM | 2,999 Views
Drove an uneventful 3.9 miles on slight uphill pavement. After .35 miles of ascending the dirt, the motor overheated & stalled erratically. Then after .35 miles of descending, the transmission fell apart. Had to walk 3.6 miles home with the truck. The smell of baking cow maneur wafted. It took 3600mAh to go just 4.5 miles at 5mph, so the motor is probably near its end of life.

It's quite clear a brushless motor is required for any attempt at the ridge trail. It's going to take a new control system to generate the different PWM control. A method of threadlocking which doesn't crack plastic is also required.

RC cars must either always operate near the ragged edge of failure or going continuously uphill for 1.5 miles is way beyond the normal regime.