Jack Crossfire's blog View Details
Posted by Jack Crossfire | Aug 25, 2015 @ 01:37 AM | 942 Views
 Running at 10mph (11 min 49 sec)

6 * 200m downhill intervals at 10mph
2 * 400m uphill intervals at 8.5mph
1 * 400m uphill interval at 8mph
Had to slow down due to nausea.

Steering gets real hard at 10mph. That's almost twice the cruising speed. Lately gave more thought to horizon detection & using only that for optical flow.
Posted by Jack Crossfire | Aug 22, 2015 @ 06:51 PM | 1,194 Views
The problem with convnets is they only work as classifiers, not position detectors. There are examples of another algorithm being used to find feature positions, then the convnet being used to identify the features. Even then, there was an example of road sign detection which was a lot worse than least squared differences.

The driving algorithm does pretty well finding the car's position in time, but falls over when looking for the path offset in the frame. Even 2 drives at the same time of day can't be aligned because the shadows changed slightly.

There was an idea of using a convnet as an expensive replacement for the least squared difference calculator. Use it to classify an image offset at multiple points. Pick the position which gives the best match. Convnets are supposed to be position independent, so they wouldn't work in this test.

Another idea is using a convnet to classify a material as path or not path. This might be more feasible. For every frame on a drive, a new set of weights for the kernel is calculated. The kernel could be passed over an unknown image, creating an output mask based on the material. The resolution might be too poor. The camera resolution isn't high enough to give much difference between materials.

Materials classification leads to the idea of horizon detection. The horizon is more invariant than the nearby path, so using just that or masking everything below it might give better optical flow. A variance calculation of each line might yield the horizon or it might be discernable for a convnet.
Posted by Jack Crossfire | Aug 19, 2015 @ 12:51 AM | 1,636 Views
 Making Robots (1 hr 3 min 58 sec)

Good coverage of current topics in robotics. Most interesting was the lack of any work being done on flying machines. 5 years ago, Vicon guided quad copters were the only topic. Now, they're all gone.

The speed & strength of modern actuators compared to years ago. The gas driven hydrolic actuators driving Boston Dynamics creations are a lot faster than construction equipment. The electrically driven hydrolic actuators were amazingly strong for their speed. Much has been done to get powerful hydrolic pumps & valves from brushless motors. We're not far from lifelike human motion, but the problem of safety is going to keep any robot as strong & fast as a human from going near any human.

A way to make a robot aware of when it could break something must be found before it can interact with a human. There was some promise in robotic ability to sense resistance, but it has to know which way a joint can bend.

The mane find was a blurb about convolutional neural networks. They became prominent only after 2012. In 2008, I became disillusioned with neural networks because to truly become learning machines rather than expensive lookup tables, they needed to be recurrent. There was no easy way to train a recurrent network or model a pilot in a neural network. Lookup tables would do a better job.

Convolutional neural networks seem to be purely used in image recognition. The fundamental unit of the convnet is the convolutional kernel, a fixed matrix which is multiplied by every region of an image to produce blurring, edge detection, & sharpening.

Now, they apply several kernels with different parameters on the entire area of the same image. The different parameters of each kernel are trained using backpropagation. The output of an image which has been processed with these kernels can be fed into another set of kernels to process higher level features. Fortunately, there are already libraries which implement all this.
Posted by Jack Crossfire | Aug 13, 2015 @ 11:13 PM | 1,356 Views
The quest for machine vision moved to optimization, because it most likely will have to run on the 900Mhz Raspberry in a 1st test.

A quick test at 160x120 showed complete loss of synchronization at the lower resolution, with different lighting. There was definitely an advantage to 640x480 & above, with the bare minimum at 320x240. Using color for matching was hopeless, whether the color was 160x120 or 320x240. Color was required for decent motion searching. The optimum source image was 640x480, with downsampling to 320x240 for all processing, & further downsampling to 160x120 for the 1st pass of motion searching.

Synchronization of any kind between 2 videos during the shady time of day was impossible. The shadows changed position too much. Synchronization between a shady video & a full daylight video was still quite good, though optical flow was hopeless. Since the full daylight video had some compatibility with all the other videos, all the reference videos should be in full daylight. FLANN pair matching was identical but much slower than brute force matching.

Another drive slightly after full daylight but not sunset was pretty awful at optical flow, though it nailed keypoint matching. Reduced the reference frame window to 10 & reduced the reference frame rate to 1 frame every second. This didn't affect the keypoint matching.

Made a logarithmic motion search use a 2x downsampled image, which gave better results than either...Continue Reading
Posted by Jack Crossfire | Aug 09, 2015 @ 12:59 PM | 1,714 Views
The rumors that computer vision takes enormous amounts of clockcyles have proven true. Last week's theory that matching homography of keypoints would yield the most similar frames was absolutely wrong. The best match was revealed by the frames with simply the highest number of matched keypoints.

Next came detecting the optical flow of the matched frames. There was computing the average position change of the matched keypoints. This was terribly rough. The keypoints were terribly mismatched.

Better results came by old fashioned macroblock searches. It might be faster to apply the Lucas Kanade method to the keypoints, but the brute force method proved the feasibility.

Not sure Lucas Kanade would handle the large differences in position between the 2 frames. The objects don't move incrementally, but occupy totally different parts of the image & don't match well enough for feature matching to produce any points to track.

The optical flow got better when the search for best macroblock was exhaustive instead of logarithmic. Narrow objects like poles got missed in a logarithmic search. It got even better when the motion search used full color instead of greyscale. But using full color instead of greyscale for the keypoint matching actually made it worse.

 Synchronizing video using SURF (4 min 59 sec)

A nifty video of the current & reference videos being synchronized by feature point matching but not optical flow emerged.

Drove another 6 miles, recording several drives of the same section of test track in smokey conditions. The battery can't go any farther. An SD card glitch caused the recording to stop. The web server was still running, but showed a 404 error, a sign the filesystem was gone. The lighting change did make a few more frames glitch, but it was still pretty good for no color correction. Color correction will add still more clockcycles.

The video had to be downscaled to 320x240 to get any reasonable processing speed. This didn't seem to degrade the results.
Posted by Jack Crossfire | Aug 06, 2015 @ 11:10 PM | 1,544 Views
The day job had no useful resistors for measuring current, so tried to add a 10mV/mA range by doing this.

Trimmed it to nearly 0.1R, but the result was a very noisy waveform which spilled noise into the other ranges. The wire was an antenna. The day job had a 1R 5%, so whacked that in. It was still noisy. Left the 10k tacked on the end of it & it acted like another antenna. Removed the 10k completely, which gave a useful but still noisy signal. The 0805 10R wasn't a perfect fit for the 10k position. Fortunately, the day job has many 0402 10R's.

The result was 1mV/1uA, 90mV/1mA, or 0.99mV/1mA, very useful ranges for measuring the coveted 2ma-10mA range. The full range was 0mA-20mA. The oscilloscope seemed to have been doing better with 90mV than 9mV & it skewed the other ranges less. Could finally see how much an interrupt handler was using.

Also, the switches started interrupting the circuit with more usage, so had to go into SHORT mode to change ranges. There was now little point in dual contact switches with all the problems of requiring parallel resistors & not being able to have more ranges. It's something Dave probably couldn't foresee. Fortunately the OFF mode still passes current while not powering the LED, so you needn't drain the battery to keep the DUT going.

After much testing, the mane advantages are changing ranges without interrupting the circuit & plotting transients. The mane limitations are lack of ranges...Continue Reading

# Images

Posted by Jack Crossfire | Aug 06, 2015 @ 01:36 AM | 1,623 Views
Reviewing the schematic for the uCurrent, the key to its ability to change settings without interrupting the circuit is switches with 2 contact points. The resistor network is always in the circuit path. Only the SHORT option gives the current a shorter path, so avoiding accidentally switching to nA requires being in SHORT mode.

The sacrifice in using dual contact switches is 2 resistors always have to be in parallel. Working around that requires the ranges to be powers of 1000. The resistances have to be in ascending order & all 3 resistors have to be populated. The output of each resistor is always amplified by 100x using the same amplifier chain.

It really is a bit of slight of hand that lets the uCurrent work the way it does. It was one of those unique inventions born from a once in a lifetime observation developed over a lifetime of experience.

The easiest way to get the coveted 2mA-10mA range is replacing the 10R with 0.1R & replacing the 10k with the 10R. This would create a new 10mA range with 9.9mV per mA. The mA range would then give 0.9mV per mA & the uA range would still give 1mV per uA. It might be good enough, but hard to read in the 10mA range.

Unfortunately, the 0.01R resistor can't be trimmed to get 1mV per mA again. It's a special resistor with sense terminals on the substrate.
Posted by Jack Crossfire | Aug 05, 2015 @ 01:02 AM | 1,701 Views

A nice bit of kit arrived at the day job. Modern gadgets live & die by their power management firmware. It's not like the old days when a power switch was the final word in whether the batteries drained or lived. Now, the typical gadget has a full power mode, low power mode, off mode, charging mode, all controlled by software with no direct user control. If the software doesn't work, the gadget bricks with no way out for the user.

Many a gadget has been instantly wiped off the face of the Earth because it didn't turn off properly, killing its batteries, or didn't turn on. History is littered with Gopro's draining their batteries in standby mode, iPad's going dark & never turning on again, Lipo's drained past their self destruct point or overcharged.

Keeping the day job gadget from becoming a statistic consumed a lot of testing. The mane problem was measuring current through the gadget's full range of hundreds of mA to tens of uA. The traditional "mowti"meter couldn't do it without shutting down to change ranges & faking the battery voltage with a bench supply set 2V higher than the battery voltage. Burden voltage was a big issue. There was no way to test transitions into charging mode with the bench supply.

As simple as it is, the uCurrent was the easiest way to do the job. It was a lot more reasonable than picking out the right resistors, op-amps, & switches, building up a circuit to test current ourselves. The...Continue Reading
Posted by Jack Crossfire | Aug 02, 2015 @ 11:13 PM | 2,089 Views

Swapped all the steering buttons after 1 had been seriously degraded for a while & they were all quite corroded. Looks like swapping buttons is going to be a standard routine in a sweaty environment. Moved the low speed steering closer together to aid in tactile movement. The corroded ones were 25 years old. Surprisingly little progress in tact buttons was made in the last 25 years, compared to the progress made after 1960.

Posted by Jack Crossfire | Aug 02, 2015 @ 06:06 PM | 1,927 Views
A test of SURF image matching with different focal lengths & lighting was a complete failure.

So did 4 x 8 minute miles on the same section of track, gathering 2 runs in both directions, in equivalent lighting & focal length, high quality JPEG. Wanted to do more repeats, but sunburning became an issue for the human.

The old raspberry pi was still putting out 10fps. Gave up trying to configure it over wifi. The USB port has enough power for a webcam & wifi.

The picture quality was still awful. Stepping up the saturation revealed it was probably using an analog NTSC signal in 1 stage. The proprietary Logitec chip still takes in a digital signal.

Posted by Jack Crossfire | Jul 30, 2015 @ 11:00 PM | 1,758 Views
Not to be converted into a project.

# Images

Posted by Jack Crossfire | Jul 25, 2015 @ 06:30 PM | 2,005 Views
The Feiyu FY-G3 is what everyone is using to make running videos, nowadays. It's about as cheap as it can be made, when the cost of the motors & the custom compact frame are factored in. If any money was left over after current rent prices, it would be time to make an order, but there is no money & no plan for next year's rent increase.

Its 1 of the new crop of gadgets whose gyros calibrate when it's in motion. Can only imagine it correlates the motion of multiple accelerometers with the gyro motion.

The mane limitation is the god damn Chinese didn't think of putting in a joystick to control it, so there's no way to lock position while retaining the ability to override it. There is a single button which switches between position lock & hand tracking. The button must be held down for 1 second to lock position & tapped to unlock position, not practical for constant trimming over 20 miles.

The biggest limitation is it's not as stable as the home made gimbal in 2013. Ginger Runner confirmed the silky smooth Gootube videos are all software stabilized. Without stabilization, it's as rocky as the home made gimbal after reworking to use UART converters. Still no clues why the rework made it unstable, but might go back to I2C.

It might be worth getting a Feiyu just for its parts, while adding a joystick & using custom firmware. The same parts from rctimer.com are insanely expensive.

It was theorized that a gimbal should stay very close to its starting position, using only rate feedback. Going over the source code, it seemed to be never actually tested using rate feedback only. It always used total angle feedback at a very high gain, but this stopped working after the UART conversion.

The leading theory is no matter how hard you try, there's going to be more latency in converting I2C to UART than reading I2C directly. The UART has buffering delays on both ends, which could be important. It might be worth testing just rate feedback on 1 gyro, with & without UART mode.
Posted by Jack Crossfire | Jul 24, 2015 @ 11:18 PM | 2,679 Views
Amazon.com is transitioning from being known as a store to being known as a web services platform. For the 1st time, there was no new phone, no new kindle, no Amazon Watch, not a single new consumer product to get anyone excited, yet the stock exploded. AWS had always been a growing monster that no-one talked about, but it’s come out of nowhere in the last 6 months to account for all their net revenue.

Of course, stock gains nowadays are mainly driven by share buybacks, algorithms, & the government. Too bad their employees will never have even 1/1000th of Jeff’s 83,921,121 shares, but that’s the new economy. A typical CEO owns 99.9% of the company & the employees own nothing.

When they're bought out by Google for \$500 billion, Jeff will invest it in real estate. The employees will end up homeless because they can no longer afford the rent increases caused by the buyout package they earned their CEO.

Every employee must keep their fate in mind, as their CEO frantically tries to sell the company to their nearest competitor. At the same time, employees can't jump ship to join their nearest competitor because that would violate their non compete agreement. Only the CEO can sell the company to their nearest competitor.
Posted by Jack Crossfire | Jul 23, 2015 @ 11:16 PM | 1,848 Views
It requires a Macbook with MacOS already on it. There have been tall tales of pirated MacOS working on a Windows box, but don't believe everything you read on the internet. In this installation, MacOS & Linux have their own bootable partitions, so they don't need a virtual machine to run separately & can get the full machine resources when needed. The trick is getting them running simultaneously, so the development environments of both sides are going at the same time.

Step 1, get Virtualbox to boot the MacOS installation on its existing partition. The magic commands were buried in http://www.virtualbox.org/manual/ch09.html#rawdisk

VBoxManage internalcommands createrawvmdk -filename /path/to/file.vmdk -rawdisk /dev/sda

Creates a virtual disk which is really a symbolic link to the entire real disk.

Attach the virtual disk to the virtual machine in the Virtualbox settings. It worked on the 1st try.

Step 2, fix the screen resolution.

Where N can be one of 0,1,2,3,4,5 referring to the 640x480, 800x600, 1024x768, 1280x1024, 1440x900,1920x1200 screen resolution respectively. This only worked with MacOS.

The MacOS booted easily, most of the time.The issues with hidden serial numbers & esoteric bootloaders on the internet were resolved years ago.

Step 3, make the virtual MacOS share the laptop's ethernet so you can access the virtual Mac desktop over VNC & share files over SMB....Continue Reading
Posted by Jack Crossfire | Jul 23, 2015 @ 01:06 AM | 2,009 Views

Comparing edge detected frames from 2 drives with 2 different lighting conditions revealed a slight chance of matching the frames from 2 different drives as a means of lane keeping. With frames matched, it might detect horizontal offset.

An initial test would try synchronizing 2 identical videos of the same drive, purely based on feature matching the exact same frames. Then it would move to 2 different videos. The initial test would match the starting frame with a window of starting frames in another video. It would advance the window of possible matches based on the previous match. It would later use GPS position to aid the frame matching.

Then would come tagging the part of each frame which contains the desired heading, probably manually. Maybe they could be tagged by playing backwards, matching a later frame to a point in an earlier frame to deduce the desired path. Finally would come adding the newest drive to a database of known drives to keep the data recent & to refine the localization.

It's a small step towards everyone's ultimate goal of being able to precisely locate position by searching an exhaustive set of images from every possible location, then refining the database of images with every new localization. It's always been a dream for quad copters, but quite difficult because of the 3D space & the precision required. A path following rover might be a feasible application. The Goog will be the 1st to acquire a startup that can do it.

It's a much more complicated algorithm than ideal & nothing so far has worked, but the initial proof of concept would be bearable. Calif* is predicted to have the wettest rainy season on record. July was already quite wet, by normal standards. There won't be much driving, between commutes & rain.
Posted by Jack Crossfire | Jul 19, 2015 @ 08:14 PM | 1,636 Views
Continuing the story,

Any robot is an endless series of glitches that must be lived with to make any progress. In the intervening years, there were some attempts to make the 8192CU work more reliably as an access point.

To compile a scratch built kernel for the PI, there was a good document on http://elinux.org/Raspberry_Pi_Kerne..._kernel_source

Then, there was a good rt8192 module compiling document on http://bogeskov.dk/UsbAccessPoint.html

Merely change the Makefile to generate ARM objects using the path to the ARM kernel. Helas, this module was no better than the stock one in 4.0.8+.

There was a nugget about compiling a new hostapd, but the cross compiler didn't have a lot of required header files.

The RTL8192 worked well with the LG Optimus F3 in root mode, but was intermittent at best with the LG Tribute. The only way to start the vision program was using ethernet to ssh in & run

./nohup vision&

That ran it detached from the console.

Wifi didn't stay connected long enough to do that. The phone was definitely part of the problem. Nothing has ever worked as well as manually running iwconfig every second to reconnect in the clear. If only wpa_supplicant could poll like that.

There are signs of a way Android apps can reconnect the wifi connection when it dies.
Posted by Jack Crossfire | Jul 19, 2015 @ 12:22 AM | 2,436 Views

Modified the RC truck cam to make it slightly higher & record uncompressed frames without the wide angle attachment.

Truck cam from February, with wide angle lens & JPEG compression.

Posted by Jack Crossfire | Jul 18, 2015 @ 01:35 AM | 2,050 Views

The answer is yes. Developing for iOS is expensive. A pile of I gadgets came from the day job, with plans to support the iWatch. It gets more expensive as the gadget gets smaller, with the iWatch the most expensive, since it requires the phone, which requires the mac.

The 13" macbook from 2012 lasted longer than any other laptop, with no issues in 12 months of daily usage. To be sure, duct taping the keyboard to stop it from glaring sunlight gummed up the touch pad & the screen. Once the gum wore off, it was useable again.

The typical product nowadays is a gadget that communicates with a phone. The ideal situation is MacOS for phone development & Linux for embedded development running simultaneously on a single computer. This was very slow on the 13" mac with 4GB RAM & 500Gig of platters. It couldn't run Android developer studio when running XCode.

The day job provided a 15" mac. It was a modern marvel, 2880x1800 display, 16GB of RAM, 1TB of flash, quad core 2.5Ghz, thin as a razor, costing almost as much as a month of rent. To be sure, more powerful laptops for a lot less money exist, but they never lasted more than 6 months & weighed a ton.

Unfortunately, Linux is a long way from supporting the Macbook Pro 11,5. The 13" continues to run the development environment. The main issues were no wifi & no suspend. It will be obsolete before it's ever supported. Attempts to run Linux on a raw partition in a virtual machine have failed. It might work with Linux on a virtual disk, but this would only be temporary.
Posted by Jack Crossfire | Jul 12, 2015 @ 06:03 PM | 2,467 Views

What the day job lacked in money, it had in bluetooth modules, so it was time to whack together a sliding door lock. Of course, the actual bluetooth board was a trade secret. The servo was a 10 year old piece from the DVD robot. The H bridge was the remanes of the laser projector from 2011, which itself was made from the 72Mhz remote control from 2007.