With all its dependancies resolved, the odroid finally underwent its most challenging task ever documented. The vision program compiled & processed the 160x120 test footage at slightly over 30fps. It was way below the predicted 72fps from nbench results. Fortunately, at 160x120, it had no problem consuming the webcam's maximum framerate. The latency would be 1/4 of whatever the framerate was, since it used a pipeline to parallelize an algorithm that was otherwise not parallelizable.
It sucked 2.2A at 7.4V with the fan wirring at full speed & webcam on. The Turnigy converter got scorching, even converting this meager voltage down to 5V.
At 320x240, it was hopelessly slow again at 4fps from the webcam. The CPUs maxed out & used 2.2A at 7.4V. Idle current was 0.65A at 7.4V or 1A at 5V. There was no sign of any frequency scaling. The A7's were always at 1.4Ghz & the A15's were always 2Ghz.
As predicted, it only used 4 CPUs at a time, no matter how many the vision program was configured to use. The kernel automatically allocated the 4 Cortex-A15's to the vision program, only occasionally using the Cortex-A7's. It was unknown whether 4 CPUs was the maximum of both A7's & A15's in use simultaneously, or whether it had to stop the A15's to use any A7.
CFLAGS were hard to come by, but the following seemed to work well:
It was still an astounding step up, compared to what the PI or the Gumstixs brought. It was of course the power of billions of premium contracts subsidizing phones that led to such huge gains in the Odroid's Samsung processor. The set top box market that led to the PI's Broadcom processor wasn't nearly lucrative enough to generate such powerful chips. Set top boxes were a race to the bottom while phones were a race to the top. Because cell phones are a vital business expense, they benefited from tax writeoffs & corporate budgets while set top boxes were squarely funded by what meager budgets consumers had after taxes.
After a quick goog search revealing nothing, decided to be the 1st to document the interior for you the viewers. 15mm on a full frame proved to be the right camera system, but didn't have as wide a depth of field as hoped.
Once inside, the futuristic hull fades away & it looks like any other ship from 30 years ago, if maybe a bit cleaner. There is no evidence inside the maze of corridors of any futuristic hull except for the strangeness of all aluminum walls.
The same exposed wiring, steep ladders, cramped spaces, & spartan rooms of the old days still abound. Can't imagine getting around there during a storm or a battle. On the bridge, it seemed incredibly expensive to have so much equipment specifically made for the military instead of using commercial boating gear.
Like all its other scandals, false advertizing by the contractor about the required crew led to another 15 people being crammed into a space designed for 40. Fortunately, the 3 hulls, low draft, & aluminum frame can get them home faster than any other ship. Everything was labeled & organized for transitory crews that come & go with their scholarship obligations.
They didn't show the engine room with its twin LM2500's, any debriefing room, or any closeup of the mane gun. Suspect all other areas besides what we saw were crammed with equipment. All the copters & UAV goodies were gone. There were just a couple dingees....Continue Reading
There it is. Much to the disappointment of unboxing fans, it'll never run Android or a desktop. It could play a mean game of Asphalt 8. Its only destiny is to injest video from a webcam & output path geometry. Even the raspberry pi saw very little use besides a short stint as an access point.
Much has to be done before it starts driving a vehicle. An operating system has to be installed. A serial console needs to be exposed. A cross compiler needs to build the vision system. The vision system needs to be made parallel. SPI has to be ported from the PI. Wifi needs to be installed. The DC-DC converter needs to be connected. It needs to be configured to use the Cortex-A15.
It's comprised of a very fragile 1/32" board. Removing the headers takes a bit of care to avoid lifting pads or cracking the board. Probably cracked the board anyway. It has a 30 pin 2mm surface mount & a 4 pin 0.100" through hole header which need to be removed. It's not worth removing any other headers since the vertical profile is dictated by the fan, unlike the PI.
The bottom is very fragile compared to the PI. The power pins can be accessed from the bottom. It's disappointing that it's a very old cell phone processor repurposed for hobbyists at great cost compared to a phone containing the same processor. There's no way to have a standard single board computer take whatever the latest phone processor is or repurpose a much cheaper phone as a single board computer.
The decision was made to pump $110 on an Odroid XU4. Reality was a bit higher than the $75 price quote because of $10 shipping, $7 tax, $7 for an SD card, $11 to ship & tax the SD card. It was already known the droid could only use the 4 Cortex A15 cores with the 4 A7 cores inoperable.
There are no definitive comparisons between the PI 1, PI 2, & the Odroid XU4 or any clear descriptions of what benchmarks they used. You can only estimate a single core of the Odroid is 3x faster than a single core of the PI 2, which is 2x faster than the PI 1. With all 4 cores, the Odroid would probably give 24x the framerate.
Based on the last optimizations, 160x120 would go at 72fps. 240x180 would go exactly at 30fps. 320x240 would go at 15fps. It brings back memories of spending a fortune on a 600Mhz single core Gumstix, just to use a neural network. It was so fast for its size, but less than a PI.
The chroma key & line detection were confirmed as the slowest parts. Since the algorithm can't be done in parallel, a frame pipeline with a 100ms latency would be best. The mane benefit is from averaging multiple frames rather than lowest latency. Changes in lighting & camera angle may provide more benefit than higher resolution from a single position, so you want to maximize the framerate before maximizing resolution.
An attempt to use linear regression to get the chroma key pixel didn't work. The simplest average algorithm here continued to be the best.
It's a good time to reveal to the world that the Accucell 8150 doesn't really balance anything. It appears to spend most of its time with the balancing resistor network on, then briefly float the balancing pins every few seconds, presumably to determine what cell is high. It can't determine the high cell when the resistor network is active, even though the LCD clearly shows it. The Astroblinky used the same cycling algorithm, for some reason. 1 cell usually ends up over 4.20 during the floating.
It might actually work if the difference is < 0.01V but lack enough effectiveness for 0.05V. It would be worth trying the astroblinky with a bunch of voltmeters to see if the astroblinky was equally ineffective. The accucell's cell readout revealed severe imbalance in a 2S & a 3S.
The mane cause of unbalancing seems to be the temperature differential between the cells. In high current applications, the inner cells get hotter than the outer cells & discharge faster. This causes the outer cells to discharge less & overcharge.
In low current applications in hot weather, the outer cells heat up from ambient air before the inner cell. This causes the inner cell to discharge less & overcharge.
In the 2S hand controller, body heat heats up 1 cell faster than the other. Ideally the current usage & enclosure would heat all the cells evenly.
The only way to manage the unbalancing is independantly charging the cells instead of relying on a resistor network like modern chargers. This would be very expensive, requiring independent MOSFETs.
It was the longest any vehicle ever went on a single battery. Set it to 6mph. Had the headlights off. Speed was somewhat variable because of the low accuracy of the inverter. Not sure driving the ESC using I2C would make the speed more accurate, since the bottom line is the number of bits in the 8khz waveform.
This achievement took 4Ah or 90% of the remaning capacity in the last full discharge. It would have a hard time going any farther or any faster. A fresh 5Ah battery would probably go 17 miles.
In the hardest test video, 2.9825% of the frames failed. Shadows are the hardest. Cloudy days are the easiest, even working quite well at 160x120. Higher resolution improved the results. 4k proved too slow to scan.
Normalization slightly degraded the results. Suspect the best results would come from manually setting the exposure, setting manually tweeked constants based on the exposure & path. The saturation is still manually set. However, it is about capturing the most information & the auto exposure is choosing the exposure that captures the most information.
The key computational user is the chroma key step, with any other step manely free. A different chroma key needs to be applied for each color of the path. Then, the threshold needs to be manually tweeked until the path has the right shape.
Tried dynamically detecting the optimum threshold by applying many thresholds to the same image & measuring the average distance of every mask pixel from the mean mask pixel. The optimum path shape should always have a similar average distance of mask pixels from the center mask pixel. This was slow indeed, but always produced very tight masks. The problem was the threshold converged on a constant value & didn't change in any frames. The return on the investment of clockcycles was minimal.
Also tried finding the thresholds where the total number of masked pixels jumped a certain amount due to a sudden bleeding. This didn't produce consistent...Continue Reading
Testing continued to try to improve the algorithm & reduce the computation requirements. On some test videos, the algorithm successfully tracked over 99% of the frames. It was still a game of chance, with the threshold, search radius, maximum offset distance, & edge kernel size just happening to be the right values. More difficult lighting naturally degraded the results.
Camera noise contributed to variability even when the position didn't change. It would benefit from a high frame rate. Smaller frame size degraded results. The computational requirements are still way beyond what can be reasonably done on the vehicle.
Normalizing the YUV channels subjectively improved results. A false color image with the normalized channels has a priori information about the dynamic range which wasn't available when the camera took the image, but it inuitively should need the dynamic range of the color channels to be the same as reality.
It didn't work at all when either converted to RGB or using luminance only. It did best with all YUV channels.
It became possible to estimate the center of the path from the 2 sides. Tracking the center would be still more robust than tracking a side.
Back at 200m intervals with the brushless motor, it took 2Ah on 3S to do this 6.2 mile drive, with 10 10mph sprints down & a steady 6.66 mph drive back up, with headlights on. The ESC has proven less accurate than directly driving the H bridge, as expected. GPS rated the sprints at 10.5mph & the fastest sprint at 11mph. At 10mph downhill, the ESC errored high. At 6.66mph uphill, it was right on. Still, it could go 12 miles on 5Ah even with these demanding speeds. The current increases as the voltage decreases, so current usage needs to be padded.
Heartrate was 176, dropping to 173 as the intervals wore on.
It was a long time coming, but after nearly a year & probably no more than 100 miles, the brushed motor was done. Finding the right motor was quite an ordeal. Either all ground vehicles use the same size motor or the bolt holes are in standard positions. There is no data for the bolt holes, only random posts. An obscure post recommended the Tacon 2400kV & it was the cheapest.
After converting the computer from direct H bridge driving to 50Hz PWM, the RPM constants all stayed the same. Drove it 6.8 miles at 5mph on 3S, consuming 1928mAh. The brushed motor used 3518 mAh to go 8.5 miles, so a 31% efficiency gain. Considering the brushless motor is 90% efficient & the brushed motor is 50% efficient, it was a very good translation of motor gains to mileage. The transmission didn't contribute a lot of power loss. It has a good chance of doing a full 13.1 miles on under 4Ah.
Starting was identical to the brushed motor. It didn't look anything like the banging of an AC electric train.
Previously, the best result came from the RALPH algorithm. After nearly a year of intermittent ideas & dead end Goog searches, an algorithm emerged which made RALPH look utterly terrible. The idea came from chroma keying, but unlike normal chroma keying, this was a super chroma key.
The super chroma key worked like a magic wand, but instead of cutting off when pixels deviated away from the color key, it learned new color keys based on where the previous mask was. It differentiated between good & bad candidates for new color keys. Then it did an old fashioned greatest edge detection to find the edges of the path. The edge detection eliminated many false areas which didn't belong to the path.
While far from perfect, the super chroma key was extremely robust when tested against very shaded sections. Most sections which weren't heavily obscured by shadows were bulletproof. It could detect both sides of the path, between which a center line could be drawn & the rover programmed to drive just on the right of the center line.
It only needed 640x480 video. Processing is extremely slow. It could be some time before the super chroma key is fast enough for a test drive.
Lions were never crazy about basic inorganic chemistry, but propellant densification is the latest thing, like any improvement in rocketry, a major event only happening once in a lifetime. Like ion engines, it was something that spent most of our lives in science fiction, taken up & put away through the years. There's no record of it ever going beyond testing. SpaceX did yet another test, with plans to actually use it, someday.
At least the oxygen side will be refrigerated to 1 deg above the point at which it becomes a solid, providing a 7% increase in the amount of fluid mass in the same volume. Someday, maybe the kerosine side will be densified. It has the same effect as fitting a bigger rocket in a smaller space. It doesn't increase the efficiency or the thrust.
The decision was made to convert the lunchbox to brushless. None of the flying ESC's in the apartment could support reverse, so it was time for a SimonK flashing of a Supersimple 30A. None of the blog posts about Arduino programming contained the pinout used to program the last ESC, so here it is:
black - ground
- sck pin 13
- miso pin 12
- mosi pin 11
- SS pin 10
The key parameter for reverse mode was RC_PULS_REVERSE in tgy.asm
Then came make tp_8khz.hex to generate a high efficiency 8khz target. Then came running arduino to install the Arduino ISP sketch. Then finally came the avrdude command:
Then came the dreaded stk500_recv(): programmer is not responding
It erased successfully, read the device ID, but wouldn't take the program. There was no evidence of the chip being fried besides reset being 3.8V instead of 5V. A power test showed it was indeed erased. It was a brick. Fortunately, no-one buys just 1 Hobbyking ESC. It may have to be run without reverse.
According to the range test, it takes 2.5A at 12V to go 6mph. It was a major investment to make the TBLE-02s precise enough. Another stock ESC probably won't be precise enough.
Drove an uneventful 3.9 miles on slight uphill pavement. After .35 miles of ascending the dirt, the motor overheated & stalled erratically. Then after .35 miles of descending, the transmission fell apart. Had to walk 3.6 miles home with the truck. The smell of baking cow maneur wafted. It took 3600mAh to go just 4.5 miles at 5mph, so the motor is probably near its end of life.
It's quite clear a brushless motor is required for any attempt at the ridge trail. It's going to take a new control system to generate the different PWM control. A method of threadlocking which doesn't crack plastic is also required.
RC cars must either always operate near the ragged edge of failure or going continuously uphill for 1.5 miles is way beyond the normal regime.
The haze of the atmosphere & the rise of the mountains above the horizon sell it, like a miniature Earth made of solid nitrogen. It was a miracle of a lucky shot, taken in total silence, as a completely silent world quickly passed below, 4.6 billion miles from the nearest human.
It could have just as easily been a featureless plain. There was no way to redirect the camera because of something interesting or plan for mountains to be in the shot. Only 1 brief flyby in 60 years of spaceflight was the limit of human capability.
A simple surface classifier using variance didn't look any different than any other method. Among basic classifiers for surfaces, there's also comparing high frequency energy. Neural networks fall over when given data outside their training database. When looking at 16x16 blocks without the benefit of anything besides edges, it's hard to differentiate between materials.
Another idea for using multiple chromakey colors appeared. Regardless of the lighting conditions, the path has only 2 colors: the sunlit color & the shaded color. The trick is identifying color & threshold.
Interesting comparison between the same window size from the 640x480 webcam & the 4k 'pro. That makes the idea of surface detection seem more practical. Unfortunately, a practical system would have to get by with a crummy webcam. It would have to use some kind of still photo mode that could go higher than 1080p.
There was a theory that texture matching could be a better path detector, but it would require much higher resolution. Recorded 7 miles of 4k. Then the battery died. Did it at higher speed than the range test, with extra cargo. That's the only explanation. The battery only took 4444mAh, so it's lost a lot of capacity.
After walking the final 1.25 miles, finally viewed the footage.
The nearest part of the image had well defined differences between path & dirt.
Edge detecting makes it resistant to shadows, still leaving a well defined boundary. It would merely need to compare the amount of noise to find the path.
It was a really brutal RC truck drive in the 95F heat, set to 10 min/mile. Got cramps, nauseous, & dehydrated. The 1st charge with the accucell revealed it could probably go 12 miles on 5Ah. Consuming constant power instead of constant current would give slightly less range for the last 2 Ah.
The accucell uses the main power leads to measure voltage, so it measures slightly above the voltage from the balancing leads. The resistance in the balancing leads doesn't seem to impair it, though it could use a test to see how much current is going through the balancing leads.
The passive matrix display brings back memories of the very 1st laptops, 30 years ago. That was as good as it got.