Searching far & wide for a marker system which might enable a follow cam revealed this:
Previous experience showed the classic international pink was the best chroma keying material & a doughnut was the easiest shape for current cameras to isolate. When chroma keying is combined with a unique shape, it gives 2 layers of redundancy.
A 4k camera detecting a full body, international pink, digital camouflage suit would be ideal.
Next would be digital camouflage markers, but given the reality of only having enough money for 640x480, we have only
Despite many flips & spinouts, it still managed to do it in 1h59m. Fastest mile was 8 minutes. Stopped the clock only for the battery change, but counted all the red lights & restarts. The battery died at 12 miles. Its speed rapidly dropped to 9m33s/mile. The new battery was back to 8m50s/mile. Battery life was 1h50m, giving a current consumption of 490mA with the headlights on. The right headlight constantly got bent down.
The human was powered by a salad & some sugar sticks. It made quite a bowel movement, but no crash.
So the bottleneck in making a viable follow cam is a super high resolution camera transferring realtime video to a really small computer where it's scanned for small, finely detailed markers. Interfacing HDMI has become the requirement for getting any kind of realtime video. You can probably get it into the raspberry pi's CSI bus. The CSI bus is a parallel bus of differential pairs. The pi isn't fast enough to do anything with it.
The only way to do the job is an FPGA implementation of the marker tracker, doing the full SDRAM, HDMI injestion, & marker detection in hardware. For all the hype about software, OpenCV, & running Pixhawk on Linux, hardware is still the only way to get anywhere near realizing the promises the UAV industry is making.
It's surprising no-one is focusing on hardware implementations of higher end object detection. There is a slow increase in hobbyist attention to FPGA's, but only for controlling LEDs, software radio, or very limited chroma keying from a low end VGA cam.
For all the different platforms they're trying to get Pixhawk to run on, they might as well port the autopilot to hardware. Everyone wants to be bought by Google & Google is a software company, so there's not a lot of thinking outside the box.
Another rover test ended when a motor fell off. Everything not threadlocked is being revealed when it falls off. Nothing was originally threadlocked because it wasn't supposed to work. It did 8m40s per mile when it worked. It was extremely hard to keep up. It flipped over twice on pavement cracks.
Accessing the offending bolt revealed all the sensors for the 1st time. The alignment is so critical that it didn't work when reassembled. Another sensor for the left motor had to be shifted ever so slightly closer. Fortunately, the error was a visible amount.
At 8.8V, with no headlights, the right motor now used 60mA more when going forward than when going backward, without load. Left motor used 20mA more when going forward.
At 15V, with no headlights, the no load current of a single motor was previously 250mA. The left motor was now 230mA going forward & 200mA going backward. The right motor was now 430mA going forward & 220mA going backward.
After reseating a bearing & another sensor reconfiguration, the right motor used 250mA going forward & 260mA going backward. This should have a significant impact on range. Another test run revealed it may now be too fast. So any direction problem is decidedly the mechanics rather than the sensors.
Key to the follow cam concept is object tracking & distance measurement. Just doing that without flying anything would be huge.
It's a little easier than full motion capture, since it only needs distance & position. It's a little harder, since there's only 1 camera. Doing that in daylight is still a hard problem. The last work Vicon showed was in 2011. It wasn't very convincing.
Lately, the Goog found a photo of a guy in a full body suit, covered in pattern markers. So obviously IR keying didn't work well enough. They replaced the reflective balls or LED emitters with more unique patterns a blob tracker could find.
Thresholding that didn't look good. The head & body might resolve, but the arm is gone. A really good piece of software could combine multiple cameras to get all the blobs, but a UAV only has 1 camera.
They also tried LED emitters on the previous sequel. What's needed is a better marker & high enough resolution camera so the marker can have enough unique features. Some bands of different types of hieroglyphics on multiple body parts might work. If 1 band is covered, it knows which body part another band belongs to, but not the distance between the bands. It may be that a tether is good enough to measure distance.
It's a fascinating problem on the cusp of being solved, but best left to the pros. There are so many people now pursuing UAVs, so many experts in machine vision, navigation, & Kalman filters compared to 3 years ago, it's solvable in short order.
You see it every time you boot up Linux. Along with Firefox & Open office, it's 1 of the 3 programs that have always defined the Linux desktop. It's The Gimp. It was truly extraordinary when it 1st came out. The user interface was better designed, tighter & more polished than any other free program at the time, by a wide margin.
It still crashed a lot, but it was your best hope of avoiding paying $300 for Photoshop. It couldn't swap images to disk for many years, making it require extremely large amounts of RAM, for the time. Other image editing programs like XV & Imagemagik were really hard to use.
The GUI toolkit it was written on became the standard for all the major desktop programs since, from Firefox to Google Chrome. It was ported to every platform. There was an extreme frenzy of development from 1996-1999 when Gimp, GTK, & most of the modern software world were being brought into existence.
The original authors of Gimp receded from the project but continued to be coworkers ever since & never had a rainy day, though their later work was never as famous as Gimp. They're now staff members at Square, which is considered the next Paypal & perhaps the next SpaceX.
Square was founded by the founder of Twitter. 1st encountered a Square terminal in 2012. It seemed like an obvious necessity that many banks were offering, but apparently Square was the only one. It was started in 2009, the optimum time to start a monopoly...Continue Reading
A better metric never discussed on the internets is how much a rocket can launch in a given timespan, rather than the amount it can launch in a single flight or the cost per mass in a single flight. The ground crew has to be paid constantly, no matter how many launches they send off.
Today's launch rates are glacially slow, compared to just 5 years ago. For all the difficulty in getting Falcon 9 to launch 3 times per year or Antares to launch once per year, it's hard to believe they were able to launch the monstrous shuttle 5 times in 2009.
The amount of mass launched in a fixed timespan is much lower today than it was 30 years ago. 30 years ago, they were putting 240,000 lbs of payload in low Earth orbit per year. It wasn't very good for the crew, but it was the highest capacity ever achieved. At $4 billion/year, it was $17,000/lb.
Today, Falcon 9 can put around 22,000 lbs per year in low Earth orbit. Antares can deliver 4400lbs per year. There aren't any figures for how much it's costing. If it was equivalent to the shuttle, Falcon 9 would be burning roughly $374 million/year. Antares would be burning roughly $74 million/year.
It actually doesn't sound far off what the rumors have been. There isn't any of the disclosure under the commercial programs that there was under the shuttle program. $57 million per flight could be a significant loss.
The 1st run with the brushless direct drive rover went way beyond predicted range, yet again. Range on 2S 900mAh was over 8 miles. Maximum downhill speed was 8 min/mile. Uphill speed sagged to 8m50s/mile. It was hopeless on rough terrain. It needed D feedback & shorter battery cables.
It was finally documented with the latest improvements.
The most expensive apartment in town is equivalent to the current situation, but 10 years behind in price. It has only 1 wall with windows, has open space, has a really solid front door. The most reasonable cheap place is another $300 cheaper, has 2 sides with windows, no open space. It's in a dump, has bars on the door with a separate lock, no bars on the windows, & has a really dumpy pool visible from the street. The difference is $3660/year. Between those 2 are other places which are extremely dumpy or not reasonable.
There's definitely a correlation between price & standard of living. It's not easy to endure a drop in standard of living & it seems worth spending zillions of dollars on happiness, which brings up the blond hottie paradox. Do unattainable middle age blond hotties spend zillions of dollars on expensive stuff because they can or because they're miserable? The other factor is if the price was sigificant, wouldn't you already have downgraded? Another choice of standard of living over price happened in 1998. It still seems impossible to say it wasn't worth it.
The mortgage & student loan booms have turned the west half of Bakersfield into an extension of LA, while the east half descends into the abyss. New construction, valley girls, & shiny black Audis straight off the 405 abound. The boom is far from the freeways, so it's not obvious to the Las Vegas trade show travelers, but a new superhighway seems destined to connect the new area to the 5.
The weather is difficult, but everywhere is difficult compared to the bay area. Eliminating sources of stress seems to be a noble priority.
The 1st road test with motor sensors worked perfectly. It definitely needed breaks & slightly faster startup. The stalls were gone. 7.4V got it to the 7.5mph range, without getting hot. Based on the rolling distance, it probably was near the no load current of 0.3A. It was virtually silent. Radio range was only 10ft.
It easily rolled over rough terrain. The rolling distance without breaks was surprisingly long. Even with no power, it rolled into curbs hard enough to flip over, but didn't flip over on rough road. The sensors seemed to withstand it. The castor wheel assemblies crumpled, reducing the impact.
Startup was a hard coded stepper acceleration, transitioning to commutating mode, to keep the wheels from slipping. That failed miserably. It slipped when rotating to the 1st step. The sensors couldn't make it transition into commutating mode seamlessly. Tried ramping PWM without a stepper state & it still lost steering. Carefully setting it to wait for the right stepper phase would probably solve the transition problem, but not the 1st step.
The only practical way to get straight driving was to go straight into full power. It didn't slip as much as feared. It used less current. Stepper acceleration is a dead topic. It consumes a lot of current for purely visual appeal.
After implementing heading hold steering & all the manual controls, it worked perfectly. All the issues with bang bang steering were gone....Continue Reading
This alignment was too widely spaced. The alignment had to be spot on to get reliable commutation. They had to be right on the edge of the magnets. The misaligned sensors gave 111, 000, or just 2 patterns with very small ranges for the other patterns. The aligned sensors gave all 6 patterns.
An experiment with derating the commutation time revealed very little derating could be done before the commutation time was too long to get useful back EMF. The working speed of the motor is just too low to use back EMF of any kind, even though it's fast enough to start it. It only starts because the commutation time is lowpass filtered & speeds up fast enough to get it into the sustainable range. The only solution is a sensor.
The loss of steering during startup was caused by the motors not changing state at the same time. Having the phase & state changes synchronized solved that problem until they stalled. At 15V, they still occasionally stalled when hitting rough pavement. A sensor would solve that & decrease the current.
There are no teardowns of a sensored motor, but blurry, shaky videos appear to show home made attempts with 3 hall effect sensors outside the motor casing. 1 guy used them in a modified e-bike. E-bikes are as common in the bay area today as Priuses were, 10 years ago. Of course, no-one has a clue how they work. That requires going to Hong Kong.
There are no examples of 2 sensored motors being used. It's a very complicated ordeal that makes people give up after converting 1.
A single A1321 sensor is applied to the outside & powered by 3.3V.
Decided to timelapse everything until the grand finale. Ran down with the guerilla pod, which was useless for video. At least the traffic was better than driving down with the video tripod. Being many years since the last manually exposed fireworks video, forgot everything.
1/30 F/2.8 ISO 1600 was a decent exposure. Stopping down would have sharpened the dots, but added noise. Desperately needed a video of a past show to know where to point the camera, but Sprint was completely down. Had to reposition during the show. 15mm with 1.6x cropping was just wide enough to get everything.
Audio was at the lowest setting. Clapping produced a decent approximation of the maximum loudness. Should have used a simple pair of outboard electret condensers, at minimum. Anything would have had better stereo separation than the Canon's microphone. Amazingly, it still managed to differentiate the guy on the right.
Now, they're doing it at slightly closer ranges, with modern wide angle, stabilized cameras. It has always depended on a very accurate attitude estimation with GPS coordinate triangulation.
The athlete still looks like a tiny dot. The newest videos are a lot more edited for when the athlete goes out of frame. It has the feel of trying to get a lot more mileage out of the same old capability.
Getting a closeup is really hard. The easiest way is to have a very long lens camera with many levels of stabilization & chroma key detection of the athlete. The last of the Sony Handycams had excellent stabilization. So far, they're all marketing gopro cameras, so this method is not being demonstrated any time soon.
The HDR-PJ540 is the current optically stabilized one. It's quite large & expensive. Stabilization on that level is now a novelty feature, since no-one cares. It's a huge investment, just for the follow me mode.
Another way is to fly up close. The advantage is a much easier time initializing it, much smaller vehicle, getting the athlete started in frame, & ability to fly in confined spaces.
GPS is no good. The athlete has to be the navigation reference. Time of flight cameras & structured light cameras don't work in daylight. The movie camera alone can resolve position, within strict limits. It would take having the athlete wear multiple chroma key markers & having 2 markers in frame at all time. The markers either have to be a constant distance apart or they need a way to tell how far apart they are, maybe by some electromagnetic sensor which has been demonstrated. Single camera autopilots have been done before.
It still would need GPS at long range. Any method using chroma keying without a GPS aid is going to be prone to false positives & flying away.
The most amusing part is that biology majors still make the least, even when they work in engineering. Coworkers who studied physics did indeed have a much easier time getting jobs. The study probably only looked at people working in engineering.
After taxes, the highest bracket only makes $1.5 million in his lifetime. The average house in a dot com area is $2 million.
The author made a big deal about eliminating selection bias, basically people getting engineering degrees because they're smarter in the 1st place. Wish studies that showed married women live longer than single women would correct for selection bias. Men pick women in better health. Marriage has the opposite effect on men's health.
So the blog is moving to Bakersfield because Tampa is completely unaffordable. The out of state tuition alone is bleak. The rent is equivalent to Silicon Valley, 10 years ago. Bakersfield has the same temperature as Death Valley, but the heat index is the same as Tampa. Who knew all those tourist attractions in Fl*rida had the same heat index as death valley.
Decided to kick it up to 15V with fixed commutation mode & it actually exceeded the required speed at 8.5mph. It sucked 2.8A to start & 1.9A to run. Keeping it going straight during the start was a problem. After commutation begins, it's quite stable. The commutation was much slower than the optimum speed, but required to get enough torque. The coils got too hot.
Some more debugging & speed regulation could slightly reduce the current. The idea is to use back EMF, but add a certain delay so the voltage is regulating speed, yet it's less sensitive to stalling. This has been tried before, without success. Its range on 4S 900mAh would probably be 20 min or equivalent to the brushless Losi micro T on 2S 200mAh.
Another idea would be to measure the back EMF time. If it was too far ahead of commutation, reduce the voltage in the next phase according to a PID controller. If it was too far behind, increase the voltage in the next phase. That would give better torque, but not slow down on hills. Normal rovers use flywheels or slip clutches to avoid stalling.
The time for this has probably run out. There probably is a future in direct drive brushless rovers. They offer a way to get a wide range of speeds or more biomorphic movement. A direct drive robot can slowly rotate to scan a scene, then dart forward. A conventional robot is geared to go either extremely slowly or extremely fast.
So the VIIRS was built in LA. There's no data on how it works, whether it's based on a large aperture or an extremely radiation hardened DSLR sensor. It's sensitive enough to detect a flashlight in the ocean. Connoisseurs of its data have noticed a large patch of lights in N Dakota which didn't exist 15 years ago.
It was worth documenting the last bit of data from the back EMF motor control fiasco. It begins with the full power applied.
Oscilloscope plots of the full power voltage are never shown, but they have the clearest view of the 2 powered phases with the back EMF clearly visible in the floating phase. There's a glitch where it goes from 9V to floating. Software detects the halfway point to determine the commutation time.
The normal oscilloscope plot shows PWM modulated voltages. The back EMF phase is less visible. The writers for EETimes tend to be more interested in showing they fully understand the concept personally, with the most obtuse diagrams, rather than conveying it to someone else.