Jack Crossfire's blog View Details
Posted by Jack Crossfire | Apr 22, 2013 @ 07:28 PM | 9,075 Views
http://www.fpvhub.com/index.php?topic=12150.0

Unlike the brushless ESC motor controller, the brushless gimbal motor controller doesn't use any back EMF or hall effect sensors. It just steps through the hard coded waveform of a 3 phase motor by a certain amount, every certain number of PWM cycles.

The amount & direction the waveform is stepped depends on where the IMU is. If it's way off, it continuously steps through the waveform real fast. If it's dead on, the waveform doesn't step & the 3 motor wires get a fixed current.

Each wire powering the motor gets the same sine wave, offset by 120 degrees. The magnitude of the sine wave determines the torque & is hard coded. If it's a large number, your gimbal is going to suck a lot of current, regardless of how much the gimbal has to move. Only the phase changes, based on the direction & speed commanded. A more advanced controller could probably vary the magnitude, but the open source one currently doesn't.

They all use the L6234 motor controller, which is just a bag of N MOSFETs & 1 pin controlling each motor wire. If the pin is high, current goes into the motor wire. If the pin is low, current goes out of the motor wire. The open source controller never turns off a motor wire. There's no fancy hardware phasing of the PWM. The 3 turning waveforms are all created by software setting 3 PWM duty cycles.

To get a smooth sine wave, they're toggling the current between in & out...Continue Reading
Posted by Jack Crossfire | Apr 21, 2013 @ 07:21 PM | 6,548 Views
So you want the smallest possible brushless gimbal. The knowledge base about supported motors & payloads is increasing. People are lifting DSLRs now. There's a video of the inventor flying a small quad with one.


SimpleBGC test in wind (3 min 48 sec)



All the notes are in Russian. The days of the future being written in english are over. You can reverse engineer the following:

ROLL- RCTimer CF2805-14
PITCH - Turnigy L2205-1350

60 turns of 0.2 something wire in each motor. Presumably 0.2 millimeters.

BrushlessGimbal DSLR01 (3 min 38 sec)
...Continue Reading
Posted by Jack Crossfire | Apr 21, 2013 @ 04:43 AM | 6,210 Views

The occulus rift was intriguing, until you realized its limitations. It has a 1280x720 screen which divides into 640x720 for each eye. It has no sound, so the sound doesn't follow your head. Like all first person 3D graphics, it's not perfectly realistic & causes nausia. Convergence point & focus don't follow your eyes.

Of course, their future plan is to upgrade it to 1920x1080 or 960x1080 for each eye. We all know how those future plans always come true. HD resolution without lag is still not possible for FPV, so forget about FPV with these.





Also, their future plan is to sell it with an omni directional treadmill, involving lots of innovation & of course an undetermined release date. It's really just a low friction foot platform with a baby walker to keep you from slipping off. It's the kind of imagination intensive plastic toy we had in the 1970's.

...Continue Reading
Posted by Jack Crossfire | Apr 21, 2013 @ 01:27 AM | 5,099 Views
What a joy to have this thing flying itself again, using 40fps, PC based, 640x240 vision. Sharing a common vision system between all aircraft will never be tweek free. It's down to 1 day for reconfiguring & halfway to a computer controlled dogfight game. Just need to get another MCX to lift a sonar package & rebuild a sonar array.

No obvious way to get it to lift a camera has emerged. 5V 70mA has never been so far off. It has 5 minutes of endurance with no payload.

At this point in the history of ground based vision, 1 huge problem has emerged. It has a hard time telling the direction of movement, leading to a toilet bowl effect.
Posted by Jack Crossfire | Apr 20, 2013 @ 12:57 AM | 5,666 Views
Had a go at using 2 cameras for a monocopter, but averaging 2 position readouts instead of getting a stereo readout. It was subjectively slightly better. Whether or not it really made any difference, it's a common camera platform for all vehicles, so it might as well use 2 cameras for a monocopter.

The favored vehicle for indoor flying is the Blade MCX, so the future of monocopters is unknown. The flying is so erratic & payload with 4100kv motor so limited, a camera wouldn't be any use.
Posted by Jack Crossfire | Apr 19, 2013 @ 03:26 AM | 5,233 Views
Finally did 26.8 miles. The fake test pilot did not go quietly into the night, after Boston.


There's no way to accurately measure 26.2, so the nearest landmark required going 26.8, which would make it an ultramarathon. It was real hard. Stopped for around 1 hour at the halfway point to rehydrate & eat salted plums. The entire experience took 6 hours.


The stomach was upset & wouldn't take any food or water without real nausia. After mile 20, the glycogen ran out & the transition to burning fat happened, surprisingly late considering there was no food besides salted plums. That was real hard, but normal. There was no pain or blisters, only the great mental challenge required to keep going on an empty stomach.


There was cramping & loss of circulation in the head, in the last mile. Stopping & waiting for traffic became really hard. Had to stop at the last .5 mile to eat another plum & that reduced the cramping. Could have stopped & eaten a hamburger at any point, but pressed on.



After all 26.8 miles, it took an hour before the stomach would take any food. The next step would be to find a way to maintain salt & food levels for the entire distance. The high temperature was 77F.


Of course, there was but 1 runner on the trail who no-one could best.
Posted by Jack Crossfire | Apr 18, 2013 @ 01:19 AM | 5,495 Views
So there were many comments in the config file where changes were required for POV & the takeoff algorithm was different than expected. The takeoff is the hardest thing for an autopilot. Sadly, this 4100kv motor is too hot & needs to be replaced by a 3600 to fly the complete program last flown in 2012. 3600kv 12mm motors are no longer made. They have to be wound from scratch.

Takeoff got converted to hard coded, somewhere after 2012. It may have been a sacrifice for ambient light. Then, there was the forgotten matter of battery balancing. Since there was only 1 copter in 2012 which always had POV, there was no dual configuration.
Posted by Jack Crossfire | Apr 17, 2013 @ 06:03 PM | 5,625 Views
Poor tenergy is the latest RC brand indicted in a terrorist attack. RC technology has gotten so much more advanced than military technology, it's more common to see your familiar brand names in a war zone than a military industrial complex.
Posted by Jack Crossfire | Apr 17, 2013 @ 01:31 AM | 5,679 Views
After much fabrication & debugging, webcam vision was busted again. Machine vision in ambient light is a high shutter speed affair. Monocopters need a high frame rate to reconstruct an oval from a high shutter speed. The webcams aren't fast enough to do it in ambient light, although they could do slow shutter speed tracking in total darkness.

Since it was about a common vision system for all vehicles, that eliminated webcams from any vehicle. For the low frame rate vehicles, the boardcams can still do 640x240 to get the same depth perception as a webcam. 640x240 wasn't possible for the raspberry pi, but is possible for a PC.

1 week of development was necessary to discover what wouldn't work. A far cry from simply recompiling the board cam code, the code which worked in January suddenly can't fly anymore.

The ground based autopilot is a 7 year old ocean of benign bugs, like why does video only come in at 40fps if the beacon rate is 25Hz? Why is the current position scale way off? Why does target altitude get offset after takeoff? Why does the pitch detection sometimes invert or show a blob at 0,0? They have no bearing on why it doesn't fly.





In the single board computer sighting department, it's a credit card sized quad core 1.7Ghz

http://www.hardkernel.com/renewal_20.../prdt_info.php

For all the attention arducopter is giving to augmenting GPS with the accelerometer, extremely tight vibration isolation, & extermely tight PID calibration, that...Continue Reading
Posted by Jack Crossfire | Apr 16, 2013 @ 02:23 AM | 5,305 Views
The problem with daylight monocopter tracking is you need to stack a lot of frames to get a useful oval. The longer the accumulation buffer, the longer it takes the accumulation buffer to reflect the current camera angle, so the camera angle overshoots. At 30fps daylight shutter angle, the accumulation buffer needs to be 1/2 second.

1 solution is to transfer every frame to a complete hemisphere buffer, where it would be placed according to the current camera angle. It would take a lot of processing to record the buffer for debugging. Another solution is to shift the accumulation buffer, every time the camera moves. That would accumulate massive errors but consume the least memory.

All solutions require correlating the camera angle to a pixel count.
Posted by Jack Crossfire | Apr 15, 2013 @ 04:00 AM | 5,454 Views
1) An unpowered vehicle which is shot out of a gun as a shell, then unfolds wings or a parachute & glides down under control. It would be for quick situational awareness.


2) A pocket size vehicle which has a parachute which is unfolded for a wing.


3) A vehicle which uses optical flow to augment GPS.


4) A micro Marcy 1 using a brushed motor & 4.2V battery.


Converting Marcy 1 to dual webcams was busted. After spending a week moving the radios to the latest protocol, you start implementing stereo vision for measuring distance. It doesn't work as well as measuring size. The distance between cameras is far smaller than the size of the copter, making it much less accurate.


The webcams aren't synchronized, so you can't detect distance in each frame. You stack frames in lieu of synchronizing cameras, but at 30fps in ambient light, you don't get a smooth oval. You get a number of constantly undulating blobs. The smallest conceivable monocopter would have a bigger diameter than the camera spacing.
Posted by Jack Crossfire | Apr 11, 2013 @ 06:39 PM | 5,646 Views
Optical flow on Posterboard vs Carpet (2 min 24 sec)


Flying a simple path by optical flow, the carpet barely does it at low altitude & has no hope at high altitude. The high altitude failed because sonar died, but given a more powerful transducer, the camera would have been next to fail. Still, it's smaller than any previous optical flow copter that navigated waypoints, took off & landed on its own.

Centeye never published the algorithm it used in the 2009 demo of a copter hovering in a fixed position, using only a bag of cameras. Presumably, it merely measured the X & Y translation of the images in each camera to deduce motion in 3 dimensions.

It would have been hard coded with the distance from the cameras to the features. It couldn't have measured depth, because the cameras weren't high enough resolution & weren't spaced far enough apart.

The next step in optical flow is 2 high resolution, horizontal cameras, on a panning servo. That would give the minimum amount of information required to sense 3D position, but be more robust than a downward camera with sonar. It wouldn't be as accurate as a downward camera with sonar.

It would search for the most detailed area by panning, then deduce position through depth & translation. 1 scanning pass would match the offset between both cameras. It could use the entire frame with a large scanning radius. The next pass would match the offset between frames on the same cameras. It would use a small scanning area with small scanning radius. It would need an FPGA with SRAM.

The cameras would have to be spaced 1 ft apart. The panning servo could be eliminated by relying on turning the aircraft to face the most detail or having more cameras.
Posted by Jack Crossfire | Apr 10, 2013 @ 01:51 AM | 6,196 Views
So what you want is a common ground station that works with all the vehicles & provides the best possible navigation. There is no practical use for dedicated ground stations based on the raspberry pi for each aircraft. There isn't enough capital to make enough of a dedicated product like that to pay for itself.

The best navigation is from a desktop computer & dual webcams at the highest resolution they can do at 30fps. There's a bag of webcams which max out at 640x480. Higher resolution would be nice. The ground station is ready for a sonar extension, which would support position detection for a 2nd aircraft in a dogfight game.


The monocopters can use 1 webcam. The Blade MCX uses 2 webcams. The problem discovered with the monocopters was how the diameter changed when they got off the ground, skewing the distance measurement. They'll have to use 2 webcams for highest accuracy.


When autonomous, flying, indoor robots become as common as microwaves, they'll use local RF positioning. For now, the camera station is the best.
Posted by Jack Crossfire | Apr 08, 2013 @ 10:59 PM | 4,480 Views
Manually controlled optical flow (3 min 31 sec)


It's not the same as the ground based camera. It drifts a lot, more with increasing altitude. At least it only requires a tablet.


It actually works reasonably well when over carpet & low enough.
Posted by Jack Crossfire | Apr 08, 2013 @ 03:10 AM | 4,799 Views
Posted by Jack Crossfire | Apr 07, 2013 @ 02:59 AM | 4,824 Views
There are now a few hundred photos of this thing on the hard drive.






...Continue Reading
Posted by Jack Crossfire | Apr 04, 2013 @ 01:10 AM | 4,974 Views
Damped Harmonic Oscillators | MIT 18.03SC Differential Equations, Fall 2011 (9 min 30 sec)


Control systems is the most intimidating course in an engineering program. It's extremely intensive on symbolic calculus, yet you encounter the same nonlinear differential equation notation in any research paper about control systems.

That's the difference between someone who has made zillions of flying things intelligent & someone who made zillions of flying things intelligent for a living. In a down economy, there are so many people who can do it, another means of weeding out candidates is needed. It comes down to who has paid their dues in formal education.


Whether the future will be dominated by an economy of self made makers building businesses in their garages or corporations building on formal college education & strict roles is a contentious issue. The fact is making enough money working for yourself to leave your day job has been the dream for centuries, whether it took the form of web site designing 15 years ago, flipping houses 7 years ago, or the maker revolution now.


There has always been a new tool, promising to bring the self employment dream to the masses. For years, Sparkfun & diydrones had declared the maker revolution would free everyone from their day jobs, just by buying their products. Guys like Dean Goedde & Jordi Munoz were posterboys on how new products like batchpcb, sparkfun breakout boards, & 3D printers...Continue Reading
Posted by Jack Crossfire | Apr 01, 2013 @ 11:46 PM | 4,603 Views
Optical flow odometry (1 min 9 sec)


The copter flies away & returns to the starting point at 2 different altitudes, using only optical flow odometry for position. The fact that it landed at nearly the takeoff position was probably coincidence, but optical flow has proven very accurate in a 30 second hover. It may depend more on time than movement.


The takeoff is hard coded for a certain battery charge. This battery may have put out a lot more than the battery it was hard coded for, causing it to bounce during the takeoff. Nothing optical flow guided besides the AR Drone has been shown doing autonomous takeoffs & landings. It takes a lot of magic.

It was a lot more stable than the ground based camera. Ground based vision had a hard time seeing the angle of movement, relative to the copter. Optical flow has precise X & Y detection in copter frame.

That finishes the hardest steps for this vehicle. The next step is manely writing software.
Posted by Jack Crossfire | Mar 31, 2013 @ 09:01 PM | 5,532 Views
Optical flow duel: AR Drone vs Marcy 2 (0 min 55 sec)


How long can they stay in bounds? The home made micro copter did surprisingly well, compared to the commercially successful AR Drone. The degradation was reasonable for a camera with 1/3 the resolution & 2/3 the frame rate. It's not likely to get more accurate. The mane problem is glitches in the Maxbotix.

Removing the 10R from the Maxbotix power filter made no difference. It now has a bare 470uF. It's not likely wifi interference. It seems to glitch over sections of paper which aren't flat. Unfolding some garbage bags might work better.

Measuring all the navigation in mm * 256 was the final leap in accuracy. Comparing it to the AR Drone showed it wasn't the math, anymore. The academic way to convert pixels to ground distance involves a tan of the pixel shift to compensate for parallax. The pixels are so small, this was neglected & the pixel shift was just multiplied by altitude.
Posted by Jack Crossfire | Mar 31, 2013 @ 03:35 AM | 6,099 Views
As bad as optical flow is at absolute position, it's hovering for around 90 seconds at 0.5m before slipping off the floor pattern. Above 1m, it slips off after 1 minute. At 1.5m, it slips off after 30 seconds. Most of the slippage seems attributable to fixed point math errors. The Maxbotix falls apart at 1.5m over posterboard.

Foaming the IMU & applying some fixed point integral magic made the attitude real stable. It's unbelievable how stable that thing became, comparable to the Blade MCX.

If scaling pixels to distance based on altitude must be done in fixed point, it's another area which would be best done with a lookup table instead of getting the angle, getting the tangent & multiplying by the altitude. Horizontal distance also needs a lot more precision than 1/256 meters. There are many navigation routines where fixed point dictates combining many steps in high level lookup tables instead of using discrete trig functions.

Stability is still comparable to the 20 second hover in the px4flow demo video. Time for some waypoints.