Jack Crossfire's blog View Details
Archive for April, 2013
Posted by Jack Crossfire | Apr 30, 2013 @ 04:22 AM | 8,236 Views
Saturn's closest approach was today, so it was time for some T4I goodness.



Averaging 37 frames. ISO 1600 1/100 F/8 800mm














Single frame. ISO 800 1/100 F/8 800mm
Posted by Jack Crossfire | Apr 30, 2013 @ 12:57 AM | 8,371 Views
Laser Power over Fiber demo - by LaserMotive (1 min 6 sec)


The intriguing part was that they used everyone's favorite indoor quad, the Syma X1 to show it could carry the required current. Unfortunately, the solar cell was too heavy. They didn't disclose the amount of power lost in the fiber, either.

To fly a Blade MCX using copper wire, it would take 30V on the ground to get 4.2V to the aircraft.

Anyways, some more tests with the Towerpro 2410-9 showed you can get more current by running the L6234 at its minimum voltage, 7V. That got it delivering 0.9A, but even temporarily feeding 2A wasn't enough to get smooth motion out of the Towerpro.

A brushless gimbal motor controller is never going to have the full capacity of the MOSFETs, since the motor is always stalled. It's always going to be dropping a huge voltage in the MOSFET, to generate a slice of a sine wave. 0.5A is probably the most you can expect, the way these controllers have only been photographed without heatsinks & they only have small molex connectors. They're better off running at 7V & higher PWM than 12V & lower PWM.
Posted by Jack Crossfire | Apr 29, 2013 @ 12:46 AM | 7,750 Views






So the Macbook has ended up useful for testing programs that are too hard to get working on Linux. It's even easier than Windows. The mane program was Hugin. After a day of compiling it, the Linux one would always prove unusable. The Mac version showed it finally reached a barely useful point, after 14 years.


Some of the aerial panoramas can finally be stitched. With only 4GB of RAM, it takes many hours, but now you can finally begin to see what the original copter saw. The Linux boxes also have only 4GB, but run Hugin a lot faster.





It's an extremely clunky program. Most of the time, it can't align them. It doesn't render the images in the order it previews them, so you have to constantly test render. Dialog boxes don't close without manually killing processes.


...Continue Reading
Posted by Jack Crossfire | Apr 28, 2013 @ 04:15 AM | 7,621 Views
So EDF's don't make a multirotor lift more in a reduced area. They require much more battery mass than the increased thrust they produce. The comments for the mighty EDF70 tell a tale of 60A consumed, just about the limit of a 3Ah battery.

https://www.hobbyking.com/hobbyking/s...Assembled.html

They require a 70A ESC.

https://www.hobbyking.com/hobbyking/s...k_70A_ESC.html


Each one would need its own battery.

The easiest way to get more out of a multirotor the size of a Syma X1 is to stack 2 on top of each other, connect 2 motors in series, & power it off 8.4V. You also want higher voltage for sonar range & a CCD camera.


The next step is to build a custom, brushless one. The brushless one would require custom ESCs to be as stable as the brushed one. It's quite involved.
Posted by Jack Crossfire | Apr 27, 2013 @ 12:26 AM | 7,551 Views
There seems to be a gigantic amount of useful tools in the Ardu universe. Andropilot, Qgroundcontrol, Arducopter, Arduplane, vrbrain, PX4, 3DR radio, all independent weekend hobbies, developed to the point of seeming to fit together in a cohesive solution. 3Drobotics has managed to harvest the myriad of random volunteer projects, get the hardware adopted by enough people to become standardized, make the tools interoperable, providing support for all the projects, & make them accessible enough to become widely used.


The old days of open source were dominated by someone making the beginning of a GUI, then giving up. Someone would start writing a library for later use in a program, then give up. Someone would design the architecture for a massive program, but never get past an initial function.


The Ardu universe has a lot of projects actually getting to the useful state, even if they're still never completely polished. Being in the Ardu universe gives you a lot more functionality than you can implement on your own, which wasn't always true for open source.


The difference is the rise of agile development since 1999, the motivation to finish something you paid hundreds of dollars on the hardware for, the immediate results you get from a robotics project that you don't always have in a pure software project.


The only drawback is being in the Ardu universe means being tied to the standard arduino size board & the size vehicle it takes to lift it. There are a few blimp & rover autopilots tied to the Ardu universe, but nothing micro sized.
Posted by Jack Crossfire | Apr 26, 2013 @ 03:43 AM | 8,203 Views
Brushless gimbal busted (1 min 11 sec)


The towerpro/turnigy 2410-9 motor doesn't work in a brushless gimbal. Though properly wound in a star pattern, it needs 3A to smooth out the cogging. The L6234 could only do 0.3A before overheating. At this current, cogging is a real problem & the snapping movement saturates the gyros. Maybe a higher power motor driver could do it, but it would take 6A to power 2 motors.

Other than the motor problem, the PID calibration seemed real close. The minimal software for level pointing was quite simple, but it would take some doing to get it to point down.

Prewound brushless gimbal motors have only been available for around a month, but are $30 to $50, so that ends the short experiment in a brushless gimbal. The fascination was mostly in how a 3 phase motor can be made to point like a very fast servo. They might be useful in astronomy or a large monocopter actuator. They would definitely have come in handy in 2011.
Posted by Jack Crossfire | Apr 25, 2013 @ 04:53 AM | 8,215 Views
Didn't take long to cave into that project. There's no point to it, other than the chance of having 3 axes before any product is sold. There's barely enough clearance for the motor chips & no way to solder the thermal pads. They'll need heat sinks. Also, the software is mostly redone, still in fixed point. At least it's simple.
Posted by Jack Crossfire | Apr 22, 2013 @ 07:28 PM | 11,338 Views
http://www.fpvhub.com/index.php?topic=12150.0

Unlike the brushless ESC motor controller, the brushless gimbal motor controller doesn't use any back EMF or hall effect sensors. It just steps through the hard coded waveform of a 3 phase motor by a certain amount, every certain number of PWM cycles.

The amount & direction the waveform is stepped depends on where the IMU is. If it's way off, it continuously steps through the waveform real fast. If it's dead on, the waveform doesn't step & the 3 motor wires get a fixed current.

Each wire powering the motor gets the same sine wave, offset by 120 degrees. The magnitude of the sine wave determines the torque & is hard coded. If it's a large number, your gimbal is going to suck a lot of current, regardless of how much the gimbal has to move. Only the phase changes, based on the direction & speed commanded. A more advanced controller could probably vary the magnitude, but the open source one currently doesn't.

They all use the L6234 motor controller, which is just a bag of N MOSFETs & 1 pin controlling each motor wire. If the pin is high, current goes into the motor wire. If the pin is low, current goes out of the motor wire. The open source controller never turns off a motor wire. There's no fancy hardware phasing of the PWM. The 3 turning waveforms are all created by software setting 3 PWM duty cycles.

To get a smooth sine wave, they're toggling the current between in & out...Continue Reading
Posted by Jack Crossfire | Apr 21, 2013 @ 07:21 PM | 8,640 Views
So you want the smallest possible brushless gimbal. The knowledge base about supported motors & payloads is increasing. People are lifting DSLRs now. There's a video of the inventor flying a small quad with one.


SimpleBGC test in wind (3 min 48 sec)



All the notes are in Russian. The days of the future being written in english are over. You can reverse engineer the following:

ROLL- RCTimer CF2805-14
PITCH - Turnigy L2205-1350

60 turns of 0.2 something wire in each motor. Presumably 0.2 millimeters.

BrushlessGimbal DSLR01 (3 min 38 sec)
...Continue Reading
Posted by Jack Crossfire | Apr 21, 2013 @ 04:43 AM | 8,498 Views

The occulus rift was intriguing, until you realized its limitations. It has a 1280x720 screen which divides into 640x720 for each eye. It has no sound, so the sound doesn't follow your head. Like all first person 3D graphics, it's not perfectly realistic & causes nausia. Convergence point & focus don't follow your eyes.

Of course, their future plan is to upgrade it to 1920x1080 or 960x1080 for each eye. We all know how those future plans always come true. HD resolution without lag is still not possible for FPV, so forget about FPV with these.





Also, their future plan is to sell it with an omni directional treadmill, involving lots of innovation & of course an undetermined release date. It's really just a low friction foot platform with a baby walker to keep you from slipping off. It's the kind of imagination intensive plastic toy we had in the 1970's.

...Continue Reading
Posted by Jack Crossfire | Apr 21, 2013 @ 01:27 AM | 7,113 Views
What a joy to have this thing flying itself again, using 40fps, PC based, 640x240 vision. Sharing a common vision system between all aircraft will never be tweek free. It's down to 1 day for reconfiguring & halfway to a computer controlled dogfight game. Just need to get another MCX to lift a sonar package & rebuild a sonar array.

No obvious way to get it to lift a camera has emerged. 5V 70mA has never been so far off. It has 5 minutes of endurance with no payload.

At this point in the history of ground based vision, 1 huge problem has emerged. It has a hard time telling the direction of movement, leading to a toilet bowl effect.
Posted by Jack Crossfire | Apr 20, 2013 @ 12:57 AM | 7,741 Views
Had a go at using 2 cameras for a monocopter, but averaging 2 position readouts instead of getting a stereo readout. It was subjectively slightly better. Whether or not it really made any difference, it's a common camera platform for all vehicles, so it might as well use 2 cameras for a monocopter.

The favored vehicle for indoor flying is the Blade MCX, so the future of monocopters is unknown. The flying is so erratic & payload with 4100kv motor so limited, a camera wouldn't be any use.
Posted by Jack Crossfire | Apr 19, 2013 @ 03:26 AM | 7,275 Views
Finally did 26.8 miles. The fake test pilot did not go quietly into the night, after Boston.


There's no way to accurately measure 26.2, so the nearest landmark required going 26.8, which would make it an ultramarathon. It was real hard. Stopped for around 1 hour at the halfway point to rehydrate & eat salted plums. The entire experience took 6 hours.


The stomach was upset & wouldn't take any food or water without real nausia. After mile 20, the glycogen ran out & the transition to burning fat happened, surprisingly late considering there was no food besides salted plums. That was real hard, but normal. There was no pain or blisters, only the great mental challenge required to keep going on an empty stomach.


There was cramping & loss of circulation in the head, in the last mile. Stopping & waiting for traffic became really hard. Had to stop at the last .5 mile to eat another plum & that reduced the cramping. Could have stopped & eaten a hamburger at any point, but pressed on.



After all 26.8 miles, it took an hour before the stomach would take any food. The next step would be to find a way to maintain salt & food levels for the entire distance. The high temperature was 77F.


Of course, there was but 1 runner on the trail who no-one could best.
Posted by Jack Crossfire | Apr 18, 2013 @ 01:19 AM | 7,546 Views
So there were many comments in the config file where changes were required for POV & the takeoff algorithm was different than expected. The takeoff is the hardest thing for an autopilot. Sadly, this 4100kv motor is too hot & needs to be replaced by a 3600 to fly the complete program last flown in 2012. 3600kv 12mm motors are no longer made. They have to be wound from scratch.

Takeoff got converted to hard coded, somewhere after 2012. It may have been a sacrifice for ambient light. Then, there was the forgotten matter of battery balancing. Since there was only 1 copter in 2012 which always had POV, there was no dual configuration.
Posted by Jack Crossfire | Apr 17, 2013 @ 06:03 PM | 7,642 Views
Poor tenergy is the latest RC brand indicted in a terrorist attack. RC technology has gotten so much more advanced than military technology, it's more common to see your familiar brand names in a war zone than a military industrial complex.
Posted by Jack Crossfire | Apr 17, 2013 @ 01:31 AM | 7,864 Views
After much fabrication & debugging, webcam vision was busted again. Machine vision in ambient light is a high shutter speed affair. Monocopters need a high frame rate to reconstruct an oval from a high shutter speed. The webcams aren't fast enough to do it in ambient light, although they could do slow shutter speed tracking in total darkness.

Since it was about a common vision system for all vehicles, that eliminated webcams from any vehicle. For the low frame rate vehicles, the boardcams can still do 640x240 to get the same depth perception as a webcam. 640x240 wasn't possible for the raspberry pi, but is possible for a PC.

1 week of development was necessary to discover what wouldn't work. A far cry from simply recompiling the board cam code, the code which worked in January suddenly can't fly anymore.

The ground based autopilot is a 7 year old ocean of benign bugs, like why does video only come in at 40fps if the beacon rate is 25Hz? Why is the current position scale way off? Why does target altitude get offset after takeoff? Why does the pitch detection sometimes invert or show a blob at 0,0? They have no bearing on why it doesn't fly.





In the single board computer sighting department, it's a credit card sized quad core 1.7Ghz

http://www.hardkernel.com/renewal_20.../prdt_info.php

For all the attention arducopter is giving to augmenting GPS with the accelerometer, extremely tight vibration isolation, & extermely tight PID calibration, that...Continue Reading
Posted by Jack Crossfire | Apr 16, 2013 @ 02:23 AM | 7,373 Views
The problem with daylight monocopter tracking is you need to stack a lot of frames to get a useful oval. The longer the accumulation buffer, the longer it takes the accumulation buffer to reflect the current camera angle, so the camera angle overshoots. At 30fps daylight shutter angle, the accumulation buffer needs to be 1/2 second.

1 solution is to transfer every frame to a complete hemisphere buffer, where it would be placed according to the current camera angle. It would take a lot of processing to record the buffer for debugging. Another solution is to shift the accumulation buffer, every time the camera moves. That would accumulate massive errors but consume the least memory.

All solutions require correlating the camera angle to a pixel count.
Posted by Jack Crossfire | Apr 15, 2013 @ 04:00 AM | 7,563 Views
1) An unpowered vehicle which is shot out of a gun as a shell, then unfolds wings or a parachute & glides down under control. It would be for quick situational awareness.


2) A pocket size vehicle which has a parachute which is unfolded for a wing.


3) A vehicle which uses optical flow to augment GPS.


4) A micro Marcy 1 using a brushed motor & 4.2V battery.


Converting Marcy 1 to dual webcams was busted. After spending a week moving the radios to the latest protocol, you start implementing stereo vision for measuring distance. It doesn't work as well as measuring size. The distance between cameras is far smaller than the size of the copter, making it much less accurate.


The webcams aren't synchronized, so you can't detect distance in each frame. You stack frames in lieu of synchronizing cameras, but at 30fps in ambient light, you don't get a smooth oval. You get a number of constantly undulating blobs. The smallest conceivable monocopter would have a bigger diameter than the camera spacing.
Posted by Jack Crossfire | Apr 11, 2013 @ 06:39 PM | 7,779 Views
Optical flow on Posterboard vs Carpet (2 min 24 sec)


Flying a simple path by optical flow, the carpet barely does it at low altitude & has no hope at high altitude. The high altitude failed because sonar died, but given a more powerful transducer, the camera would have been next to fail. Still, it's smaller than any previous optical flow copter that navigated waypoints, took off & landed on its own.

Centeye never published the algorithm it used in the 2009 demo of a copter hovering in a fixed position, using only a bag of cameras. Presumably, it merely measured the X & Y translation of the images in each camera to deduce motion in 3 dimensions.

It would have been hard coded with the distance from the cameras to the features. It couldn't have measured depth, because the cameras weren't high enough resolution & weren't spaced far enough apart.

The next step in optical flow is 2 high resolution, horizontal cameras, on a panning servo. That would give the minimum amount of information required to sense 3D position, but be more robust than a downward camera with sonar. It wouldn't be as accurate as a downward camera with sonar.

It would search for the most detailed area by panning, then deduce position through depth & translation. 1 scanning pass would match the offset between both cameras. It could use the entire frame with a large scanning radius. The next pass would match the offset between frames on the same cameras. It would use a small scanning area with small scanning radius. It would need an FPGA with SRAM.

The cameras would have to be spaced 1 ft apart. The panning servo could be eliminated by relying on turning the aircraft to face the most detail or having more cameras.
Posted by Jack Crossfire | Apr 10, 2013 @ 01:51 AM | 8,326 Views
So what you want is a common ground station that works with all the vehicles & provides the best possible navigation. There is no practical use for dedicated ground stations based on the raspberry pi for each aircraft. There isn't enough capital to make enough of a dedicated product like that to pay for itself.

The best navigation is from a desktop computer & dual webcams at the highest resolution they can do at 30fps. There's a bag of webcams which max out at 640x480. Higher resolution would be nice. The ground station is ready for a sonar extension, which would support position detection for a 2nd aircraft in a dogfight game.


The monocopters can use 1 webcam. The Blade MCX uses 2 webcams. The problem discovered with the monocopters was how the diameter changed when they got off the ground, skewing the distance measurement. They'll have to use 2 webcams for highest accuracy.


When autonomous, flying, indoor robots become as common as microwaves, they'll use local RF positioning. For now, the camera station is the best.