In exchange for expanding your executable by a large chunk, the Arduino toolchain for ARM does compile most of the floating point operations for the STM32F4, but some trig functions don't work & USB doesn't work. It uses software floating point, of course. It would take some hunting for the right compiler flags for that one.
You'd be amazed what can be accomplished with software fixed point routines. All your favorite trig functions can be performed with lookup tables, in a reasonable amount of memory. Bringing up a fixed point navigation routine takes a lot less time than bringing up a floating point compiler toolchain & it's still going to be faster than software floating point.
There's basically execution time, development time, & longevity of the microprocessor. If your execution doesn't require the fastest floating point, all your work on the toolchain has no effect on the output. The average microprocessor is end of lifed after 1 year so all your investment in bringing it up has to be disposable. Bringing up the most advanced gcc optimizations for a new microprocessor can expand the development time weeks & months beyond implementing fixed point routines.
If another new, ground up, autopilot is written for ARM, it's going to be fixed point. More likely, the next outdoor vehicle would use Arducopter. An outdoor vehicle hasn't flown in over 1 year. The 1 vehicle would take a lot of debugging to fly again & it still uses a ground based...Continue Reading
Adrucopter is attractive for all types of vehicles because the arduino framework has all the compilers, libraries, & floating point ready to go. Arducopter contains all the navigation routines any vehicle would need.
It would save a lot of time, but wouldn't be the bleeding edge fastest processing with scalability to video. Navigation routines of any kind require floating point.
Now that was a disappointing rover. There are too many bits & not enough bandwidth, but instead of dropping packets, it buffers packets when the reception goes down. It seems to buffer on the tablet side only. Controls lag more & more, but video never seems to lag. Disabling video improves matters, slightly. Trying to disable buffering with java.net.DatagramSocket.setSendBufferSize didn't do anything.
The AR Drone solved this issue. It could be a more powerful, custom radio, but they still have to make due with the tablet buffering.
This has lot of problems, which is why Rovio gets a $42 million investment on a ball that does nothing while a rover that tries to stream video gets nothing. The mane problem seems to be the high current used by the Rover 5 motors. They take 1A, causing serious interference. Despite being UDP, someone is buffering when the connection is lost, causing driving commands to get delayed or stuck.
This rover looks dead, unless another communications method can be devised, the buffering can be disabled, or the interference can be traced to a fixable source, like voltage ripple in the 3.3V line. At least some autonomy might need to be done onboard instead of over wireless.
Also should note conventional wood glue takes 2 days to set, in a 65F apartment. After waiting 1 day for the 1st stage of wood parts to set, gave up & hot glued it together.
Finally converted the intervalometer yet again to beefier switches & an enclosure. Using lessons learned, it's now using the very reliable dip switches & slide switches from the 1st intervalometer + a very reliable battery connector to hopefully eliminate all the problems of the last 8 years of intervalometers.
Obviously, it's insane to invest all this energy in home brewing something that's $15, more reliable, & much smaller, but the parts are getting recycled instead of thrown away & it does exactly what it needs to do instead of requiring workarounds.
The dip switches have proven the most reliable way of setting mirror lockup delay, exposure time, & storage delay. LCD's had the problem of vanishing in cold weather.
Together with the super reliable wireless control. That gives all the shutter control required. The only next step would be a remote controlled pan/tilt mount, but there's no obvious need for it.
Now for a rousing XMas working on a UDP ground station interface. XMas was rained out again, so there was no point in driving anywhere.
After discovering the annual $100 fee for compiling iOS code, the tablet interface made its introduction on Android. It's too hard to fly by tablet in the tiny apartment, so all it's good for is starting the autopilot & viewing status.
Not being on iOS is not being in M.M.'s Apple world.
The real problem is the wifi dropouts. The ground station protocol always used TCP, which tends to lock up on wifi. It's programmed to crash if the connection drops out. What it needs is a conversion to UDP, a major rewrite that would have been a no brainer in the days before programming for someone else.
There's also the need for more telemetry display, configuration from the ground station. Configuration always required restarting the flight control software.
When it came time to show the video preview, Java fell over. It can't get nearly the framerate of the C ground station.
Finally got the macbook running on VNC, XCode to add some widgets to a screen, keychain access to manage all the passwords Apple requires, & then ran into the dreaded "Access denied" when trying to create a provisioning profile to run the program.
So developing anything for the 'pad requires either paying $100 every year to belong to the iOS developer program or getting someone you know who's already paying up to give you a provisioning profile for every device & app you develop. It's unfathomable to have to pay $100 every year to run your own code, so that just about finishes off having an aircraft that belongs to M.M.'s Apple universe. It needs to run on Android.
There are ways of rooting it, but now we're back to writing software that no-one else can run without rooting their own device with a program that requires someone somewhere to root every new version of iOS.
Don't know what the future of this macbook & iPad is. They're basically worthless except for writing software for a paying customer that won't be runnable after a year. Have found the pad interface more tedious to navigate than Android, even though it's faster. The pad is also heavier than the Android.
After deciding on continuing with just Android development, started the 4th incarnation of the ground station interface. It has now been done once in Java with a HUD, once in Java without a HUD, once in C++ without a HUD, & once in Android. The ground station is a...Continue Reading
Another day of soldering & the preamp once again emerged undefeated. This was the 1st rework since the M.M. era. It was an attempt to reduce noise by running it on 5V from a linear regulator. The regulator should smooth out low frequency noise, but this idea failed. It still needs a separate battery. The next idea still remanes an isolation transformer.
The preamp finally got a 2nd channel installed, for whenever a 2nd microphone arrives. The price of the CAD GLX-2400 increased $10 since last year, typical for fiat money.
The 2nd microphone was supposed to arrive whenever the final Delta IV heavy launched from Vandenburger. That was supposed to happen by now but never did.
Once again, a small encloser makes life unnecessarily hard. Having everything on a single board, in a rack enclosure would be a lot easier. The key components are the 5V regulator for the preamps, the 48V phantom power regulator, & the preamps.
In other news, it occurred to me that hardly any RC pilots using 1W video transmitters actually have the required license. They've gotten so ubiquitous & the unlicensed power level has been increased over the years so no-one cares about a license anymore. Anything under 1W seems to be unlicensed, nowadays. Rangevideo no longer offers a 1W 900Mhz transmitter.
The best of the best Spektrum radios still dick around in the...Continue Reading
After many months, the next remote shutter was fabricated. It was the most expensive remote shutter ever made, with an $80 radio set capable of reaching 450 ft, but it was about recycling instead of buying a $10 remote control. The 900Mhz radios actually arrived in the 1st 3 months of M.M..
The radios could probably do all the functions by themselves, but it was easier to use some microcontrollers instead of figure out the xbee API mode. It takes 100mA when on, giving it only 8 hours on a battery charge. The next step is figuring out where to put the batteries.
Recycling all those parts which were used in projects during the 1st 5 months of M.M. showed just how much was going on, back then.
Now some ancient microcontrollers. The last major project of note using the Blackfin was the http://surveyor.com/ robot. Looking over the product matrix, they're still only 400Mhz with leads or 600Mhz in a BGA, with a vapor 2 core version.
The open hardware/hobby space emphasizes the 168Mhz ARM solutions for 32 bit or the Atmel AVR solutions for 8 bit. The consumer electronics space emphasizes the 1.7Ghz ARM solutions. You never see a Blackfin or MSP430, anywhere.
The Procerus autopilot used Blackfin, but it's not clear if anyone ever used all those hand coded DSP instructions for anything or if it only did basic navigation. The blackfin might be viable in a hand soldered project requiring embedded image processing, but it would require external SRAM, making it approach a raspberry pi in size, yet be twice as slow.
The next item of concern is iPad support. Basically, there's J2ObjC which tries to convert back end Android Java to iOS Objective C but doesn't touch the GUI code. Some kind of preprocessor that would generate both Android & iOS back end code from Java would be ideal, but we seem to be back in the 1980's, with independent code bases & people dedicated to porting to every incompatible platform.
The 2nd Marcy 1 took to the air. This one is loaded down with the POV LEDs.
Visually the same, but mostly redesigned. The POV processor & mane processor now read the radio directly instead of the mane processor passing on data to the POV processor. It was 1 more wire but much simpler code. Even though it was built in July, this was the 1st time this airframe did POV. The motor is immediately getting too hot, melting through the propeller.
The fancy blob detection had to go for POV to work. It's back to a simple threshold, exactly like the nighttime only version. IR vision would allow it to use blob detection.
There was supposed to be a 3rd Marcy 1, with an onboard camera. It's possible to get a very small, wireless camera, but the only fabrication possible in the apartment is a large wireless camera.
Also, if it has 802.11 on it, it's a drag to have to carry around a ground station to control it instead of controlling it directly from a tablet. It's necessary, because 802.11 isn't reliable enough to control it & having 1 system that supports an autopilot & manual control is easier than having 2 unique systems for autopilot & manual control.
The trick with Marcy 1 is once she's flying with the lights on, the thrill from such a strange device hovering subsides, she gets real boring.
So the 70fps framerate, the lack of need for a USB hub, & the lower computational load made the board cam irresistible. Despite everything flying perfectly, it was time to rebuild it again.
The board cam immediately had new problems. The radio & camera don't always initialize. It helps to leave the board powered off for a while before restarting it, but it's a real problem if it is to automatically boot on the raspberry pi without a command line to drop kick it. There would only be power cycling after observing the failure on a tablet. This didn't happen with the last build of exactly the same board, but the last build had 4 more PWM's & no camera.
Also, there are a lot of tiny cables flexing. There wasn't any notable improvement in flight. The picture had a much more refined oval & probably more accurate coordinates but the flight was equally unstable.
With all the knowledge gleamed over the last year, the ultimate vision system would now use visible light, dual board cameras on separate turrets & separate USB connections producing 320x240 at 70fps. That would give the best velocity measurements. It would be nice if an IR board cam was easily obtained, but the lack of such a camera & the reduced power needs of visible LED's make IR impractical.
Without the instant velocity measurement of doppler shift that GPS provided, all indoor vehicles have suffered from delayed velocity measurement. The only solution is to increase the framerate...Continue Reading
Finally got her to fly with the lights on. There are a lot of problems with lightbulbs reflecting on reflective surfaces. These problems never happened with IR, but IR should be no different.
Detecting the XYZ of a spinning object with 1 camera in high shutter speed mode is really hard. It would be easier with 2 cameras.
There is a new takeoff algorithm, relying on hard coded throttle for a given battery voltage to instantly leap off the stand. It's a very unstable system. Technically, it should be more stable than the previous throttle ramping algorithm.
Finally got sound from the Canon T4I. It's not always in focus, but it's a real narrow depth of field & real expensive lens.
Many years ago, the feedback algorithm went to a fixed integral step for all error. Today, it went back to a variable integral step & got a lot more stable. The Syma X1 is the most unstable vehicle in the indoor fleet. It usually can't take off successfully.
Also, the award winning follower amplifier http://www.rcgroups.com/forums/showthread.php?t=1778814
failed. It appeared to have a resonant oscillation when plugged into the Syma remote control. It oscillated at 40khz. The Syma may have drawn current at just the right frequency to resonate with the slow op-amp. So it's back to no follower amplifier & it works perfectly. There's a risk of any active component creating an oscillation.
The T4I took some comparison photos with the 5D. The 5D's only use now is the larger frame.
An 800mm F/8, 50mm F/1.4, 200mm F/2.8, & 15mm F/2.8 are shown.
For night vision & astrophotography, it's quite a step up from any previous camera in this apartment. The live view focusing magnification makes possible what was either impossible or extremely time consuming on the 5D. However, besides the footage being extremely bright, it's still extremely grainy.
Considering the noise, the T4I is probably only incrementally better than a 5D in raw mode with software brightening. The 5D is still king of landscapes & portraits. If a beautiful heroine ever talked to me, I would take the 5D instead of the T4I to document the occasion. It hasn't happened in almost 40 years, but you never know.
The live view focusing magnification & incrementally better sensor of modern cameras would make the 6D the best camera ever made, but it would be hard to justify upgrading from the 5D for the level of improvement. Those features are manely used in the single application of astrophotography.
Anyways, it's about time someone employed in video software for many years & writing video software for many years before that finally came into the world of DSLR video.
It ain't 30fps of DLSR quality still photos. The color latitude isn't there. It's more like an incremental improvement in color latitude over the cheap Sanyo with a narrow depth of field & higher bitrate. The problem seems to be lack of processing power, continuing dependance on compression.
Of course, getting good results on the Canon is easier than the Sanyo. The Canon should be incrementally less noisy in the dark, though all videos on this blog get as much light as possible.
The thought of going back to board cams has occurred. Boardcams do visible light only. Webcams do infrared. That's the only reason for using webcams.
Boardcams can do useful resolutions from 40fps to 70fps. Webcams can only do 30fps. Boardcams can probably do higher resolution on the raspberry pi than webcams, since they do RLE compression & luma keying on the board. Decompressing webcam JPEG video & luma keying takes over 30% of the pi.
Boardcams can probably do lower latency, since they transfer much less data than webcams. They might have higher latency, since they have to wait for beacons to transmit.
Boardcams are a pain to program, since all the firmware is done in house. They can be daisychained & multiplexed with a 900Mhz radio, eliminating a USB hub completely.
The next step is 2 cameras on independent pan/tilt mounts, spaced more than 1ft apart. That would give the depth perception of a multicam vicon system, but use the pan/tilt mounts to emulate more than 2 cameras.
Spacing multiplexed cameras would require LVD buffer chips & reduce the frame rate. Multiplexing was manely required for synchronizing the cameras. Dual USB cables continue to be the optimum way.
If Marcy 1 proves successful with visible light, she would definitely go back to a board cam, eliminating a USB hub & making the last 2 days of PIC debugging irrelevant.
Once again, the problem isn't flying the aircraft but PIC assembly language nuances. BANKSEL tracking, pointer tracking, buffer tracking, byte order tracking, branch statement tracking, low pass filtering using mult & add operations, are all steps that could be done without, in C compiled for Atmel.
It just needs a time with a window to port the USB driver or even better, port everything to the arduino libraries. Development would be a lot faster, since the microcontroller isn't doing anything revolutionary.
GCC for AVR is so mainstream, it's now tracked in the GCC changelogs. No mention of GCC for PIC anywhere.