New Products Flash Sale
Jack Crossfire's blog
Posted by Jack Crossfire | Oct 22, 2008 @ 02:48 AM | 3,239 Views
Looks like India is the aerospace champion today. Looks like we have another round of IMU alignment to do. PID equations did no better than the static neural networks. Lots of underdamped oscillation. Nasty rate damping oscillations on landing.

In other news, the video of the Lockheed head of spacecraft survivability & engineering at http://www.flightglobal.com/blogs/hy...orce.html#more drove home the point & the point is this. Knowing that the smart people in college would become accomplished spacecraft designers at Lockheed by the same age U were a flat broke, low paid programmer would have made no difference.

U still would have goofed off & daydreamed, & become a flat broke, low paid programmer.
Posted by Jack Crossfire | Oct 21, 2008 @ 02:13 AM | 3,341 Views
Looks like the inflight training problem is getting bigger & bigger. Most people probably ramp up the gains in flight, compare a small period of tracking with recorded movement, & ramp up the gains some more, until the guidance is within a certain target. Good enough for a fixed wing.

The mane problem is we're controlling angular rates, not the absolute angles the controller reads in. Humans go from seeing absolute angles to commanding angular rates, based on already knowing how fast they can turn the servos without losing control. Natural frequencies as it were.

Our feedforward impared computer may just have to focus on angular rate feedback & ramping up different natural frequencies for the angular rates until it gets ideal tracking.

Finally found that paper on the neural network in GA Tech's copter. Eric N. Johnson and Suresh K. Kannan "Adaptive Flight Control for an Autonomous Unmanned Helicopter" Lots of crazy symbols. The author's resume looks like the aeronautics program of a small country & he's just an assistant professor. At least he's overweight. He got an MS from MIT. The top software architect of Sony's BluRay player also got an MS from MIT. Might also be a few heads of state in there.

Full professors at GA Tech R indeed a real small club, each of whom probably accounts for 1/5th of the current human knowledge of control systems. Maybe there's a way in from the private sector. Maybe not.

The "NN" seems to be substituting for the integral in a PID equation. He calls integral terms the "trim" & has the remaining feedback as just "PD". The neural network accepts detected position, velocity, linear acceleration, attitude, angular velocity, & angular acceleration. It outputs some kind of offset to convert the output from conventional PD equations to nonlinear output.

Our first use of NN's was assisting the navigation estimate. That seems 2 B what the GA Tech NN is doing.
Posted by Jack Crossfire | Oct 18, 2008 @ 08:33 PM | 3,607 Views
Well, did our mandatory test flight with 100% neural control but no adaptation. The networks were trained for linear feedback.

Attitude control was definitely underpowered & underclamped. Forgot what we were doing with the throttle & may have had throttle on a low value for stable air. No surprises.

Was just about to revert to PID equations for comparison when the nose dove. Starboard cyclic servo failed after under an hour. Definitely an act of Chinese terrorism.

Also, the MaxAmps 3.3Ah did puff slightly after flight & recede after it cooled off. What we're seeing may be a natural event most people live with. As for the servo budget, not flying again until an adaptation algorithm is in place.
Posted by Jack Crossfire | Oct 18, 2008 @ 01:58 AM | 3,003 Views
The delusions of being able to fix the shutter quickly vanished as we dug deeper into the 20D. Then the delusions of being able to reuse the sensor vanished when we saw the number of pins & that they were extremely fine pitch ribbon cabled.

As for reusing the hot shoe & lens mount on their own, the electrical contacts are integral parts of the plastic body. Would have to grind the body up & that still wouldn't give up the lens protocol or provide the plug for a flash extension cord. The lens mount is also for a 62% sized sensor.

U laugh at that dinky sensor nowadays & the sacrifice in coverage once required to get digital pictures out of 35mm lenses. Most people still buy those 62% sized sensors.

Going to miss the 20D. Have a lot of fond memories of it. It recorded a lot of great moments in history, an elegant camera for a more civilized age....Continue Reading
Posted by Jack Crossfire | Oct 17, 2008 @ 02:54 AM | 2,911 Views
After mandatory procrastinating, decided to put down the recurrent neural network & use less sexy feedforward neural networks. There will B neural networks 4 proportional & rate feedback & linear equations for very small integral feedback. Add the neural & integral output to get total feedback. The neural networks are modeled after a PID equation initially & use backpropagation to adapt in flight.

The weeks of recurrent neural networks were necessary to know they couldn't be done. Got all the way to a recurrent network which could give proportional feedback on a graph, was stable in real life, & showed some memory in real life, but could not offer effective proportional feedback in real life.

Getting the integral part to work seems too need a much longer training pattern than we can afford. The amount of computing time & hand tweeking to get evolution out of it wasn't justified. The implementation is on the hard drive for when a CUDA budget appears.
Posted by Jack Crossfire | Oct 16, 2008 @ 02:01 AM | 2,926 Views
Discovered X-Y charts in OpenOffice were broken & the error of the neural network should be a euclidean distance instead of an absolute difference. That, along with more evolution tweeks, less neurons, & more processing time & the genetic algorithm is staying alive.

With the evolved neural networks, there is an optimum neuron count which is not too high & not too low & an optimum range for the starting weights. The chance of hitting a dead end with this algorithm is too high to build extra capacity into the structure.



Tomorrow's news today:

CNN: Ubama wins election

Fox: Ubama's 3rd nephiew's friend's cat's sister attacks hiker with hair ball.

CNN: President Ubama awards a free SUV to every home ower over $1 million underwater.

CNN: Ubama...

Fox: Hugo Chavez accuses Ubama of being too liberal & cuts off oil.

CNN: Treasury secretary Cramer pumps another $700 trillion into bank stocks.

Doomberg: Investors pull $700 trillion out of bank stocks & buy treasury notes.

Fox: Humans lose ability to walk without stepping on their own ****s.

CNN: Ubama promises bailout.

Fox: Ubama bans teaching evolution that requires carefully calibrated starting weights & neuron counts.
Posted by Jack Crossfire | Oct 15, 2008 @ 12:51 AM | 3,107 Views
Discovered rand() is not thread compatible. U need to use rand_r(&seed) for threads. rand() was sucking huge amounts of CPU time updating a global variable. rand_r keeps the CPUs in their own memory spaces & is much faster.

Well, after many hours & a lot of electricity, concluded evolved neural networks seem to require inhuman amounts of computing power to be effective. Maybe our genetic algorithm isn't the best. It just randomizes weights. It doesn't do crossovers. Even if they were trained offline using flight recordings every hour that we didn't fly, they probably wouldn't evolve much.

Have some other ideas, all back to backpropagation in flight:

Feedforward network taking an integrated error as one of the inputs.

Feedforward network that generates changes in feedback instead of absolute feedback. Current state & target state would replace current error as the input.

Well PBS is defending the parallel universe theory. Our obsession with infinite universes began when we asked if humans ever succeeded in simulating a human brain on a computer, & the contents of a real human brain were transferred to the computer, would the real human have sensations from the computer?

That led to defining human sensation as a function of the rate of increase in complexity of a localised packet of information. Similar packets of evolving information in different brains & different universes could share sensations like ESP. Our packets of...Continue Reading
Posted by Jack Crossfire | Oct 13, 2008 @ 12:15 AM | 2,701 Views
Good news: got a genetically trained feedforward neural network to control attitude feedback. Took very carefully designed training data with no clipped regions. Not sure large populations add anything.

The recurrent network is still a disaster....Continue Reading
Posted by Jack Crossfire | Oct 11, 2008 @ 12:38 PM | 2,934 Views
The genetic algorithm has been a disaster. So far, been using 1 member of the population as the seed for the next population. Very cache efficient & fast, but can lead to entire worthless populations. Also been feeding random inputs to it for the training set. This hasn't worked at all.

The algorithms that work use constant sets of training data & large populations as the seed for the next populations. No cache usage. Faster training since the random number generator is slow. Would have to test each population member on the entire training set.
Posted by Jack Crossfire | Oct 10, 2008 @ 01:44 AM | 2,616 Views
Daul layer DVD+R DL:

a much cheaper step on the way to BD with 1/3 the capacity instead of 1/5. Have had 50% coasters with these. Very few dual layer burners actually support them & none of the first dual layer burners support them. The "RW" on the packaging does not mean rewritable.

Micron:

finally had to face the music & admit it can't pay off Steve Appletree's executive mansion & 23,500 Idaho employees simultaneously. 3,525 Idaho workers must go. Another state fails to create jobs outside Calif*.

The end of the world:

The end of the world is Nov 5. Election complete, no need for more bailouts & no more money. Banks foreclose on everyone because their property is worth 10% of their loans. Paulson & Bernanke catch the last flight to antarctica. McCain & Ubacka take the escape pods to Canadia. Congress hides in the WV bunker. Dubya wanders around looking for a beer. Dogs & cats live together. Dead rise from the grave. 30 years of darkness, earthquakes, volcanoes.

Infinite universes:

Used to think there were infinite universes & our sensations were in a gaussian curve of many related universes. That's depressing because it means no matter what U do & who U meet, there's always another version of U who succeeded or failed & the people change every week, so nothing matters. A more likely story is there is only 1 universe & our sensations R in a gaussian curve of many beings in the same universe. Dying simply causes U to have sensations from another being in another part of the same universe.

Fl*rida timelapses:

Comca$t increased its upload bandwidth to 1.5Mbit. Finished cutting all the timelapse movies from Fl*rida & now U get the reap the rewards with fl*rida timelapses in HD. This is the timelapse footage we collected on the Fl*rida farm, on the mighty EOS 5D.

Fl*rida timelapses (3 min 47 sec)

Posted by Jack Crossfire | Oct 09, 2008 @ 06:51 AM | 2,954 Views
Well, got a recurrent network for cyclic feedback trained to where it was pretty close to the PID equation in the training algorithm. Unfortunately, it was completely erratic when run on the airframe. Training the recurrent network is definitely hit or miss. Most of the time, the evolution gets stuck & U need to restart it.

Now on to algorithm 2, comparing a sequence of steps from each mutation to a sequence of PID equation steps. The 2.4Ghz dual opteron from 4 years ago is coming out much faster than the 2.6Ghz Athlon X2's & core duos of today. Maybe the current stagnation in clockspeed means more assembly language jobs.

There's a limit to how accurate these networks can be & a definite dependancy on the structure of the network.
Posted by Jack Crossfire | Oct 08, 2008 @ 01:32 PM | 2,575 Views
The 4Ah maxamps.com we got for $94 in 2006 R now $150. 60% inflation. Even the 3.3Ah cheapbatterypacks.com ones R $130, easily 100% what they charged last year. Interest rates just fell to 1.5% so it's time to buy.
Posted by Jack Crossfire | Oct 07, 2008 @ 11:37 PM | 2,134 Views
Like any return of the neurons, it's not going well. The mission is to model a PID equation in a recurrent neural network. Once all our PID equations R modelled in neural networks & flying, the next step is to have the networks evolve from the starting PID equations in flight.

Training the recurrent neural network to act like a PID equation has 2 genetic routes.

1) Feed 1 random input into a PID equation & solve once using a random neural network. Throw the network out if the single solution deteriorates. The previous output of the PID equation is fed into the neural network & the integral in the PID equation is carried over to each step. This gives the neural result which would have resulted if it was predicting correctly & it was carrying an integral. Very slow & less likely to give a good integral part.

2) Feed a sequence of random inputs into a PID equation & compare with a sequence of solutions from a random neural network. Reset the PID integral before every sequence. Throw the network out if the solution sequence deteriorates. Very very very very slow & more likely to work.

It's taking populations of about 1000000 & around 4000 generations for genetic algorithm #1 to arrive at reasonable error rates. Nothing to do but procrastinate & have some more Fl*rida.
Posted by Jack Crossfire | Oct 06, 2008 @ 01:01 PM | 2,300 Views
Fixed some more photos 4 U. Had to relearn how to survive in the dumpy apartment, what to do when heroineclock goes off, how to use the dumpy light switches, where to sit, where the shower head is. Waking up the first time after a long trip, U have to wait a while to figure out where U R. Especially confusing because it looks exactly like Fl*rida outside the dumpy window. U actually still think you're where U were. Who knows how a neural machine would relearn its long term memory....Continue Reading
Posted by Jack Crossfire | Oct 05, 2008 @ 09:01 AM | 2,455 Views
For your last transmission from Fl*rida, U get some star trails over the farm. Time to start heading back to silicon valley.
Posted by Jack Crossfire | Oct 02, 2008 @ 11:11 PM | 2,424 Views
The answer is yes. There R lots & lots & lots of alligators in Fl*rida. On the Hillsborough river, they're as numerous as the mosquitos. 240 million year old scale technology triumphs over newer fur & skin technology.

Make no mistake, Tampon is the home of New York stock brokers after they make millions on mortgage bailouts. They all have New York accents & the Tampon flag is the Lexus SUV. They've created a virtual New York complete with traffic jams, virtual Manhattan island, lots of bridges, & New York driving techniques. It had a population boom in the 80's & is a few mortgage bailouts from another, so the alligators have fresh meat to look forward to....Continue Reading
Posted by Jack Crossfire | Oct 01, 2008 @ 09:54 PM | 2,512 Views
Small enough to fit in a roadster, smarter than Dubya, & coming to a management office near U.
Posted by Jack Crossfire | Sep 29, 2008 @ 11:13 AM | 3,290 Views
So crammed $280 of gas consumption into 1 week so now we have only 1 beach & some timelapse movies left. Need some fair weather clouds. Star movies over a few minutes R impossible because we don't have the lens heater to defeat condensation.