Jack Crossfire's blog - RC Groups
Jack Crossfire's blog View Details
Posted by Jack Crossfire | Sep 22, 2017 @ 12:45 PM | 1,276 Views
The answer is no, you can't align the rear wheels. They're doomed to wear out from being toed in. The tires can't be removed. H King sells an alternative set of wheels with removable tires. The rear wheels can't be swapped with the front wheels.
Posted by Jack Crossfire | Sep 16, 2017 @ 10:26 PM | 959 Views
It was assumed when the treads wore down, the bald tires wouldn't wear anymore. A quick examination of the tires revealed the bald parts were still wearing down & were looking like a Mars rover. So they were moved to the back, which had shown very little wear. It would have been better if they were rotated sooner, but it only means a new vehicle is going to be required before 2 of the tires are worn out.


After another 15 miles, it became clear that the front tires wore down because they have a slight toe in. The good news is the hubs didn't slip inside the rear tires, anymore. Slippage is a function of the rubber wearing out over time, as the tire wiggles ever so slightly when the motor starts.

http://solarxploration.wixsite.com/concepts/images

has some more space artwork. There are a few disasters like the ITS being made of riveted aluminum instead of carbon fiber, wind surfing & jet suits on Mars.
Posted by Jack Crossfire | Sep 16, 2017 @ 03:06 PM | 1,003 Views
Flickr has a reasonable viewer for spheres, but none of them can zoom like the QTVR player we had 25 years ago.

https://www.flickr.com/photos/152533...posted-public/

This one has the moon overlaid on the 60 frame average, giving the best approximation of what it was like. Unfortunately, flickr doesn't allow fullscreen viewing.

https://www.flickr.com/photos/152533...posted-public/

Made another eclipse video where footage of the moon from another camera was inserted where the Gear 360 couldn't resolve anything. It's far from a perfect spherical mapping, but it's quite effective in 4k. The bitrate was also doubled.

Eclipse in spherical mode with moon visible (4 min 18 sec)

Posted by Jack Crossfire | Sep 14, 2017 @ 03:23 PM | 1,138 Views
Musk dealt a crushing blow to the 5 guys who read about the space program. A lot of us were hoping the current wave of civilization would go beyond the Saturn V, but in 1 tweet, the Saturn V became the largest we would ever get.

The revised BFR would have 21 instead of 42 engines. The work invested in building a mock 12 meter tank last year goes away. The good news is waves of civilization only take thousands of years instead of millions. The successor to Saturn V could be 4000 years away.

The only savings are the factory & the assembly building. A new barge & launchpad still have to be built. He would find a way to stack it horizontally. It's obvious that in order to free up room to build the mini BFR in the existing factory, they're going to dramatically slow down Falcon 9 production & reuse the existing inventory. Suspect block 5 will be the start of almost every customer flying on a used booster.

The 21 engines would make 10 million lbs of thrust instead of 29 million, slightly more than the Saturn V, in a smaller package.
Posted by Jack Crossfire | Sep 13, 2017 @ 12:04 AM | 1,609 Views
A fusing of the very 1st Gear360 video was quite accurate, all the way to the bottom. This had the camera horizontal.
A fusing of the camera rotated sideways was also nearly perfect, all the way to the bottom.


The leading theory is grinding the lens down may have tilted the plane of focus so objects on 1 side are farther away than objects on the other side. This can probably be corrected with a pile of new parameters.

Another problem was vignetting causing the boundaries to look dark.

Eliminated the radius parameter from the Bourke equations, since it only scaled the field of view. Fusing now took only field of view, input XY, & Z rotation.


Uploading it to the goo tube requires tagging it. The goog has source code for tagging it & reading the tags. It's a python program which works on all operating systems. Merely run 'python spatialmedia -h' in the root directory.


https://github.com/google/spatial-media/releases


Unfortunately, it couldn't handle files containing mp4 audio. Despite being written 16 years after libraries for reading Quicktime/mp4 started appearing, he still wrote a custom parser which doesn't work. Such is life in a world with every program written in a different language.

The decision was made to hack the lion kingdom's 19 year old make_streamable program to inject the header while also moving the headers to the start of the file. It'll never pass a Goog job screening, but it works. It was hard coded for equirectangular with 2 channel audio. The hardware is around for making 4 channel audio, but the standards are very complex.

Eclipse in spherical mode (4 min 18 sec)


It has some potential on the day job's ipad pro. It's almost worth buying VR goggles to experience that moment again. The Samsung's inability to resolve the moon was a disaster.


To get flickr & facebook to show spherical photos in a viewer, use exiftool to add the magic tag.

exiftool -ProjectionType="equirectangular" photo.jpg
Posted by Jack Crossfire | Sep 10, 2017 @ 07:20 PM | 1,148 Views
After many days of searching, a legit looking source appeared.

http://paulbourke.net/dome/dualfish2sphere/

http://paulbourke.net/dome/fish2/#fish2pano


Of course, like most things these days, there's no source code for a complete converter. There's just enough source code to develop a converter from scratch. More clues come from a list of command line arguments required to stitch the images.


Using standard image processing techniques, the equations finally yielded a proper equirectangular projection. Without the benefit of automated control points, a lot of manual tweeking got the horizons to line up, but failed to align anything above or below. It definitely wasn't parallax distortion. The right eye needed to be rotated in the opposite direction for the bottom to line up. It would be impossible to align the images without a GUI giving instant feedback.


The field of view can be adjusted by getting objects on the horizon to line up horizontally. It's not clear how to adjust the radius. The center XY parameters he provided are the input coordinates of the fisheye. He then provided XYZ rotation parameters which appeared to transform the input coordinates. Because the rotation & center parameters feed back into each other, it was impossible to get very far that way.


It was easier to make the X & Y rotation apply to the output & make center X & Y where to capture the input. Z rotation still worked by transforming the input coordinates.


The next step would be finding a decent viewer besides the goo tubes. Lions prefer watching spherical videos as equirectangular projections, but there's no way to know if the projection works without a viewer. There's also a new class of effects which apply just to spherical video.
Posted by Jack Crossfire | Sep 06, 2017 @ 01:14 AM | 1,229 Views
The closest empirical equations got was this render. The problem is tangents to the sphere should be horizontal. Here, they only converge. Any horizontal lines are purely from extreme stair stepping of pixels rather than coordinates. Nearby objects don't overlap the way they should. The lenses have to be pretty well crushed, horizontally. Some horizontal compression makes sense, since the eyes overlap.

Having failed the empirical idea, that leaves installing windows, Actiondirector, & photographing test patterns to derive other empirical equations or resorting to mobius transforms. It's the kind of thing vihart would have made a video about, 10 years ago. The generation which used to make videos about math graduated to making families. They were like an offshoot of the generation which wrote open source software as a hobby or to get jobs.

Surprising how times changed so much, the entire VR craze passed without anyone documenting the math behind it, just for the spirit of discovery, the way Alan Cox or someone from that generation would have.
Posted by Jack Crossfire | Sep 04, 2017 @ 04:05 PM | 937 Views
Figured no-one would get a 4k video of it, because it was too difficult for the current generation. No-one did, except for 1 Korean professor who came around the world with a very large telescope.


https://petapixel.com/2017/09/02/tot...eid=37b806db54


The panning shot was fully automated. It claims to have an accoustic reduction gear but uses a DC motor, so it could be a bit of marketing wank. In the automated pan, it's remarkable to see the detail of the moon's mountains & the entire firey surface of the sun. Only with a shade in space can these details be seen. The atmosphere scatters too much light from the sun's center for a ground based shade to work. It makes it almost worth getting a 4k monitor.



Among other predictions, no-one got a locked down timelapse from a quad copter. No-one got a 360 video from a mountain top. In the coming weeks, a few other timelapses of Mount Jefferson appeared which were better than the lion kingdom's.


Solar Eclipse 2017 - Mt Jefferson Oregon. Time lapse (0 min 23 sec)


For some reason, people don't know how to upload higher than 320x240.
Posted by Jack Crossfire | Sep 03, 2017 @ 06:02 PM | 1,053 Views
The stitching plugin is an ongoing debate. Actiondirector just works. The original idea was not creating proper spherical projections but something more visually appealing which the user could watch without panning. Then, spherical projection became necessary because that's what every program should do.

Doing it properly requires mobius transforms. Cinelerra can do it emprically, using just trig functions, but it lacks the more advanced manipulations. So far, just scaling X coordinates by a secant function gets close to the spherical projection, but is still pretty bad.



In the years since VR lost popularity, much progress has been made in spherical anaglyphs, virtual camera blocking, rotating spherical projections. The mobius transform is implemented with some kind of ray tracing. It would have to be optimized into trig functions or made into a lookup table.

Only Actiondirector can apply the IMU data. Samsung didn't write Actiondirector. It was marketed by Cyberlink, the same Cyberlink which marketed the 1st fully licensed, region locked, menu supporting DVD player for computers, in 1999.

On Linux, there's an expensive KartaVR plugin for Blackmagic Fusion. There's Ptgui which is an emprical process but doesn't seem to work. There's the expensive Autopano for Windows.
Posted by Jack Crossfire | Aug 28, 2017 @ 11:06 PM | 1,524 Views
It was almost good enough to land back on the launch mount. Suspect landing it back on some kind of mount other than the launch mount & stripping the landing gear is coming. It would only need some beefier metal bumpers. It may be the missing link to recovering the 2nd stage. The key to landing on Mars may be sending a launch mount 1st, then landing the rocket on it. It would allow a smaller rocket to send the same payload if every rocket used the same landing/launch mount at the destination.


There's serious math behind those landings. Lars Blackmore had to spend 8 years at Cambridge & MIT, another 4 years at NASA before they would hire him to design that algorithm. This evolution was basically driven by the education system. Hard to imagine working on algorithms for all those years with almost no chance of being used in a mission, again being tasked with grasshopper software with almost no chance of succeeding, then suddenly having these 16 years of unseen work become headline news around the world.
Posted by Jack Crossfire | Aug 26, 2017 @ 02:16 PM | 1,435 Views
There's been a rise in concept art based on modern, constrained hardware. These barely plausible designs rooted in modern constraints are a bit more interesting than the purely fictional designs of the past. In the old days, there was no way the hardware of the time could have ever gotten us to other planets, so it lead to the rise of purely fictional Star Trek/Star Wars type artwork. There were no elements from every day life. They were completely smooth spaceships, flying saucers, levitating & powered by concept engines which didn't exist, relying on warp drives, hyper drives, & the force.


All that changed when an old fashioned chemical rocket showed modern technology could be just good enough to open the door to the stars. Now, the machines of the future are based on the machines of today.


Computer generated art:

http://deadvertex1.blogspot.com/2017...-concepts.html


Hand drawn art:


http://spacethatneverwas.tumblr.com/


Random, but had the word SpaceX on something would could pass as a methalox plant:

https://www.artstation.com/beeple


Finally, there's the legendary Robert McCall:


http://www.mccallstudios.com/

While manely fictional, there was 1 piece he did in 1996 which looked a bit like the XB-70 & Concorde designs of the old days.

http://www.mccallstudios.com/include...c-airliner.jpg
Posted by Jack Crossfire | Aug 26, 2017 @ 12:38 AM | 1,671 Views
The last video was a 10:1 timelapse of the 360 view. The shadow movement is a bit clearer, but not fast enough to miss the details.


Eclipse panorama timelapse (1 min 38 sec)



Overall, interest in the eclipse disappeared as fast as any other news story. There wasn't even a speech about unification of the world like there was during the moon landings. The only interest came from the same small group which reads every nasaspaceflight.com story & wants to go to Mars.


Lions read about total eclipses many times in childhood, but never thought they would see one.
Posted by Jack Crossfire | Aug 24, 2017 @ 11:37 PM | 1,326 Views
Posted by Jack Crossfire | Aug 24, 2017 @ 12:06 AM | 1,784 Views
Based on the camera's orientation, this is how it was oriented. Strangely remember it rotated 90 deg clockwise & the 2 large prominences going down & right, but it may have been an optical illusion from the sun being on the car's right side. Used a single exposure to bring out the flares.

Also in this issue, we see just how good the Sanyo's footage was. It might be the most exciting footage of all the cameras, clearly showing the shadow traveling from Mount Jefferson to the crowd.

Mount Jefferson eclipse timelapse (1 min 7 sec)

Posted by Jack Crossfire | Aug 23, 2017 @ 12:35 AM | 1,719 Views
After finding nothing which captured the size lions actually saw, the lion kingdom decided to stack all 62 photos the T4I got & rotate it with the horizon down. It was configured to make a movie rather than a still photo, so it got only a tiny amount of light. In reality, the prominences stretched many diameters & were fuller. At least this starts to show the X with 2 large arms on the left & 2 smaller ones on the right. Unsharp masking gives mixed results.
Posted by Jack Crossfire | Aug 22, 2017 @ 08:48 PM | 1,898 Views
5971420 eclipse_5d
6503204 eclipse_android
10632900 eclipse_gear360
11955400 eclipse_gopro
26555836 eclipse_mark3
3536040 eclipse_sanyo
38355472 eclipse_t4i
103510272 total

It was 1 of the things you thought would never happen in your lifetime. 103 GB of data was captured in 3 hours by 7 cameras. The T4I shot a CU timelapse at 1/200, F5.6 ISO200, 200mm The T4I got 62 frames of totality, manely limited by the slow SD card.


The hour before totality was very busy. The amount of checklist rechecking, camera blocking, camera reconfiguring, reframing of the drift shots was intense. 10 minutes before totality was a race to start all the cameras. The Mark III went into an excessively fast shutter bracket. It required a faster shutter speed to focus, but this wasn't reset to the slower shutter speed needed for exposing the stars. Magic Lantern wouldn't stop shooting, but fortunately there was enough time between shots to adjust the shutter without stopping it.


As the 3 DSLR's clicked furiously, the lion now turned to using his own eyes. The lion just stood & stared at the sun through the glasses. The sliver of light quickly got smaller & smaller. After the episode with Magic Lantern, it all happened extremely fast. There was no time to gather someone to hug or tweek any cameras, now.


Then the mass shouting began, including "glasses off!" After a careful move to pull the filters off the cameras without shifting focus, it was time to stare &...Continue Reading
Posted by Jack Crossfire | Aug 18, 2017 @ 08:32 PM | 2,037 Views
This arrangement can slide into the defroster button when the screws get loose, but should be good enough for 2 drives. The long term requires removing more material & shifting the phone right.
Posted by Jack Crossfire | Aug 17, 2017 @ 11:52 PM | 1,858 Views
Those vent mounts are $1 at the dollar store.

A simulation of the corona was released

http://www.businessinsider.com/solar...natural-blue-4





showing a very unique pattern, distinct from all the past coronas. It made the lion kingdom wonder if it was worth F5.6 instead of F2.8, but F2.8 is very sharp on the wide lens. The corona will be captured by many men & the lion kingdom does mean men. So it won't be worth sacrificing the timelapse to try to get it on the CU lens. It would have been nice to have a 9th camera for making a bracketed CU, anyways.


As we approach zero hour, the mane reason for preparing 8 cameras & having 4 pointing at the sun was to increase the chance of at least 1 getting a minimal shot. The absolute worst thing that could happen is all the cameras failing & your only memory being what you saw with your own eyes. Some of them are definitely expected to get nothing.