Jack Crossfire's blog View Details
Archive for September, 2017
Posted by Jack Crossfire | Sep 29, 2017 @ 03:08 PM | 3,132 Views
"We will start construction of the 1st ship around the 2nd quarter of next year."

Like something from a movie about averting the destruction of the human race. Pretty intense stuff, but lions are pretty sure you can't test the in situ propellent plant for the 1st time on the same mission the 1st astronauts are going on. As expected, it was lighter than last year's show because of the renewed focus on their mission backlog instead of the BFR.

Delta wings are back. It has assumed the familiar shuttle shape. At least the shuttle designers got that part right. Unfortunately, he couldn't make the heat shield reusable. They're not going fly around with enough material to make it reusable. They're going to spend a huge amount of money replacing it for every flight.

He didn't show the engine layout of the new 1st stage. Perhaps the BFR guy focused on just the new 2nd stage & he only has a rough idea of the new 1st stage.


Unlike last year's design, the current design looks like it could actually happen & he was clean shaven which added more credibility. The payload is now 1/2 of last year's. The timeline is the same as last year, but he's able to do it in the existing factory.

Using a BFR to fly around Earth is a bit silly. The Concorde already proved no-one cares about high speed air travel, so he's not going to make money on that. The security lines & boat trip to the launch pad will comprise most of the time. At least it would...Continue Reading
Posted by Jack Crossfire | Sep 27, 2017 @ 11:57 PM | 2,711 Views
In the quest for a use for the Gear 360 & all the work that went into it, some more tweeks to the pointless spherical camera software have revealed some documentation of a vehicle to be retired next year. The next step is spherical images of the interior, which weren't possible with the mob.

C130P #66-0219 (0 min 30 sec)

Posted by Jack Crossfire | Sep 22, 2017 @ 12:45 PM | 3,248 Views
The answer is no, you can't align the rear wheels. They're doomed to wear out from being toed in. The tires can't be removed. H King sells an alternative set of wheels with removable tires. The rear wheels can't be swapped with the front wheels.
Posted by Jack Crossfire | Sep 16, 2017 @ 10:26 PM | 2,824 Views
It was assumed when the treads wore down, the bald tires wouldn't wear anymore. A quick examination of the tires revealed the bald parts were still wearing down & were looking like a Mars rover. So they were moved to the back, which had shown very little wear. It would have been better if they were rotated sooner, but it only means a new vehicle is going to be required before 2 of the tires are worn out.


After another 15 miles, it became clear that the front tires wore down because they have a slight toe in. The good news is the hubs didn't slip inside the rear tires, anymore. Slippage is a function of the rubber wearing out over time, as the tire wiggles ever so slightly when the motor starts.

http://solarxploration.wixsite.com/concepts/images

has some more space artwork. There are a few disasters like the ITS being made of riveted aluminum instead of carbon fiber, wind surfing & jet suits on Mars.
Posted by Jack Crossfire | Sep 16, 2017 @ 03:06 PM | 2,790 Views
Flickr has a reasonable viewer for spheres, but none of them can zoom like the QTVR player we had 25 years ago.

https://www.flickr.com/photos/152533...posted-public/

This one has the moon overlaid on the 60 frame average, giving the best approximation of what it was like. Unfortunately, flickr doesn't allow fullscreen viewing.

https://www.flickr.com/photos/152533...posted-public/

Made another eclipse video where footage of the moon from another camera was inserted where the Gear 360 couldn't resolve anything. It's far from a perfect spherical mapping, but it's quite effective in 4k. The bitrate was also doubled.

Eclipse in spherical mode with moon visible (4 min 18 sec)

Posted by Jack Crossfire | Sep 14, 2017 @ 03:23 PM | 2,915 Views
Musk dealt a crushing blow to the 5 guys who read about the space program. A lot of us were hoping the current wave of civilization would go beyond the Saturn V, but in 1 tweet, the Saturn V became the largest we would ever get.

The revised BFR would have 21 instead of 42 engines. The work invested in building a mock 12 meter tank last year goes away. The good news is waves of civilization only take thousands of years instead of millions. The successor to Saturn V could be 4000 years away.

The only savings are the factory & the assembly building. A new barge & launchpad still have to be built. He would find a way to stack it horizontally. It's obvious that in order to free up room to build the mini BFR in the existing factory, they're going to dramatically slow down Falcon 9 production & reuse the existing inventory. Suspect block 5 will be the start of almost every customer flying on a used booster.

The 21 engines would make 10 million lbs of thrust instead of 29 million, slightly more than the Saturn V, in a smaller package.
Posted by Jack Crossfire | Sep 13, 2017 @ 12:04 AM | 3,406 Views
A fusing of the very 1st Gear360 video was quite accurate, all the way to the bottom. This had the camera horizontal.
A fusing of the camera rotated sideways was also nearly perfect, all the way to the bottom.


The leading theory is grinding the lens down may have tilted the plane of focus so objects on 1 side are farther away than objects on the other side. This can probably be corrected with a pile of new parameters.

Another problem was vignetting causing the boundaries to look dark.

Eliminated the radius parameter from the Bourke equations, since it only scaled the field of view. Fusing now took only field of view, input XY, & Z rotation.


Uploading it to the goo tube requires tagging it. The goog has source code for tagging it & reading the tags. It's a python program which works on all operating systems. Merely run 'python spatialmedia -h' in the root directory.


https://github.com/google/spatial-media/releases


Unfortunately, it couldn't handle files containing mp4 audio. Despite being written 16 years after libraries for reading Quicktime/mp4 started appearing, he still wrote a custom parser which doesn't work. Such is life in a world with every program written in a different language.

The decision was made to hack the lion kingdom's 19 year old make_streamable program to inject the header while also moving the headers to the start of the file. It'll never pass a Goog job screening, but it works. It was hard coded for equirectangular with 2 channel audio. The hardware is around for making 4 channel audio, but the standards are very complex.

Eclipse in spherical mode (4 min 18 sec)


It has some potential on the day job's ipad pro. It's almost worth buying VR goggles to experience that moment again. The Samsung's inability to resolve the moon was a disaster.


To get flickr & facebook to show spherical photos in a viewer, use exiftool to add the magic tag.

exiftool -ProjectionType="equirectangular" photo.jpg
Posted by Jack Crossfire | Sep 10, 2017 @ 07:20 PM | 4,270 Views
After many days of searching, a legit looking source appeared.

http://paulbourke.net/dome/dualfish2sphere/

http://paulbourke.net/dome/fish2/#fish2pano


Of course, like most things these days, there's no source code for a complete converter. There's just enough source code to develop a converter from scratch. More clues come from a list of command line arguments required to stitch the images.


Using standard image processing techniques, the equations finally yielded a proper equirectangular projection. Without the benefit of automated control points, a lot of manual tweeking got the horizons to line up, but failed to align anything above or below. It definitely wasn't parallax distortion. The right eye needed to be rotated in the opposite direction for the bottom to line up. It would be impossible to align the images without a GUI giving instant feedback.


The field of view can be adjusted by getting objects on the horizon to line up horizontally. It's not clear how to adjust the radius. The center XY parameters he provided are the input coordinates of the fisheye. He then provided XYZ rotation parameters which appeared to transform the input coordinates. Because the rotation & center parameters feed back into each other, it was impossible to get very far that way.


It was easier to make the X & Y rotation apply to the output & make center X & Y where to capture the input. Z rotation still worked by transforming the input coordinates.


The next step would be finding a decent viewer besides the goo tubes. Lions prefer watching spherical videos as equirectangular projections, but there's no way to know if the projection works without a viewer. There's also a new class of effects which apply just to spherical video.
Posted by Jack Crossfire | Sep 06, 2017 @ 01:14 AM | 2,865 Views
The closest empirical equations got was this render. The problem is tangents to the sphere should be horizontal. Here, they only converge. Any horizontal lines are purely from extreme stair stepping of pixels rather than coordinates. Nearby objects don't overlap the way they should. The lenses have to be pretty well crushed, horizontally. Some horizontal compression makes sense, since the eyes overlap.

Having failed the empirical idea, that leaves installing windows, Actiondirector, & photographing test patterns to derive other empirical equations or resorting to mobius transforms. It's the kind of thing vihart would have made a video about, 10 years ago. The generation which used to make videos about math graduated to making families. They were like an offshoot of the generation which wrote open source software as a hobby or to get jobs.

Surprising how times changed so much, the entire VR craze passed without anyone documenting the math behind it, just for the spirit of discovery, the way Alan Cox or someone from that generation would have.
Posted by Jack Crossfire | Sep 04, 2017 @ 04:05 PM | 2,556 Views
Figured no-one would get a 4k video of it, because it was too difficult for the current generation. No-one did, except for 1 Korean professor who came around the world with a very large telescope.


https://petapixel.com/2017/09/02/tot...eid=37b806db54


The panning shot was fully automated. It claims to have an accoustic reduction gear but uses a DC motor, so it could be a bit of marketing wank. In the automated pan, it's remarkable to see the detail of the moon's mountains & the entire firey surface of the sun. Only with a shade in space can these details be seen. The atmosphere scatters too much light from the sun's center for a ground based shade to work. It makes it almost worth getting a 4k monitor.



Among other predictions, no-one got a locked down timelapse from a quad copter. No-one got a 360 video from a mountain top. In the coming weeks, a few other timelapses of Mount Jefferson appeared which were better than the lion kingdom's.


Solar Eclipse 2017 - Mt Jefferson Oregon. Time lapse (0 min 23 sec)


For some reason, people don't know how to upload higher than 320x240.
Posted by Jack Crossfire | Sep 03, 2017 @ 06:02 PM | 2,767 Views
The stitching plugin is an ongoing debate. Actiondirector just works. The original idea was not creating proper spherical projections but something more visually appealing which the user could watch without panning. Then, spherical projection became necessary because that's what every program should do.

Doing it properly requires mobius transforms. Cinelerra can do it emprically, using just trig functions, but it lacks the more advanced manipulations. So far, just scaling X coordinates by a secant function gets close to the spherical projection, but is still pretty bad.



In the years since VR lost popularity, much progress has been made in spherical anaglyphs, virtual camera blocking, rotating spherical projections. The mobius transform is implemented with some kind of ray tracing. It would have to be optimized into trig functions or made into a lookup table.

Only Actiondirector can apply the IMU data. Samsung didn't write Actiondirector. It was marketed by Cyberlink, the same Cyberlink which marketed the 1st fully licensed, region locked, menu supporting DVD player for computers, in 1999.

On Linux, there's an expensive KartaVR plugin for Blackmagic Fusion. There's Ptgui which is an emprical process but doesn't seem to work. There's the expensive Autopano for Windows.