This blog is for nonprofit, educational purposes - media is incorporated for educational purposes as outlined in § 107 of the U.S. Copyright Act.

Wednesday, June 14, 2017


I suppose this strategy is successful enough with people who don't really know much about just how much data we get from satellites but let's jump down the rabbit hole...

Gosh, if only...  And what does he mean "composite" exactly since EVERY COLOR IMAGE IS A COMPOSITE?  I'll assume he means it's not strips of data so I've given numerous examples here of images that are FULL FRAME images of the Earth. They are ALL composites because all digital images are grayscale and color is applied to at least 3 layers of data to give a color image.  That is how your P900 works also.  It has Red/Green/Blue filters in a matrix that filter out that color of light and render a grayscale data on the sensor, which is then COMPOSITED to form the final color image.

Because we don't have about a MILLION 11000 x 11000 pixel images from Himawari-8, the Japanese weather satellite and Himawari-9 coming online now (click for full resolution image, prepare for a big download):


The idea that all this data is faked and yet matches actual weather patterns is just ridiculous.

Then we have DSCOVR/EPIC (primary mission is solar observations) which has a unique perspective on the Earth being the furthest Earth imaging satellite we have -- it hovers around the Lagrange point 1, which is a stable orbital area located about  930,000 miles from Earth.  Most "full frame" weather satellites are geosynchronous which is "only" about 22,240 miles away (so EPIC is almost 42 times further away).

 At this point the satellite isn't orbiting the Earth but is co-orbiting the Sun along with the Earth so it always faces the sunny side of the Earth -- but since it isn't perfectly centered you can usually see a slight shadow on one edge (which shifts over time).

The Lagrange Points, DSCOVR/EPIC is at L1

EPIC is a little bit older technology now so the CCD is only 2048x2048.

But it takes an image ever hour (in 10 spectra) and you can watch a full year timelapse:

Another new satellite coming online now is GOES-16 which isn't operational yet but is already providing unprecedented views of the Earth.

GOES-16 Comes Online

If you add all the weather satellite data from the past 20 years we have BILLIONS of images and hundreds of thousands of hours of video.

The idea that all of these are simple photoshop images or even the idea that there are millions of planes and balloons scouring the Earth every minute of the day and night so we get accurate image data we can render into globes is just absurd.

What Flat Earth is doing here to taking the fact that NASA themselves explained how they made SOME of the Blue Marble images FROM low-earth-orbit data (~500 miles up) and claiming that shows some grand conspiracy.

If NASA is "lying" why did THEY publish the details of how the images were put together?

NASA: VIIRS Eastern Hemisphere Image - Behind the Scenes

People often balk at this image but this is simply HOW the Earth looks from just 500 miles up, you only see a FRACTION of the Globe - you can recreate this in Google Earth.  THIS IS HOW SPHERES WORK.

From the ISS, 250 miles up, this is all you can see of the Globe because you cannot see past where your line-of-sight is tangent to the surface:

As for his second claim that we see "Top to Bottom" of distant mountains -- that is also just a flat out lie and I've posted a dozen or more of these examples along with my full analysis on my blog:

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.