From my understanding, anytime DG instructs the satellite to “look backward” over its shoulder, the images will be “stretched” and so, not line up with how we normally envision the earth. The algorithms don’t make the stretching; the high nadir “looking backward” is what does it.
I think of it as putting a coin face up on the floor, walking 25-50 feet away and standing with your back to the coin, and then, look over your shoulder (or between your legs) and see how much of the coin face you can distinguish? Now, breathe air into a balloon and put the coin on top and repeat with looking backward to it. The further away you are changes the nadir, and how “high” or “low” the object is on the “surface” changes how we see it. So they run an algorithm to “smooth out” how we expect to see objects on earth.
You know @Mel_Nod it might help if folks could see one “untreated” strip of a backward looking nadir.
It’s interesting that DG has been working on “cloud thinning”. Please do instruct the algorithm to watch for 1 black cat, yours truly, before thinning the cloud upon which I lie. Please also watch out for my whiskers, as I’ve grown quite fond of my ‘sensitivities’ upon my face. Just don’t want a buzz cut, tell Server Johnny, k?