Views
518
Replies
6
Status
Closed
A blurred picture is basicly nothing more than a crisp picture passed throught a 2-D low-pass filter of a certain characteristic
I’m not an engineer, so take that into consideration. I’m not certain that the statement above is 100% true. It seems to imply that no matter how badly a picutre is out of focus, there should be a "counter" algorithm to bring it back into focus.
If I’ve got that right, and if this is true, I don’t know how this helps us at all. Consider your typical non-digital camera: the variation in focus seems infinite in the possiblilities of the "math" behind why it’s out of focus. We use our eye with a little help from the TTL viewfinder to assure an image is in focus, but that value is (or seems that it would be) impossible to produce twice in a row unless the focus was never changed.
With that in mind, you snap a pic of Aunt Judy at 3 feet away, that would produce a "math value" of X; if you never touch the focus again, it *might* be possible to calculate the algorithm that would counter any out-of-focus issues. However, if you put the camera down, and reposition Aunt Judy to get a profile, the photographer will re-focus, thereby changing the algorithm needed to correct any slight out of focus.
Taking that a step further, if you always shot out of focus, never changing the focus of the camera, a single step forward or step backward by the subject, would again, change whatever value you were trying to apply, from an algorithm perspective, thereby obviating the whole exercise.
So, I’m not saying I understand exactly what you’re after, but as an engineer, I’m sure you know, theory is WAY different than real life.
Once the transmission of this filter function is found, it should be easy to calculate an inverse transformation to bring back the sharpness,
In consideration of what I was saying above, I’m not sure that you can EVER find the inverse transformation.
<shrug>
Peace,
Tony
I’m not an engineer, so take that into consideration. I’m not certain that the statement above is 100% true. It seems to imply that no matter how badly a picutre is out of focus, there should be a "counter" algorithm to bring it back into focus.
If I’ve got that right, and if this is true, I don’t know how this helps us at all. Consider your typical non-digital camera: the variation in focus seems infinite in the possiblilities of the "math" behind why it’s out of focus. We use our eye with a little help from the TTL viewfinder to assure an image is in focus, but that value is (or seems that it would be) impossible to produce twice in a row unless the focus was never changed.
With that in mind, you snap a pic of Aunt Judy at 3 feet away, that would produce a "math value" of X; if you never touch the focus again, it *might* be possible to calculate the algorithm that would counter any out-of-focus issues. However, if you put the camera down, and reposition Aunt Judy to get a profile, the photographer will re-focus, thereby changing the algorithm needed to correct any slight out of focus.
Taking that a step further, if you always shot out of focus, never changing the focus of the camera, a single step forward or step backward by the subject, would again, change whatever value you were trying to apply, from an algorithm perspective, thereby obviating the whole exercise.
So, I’m not saying I understand exactly what you’re after, but as an engineer, I’m sure you know, theory is WAY different than real life.
Once the transmission of this filter function is found, it should be easy to calculate an inverse transformation to bring back the sharpness,
In consideration of what I was saying above, I’m not sure that you can EVER find the inverse transformation.
<shrug>
Peace,
Tony
Must-have mockup pack for every graphic designer 🔥🔥🔥
Easy-to-use drag-n-drop Photoshop scene creator with more than 2800 items.