Photography is based on the detection of photons. Photons are easy to misinterpret as these are a bunch of special quantum. They are not like other quantum mechanical particles. eg they are NOT electrons. How is a photon different from an electron while both electrons and photons are dual entities? That is they are both to be realized as wave as well as particles? Here is the most basic elucidation of their properties. A photon is more like a wave even if its both wave and a particle. An electron is more like a particle even if its both wave and a particle. That is basically so because the photons never carry mass, and as a consequence their speed is always as high as it can be, which is found to be 300,000 km per second. Its erroneous to call photon's REST frame into consideration for that reason. Its not a particle if we are to think classically, particles must carry mass and by effect of their mass, momentum. But while they are mass-less they do have momentum. This property is described in one article on my website, which I will find and link, if you are interested. But to the contrary the electron does have some mass even when its at rest. (Photon can never attain rest and can never attain mass, it can only have momentum and energy as long as its single and traveling in vacuum). So one can bring the electron to rest in some way. How does that affect photography? The basic laws of nature are different for electrons and photons for this reason. The very uncertainty principles that we chose to describe the electrons must first be changed in a special way before they can be applied on the photon. The difference is electron being more particle like due to its possible slower motion, does not describe the photon as the latter is never a slower candidate. Hence the Non-relativistic forms of Uncertainty Relation are to be changed into the Relativistic Uncertainty Relation. Only then photography can be properly understood. In this special latter case of photon, the regular momentum-position uncertainty relation is no longer valid. How can you describe the photon, which never comes to rest; with "its" POSITION? It does not have a position. Hence position-momentum uncertainty is to be changed. Its recast-able into a speed-momentum (and position-energy etc) form. A form which I have worked out in much detail in one of my research work, available on my website (mdashf.org) Hence a constant speed results in a blurred momentum, a blurred energy and a blurred position. Depending on various other parameters such as time, the probability patterns of a camera image changes depending upon the relative motion between observed and observer (camera and object whose image is taken) Due to relative motion between camera and object (such as a bird) one is definitely going to get a blurred image. This is the reason the moving parts of a body whose picture is being taken might produce a fuzzy image while the parts that are still, always produces a sharp image.
Uncertainty Principle and Photography. see mdashf.org/2015/06/08/
1. September 27, 2008 Manmohan Dash, Virginia Tech
Uncertainty Principle
A natural effect of the Quantum World
2. September 27, 2008 Manmohan Dash, Virginia Tech
A shot taken when the
camera is focused at the
grass.
A shot taken when the
camera is focused at the
flower.
3. September 27, 2008Manmohan Dash, Virginia Tech
How do we see what we see
in a camera shot !!
The camera gets a
bunch of photons
[particulate light] or
a flash of light beam,
[light in wave form]
To get a clear image
of a spot this bunch
must be registered
with precision.
I.e. the spread in the
energy of photons must
be narrow.
The spread of the
energy is the same as
the “error” of the energy
measurement the
camera is capable of
performing, the less the
error the sharper the
image.
4. September 27, 2008Manmohan Dash, Virginia Tech
How do we see what we see
in a camera shot !! ……
Light travels at a
constant speed and the
small spread of the
energy is same as a
precise frequency of the
light beam.
This is not independent
of the corresponding
spread in the
wavelength of the light
beam.
The spread or the error
of the wavelength is
large if the
corresponding error in
the frequency/energy is
small.
For sharper image at a
specific spot the
position of the other
spots is “blurred”.
6. September 27, 2008Manmohan Dash, Virginia Tech
The image of the flower is blurred as its position along the direction “vertical to
the picture (say Z)” is not known precisely. In-fact it’s a “stack” of images from
different Z-locations. Since the camera has measured energy precisely it has
lost the position to the extent of a wide Z.
So far so good. Only the smartest guys would immediately point out what’s
“wrong” with this explanation. Their question would be the precision of energy
would also mean a lack of precision for the Z position of the grass. Why is it Not
blurred.
7. September 27, 2008Manmohan Dash, Virginia Tech
In-fact this would mean the Uncertainty Principle is not valid
The answer is “Uncertainty Principle” is still valid. The grass image would have
been blurred except for we have focused our camera to the grass. “So what?,
does that mean focusing has over-ruling power and uncertainty principle is a
joke?” NO. Focusing has provided an independent way of precise knowledge of
the Z position of the grass. The camera has focused to a precise position and
this knowledge is inherent in the information content of the camera. When the
camera reconstructs the images from all the available photons despite of the
error in the wavelength of the bunch coming from the grass we know its
position clearly.
8. September 27, 2008Manmohan Dash, Virginia Tech
A principle of Nature.
We would even have discovered this effect with our
naked eye, trying to focus to a distant object other
objects have a blurry image. Nonetheless this is not
an easy task with the eye. It doesn’t mean we are
super powerful but rather our eye works like a
camera/detector and our perception of vision is same
as our perception of a image shot by a camera. In the
case of a camera we have better optical and digital
power.
In any case “Uncertainty Principle” is just there in
Nature, like there is Gravity, or the sensation of
warmth and cold.
9. September 27, 2008Manmohan Dash, Virginia Tech
Merging the QM and the non
QM information ?
I haven’t done this experiment with a film camera or
even with naked eye. This was done with a latest digital
model and I don’t know its exact mechanism. I haven’t
reviewed to understand how exactly, focusing gives us
inherent precision about position. In any case its clear
that this is independent of the camera’s other
measurement. By focusing in-between the grass and
flower we were going to get a blurry grass as well. Does
it mean a digital camera merges both optical
measurement with its digital or quantum mechanical
measurement and gives us better result?? In that case
can we merge another focusing component to the
“camera” to get a sharper image of the flower as well??
10. September 27, 2008Manmohan Dash, Virginia Tech
Merging the QM and the
non-QM information ?
If not for commercial
purposes it may have
implications for
measurements in
astronomy !!
A recently [15th Sept
2008] found “Star-
Planet !!” by the
Gemini telescope. Gemini adaptive optics image of 1RSX J160929.1-210524 and its likely
~8 Jupiter-mass companion (within red circle).
This image is a composite of J-, H- and K-band near-infrared images.
All images obtained with the Gemini Altair adaptive
optics system and the Near-Infrared Imager (NIRI) on the
Gemini North telescope. Photo Credit: Gemini Observatory
11. September 27, 2008Manmohan Dash, Virginia Tech
Adaptive Optics System at
Gemini Observatory
Distortion in Optical
image is corrected by
calibration of the
atmospheric effects.
12. September 27, 2008Manmohan Dash, Virginia Tech
Infra-red wavelength reveals
more
Gemini works better at the
infrared wavelength, it would
be easier to study the
energy-wavelength
distributions.
A sophisticated energy-
wavelength analysis can be
incorporated by introducing
an optical “correction” to
Quantum Mechanical
information.