Limitations of the eye and how to overcome them

Human eyes have spatial, temporal, and spectral limitations which impose constraints on our perception. With appropriate optical devices and cameras, the limitations can be easily overcome. As a consequence, a huge variety of physical phenomena can be made accessible for teaching.


Introduction:
Data on the proportions of information content perceived by the human sensory organs usually vary between about 81 and 87% for the eyes, 10-11% for ears, and the remainder by taste, smell, and touch.
Our eye is therefore probably the most important of our senses, regarding perception of our environment.Unfortunately, it is limited in terms of spatial and temporal resolution as well as spectral sensitivity with regard to detection of physical objects and processes.However, the resulting limitations can be overcome by suitable optical devices with sensors in camera systems that replace the eye (figure 1).With these sensors, the infrared (IR), near infrared (NIR) and ultraviolet (UV) parts of the spectrum can be observed.As a result, the number of observable technical and natural phenomena and the information content obtained from them can be significantly increased.This enables a better understanding of physical phenomena.
Such cameras are relatively inexpensive, and many simple experiments can be successfully demonstrated with them.Hence, their use can enrich the teaching of physics at schools and introductory lectures at universities.Furthermore, it can also motivate students to study related fields such as infrared astronomy.

Space: from microscopes to telescopes
Without optical aids, the human eye has limited spatial resolution.Observation of nearby objects much smaller than 1 mm requires microscopes.These are usually available in all physics departments at school or university.As an example of modern physics devices, inexpensive semiconductor lasers (figures 2a,b) can be observed under the microscope (figure 2c) and even while operating (figure 2d) [1].This allows discussion of optical waveguide effects or light attenuation through the metallic contacting layers etc.In combination with the recording of I-U-characteristics of the laser diodes as well as the change of the emission spectra when the laser threshold is exceeded, the subject is also suitable as a simple practical experiment.The limited angular resolution of the eye also fails with respect to distant objects.Here observation with telescopes helps.Figure 2e shows Saturn with rings and auroral oval as an example, recorded by the Hubble Space Telescope.There is a multitude of fascinating astronomical images for use in physics education (here, e.g., discussion of solar wind, magnetic fields, aurora) which can be found on the excellent website Astronomy Picture of the Day [2].

Time: from slow motion to time lapse
Our eyes are also very limited as detector in terms of time resolution.Transient changes in the range of seconds are easily detectable by our eyes.However, very fast short-time processes with time scale way below one second, say 1 ms, 1 µs or even less, elude our perception.In contrast, very slow changes with time scales larger than hours or days also cause problems, as it is difficult to detect changes just by observing in the usual second or minute periods.
These restrictions can be overcome with suitable cameras.On the one hand, fast processes are recorded with high frame rates and short exposure times.The individual images are then played back as a slow motion video.On the other hand, slow processes are detected with single frames, recorded at large time intervals.The images are then used to create a time-lapse video [3,4].
Figures 3a,b show two still images from a high speed video of an exploding red balloon, recorded with a commercial high speed camera.The skin was punctured due to absorption of light from a green laser.Within a few milliseconds, the crack in the rubber propagates at the speed of sound.The reflections of the two illuminating lamps are initially visible on the outer skin in front.After 8 ms they are also visible on the still stretched rear inner skin.Obviously, the information of the bursting has not yet arrived there due to the finite propagation speed of the crack.This is a nice example for retardation effects in mechanics.).Links to video sequences, see [5,6].
Such images with time resolutions in the millisecond range can nowadays also be recorded with modern smartphone cameras, some of them operating with up to 1000 frames per second.
In contrast, figures 3c,d show two stills from a time-lapse video of a very slow thermodynamic process, the evaporation of water at room temperature.Such an experiment, running on a time scale of several weeks, can obviously be nicely integrated into teaching as a time-lapse video.This is much more impressive than a live demonstration in the lecture hall with occasional observations every other day over a period of several weeks.Many examples of slow motion and time-lapse videos for use in physics teaching have been reported [5,6], often with detailed explanations of the underlying physics.

Spectral domain: from UV to IR.
Maybe the most obvious limitation of the human eye, at least with regard to physics, is its spectral sensitivity.Light, i.e. visible electromagnetic radiation (abbreviated as VIS), comprises only a tiny fraction of the spectrum of electromagnetic radiation which is used technically or detectable in nature [7].This is, because our eyes with the cones and rods located on the retina as radiation sensors can only detect radiation in the wavelength range from about 380 nm to 780nm.At the beginning of the 19th century, however, it was proven that electromagnetic waves also transmit energy beyond these spectral limits.In the short-wave range, i.e. for smaller wavelengths, UV radiation, then X-rays and  radiation follow.In contrast, infrared (IR) radiation, then later microwaves and teraherz radiation as well as the radio waves could be detected beyond the long-wave range of 780 nm.
Obviously, we just need to replace our eyes with detectors sensitive to the respective wavelength ranges.This will change our perception of physical phenomena in technology and the environment enormously and a lot of new information is gained.
In the following, only the UV and near-IR ranges directly adjacent to visible light will be discussed.The slightly longer wavelengths of the thermal IR from around 3µm to 14 µm have of course many additional applications, in particular for teaching physics [8].This has, however, already been treated in detail elsewhere regarding basics of IR imaging as well as experiments for the classroom (e.g.[9,10]).
First, figure 4 depicts a comparison of a UV and a visual spectrum (VIS) image of a flower.The UV image (figure 4a) reveals a familiar sight to bees.The landing sites around the flower center are clearly enlarged, which is not at all visible in the VIS image with a yellow hue of the leaves (figure 4b).This is another example of occasional surprises in observations of familiar objects in other spectral regions.There is always interesting physics behind it, here UV radiation absorbing pigments.As another example for recording photos in the UV, VIS, and NIR spectral range, figure 5 shows three images of the same landscape scene in sunny weather with partial cloud cover, recorded with the same commercial SLR camera with Si-FPA sensor in the near UV, VIS, and near IR regions.Due to the change of the respective spectral filters, the images were recorded, within a short time interval of a few minutes.This manifests itself in slight shifts of the clouds.For better comparison, all images were converted into gray scale photos (using Adobe Photoshop).
Analysis of the differences observable in these images reveals many interesting physical insights.Perhaps the most striking feature at first glance -known as the Wood effect is the very bright vegetation in the NIR compared to the VIS and UV images.It is caused by different scattering and absorption behavior in the leaves [11].For more examples of NIR images in nature, see [12] and Figure 9.
A second feature, which shall be discussed in more detail is the successively improved contrast between clouds and sky while increasing wavelength from the UV to NIR.This contrast improvement is evident in all details of the images and has many applications.How does it come about?

Contrast and contrast enhancement in the near infrared
Contrast in vision refers to the extent to which objects can be distinguished from their surroundings, e.g.whether the clouds in figure 5 stand out from the sky.Of course, in the real world, we always perceive differences in color and brightness simultaneously.For the sake of simplicity, only perceived differences in brightness will be discussed here.These are described by the photometric quantity luminance B.
According to the laws of psychology of perception, the contrast C is defined by the relative luminance difference between an object and its surroundings.
This definition is based on Weber-Fechner's law.It states that a perceived sensory change, here brightness, is always related to the relative change of the stimulus, here the incident radiance or perceived luminance.
A prerequisite to detect an object with respect to its surrounding is that a certain contrast threshold Cthresh must be exceeded.Its exact value depends not only on the specific observation, but also on the observer.In atmospheric observations, e.g.visibility with contrast of objects to the sky close to the horizon, one usually assumes a threshold value of 2%, i.e.Cthresh=0.02[13,14].At finite distances, contrast below this threshold leads to the fact that corresponding details, i.e. differences of neighboring pixels cannot be perceived.
To apply equation ( 1), the relevant luminance values have to be calculated first.These usually consist of two parts, which is explained for an object irradiated by sun light (figure 6).First, one must consider the radiance emitted by the object and attenuated in the air on the way to the eye.Second, there is an additional radiance which is generated along the line of sight between object and eye.It is due to scattering of sun light by the air molecules.Light from such individual scattering events is then, of course, also attenuated again on its way to the observer.The combined radiance is then detected by the eye according to its spectral sensitivity.Thereby, the radiance is ultimately converted into luminance.Figure 6.The luminance perceived by the eye consist of two parts.First, solar radiation is scattered from the object towards the observer.On its way, this radiance is attenuated by scattering processes.Second, irradiation of the air along the line of sight also generates scattered radiation, which propagates towards the eye while being again attenuated.

In this respect, the perception of objects depends on:
 how the object is irradiated  how much of this radiation is scattering towards the eye  how much of this object radiation is lost on the way to the eye by scattering processes and  how much additional scattered radiation is formed along the line of sight, which is subsequently again attenuated Obviously, attenuation plays a crucial role.For an object in distance d, the radiation BObj which passes through a homogeneous medium like air, is attenuated according to the Lambert-Beer-Bouguer law The extinction coefficient contains all single scattering and absorption components.Multiple scattering processes (e.g., [15]) are not considered here.In addition to the attenuated object luminance, one must also compute the attenuated contributions of the scattered radiation within the line of sight.This contribution is often called air light.With regard to contrast of an object to its surrounding, we also need the luminance of neighboring objects (the surrounding), characterized by the outgoing radiation Bsurr(0).Before entering the eye, it is also attenuated according to equation (2).
From both, the luminance of object and surrounding, the contrast can be calculated.According to equation (2), it depends on distance, i.e.C= C(d).More generally speaking, it depends on four input variables: the extinction coefficient  and the three radiances of object, the surrounding and air light.In a simplified model, the ratios of the three radiances are wavelength independent, and the influence of the observation wavelength on the contrast can be easily analyzed [16].
For pure air, Rayleigh scattering [17] from air molecules dominates.It gives    = 0.0114/km for T=0°C, and p=1013.25 hPa in the middle of the visible spectral range at 560 nm [18].This means that any detected luminance is attenuated to 89,2% and 32,0% of its initial value after 10km and 100km while traversing the air along the line of sight.Since Rayleigh scattering varies approximately proportional to 1   decreases in the near infrared (NIR) at =900 nm to the smaller value    = 0.00163/km, i.e., to only about 1/7 of the value in the VIS.Less scattering means less attenuation of object and surrounding radiances, but also less scattered light along the line of sight.The respective changes in detected luminance significantly increases the NIR contrast as well as the visual range, corresponding to the contrast threshold.The same is true when scattered radiation from aerosols is added to Rayleigh scattering.These scatter mostly independent of wavelength.Therefore, the Rayleigh scattering component remains essential for the contrast differences between VIS and NIR.
Figure 7 shows an illustrative example by comparing a visible photo and a NIR one, both recorded with the same converted SLR camera [11] from Bozeman/MT.The distant mountain is Gallatin Peak, recorded from a distance of 42.5 km.For better comparison, both photos were converted to gray scale images [16].The theoretical spatial resolution of the camera at the distance 42.5 km is between 30 cm and 40 cm.At smaller distance, e.g. at around 20 km in the area of the front hill chains, the resolution drops by about a factor of two.There, resolution should be sufficient, in principle, to distinguish individual trees.
The visible image (figure 7a), seems, however, to depict one or two more or less homogeneous hill chains, except for the deforestation area in the foreground (C<Cthresh).Individual trees can just be resolved in the closest regions.In contrast, the NIR image (figure 7b), recorded at about the same time, shows significantly more structures and detail due to the increased contrast (C>Cthresh).The second hill chain is also forested and there is a lowered shaded area, which was hardly perceptible in the VIS.In addition, individual trees can still be clearly seen here.
Similar analyses were used to study maximum visual ranges in clear atmospheres as well as changes of visual range during abrupt changes of illumination during solar eclipses [19,20].
Contrast increase for larger wavelength has many technical and scientific applications, which can be discussed while teaching physics.This concerns, e.g. at even longer wavelengths in the NIR and thermal IR range [5], detection through haze (from satellites) and light fog.For forest fire fighters airborne cameras also allow to look through smoke and thus identify critical hot spots (see [9]).

NIR astronomy
Less attenuation of radiation at NIR compared to VIS wavelengths while passing through gases or dust clouds has also many important applications in astronomy.Therefore, the present topic is also suitable as preparation for infrared astronomy.
On Earth, the radiation of terrestrial objects has to pass through the Earth's atmosphere.Thereby it is scattered at their components, atoms, molecules, aerosols, water droplets, ice crystals and also dust and -while attenuated -contrast at longer wavelengths is increased.
Of course, extraterrestrial objects can also be observed.With earthbound telescopes, the light is then attenuated again at the very end of its journey when passing through the vertical atmosphere.Spacebased telescopes provide better images.On the one hand, these eliminate the resolution-limiting effect of air turbulence (Seeing).On the other hand, the frequency range outside the atmospheric windows can also be used.
However, even for space based telescopes, there is attenuation for the observed radiation.This is due to the fact, that radiation from stars and galaxies does not only propagate in nearly ideal vacuum.It is occasionally weakened by absorption and scattering, especially by cosmic gas and dust clouds in areas of new star formation.For this case, similar considerations apply as on Earth.For small particle sizes of dust, the extinction in the near infrared is much lower than in the visible spectral region.Lower scattering means higher contrast, i.e. quasi a larger visibility into space.

Figure 1 .Figure 2 .
Figure 1.Schematic representation of the three parameters space, time and the spectral domain, which do restrict visual perception.For each parameter, there are technological solutions to overcome the respective limitations.

Figure 3 .
Figure 3. a) bursting red balloon recorded with a commercial NAC Hotshot camera at 1000 fps, i.e. with one millisecond time resolution; (a) at start, i.e. at 0 ms and (b) 8 ms later; b) evaporation of 500 ml of colored water within 7 weeks at room temperature (camera Brinno TLC200).Links to video sequences, see[5,6].

a b Figure 4 .
UV (a) and VIS (b) image of a flower.Due to pigments, absorbing in the UV only, the UV image shows enlarged structures around the center which attract insects

a b Figure 7 .
Visible (a) and NIR image (b) of Gallatin peak, recorded from Bozeman/MT (USA) at a distance of 42.5 km in the evening sun (f = 600 mm, images converted to grayscale via Photoshop).

Figure 8
shows this impressively for the Carina Nebula, imaged by the Hubble Space Telescope [21].The columns of gas and dust at a distance of about 7500 light-years were imaged in 2009 in the visible and NIR range.The VIS image shows dense dust clouds in which new stars are forming.The stars behind the optically dense dust region remain hidden.In contrast, they show up impressively in the NIR image.The NIR radiation is much less attenuated and a large part can pass the dust cloud unhindered on its way to Earth -and provide us with valuable information.

Figure 8 .
Figure 8.Comparison of visible (top) and near-infrared (bottom) images of the Carina nebula.VIS image (16 MPixel) : composed of 3 images of narrowband detection at  502 nm, 656 nm and 673 nm (false colors blue, green and red).NIR image (1 MPixel): composed of 2 images at = 1.26 µm and 1.64 µm (false colors cyan and orange).Image courtesy NASA, ESA, and the Hubble SM4 ERO Team.