2. Once again, one could make an evolutionary argument here. Given that the human voice falls in this middle range and the importance of communication among humans, one could argue that it is quite adaptive to have an audible range that centers on this particular type of stimulus.

and than you can do things like siemensStars["BottomRight", "Center"] or siemensStars // Dataset. And you can convince yourself about the correctness of the above calculation like so:

Here comes a final remark - hopefully not too off-topic! I just want to mention the probably most basic and shortest way to determine the MTF:

Two physical characteristics of a wave are amplitude and wavelength (figure below). The amplitude of a wave is the height of a wave as measured from the highest point on the wave (peak or crest) to the lowest point on the wave (trough). Wavelength refers to the length of a wave from one peak to the next.

The visible spectrum is the portion of the larger electromagnetic spectrum that we can see. As the figure below shows, the electromagnetic spectrum encompasses all of the electromagnetic radiation that occurs in our environment and includes gamma rays, x-rays, ultraviolet light, visible light, infrared light, microwaves, and radio waves. These waves are everywhere around us at all times but for some waveforms we need to use sophisticated tools in order to translate this information into visible light waves we are able to see. The visible spectrum in humans is associated with wavelengths that range from 380 to 740 nm—a very small distance, since a nanometer (nm) is one billionth of a meter. Other species can detect other portions of the electromagnetic spectrum. For instance, honeybees can see light in the ultraviolet range (Wakakuwa, Stavenga, & Arikawa, 2007), and some snakes can detect infrared radiation in addition to more traditional visual light cues (Chen, Deng, Brauth, Ding, & Tang, 2012; Hartline, Kass, & Loop, 1978).

Does frequency affectwavelength

A test chart has 5 Siemens stars, one larger one in the centre and four smaller in each corner. [...] I have attached a real camera export ...

The image of a "Delta peak" is the point spread function (PSF) - and its Fourier transform is the sought-after MTF. Making the realistic assumption of a rotational symmetric PSF, one can make use of the Fourier slice theorem: A projection of the 2D-PSF gives its Abel transform (1D), and the Fourier transform of the Abel transform gives the radial data of the (rotational symmetric) MTF:

opponent process: Perception of color derives from a special group of neurons that respond to opponent colors (red-green, blue-yellow)

Whenwavelengthincreases, what happens to the frequency

I understand you have an image (as bitmap). This is an ( n, m ) array of bytes, and as first step you want to find those regions which show siemens stars?

After light passes through the cornea, pupil and lens, light waves travel through the jelly like vitreous fluid in the eye and land on the retina, a dense collection of neurons covering the back wall of the eye. The retina is where millions of specialized neurons called photoreceptors which absorb light waves and turn this information into chemical and electrical signals which are processed in the primary visual cortex of the occipital lobe, and the lateral geniculate nucleus of the thalamus. Rods and cones represent the two types of photo receptors that exist in the retina which get their names from their characteristic shape. Rods are are extremely sensitive to (fire in response to) single photons (quantum light units, the smallest packet of light, Rieke & Baylor, 1998). Rods create scotopic vision which encodes less intense light and are mainly responsible for humans ability for night vision. Rods are much more common in the human retina compared to cones with about 100 rod cells compared to about seven million cone cells (Williamson & Cummins, 1983). Cone receptors on the other hand allow us to experience the vivid diversity of different wavelength reflections from objects which create our perception of colors. It is important to note that color is not an innate property of object in the world and is created by they way our receptors respond to the way light is reflected off objects. Because one organism perceives an object as being blue and another experiences the same object as being gray does not mean one organisms perception is wrong or incorrect, it just means that they have receptors that are tuned to send different signals to color processing areas of their brains when experiencing the reflection of light off that object. Color is an interpretation that is created by mixing activation of the specific receptors we have and the signals those receptors send to higher processing areas of the brain. In addition to allowing us to see color, cones also process fine details and allow for visual acuity.

As was the case with the visible spectrum, other species show differences in their audible ranges. For instance, chickens have a very limited audible range, from 125 to 2000 Hz. Mice have an audible range from 1000 to 91000 Hz, and the beluga whale’s audible range is from 1000 to 123000 Hz. Our pet dogs and cats have audible ranges of about 70–45000 Hz and 45–64000 Hz, respectively (Strain, 2003).

Although wave amplitude is generally associated with loudness, there is some interaction between frequency and amplitude in our perception of loudness within the audible range. For example, a 10 Hz sound wave is inaudible no matter the amplitude of the wave. A 1000 Hz sound wave, on the other hand, would vary dramatically in terms of perceived loudness as the amplitude of the wave increased.

What doeswavelengthdetermine in sound

Answer: Once again, one could make an evolutionary argument here. Given that the human voice falls in this middle range and the importance of communication among humans, one could argue that it is quite adaptive to have an audible range that centers on this particular type of stimulus.

aswavelengthincreases (gets longer), what happens to energy?

Thank you Hans, thank You Henrik, The last response from Henrik seems to be what I am after, and it is very promising. I am not sure why you didn't see the original image I had attached in my last response, as I used the "Add file to the post". But I am now attaching the original real export of a CCTV camera (typically they are HD or 4k resolution) and the measurements that I have made using Matlab. As you can see I am also measuring other parameters of the reproduced test chart, like the colour, Gamma and noise. I am not very happy with the Siemens star measurements, as they are not consistent with what I measure manually using PhotoShop for example. This is the reason I intend to try Mathematica, which I have no experience with, but willing it to try. I am assuming, finding the colour coordinates of the colour patches would be easy, and so would the random pixel noise of a grey patch representing 50% grey. Thank you very much for this hint Henrik, I will try and reproduce it myself on the real test chart and calculate the Depth of Modulation.

Hi Wolframians, I am new to the Wolfram language, and I only used a trial version of the Mathematica to see if there is a way to replace my Siemens sine-wave star measurement for my test chart, using MatLab. I am under impression that Mathematica is much more powerful, but as with anything new, it is hard for me to find a function or algorithm of how I can measure resolution of a Siemens sine-wave star, which I use in my ViDi Labs test charts (https://vidilabs.com/testcharts.html) and developed a little program using MatLab to find the circles in an image, and then measure resolution (or MTF = Modulation Transfer Function), based on the 10% Depth of Modulation, as described by the IEC 62676-5 standards. If anybody can help me in either giving me a list of functions to look at, or perhaps somebody would know exactly how to calculate the above mentioned resolution, I would gladly purchase a full licence of Mathematica and investigate this further. Happy to consider even paying somebody for their time if they can do what I have described in my video in the able mentioned URL link. I am also adding an attachment, a simple explanation on how this resolution is measured. Thank you so much.

Both light and sound can be described in terms of wave forms with physical characteristics like amplitude, wavelength, and timbre. Wavelength and frequency are inversely related so that longer waves have lower frequencies, and shorter waves have higher frequencies. In the visual system, a light wave’s wavelength is generally associated with color, and its amplitude is associated with brightness. In the auditory system, a sound’s frequency is associated with pitch, and its amplitude is associated with loudness.

Howdoes amplitude affectwavelength

The loudness of a given sound is closely associated with the amplitude of the sound wave. Higher amplitudes are associated with louder sounds. Loudness is measured in terms of decibels (dB), a logarithmic unit of sound intensity. A typical conversation would correlate with 60 dB; a rock concert might check in at 120 dB (figure below). A whisper 5 feet away or rustling leaves are at the low end of our hearing range; sounds like a window air conditioner, a normal conversation, and even heavy traffic or a vacuum cleaner are within a tolerable range. However, there is the potential for hearing damage from about 80 dB to 130 dB: These are sounds of a food processor, power lawnmower, heavy truck (25 feet away), subway train (20 feet away), live rock music, and a jackhammer. The threshold for pain is about 130 dB, a jet plane taking off or a revolver firing at close range (Dunkle, 1982).

The human retina is a fascinating structure because light is actually processed seemingly in reverse, beginning with the pigment epithelium which is organized into receptive fields on the outside layer of the retina, and continuing toward the front of the eye through the rods and cones. The rods and cones transmit information to bipolar cells which transmit signals to to ganglion cells located at the from of the retina that bundle together and relay information to deeper structures of the brain by way of the optic nerve. The area where the ganglion cells bundle together to form the optic nerve exit the retina at the optic disc, which creates a natural blind spot in each eye. However the blind spot created by the exiting of the optic nerve is not perceived due to compensation of information from receptions surrounding the blindspot as well as information compensated from the other eye that is able to perceive information in the other eyes blind spot due to the light hitting the compensating eye in a different location on the retina. This will be additionally reviewed in the following section on vision.

5.2 Waves & Wavelengths by Kathryn Dumper, William Jenkins, Arlene Lacombe, Marilyn Lovett, and Marion Perimutter is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

1. Other species have evolved to best suit their particular environmental niches. For example, the honeybee relies on flowering plants for survival. Seeing in the ultraviolet light might prove especially helpful when locating flowers. Once a flower is found, the ultraviolet rays point to the center of the flower where the pollen and nectar are contained. Similar arguments could be made for infrared detection in snakes as well as for the differences in audible ranges of the species described in this section.

1. Why do you think other species have such different ranges of sensitivity for both visual and auditory stimuli compared to humans?

2. Why do you think humans are especially sensitive to sounds with frequencies that fall in the middle portion of the audible range?

1. Which of the following correctly matches the pattern in our perception of color as we move from short wavelengths to long wavelengths?

How is wavelength affectedin physics

Visual and auditory stimuli both occur in the form of waves. Although the two stimuli are very different in terms of composition, wave forms share similar characteristics that are especially important to our visual and auditory perceptions. Waveforms of different types surround us at all times, however we only have receptors which are sensitive to specific types of wavelengths. In this section, we describe the physical properties of the waves as well as the perceptual experiences associated with them.

Wavelength is directly related to the frequency of a given wave form. Frequency refers to the number of waves that pass a given point in a given time period and is often expressed in terms of hertz (Hz), or cycles per second. Longer wavelengths will have lower frequencies, and shorter wavelengths will have higher frequencies (figure below).

Like light waves, the physical properties of sound waves are associated with various aspects of our perception of sound. Sounds waves are created by vibrations and can be thought of as ripples in the gasses that are constantly surrounding us. This is why sounds does not exist in space or complete vacuums. Without air or the presence of a gas to transmit the signal, sounds cannot exist. The frequency of a sound wave is associated with our perception of that sound’s pitch. High-frequency sound waves are perceived as high-pitched sounds, while low-frequency sound waves are perceived as low-pitched sounds. The audible range of sound frequencies is between 20 and 20000 Hz, with greatest sensitivity to those frequencies that fall in the middle of this range.

Whatis wavelength

Your impression is absolutely correct! Here comes an attempt showing some intermediate result which hopefully points to the right direction:

Here is a first attempt to find a (simple) circular structure in an image. First make the image. Define its limits and what else is needed

aswavelengthincreases , frequency decreases

Hi Vlado, sorry for the delay! Here is a sketch of my approach for finally calculating the MTF (according to the formula you gave). I am assuming we have defined the above siemensStars association, then:

Of course, different musical instruments can play the same musical note at the same level of loudness, yet they still sound quite different. This is known as the timbre of a sound. Timbre refers to a sound’s purity, and it is affected by the complex interplay of frequency, amplitude, and timing of sound waves.

In humans, light wavelength is associated with perception of color (figure above). Within the visible spectrum, our experience of red is associated with longer wavelengths, greens are intermediate, and blues and violets are shorter in wavelength. (An easy way to remember this is the mnemonic ROYGBIV: red, orange, yellow, green, blue, indigo, violet.) The amplitude of light waves is associated with our experience of brightness or intensity of color, with larger amplitudes appearing brighter. Animals that are able to see visible light have different ranges of color perception. Humans have three different types of color receptors (cones) resulting in a trichromatic organization of color, whereas most birds have four different types of cones resulting in a tetrachromatic experience including gray, blue, green and red. Dogs commonly thought to see in black and white actually do see in color, however their perception is limited to a more narrow arrangement of colors including black, yellow, gray and blue. Humans and animals perceive color by way of an opponent processing model of color vision where a small amount of primary color receptors mix their signals to create the perceptions of a variety of other colors (Herring, 1924). Behavioral methods have been designed which are used to better understand how many different colors animals are able to differentiate between (how many different colors are perceived) compared to how many different types of receptors they have (see Gregg, Jamison, Wilkie & Radinsky, 1924, for example of color differentiation between dogs, cats and raccoons). Where as human vision appears to operate on an opponent process model, some animals with more diverse varieties of color receptors have been show to operate on different methods of color perception. Ironically the mantis shrimp, the animal that could have the broadest, most detailed perception of color with 12 different color receptors, may not see in such the vivid arrangement that was previously thought. Recent research has demonstrated that although the mantis shrimp has 12 different types of color receptors (thus far the most known in the animal kingdom), the mantis shrimp’s visual system appears to be operating on a completely different, previously unknown color vision processing model which is based on temporal signaling combined with scanning eye movements, enabling a type of color recognition as opposed to color discrimination as in other animals and humans (Thoen, How, Chiou & Marshall, 2014).

But my remark here is basically about processing the test image you sent, and the main thing is a hopefully more robust function for finding the star centers:

Answer: Other species have evolved to best suit their particular environmental niches. For example, the honeybee relies on flowering plants for survival. Seeing in the ultraviolet light might prove especially helpful when locating flowers. Once a flower is found, the ultraviolet rays point to the center of the flower where the pollen and nectar are contained. Similar arguments could be made for infrared detection in snakes as well as for the differences in audible ranges of the species described in this section.

Wow Henrik, I am impressed, although I am yet to start using your inspirational suggestions :-). Do you mind writing me a direct e-mail to vlado@vidilabs.com so I can give you some other material for you to look at which would be difficult to include here as attachment. But only if you have time and willing to continue with your suggestions :-)

Ok Hans, this is what i would summarise it as: 1. A test chart has 5 Siemens stars, one larger one in the centre and four smaller in each corner. An image is taken by a camera and exported as JPG or BMP. How would you find the five circles and their centres? 2. Once the circles are found and their respective centres, one needs to measure the Depth of Modulation, as per the image I attached previously. Basically find how close to the centre the B/W rays are distinguishable with 10% contrast. 3. Based on the measurement under 2 (In pixels) calculate the MTF expressed in Line Pairs per Picture height. This is admittedly very easy if 1 and 2 are automated. I have attached a real camera export, which means they could always be slight variations in star positions, their geometry could be distorted (not always perfect circles) and have different resolution (which is to be measured).

Thank you Henrik, You certainly are one of a kind. Your suggestions are definitely putting my thoughts in the right direction. All I need to do now, study Mathematica myself and see how I can proceed from your suggestions. I am happy to even pay for your time if you wish to complete my test chart evaluation, of which the Siemens star is probably the most convoluted, if I may say. But, I leave this up to you, I know this is not the place to discuss such outcomes, but wanted to show you my appreciation. I appreciate you time Henrik and thank you again for the good pointers and examples.

There are three main features of light waves which allows us to objectively define differences between what we experience as colors. The first factor, hue is what we are usually talking about when we refer to color (a red shirt has a red hue). The hue is basically the specific name for the specific wavelength that is reflected by the object. Violet has the shortest visible wavelength in the visible spectrum (~ 400 nm), and red has the longest (700 nm). Brightness refers to the intensity of the color and depends on the amplitude or the distance between the midpoint and the peak of the wave. The higher the amplitude of the waveform, the more intense and bright the color. Finally, saturation referred to color purity which is determined by uniformity of the wavelength. Higher saturations are recorded when many wavelengths have the same size and shape. Most colors we experience are not pure meaning there are many wavelengths entering the eye of which are different shape and sized waveforms. Due to differences between color hue, amplitude of the wave and saturation, the average human is able to perceive some 2.3 million different colors (Linhares, Pinto & Nascimento, 2008).

1. If you grew up with a family pet, then you have surely noticed that they often seem to hear things that you don’t hear. Now that you’ve read this section, you probably have some insight as to why this may be. How would you explain this to a friend who never had the opportunity to take a class like this?