Spherical Aberration of Intraocular Lenses - PMC - spherical aberration
Line pairresolution
If the blurring from the optical system and sensor is significantly smaller than a pixel, then you can get spatial aliasing*. For instance, if you've got a perfectly crisp pair of lines that are projected onto one pixel column, you'll just see a 50% gray vertical line. Many inexpensive optical systems these days tend to have pixel counts high enough that the pixels are smaller than the blur spot. This means that the optical blur forms an anti-aliasing filter of sorts, and you don't need to worry about the sensor geometry. Really good cameras and esoteric sensors with low pixel counts still tend to be limited by the sensor's pixel size.
That's also true for a camera system. Its spatial resolution is limited by two factors; first the resolution limit of the lens (its sharpness), and then the resolution limit of the CMOS / CCD sensor grid (its sampling density).
The whole electrooptical imaging thing is complicated, and there's always more details to stumble over. In this case, I think the following three points will clear things up for you:
Line pairmath
I hope this question makes sense. I am a CS student so don't do much of anything with signal processing, but I would like to have a better understanding of how images are formed with respect to their spatial resolution capabilities.
Line pairphantom Radiography
What does a line pair actually correspond to or respresent in the context of image formation, specifically in a camera? There is a question in my book about how many line pairs/mm a camera will be able to resolve when taking an image of a subject. It gives the size of the CCD in the camera, as well as how many elements there are vertically and horizontally in the CCD. What does line pairs/mm of the CCD mean? Is this something like the number of pixels that will be resolved per CCD element? It just isn't clear to me how line pairs are realized in the camera.
Those resolution charts are designed to test optical equipment and systems. Consider two lines viewed from a fixed distance, separated by $d$ from each other. It's known that as $d$ is made smaller and smaller, after some threshold, you can no longer tell that there are two lines: they will look like a single one. That line speraration distance $d_{min}$ is the limit of your optical resolution at that viewing condition. To get rid of the viewing distance dependence, resolution is stated in angular units.
Stack Exchange network consists of 183 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.
Line pairphantom
I am learning about digital image processing, and the book I am reading is discussing spatial resolution. They discuss spatial resolution in terms of line pairs. They talk about constructing a chart with pairs of dark and light lines, and how the width of a line pair is the width of a light and dark line. This is all of the information the book gives about line pairs, and I don't really understand what role these play in digital image formation and resolution.
Line pairradiology
* Spatial aliasing is a long subject and there's information out there, so Google for it. Or look in your book -- it really ought to be there.
Given an optical system with perfect resolution, an image will be projected onto the face of the sensor chip. If you know the lens characteristics (focal length, f number, distortion) and the distance to the target, then you can calculate exactly what this image should be just using geometry. At this point, there's no loss of signal because we're in Plato's land of perfect things. I mention it because you're trying to think in mm in object space, but ultimately you need to translate that to $\mu\mathrm m$ on the sensor.
The optical system (and to some extent the sensor) will blur the image. This is usually what people think about when they think of optical resolution. Just imagine looking at a set of white and black lines and telling them apart depending on whether they're crisp or blurred. This is usually what people think of when they think of optical resolution. The spot that's formed when you look at some point source of light is called the "blur spot", and the size of the blur spot is an important system parameter.
The spatial resolving capability of an optical system refers to its ability to distingusih between closest (and possibly tiniest) details. The distance between a pair of lines, yields a measure of how small the distance between them can be while they are still sperated from each other. Resolution depends on several factors such as brightness level, color, and viewing environment.
“Laser beam.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/laser%20beam. Accessed 25 Nov. 2024.