Vision and imaging systems are several components that work together to provide a solution for the end user. For them to function properly, all components must be correctly matched. The camera and its image sensor are critical components of a vision system. However, if they are mismatched with the lens, the vision system will ultimately fail. In our project work, time is of the essence, and lens selection is no exception. The earlier you select lenses and optics during the construction of a vision or imaging system, the better.

General Standard Parameters
Image quality is a starting point when selecting a lens for your vision system. Engineers must first carefully determine the image quality required by their imaging system. The lens is a fairly complex analog component in an imaging system, and with the continuous development of sensors—smaller pixels and larger areas—the role of the lens in the imaging system is more important than ever. Therefore, we should allow time for the lens selection process to ensure optimal performance.

Sensor size also helps determine the choice of lens and optics. Always choose your camera first so that you have a suitable sensor size. Then, ensure you have the required field of view (FOV) and working distance. FOV and working distance combine to provide the optimal focal length for your lens. Furthermore, using a camera and lens combination that provides the required depth of field (DOF) for your image is also crucial.

Of course, as with any vision system implementation, this depends on the application. For some applications, it's size and weight, while for others it's resolution at all costs. Some applications have very strict requirements for distortion or color balance. So it's crucial to define our primary application. While the exact standards may vary depending on the application, some standards are generally retained. We call these the fundamental parameters of an imaging system.

First, FOV, which is the size of the object being inspected. Typically, if the end user is unaware of this, FOV needs to be estimated. It is generally determined based on the aspect ratio of the camera sensor and the object's position.

Second, working distance, which describes the distance from the lens to the object.

Third, resolution, which is also the most frequently discussed, refers to the smallest resolvable feature size on the object, the camera's resolution, or the resolution of a lens given that resolution (often described as the lens's modulation transfer function [MTF]). Next is the camera's sensor size, which, combined with the working distance and FOV, will determine the required lens focal length for the application. We also consider the size of individual pixels on the sensor and their compatibility with the lens.

Fourth, DOF, or depth of field, describes the variation in lens resolution. At different working distances, the sharpness of the lens focus is different. In some applications, this parameter is not very important, such as when inspecting flat objects.

Fifth, the size of the target surface: the size of the camera sensor should match the size of the lens target surface so that the image circle of the lens can cover the entire sensor, and the captured image will not have vignetting.

Sixth, the minimum focusing distance of the lens. Lenses with the same focal length have different minimum focusing distances. Does the lens's minimum focusing distance match the distance required for the application?

Seventh, the working spectrum of the lens. We should determine which spectrum the lens will operate in and select lenses that perform well under these lighting conditions.

Eighth, lens distortion. Lens distortion is an important issue in precise measurements, as it affects the accuracy of our measurements.

Ninth, relative illumination. When shooting a wide field of view, the surrounding image is generally darker than the central image. If the lens has good relative illumination, it can make the brightness of the surrounding image and the central image the same.