Skyhubrouter

A CMOS image sensor is a semiconductor element that utilizes photoelectric conversion by the photoelectric effect, but it is an element that has the function of "simply converting light into electricity" without any processing. In order to produce red, blue, and green colors from here, it is necessary to pass light through colored filters and convert the light that passes through each color filter into electricity.The filters used for this purpose are called color filters. .This color filter consists of a Bayer array consisting of red, green, and blue arrays (named after Dr. Bayer of KODAK made it), cyan, magenta, and yellow. There are various color filter arrays such as the CMYe array composed of (Yellow), the RGBC array that adds transparency (Clear or White) to the array of Red, Green, and Blue, and the Bayer array. In addition, there are diagonally arranged double Bayer arrays, etc., and there are various arrays depending on the application and purpose.In addition, so-called monochrome CMOS image sensors tend to be thought of as having no color filters, but they do have transparent filters before they are colored, and the structure is the same as those with color filters.

In recent years, it has become commonplace for CMOS image sensors used in digital cameras to be used in mobile phones, smartphones, and even automobiles. There are many things I don't understand about the terms used in CMOS image sensors, which have become so commonplace, because they are too technical and, to put it bluntly, too geeky.Here, basic terms often used in CMOS image sensors are briefly explained together with image diagrams.

Skyhubinternet light orange

Frame rate is an indicator of how many images can be output per second. Many recent image sensors are equipped with serial interfaces such as SLVS-ES and MIPI, and the data rate is an indicator of how fast data can be output, and is closely related to the frame rate.To show a specific calculation example, when sending a 1280x1024 image with 1280 lines vertically and 1024 pixels horizontally as 12-bit​ ​RAW data,The amount of data for one image is 1280x1024x12bit=15,728,640bit.If this is sent at 30fps, 15,728,640bit x 30fps=471,859,200bit/s per second.Since it is necessary to transmit this amount of data per second, the I/F is required to have specifications that can withstand this transmission capacity, such as a speed of 600Mbps.

4 greenlightson Sky router

Sorry, we just need to make sure you're not a robot. For best results, please make sure your browser is accepting cookies.

​CMOS image sensors exploded in popularity 20 years ago when cameras were installed in mobile phones. increase.I hope that this article will help you to deepen your understanding of CMOS image sensors.

As shown in the left figure below, the structure is called surface illumination, in which microlenses, color filters, wiring layers, and substrates (photodiodes) are layered in this order from the top. The disadvantage of this structure lies in the wiring layer. Light passing through the microlens passes through the wiring layer before entering the photodiode, so the amount of light inevitably drops there.This disadvantage becomes more pronounced as the wiring becomes finer. Backside illumination overcomes this disadvantage. As shown in the right figure below, the microlenses and color filters are the same as frontside illumination, but there is a substrate (photodiode) directly below the color filter, and wiring underneath. It is a layered structure. Most of the image sensors manufactured recently have a back-illuminated structure, and ONSEMI's AR0233 and AR0521 image sensors have this back-illuminated structure.

SkyHub4

It represents the output of an image sensor, and there are generally two types of output: parallel output and serial output. Serial output also includes methods such as LVDS and MIPI. The data rate shown earlier is an item related to the serial output.

As mentioned above, a CMOS image sensor has fine microlenses arranged in a row, and light is delivered to the light-receiving part (photodiode) through these microlenses. At this time, the arrangement relationship between the microlens and the photodiode should be straight or set at a certain angle, and the angle at this time is called the principal ray axis angle.The advantage of providing a gap between the microlens and the photodiode is that the focal length can be shortened, resulting in the advantage of reducing the size of the camera module.

Optical size is an important factor that expresses the size of the sensor. How big is the sensor just by looking at the optical size? is easier to imagine.For example, commercially available mirrorless single-lens cameras have a size of 24x35mm in full size (35m or Leica format), which is a remnant of the film camera era. There are also other optical sizes such as 1 inch and 1/3 inch, which indicate the size in the diagonal direction.Depending on this optical size, the lenses that can be selected also change.