An image sensor includes a first element separation film inside a substrate and having a mesh shape, pixel regions on the substrate defined by the first element separation film and including at least first and second pixel regions, and a second element separation film inside the substrate and partitioning the first pixel region into sub-pixel regions, the second element separation film not being in the second pixel region, wherein the first pixel region includes first photoelectric conversion elements, and a first color filter on the first photoelectric conversion elements, the first color filter being one of white, green, and blue color filters, and wherein the second pixel region includes second photoelectric conversion elements, and a second color filter on the second photoelectric conversion elements, the second color filter being different from the first color filter and one of red and white color filters.
1. An image sensor, comprising:
a substrate; a first element separation film inside the substrate, the first element separation film having a mesh shape; pixel regions on the substrate defined by the first element separation film, the pixel regions including at least a first pixel region and a second pixel region; and a second element separation film inside the substrate, the second element separation film partitioning the first pixel region into sub-pixel regions, and the second element separation film not being in the second pixel region, wherein the first pixel region includes:
first photoelectric conversion elements, and a first color filter on the first photoelectric conversion elements, the first color filter being one of a white color filter, a green color filter, and a blue color filter, and wherein the second pixel region includes:
second photoelectric conversion elements, and a second color filter on the second photoelectric conversion elements, the second color filter being one of a red color filter and the white color filter, and the second color filter being different from the first color filter. 2. The image sensor as claimed in 3. The image sensor as claimed in the pixel regions further include a third pixel region defined by the first element separation film, the third pixel region including third photoelectric conversion elements, and a third color filter on the third photoelectric conversion elements, the third pixel region is in contact with the second pixel region, and the third pixel region is not in contact with the first pixel region. 4. The image sensor as claimed in the third color filter is one of the white color filter, the green color filter, and the blue color filter, and the image sensor further comprises a third element separation film inside the substrate, the third element separation film partitioning the third pixel region into sub-pixel regions. 5. The image sensor as claimed in the third color filter is one of the red color filter and the white color filter, and the second element separation film is not placed in the third pixel region. 6. The image sensor as claimed in 7. The image sensor as claimed in 8. The image sensor as claimed in 9. The image sensor as claimed in 10. The image sensor as claimed in 11. The image sensor as claimed in 12. The image sensor as claimed in 13. The image sensor as claimed in 14. The image sensor as claimed in 15. An image sensor, comprising:
a substrate including a first pixel region and a second pixel region; first photoelectric conversion elements inside the substrate in the first pixel region; second photoelectric conversion elements inside the substrate in the second pixel region; a boundary separation film at a boundary between the first pixel region and the second pixel region; a pattern film inside a portion of the substrate of the first pixel region, the pattern film not being in the second pixel region; a first color filter on the first pixel region of the substrate, the first color filter being one of a white color filter, a green color filter, and a blue color filter; and a second color filter on the second pixel region of the substrate, the second color filter being a red color filter. 16. The image sensor as claimed in 17. The image sensor as claimed in 18. The image sensor as claimed in 19. The image sensor as claimed in 20. An image sensing system, comprising:
an image sensor configured to output an image signal; and an image signal processor connected to the image sensor, the image signal processor being configured to receive and process the image signal, wherein the image sensor includes
a substrate, a first element separation film inside the substrate and having a mesh shape, pixel regions defined by the first element separation film, the pixel regions including at least a first pixel region, a second pixel region, and a third pixel region, a second element separation film inside the substrate, the second element separation film partitioning the first pixel region into sub-pixel regions, and the second element separation film not being in the second pixel region, and a third element separation film inside the substrate, the third element separation film partitioning the third pixel region into sub-pixel regions, and the third element separation film not being in the second pixel region, wherein: the second pixel region is adjacent to the first pixel region with the first element separation film as a boundary therebetween, and the second pixel region is adjacent to the third pixel region with the first element separation film as a boundary therebetween, the first pixel region includes first photoelectric conversion elements, and a first color filter on the first photoelectric conversion elements, the second pixel region includes second photoelectric conversion elements, and a second color filter on the second photoelectric conversion elements, the third pixel region includes third photoelectric conversion elements, and a third color filter on the third photoelectric conversion elements, and the second color filter is different from the first and third color filters, the first color filter being one of a white color filter, a green color filter, and a blue color filter, the second color filter being one of a red color filter and the white color filter, and the third color filter being one of the white color filter, the green color filter, and the blue color filter.
Korean Patent Application No. 10-2020-0160706 filed on Nov. 26, 2020, in the Korean Intellectual Property Office, and entitled: “Image Sensor and Image Sensing System,” is incorporated by reference herein in its entirety. The present disclosure relates to an image sensor and an image sensing system. An image sensing device is one of semiconductor elements that convert optical information into electric signals. Such an image sensing device may include a charge coupled device (CCD) image sensing device and a complementary metal-oxide semiconductor (CMOS) image sensing device. The CMOS image sensor may be abbreviated as a CIS (CMOS image sensor). The CIS may be equipped with a plurality of pixels arranged two-dimensionally. Each of the pixels may include, e.g., a photo diode. The photo diode may convert incident light into electrical signals. Recently, with the development of the computer industry and the telecommunications industry, there are increasing demands for image sensors with improved performance in various fields, e.g., a digital camera, a video camera, a smart phone, a game console, a security camera, a medical micro camera, a robot, etc. According to an embodiment of the present disclosure, the image sensor includes a first element separation film placed inside a substrate in a form of a mesh, a plurality of pixel regions defined by the first element separation film, and including a first pixel region and a second pixel region, and a second element separation film which is disposed inside the substrate and partitions the first pixel region into a plurality of sub-pixel regions, wherein the first pixel region includes a plurality of first photoelectric conversion elements, and a first color filter on the plurality of first photoelectric conversion elements, the second pixel region includes a plurality of second photoelectric conversion elements, and a second color filter on the plurality of second photoelectric conversion elements, the second color filter is different from the first color filter, the first color filter is one of a white color filter, a green color filter, and a blue color filter, the second color filter is one of a red color filter and a white color filter, and the second element separation film is not placed inside the substrate of the second pixel region. According to the aforementioned and other embodiments of the present disclosure, an image sensor includes a substrate including a first pixel region and a second pixel region, a plurality of first photoelectric conversion elements formed inside the substrate in the first pixel region, a plurality of second photoelectric conversion elements formed inside the substrate in the second pixel region, a boundary separation film which defines a boundary between the first and second pixel regions, a pattern film formed inside the substrate of the first pixel region, a first color filter which is formed on the first pixel region of the substrate, and a second color filter which is formed on the second pixel region of the substrate and different from the first color filter, wherein the first color filter is one of a white color filter, a green color filter, and a blue color filter, the second color filter is a red color filter, and the pattern film is not placed inside the substrate of the second pixel region. According to the aforementioned and other embodiments of the present disclosure, the image sensing system includes an image sensor which outputs an image signal, and an image signal processor which is connected to the image sensor and receives and processes the image signal, wherein the image sensor includes a substrate, a first element separation film placed inside the substrate in a form of a mesh, a plurality of pixel regions defined by the first element separation film, and including a first pixel region, a second pixel region, and a third pixel region, a second element separation film which is disposed inside the substrate and partitions the first pixel region into a plurality of sub-pixel regions, and a third element separation film which is disposed inside the substrate and partitions the third pixel region into a plurality of sub-pixel regions, the second pixel region is in contact with the first pixel region with the first element separation film as a boundary, and is in contact with the third pixel region with the first element separation film as a boundary, the first pixel region includes a plurality of first photoelectric conversion elements, and a first color filter on the plurality of first photoelectric conversion elements, the second pixel region includes a plurality of second photoelectric conversion elements, and a second color filter on the plurality of second photoelectric conversion elements, the third pixel region includes a plurality of third photoelectric conversion elements, and a third color filter on the plurality of third photoelectric conversion elements, the second color filter is different from the first and third color filters, the first color filter is one of a white color filter, a green color filter, and a blue color filter, the second color filter is one of a red color filter and a white color filter, the third color filter is one of the white color filter, the green color filter, and the blue color filter, and the second and third element separation films are not placed inside the substrate of the second pixel region. Features will become apparent to those of skill in the art by describing in detail exemplary embodiments with reference to the attached drawings, in which: An image sensing system 1 including an image sensor 100 with a pixel array PA will be described below referring to Referring to The image sensor 100 may generate an image signal IS by sensing an image to be sensed, using incident light. In some embodiments, although the generated image signal IS may be, e.g., a digital signal, the embodiments are not limited thereto. The image signal IS may be provided to the application processor 180 and processed. That is, the image signal IS may be provided to an image signal processor (ISP) 181 included in the application processor 180 and processed. The ISP 181 may process or treat the image signal IS to be easily displayed. In some embodiments, the image sensor 100 and the application processor 180 may be placed separately as shown. For example, the image sensor 100 may be mounted on a first chip, the application processor 180 may be mounted on a second chip, and they may communicate with each other through an interface. However, example embodiments are not limited thereto, e.g., the image sensor 100 and the application processor 180 may be implemented as a single package (e.g., a multi-chip package (MCP)). The image sensor 100 may include a control register block 110, a timing generator 120, a row driver 130, the pixel array PA, a readout circuit 150, a ramp signal generator 160, and a buffer 170. The control register block 110 may generally control the operation of the image sensor 100. In particular, the control register block 110 may directly transmit an operating signal to the timing generator 120, the ramp signal generator 160, and the buffer 170. The timing generator 120 may generate a signal that serves as a reference for the operation timing of various components of the image sensor 100. The operation timing reference signal generated by the timing generator 120 may be transferred to the row driver 130, the readout circuit 150, the ramp signal generator 160, and the like. The ramp signal generator 160 may generate and transmit the ramp signal to be used in the readout circuit 150. For example, the readout circuit 150 may include a correlated double sampler (CDS), a comparator, and the like. The ramp signal generator 160 may generate and transmit the ramp signal to be used in the CDS, the comparator, or the like. The buffer 170 may include, e.g., a latch. The buffer 170 may temporarily store the image signal IS to be provided to the outside, and may transmit the image signal IS to an external memory or an external device. The pixel array PAs may sense external images. The pixel array PA may include a plurality of pixels (or unit pixels). The row driver 130 may selectively activate the row of the pixel array PA. The readout circuit 150 may sample the pixel signal provided from the pixel array PA, compare the pixel signal with the ramp signal, and then, convert an analog image signal (data) into a digital image signal (data) on the basis of the results of the comparison. Referring to Although not shown in the drawing, a third region in which the memory is placed may be placed below the second region S2. At this time, the memory placed in the third region may receive the image data from the first region S1 and the second region S2, store or process the image data, and transmit the image data to the first region S1 and the second region S2 again. In this case, the memory may include a memory element, e.g., a dynamic random access memory (DRAM) element, a static random access memory (SRAM) element, a spin transfer torque magnetic random access memory (STT-MRAM) element, and a flash memory element. When the memory includes, e.g., the DRAM element, the memory may receive the image data at a relatively high speed and process the image data. Also, in some embodiments, the memory may also be placed in the second region S2. The first region S1 may include the pixel array PA and a first peripheral region PH1, and the second region S2 may include a logic circuit region LC and a second peripheral region PH2. The first region S1 and the second region S2 may be sequentially stacked and placed one above the other. In the first region S1, the pixel array PA may be the same as the pixel array PA described referring to The first peripheral region PH1 may include a plurality of pads, and may be placed around the pixel array PA. The plurality of pads may transmit and receive electrical signals from an external device or the like. In the second region S2, the logic circuit region LC may include electronic elements including a plurality of transistors. The electronic elements included in the logic circuit region LC are electrically connected to the pixel array PA to provide a constant signal to each unit pixel of the pixel array PA or control the output signal. For example, the control register block 110, the timing generator 120, the row driver 130, the readout circuit 150, the ramp signal generator 160, the buffer 170, and the like described referring to Although a second peripheral region PH2 may also be placed in a region of the second region S2, which corresponds to the first peripheral region PH1 of the first region S1, the embodiments are not limited thereto. Referring to Each pixel region PX may include a micro lens ML. Each micro lens ML may be placed above each pixel region PX. That is, when viewed from above, the micro lens ML may be placed on an upper surface of the pixel array PA. Each micro lens ML may correspond to each pixel region PX, e.g., in a one-to-one correspondence. Referring to Although two pixel arrays PA are shown in In the pixel array PA, each of the first to twelfth white pixel regions W1 to W12 may be surrounded by each of the first to sixth green pixel regions G1 to G6, the first to fourth red pixel regions R1 to R4, and the first and second blue pixel regions B1 and B2. In addition, in the pixel array PA, the first to sixth green pixel regions G1 to G6, the first to fourth red pixel regions R1 to R4, and the first and second blue pixel regions B1 and B2 may be surrounded by each of first to twelfth white pixel regions W1 to W12. That is, each pixel region may be defined in a mesh-shaped pixel array PA. For example, the third white pixel region W3 may be surrounded by first and second red pixel regions R1 and R2, and second and third green pixel regions G2 and G3. That is, the third white pixel region W3 may share a boundary with the first and second red pixel regions R1 and R2, and the second and third green pixel regions G2 and G3. Also, for example, the third green pixel region G3 may be surrounded by the third, fifth, sixth and seventh white pixel regions W3, W5, W6 and W7. That is, the third green pixel region G3 may share the boundary with the third, fifth, sixth and seventh white pixel regions W3, W5, W6 and W7. Each pixel region may be defined by a first element separation film 222. The first element separation film 222 may be placed in the form of a mesh, e.g., the solid grid lines in Each pixel region PX may include a plurality of photoelectric conversion elements. For example, each pixel region PX may include a first photoelectric conversion element PD1 and a second photoelectric conversion element PD2. The first and second photoelectric conversion elements PD1 and PD2 may be placed side by side in the first direction X within each of the pixel region PX. However, example embodiments are not limited thereto, e.g., the pixel region PX may include only one photoelectric conversion element or may include four photoelectric conversion elements. Referring to The pixel array PA according to some embodiments may include a semiconductor substrate 220 having a first surface BS and a second surface FS opposite to each other. For example, the semiconductor substrate 220 may include silicon, germanium, silicon-germanium, a group VI compound semiconductor, a group V compound semiconductor, and the like. The semiconductor substrate 220 may be a silicon substrate into which P-type or N-type impurities are injected. Hereinafter, an example in which P-type impurities are injected into the semiconductor substrate 220 will be described. Also, the semiconductor substrate 220 may include a floating diffusion node region doped with N-type impurities. The pixel region PX may include a plurality of photoelectric conversion elements PD1 and PD2, color filters 231, 232 and 233, antireflection films 271, 272 and 273, and the like. The plurality of photoelectric conversion elements PD1 and PD2 may be placed in, e.g., inside, the semiconductor substrate 220. For example, the plurality of photoelectric conversion elements PD1 and PD2 may not be exposed to the first surface BS and second surface FS of the semiconductor substrate 220, e.g., the photoelectric conversion elements PD1 and PD2 may be vertically spaced apart from each of the first and second surface BS and FS of the semiconductor substrate 220. Each of the photoelectric conversion elements PD1 and PD2 may be formed by a PN junction. The photoelectric conversion elements PD1 and PD2 may include impurities having a conductivity type opposite to that of the semiconductor substrate 220. For example, the photoelectric conversion elements PD1 and PD2 may be formed by ion-implantation of N-type impurities into the semiconductor substrate 220. The photoelectric conversion elements PD1 and PD2 may be formed in a form in which a plurality of doping regions are stacked. The mesh-shaped first element separation film 222 may be formed in the semiconductor substrate 220. The first element separation film 222 may be formed to penetrate, e.g., an entire thickness of, the semiconductor substrate 220. That is, the first element separation film 222 may be exposed to, e.g., in direct contact with, the first surface BS and the second surface FS. However, example embodiments are not limited thereto, e.g., the first element separation film 222 may be exposed only to the first surface BS and may not be exposed to the second surface FS. A second element separation film 224 The second element separation film 224 The second element separation films 224 The first element separation film 222 and the second element separation films 224 The first element separation film 222 may prevent a charge transfer between adjacent pixel regions PX to prevent an electrical crosstalk between the adjacent pixel regions PX. In addition, the first element separation film 222 may refract light obliquely incident on the pixel region PX to prevent an optical crosstalk that may occur when light penetrates the adjacent pixel region PX. The second element separation films 224 The second element separation films 224 Color filters 231, 232 and 233, and the micro lens ML may be placed on the first surface BS of the semiconductor substrate 220. A white color filter 231 and the micro lens ML may be placed on the first surface BS in the ninth white pixel region W9. A red color filter 232 and the micro lens ML may be placed on the first surface BS in the third red pixel region R3. A white color filter 233 and the micro lens ML may be placed on the first surface BS in the tenth white pixel region W10. The color filters 231, 232 and 233 may select and transmit light of different colors. For example, the white color filters 231 and 233 are transparent and may transmit light of all wavelengths. The red color filter 232 may transmit light red color. Although not shown, a blue color filter may transmit blue color light, and a green color filter may transmit green color light. That is, the length of the wavelength of the light transmitted through the red color filter 232 may be greater than the length of the wavelength of the light transmitted through other color filters. Antireflection films 271, 272 and 273 may be formed on an insulating layer 240. The antireflection films 271, 272 and 273 may vertically overlap the first element separation film 222. For example, the antireflection film 271 may be placed at the edge of the ninth white pixel region W9. For example, the antireflection film 272 may be placed at the edge of the third red pixel region R3. For example, the antireflection film 273 may be placed at the edge of the tenth white pixel region W10. The thickness of the antireflection films 271, 272 and 273 may be smaller than the thickness of the color filters 231, 232 and 233, e.g., along the third direction Z. Photons reflected or scattered at an interface between the color filters 231, 232 and 233 and the insulating layer 240 may be prevented from moving to other sensing regions. The antireflection films 271, 272 and 273 may prevent the incident light passing through the color filters 231, 232 and 233 from being reflected or scattered to the side surface. The antireflection films 271, 272 and 273 may include metals, e.g., at least one of tungsten (W), aluminum (Al), and copper (Cu). The wiring layer 210 may be placed on the second surface FS of the semiconductor substrate 220. The wiring layer 210 may include a plurality of transistors including a pixel region PX, and a plurality of wires connected to the transistors. The wiring layer 210 is electrically connected to the first and second photoelectric conversion elements PD1 and PD2 and may receive analog signals. The insulating layer 240 may be placed between the first surface BS of the semiconductor substrate 220 and the color filters 231, 232 and 233. The insulating layer 240 may prevent the incident light from being reflected and efficiently transmits the incident light, thereby improving the performance of the image sensor 100. Referring to Referring to Here, the third red pixel region R3 may be defined by the first element separation film 222 The ninth white pixel region W9 may be in contact with, e.g., immediately adjacent to, the third red pixel region R3 with the first element separation film 222 The second element separation film 224 The second element separation films 224 A third element separation film 226 The fourth element separation film 228 The element separation film that partitions each pixel region into two sub-pixel regions may not be placed in the third red pixel region R3 and the fourth red pixel region R4. That is, element separation films such as the second element separation films 224 For example, referring to Referring to Here, since red light having a long wavelength has better refraction than blue light having a short wavelength, the crosstalk occurring in the first red pixel region R1 may be greater than the crosstalk occurring in first green pixel region G1, the third green pixel region G3, and the first blue pixel region B1. As a result, the crosstalk occurring in the second white pixel region W2 may be greater than the crosstalk occurring in the sixth white pixel region W6. That is, since the crosstalk occurring in the second white pixel region W2 and the sixth white pixel region W6 corresponding to the same white channel are different, there may be a problem of a difference in sensitivity. In contrast, as shown in As a result, there may be no difference between the optical crosstalk on the second white pixel region W2 and the optical crosstalk on the sixth white pixel region W6. That is, the optical crosstalk on the second white pixel region W2 and the optical crosstalk on the sixth white pixel region W6 may be substantially the same. As a result, the difference in sensitivity between the second white pixel region W2 and the sixth white pixel region W6 may not occur. As a result, the image quality of the image signal IS that is output from the pixel array PA1 can be further improved. A pixel array PA2 according to some other embodiments will be described below referring to Referring to Second element separation films 224 The element separation film that partitions each pixel region into two sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. Also, the element separation film that partitions each pixel region into two sub-regions may not be placed in the first and second blue pixel regions B1 and B2. That is, the fourth element separation film 228 Referring to The element separation film that partitions each pixel region into two sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. Also, the element separation film that partitions each pixel region into two sub-regions may not be placed in the first and second blue pixel regions B1 and B2. That is, the fourth element separation film 228 Also, the element separation film that partitions each pixel region into two sub-regions may not be placed in the third to sixth green pixel regions G3 to G6. That is, the third element separation films 226 Referring to The second element separation films 224 Referring to The second element separation films 224 Referring to A pixel array PA3 according to some other embodiments will be described below referring to Referring to The third element separation film 236 The fourth element separation films 238 The element separation film that partitions each pixel region into multiple sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. Since the element separation film is not placed in the third and fourth red pixel regions R3 and R4, the image quality of the image signal IS of the image sensor 100 can be further improved. Referring to The element separation film that partitions each pixel region into multiple sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. In addition, the fourth element separation films 238 Referring to The element separation film that partitions each pixel region into multiple sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. In addition, the fourth element separation films 238 Referring to The second element separation films 234 Referring to The second element separation films 234 Referring to A pixel array PA4 according to some other embodiments will be described below referring to Referring to The third element separation films 246 The element separation film that partitions each pixel region into multiple sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. Since the element separation film is not placed in the third and fourth red pixel regions R3 and R4, the image quality of the image signal IS of the image sensor 100 can be further improved. Referring to Referring to The second element separation films 224 Referring to The element separation film that partitions each pixel region into multiple sub-pixel regions may not be placed in the third and fourth red pixel regions R3 and R4. In addition, the fourth element separation films 228 Hereinafter, the structure and operation of the pixel array PA and the pixel region PX according to some embodiments will be described referring to Referring to The power supply voltage line VDL, the transfer signal lines TGL1, TGL2 and TGL3, the reset signal line RGL, and the selection signal line SELL may generally extend in the first direction X. The output signal lines RL1 and RL2 may extend in the second direction Y. The power supply voltage line VDL transfers a constant power supply voltage, and the plurality of transfer signal lines TGL1, TGL2 and TGL3 placed in one row independently transfer first through third transfer signals, respectively, to transfer the charge generated by the photoelectric conversion elements PD1 and PD2 of the pixel region PX to the readout element. The reset signal line RGL may transfer a reset signal for resetting a pixel, and the selection signal line SELL may transfer a selection signal instructing the row selection. The first through third transfer signals, the reset signal, and the selection signal may be output from the row driver 130 described above. The row driver 130 may output the first through third transfer signals, the reset signal, and the selection signal for each row sequentially or non-sequentially. In some embodiments, the ninth white pixel region W9 is connected to two transfer signal lines TGL1 and TGL2, and the third red pixel region R3 may be connected to one transfer signal line TGL3. The transfer signal line TGL3 may be a transfer signal line different from the two transfer signal lines TGL1 and TGL2. The pixel regions PX arranged in the same row may be connected to the same reset signal line RGL and the same selection signal line SELL. Referring to Each of the first and second photoelectric conversion elements PD1 and PD2 may be a photo diode with an anode connected to a common voltage VSS. A cathode of the photo diodes may be connected to the first and second transfer transistors TX1 and TX2, respectively. The charge generated when the first and second photoelectric conversion elements PD1 and PD2 receive light may be transmitted to the floating diffusion node FD through the first and second transfer transistors TX1 and TX2. Gates of the first and second transfer transistors TX1 and TX2 may be connected to the transfer signal lines TGL1, TGL2 and TGL3 to receive application of the first through third transfer signals. For example, as described above, the gates of the first and second transfer transistors TX1 and TX2 of the ninth white pixel region W9 may be connected to different transfer signal lines TGL1 and TGL2. Further, gates of the first and second transfer transistors TX1 and TX2 of the third red pixel region R3 may be connected to the same transfer signal line TGL3. Therefore, the charges generated by each of the first and second photoelectric conversion elements PD1 and PD2 of the ninth white pixel region W9 may be transmitted to the floating diffusion node FD through the first and second transfer transistors TX1 and TX2 which are turned on at different times from each other. Also, the charges generated by each of the first and second photoelectric conversion elements PD1 and PD2 of the third red pixel region R3 may be transmitted together to the floating diffusion node FD through the first and second transfer transistors TX1 and TX2 which are turned on at the same time as each other. The floating diffusion node FD may store the transmitted charges in a cumulative manner, and the drive transistor DX may be controlled depending on the amount of charges stored in the floating diffusion node FD. The gate of the reset transistor RX may be connected to the reset signal line RGL. The reset transistor RX may be controlled by the reset signal transferred by the reset signal line RGL to periodically reset the floating diffusion node FD to the power supply voltage. The drive transistor DX may output a voltage that changes in response to the voltage of the floating diffusion node FD. The drive transistor DX may function as a source follower buffer amplifier in combination with a constant current source. The drive transistor DX may generate a source-drain current that is proportional to a dimension of the amount of charge applied to the gate. A gate of the selection transistor SX is connected to the selection signal line SELL. The selection transistor SX which is turned on according to the activation of the selection signal transferred by the selection signal line SELL may output the current generated by the drive transistor DX to the output signal line RL1 as a pixel signal. The selection signal may be applied sequentially or non-sequentially on a line basis, as a signal which selects a row that outputs a pixel signal. Hereinafter, an electronic device 1000 according to some other embodiments will be described with reference to Referring to The camera module group 1100 may include first to third camera modules 1100 At least two camera modules (e.g., 1100 In some embodiments, viewing angles of each of the first to third camera modules 1100 In some embodiments, one camera module (e.g., 1100 In some embodiments, one camera module (e.g., 1100 The first to third camera modules 1100 For example, one camera module of the first to third camera modules 1100 The application processor 1200 may include an image processor 1210, a memory controller 1220, and an internal memory 1230. The application processor 1200 may be implemented separately as semiconductor chips separate from the plurality of camera modules 1100 The image processor 1210 may include first to third sub-image processors 1212 For example, the image data generated from the first camera module 1100 In another example, a plurality of sub-image processors corresponding to a plurality of camera modules may be implemented as a single sub-image processor. For example, although the first to third sub-image processors 1212 Further, in some embodiments, the image data generated from the first camera module 1100 Each of the first to third sub-image processors 1212 In some embodiments, the remosaic signal processing may be performed on the respective camera modules 1100 The image generator 1214 may generate a target image, using the image data provided from the respective first to third sub-image processors 1212 The plurality of various modes may control the first to third camera modules 1100 The modes adopted in some embodiments include a plurality of still image modes and a plurality of moving image modes, and the camera module group 1100 of the electronic device 1000 according to the present embodiment may operate in another way depending on the signal of the selected mode among such modes. In some embodiments, the plurality of modes may include first to third still image modes and first and second moving image modes. The plurality of modes may be described by the operation (particularly, output) of the second camera module 1100 On the other hand, the image generating information may include, e.g., a zoom signal (or a zoom factor). The zoom signal may be, e.g., signal selected from the user. If the image generating information is a zoom signal (or zoom factor), and the first to third camera modules 1100 When the zoom signal is a second signal different from the first signal, the image generator 1214 may generate the output image, using the image data output from the third sub-image processor 1212 When the zoom signal is still another third signal, the image generator 1214 may select any one of the image data output from the respective first to third sub-image processors 1212 The camera control signal according to the mode selection may be provided to each of the camera modules 1100 On the other hand, any one of the first to third camera modules 1100 The PMIC 1300 may supply power, e.g., a power supply voltage, to each of the first to third camera modules 1100 The PMIC 1300 may generate power corresponding to each of the first to third camera modules 1100 By way of summation and review, aspects of the present disclosure provide an image sensor that prevents crosstalk between pixels to improve an image quality, and an image sensing system that includes the same. That is, according to example embodiments, an ,e.g., RGBW, image sensor may not include an element separation film or a separation trench between photo diodes within a same red pixel, thereby reducing crosstalk of the image sensor. Example embodiments have been disclosed herein, and although specific terms are employed, they are used and are to be interpreted in a generic and descriptive sense only and not for purpose of limitation. In some instances, as would be apparent to one of ordinary skill in the art as of the filing of the present application, features, characteristics, and/or elements described in connection with a particular embodiment may be used singly or in combination with features, characteristics, and/or elements described in connection with other embodiments unless otherwise specifically indicated. Accordingly, it will be understood by those of skill in the art that various changes in form and details may be made without departing from the spirit and scope of the present invention as set forth in the following claims.CROSS-REFERENCE TO RELATED APPLICATION
BACKGROUND
1. Field
2. Description of the Related Art
SUMMARY
BRIEF DESCRIPTION OF THE DRAWINGS
DETAILED DESCRIPTION