Researchers from Harvard University are working on an image sensor made with silicon nanowires of different radii which absorb specific wavelengths (colors) of light. No intervening filters are needed since each nanowire captures light of a specific color. Since no light is absorbed, the higher absorption efficiency allows for higher pixel densities and higher resolution.
In principle, the device could absorb all incoming light and convert it directly into photocurrent. In contrast, filter-based devices absorb approximately half of the incoming light before it reaches the image sensor. The greater efficiency would then pave the way for cameras with higher resolutions.
Read more at: physorg.