%D\HU,PDJH(QODUJHPHQW8VLQJ&RUUHODWHG &RORU&RPSRQHQWV 6%DWWLDWR0*XDUQHUD00DQFXVR$%UXQD ^VHEDVWLDQREDWWLDWRPLUNRJXDUQHUDPDVVLPRPDQFXVRDUFDQJHOREUXQD`#VWFRP STMicroelectronics – AST Catania Lab 'LJLWDO6WLOO&DPHUD0XOWLPHGLD0RELOH*URXS $EVWUDFW 7KH SDSHU SUHVHQW D PHWKRG IRU HQODUJLQJ LPDJHV DFTXLUHGE\VLQJOHFKLSGLJLWDOVWLOOFDPHUDVXVLQJWKH FRUUHODWLRQ DPRQJ WKH FRORU FRPSRQHQWV 7KH HQODUJHPHQW LV GRQH GLUHFWO\ RQ WKH %D\HU SDWWHUQ REWDLQLQJ EHWWHU SHUIRUPDQFHV LQ WHUP RI TXDOLW\ DQG SRZHUFRQVXPSWLRQ ,QWURGXFWLRQ Imaging sensors (CCD/CMOS) acquire images by a Color Filtering Array (CFA) also called Bayer Pattern [1]. Each pixel preserves the intensity of just one of the many-color separations. In the Bayer Pattern the green pixels are twice the red and blue pixels because our eyes are most sensitive to green light. The CFA filtering scheme allows us to capture color images by using powerful and smart algorithms applied in a sort of chain known as Image Generation Pipeline (Figure 1). In order to enlarge the image dimensions, traditional upsampling methods of use interpolating functions. The simplest interpolation algorithm is nearest neighbor algorithm, which produces unpleasing blocky appearance. Figure 1 - Typical Image Generation Pipeline More satisfactory results can be obtained with bilinear interpolation or by cubic convolution techniques [3]. Other related works can be found in [2],[6],[7],[8]. All previous methods work after the IGP block, directly on fully interpolated data, and they don’t improve the image quality, producing sub-optimal results. The proposed approach realizes a suitable enlargement of digital images directly on Bayer data, obtained directly from a CCD/CMOS sensor. Compared with classical approaches, the amount of computation requested is sensibly reduced and the final perceived quality is fully comparable. 7KHDOJRULWKP The algorithm exploits the correlation among the chromatic channels of the Bayer data. Starting from an NxM Bayer image, the original image is magnified by 2. Each pixel R, Gr, or B of the input image is splitted in four sub-pixels, as shown in Figure 2. Figure 2 - Correlated Color Components Retrieval. When the sensor acquires a scene, the lightness of a single pixel is given by the average lightness of the real scene portion corresponding to the given pixel’s size. So the high-frequency details inside the corresponding pixel area are lost. The considered splitting tries to point out such details. Our algorithm is mainly based on *UHHQ components. We assume that, locally, around the central pixel of an Q[Q window, there is a high correlation among the three colors signals. Figure 3 shows pictorially the retrieval of a color component by correlation among channels (the error introduced using a simple linear interpolation is also showed). For example, considering the low-frequencies components (* ,5 ,% ), the central pixel * (if the central