Top Banner
A New Perceptually Uniform Color Space with Associated Color Similarity Measure for Content-Based Image and Video Retrieval M. Sarifuddin epartement d’informatique et d’ing ´ enierie, Universit´ e du Qu ´ ebec en Outaouais C.P. 1250, Succ. B Gatineau Qu´ ebec - Canada, J8X 3X7 [email protected] Rokia Missaoui epartement d’informatique et d’ing ´ enierie, Universit´ e du Qu ´ ebec en Outaouais C.P. 1250, Succ. B Gatineau Qu´ ebec - Canada, J8X 3X7 [email protected] ABSTRACT Color analysis is frequently used in image/video retrieval. However, many existing color spaces and color distances fail to correctly capture color differences usually perceived by the human eye. The objective of this paper is to first high- light the limitations of existing color spaces and similarity measures in representing human perception of colors, and then to propose (i) a new perceptual color space model called HCL, and (ii) an associated color similarity measure denoted DHCL. Experimental results show that using DHCL on the new color space HCL leads to a solution very close to hu- man perception of colors and hence to a potentially more effective content-based image/video retrieval. Moreover, the application of the similarity measure DHCL to other spaces like HSV leads to a better retrieval effectiveness. A comparison of HCL against L*C*H and CIECAM02 spaces using color histograms and a similarity distance based on Dirichlet distribution illustrates the good performance of HCL for a collection of 3500 images of different kinds. Keywords Color spaces, content-based image retrieval, similarity mea- sures. 1. INTRODUCTION Challenges in content-based image retrieval (CBIR) consist not only to bridge the semantic gap (i.e., the mismatch be- tween the capabilities of CBIR techniques and the semantic needs of the users) but also to exploit different models of hu- man image perception, and manage large image collections and incomplete query/image specifications [12]. The human visual system does not perceive a given image as a mere and aleatory collection of colors and pixels, but rather as a layout of homogeneous objects and regions with respect to visual features like color, shape and texture. Given a large range of images such as landscape, satellite, and medical im- ages, human visual system has the capacity to distinguish, recognize and interpret different types of objects in images. However, computer programs can hardly recognize image objects even in a simple scene. In image processing and computer vision, color analysis (e.g., dominant color identifi- cation, color-based object detection) is a low-level operation which plays an important role in image/video retrieval. A variety of color spaces have been developed for color rep- resentation such as RGB, perceptual color spaces HSL (hue, saturation, luminance), HSV/HSB (hue, saturation, value or brightness) [13, 14] and HSI (hue, saturation, intensity) as well as perceptually uniform color spaces like L*u*v*, and L*a*b* (luminance L*, chrominance u*, v*, a*, and b*) and CIECAM02 [7, 15]. We recall that perceptual uniformity in a given color space means that the perceptual similarity of two colors is measured by the distance between the two color points. The objective of this paper is to first illustrate the limi- tations of existing color spaces and similarity measures in representing human perception of colors, and then to pro- pose (i) a new color space model which aims at capturing the real color difference as perceived by human eye, and (ii) a new color similarity measure. The proposed space is inspired from HSV (or HSL) and L*a*b*. The paper is organized as follows. Section 2 is a brief de- scription of color spaces, their strengths and limitations. Section 3 presents a new color space called HCL while Sec- tion 4 presents a set of existing color distances, proposes a new similarity measure and provides a performance anal- ysis of color distances applied to a set of color spaces. A conclusion is given in Section 5. 2. COLOR SPACES The most commonly used and popular color space is RGB. However, this space presents some limitations: (i) the pres- ence of a negative part in the spectra, which does not allow the representation of certain colors by a superposition of the three spectra, (ii) the difficulty to determine color features like the presence or the absence of a given color, and (iii) the inability of the Euclidean distance to correctly capture
8
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 10.1.1.125.3833

A New Perceptually Uniform Color Space with AssociatedColor Similarity Measure for Content-Based Image and

Video Retrieval

M. SarifuddinDepartement d’informatique et d’ingenierie,

Universite du Quebec en OutaouaisC.P. 1250, Succ. B GatineauQuebec - Canada, J8X 3X7

[email protected]

Rokia MissaouiDepartement d’informatique et d’ingenierie,

Universite du Quebec en OutaouaisC.P. 1250, Succ. B GatineauQuebec - Canada, J8X 3X7

[email protected]

ABSTRACTColor analysis is frequently used in image/video retrieval.However, many existing color spaces and color distances failto correctly capture color differences usually perceived bythe human eye. The objective of this paper is to first high-light the limitations of existing color spaces and similaritymeasures in representing human perception of colors, andthen to propose (i) a new perceptual color space model calledHCL, and (ii) an associated color similarity measure denotedDHCL. Experimental results show that using DHCL on thenew color space HCL leads to a solution very close to hu-man perception of colors and hence to a potentially moreeffective content-based image/video retrieval. Moreover, theapplication of the similarity measure DHCL to other spaceslike HSV leads to a better retrieval effectiveness.

A comparison of HCL against L*C*H and CIECAM02 spacesusing color histograms and a similarity distance based onDirichlet distribution illustrates the good performance ofHCL for a collection of 3500 images of different kinds.

KeywordsColor spaces, content-based image retrieval, similarity mea-sures.

1. INTRODUCTIONChallenges in content-based image retrieval (CBIR) consistnot only to bridge the semantic gap (i.e., the mismatch be-tween the capabilities of CBIR techniques and the semanticneeds of the users) but also to exploit different models of hu-man image perception, and manage large image collectionsand incomplete query/image specifications [12]. The humanvisual system does not perceive a given image as a mereand aleatory collection of colors and pixels, but rather as alayout of homogeneous objects and regions with respect to

visual features like color, shape and texture. Given a largerange of images such as landscape, satellite, and medical im-ages, human visual system has the capacity to distinguish,recognize and interpret different types of objects in images.However, computer programs can hardly recognize imageobjects even in a simple scene. In image processing andcomputer vision, color analysis (e.g., dominant color identifi-cation, color-based object detection) is a low-level operationwhich plays an important role in image/video retrieval.

A variety of color spaces have been developed for color rep-resentation such as RGB, perceptual color spaces HSL (hue,saturation, luminance), HSV/HSB (hue, saturation, valueor brightness) [13, 14] and HSI (hue, saturation, intensity)as well as perceptually uniform color spaces like L*u*v*, andL*a*b* (luminance L*, chrominance u*, v*, a*, and b*) andCIECAM02 [7, 15]. We recall that perceptual uniformity ina given color space means that the perceptual similarity oftwo colors is measured by the distance between the two colorpoints.

The objective of this paper is to first illustrate the limi-tations of existing color spaces and similarity measures inrepresenting human perception of colors, and then to pro-pose (i) a new color space model which aims at capturingthe real color difference as perceived by human eye, and(ii) a new color similarity measure. The proposed space isinspired from HSV (or HSL) and L*a*b*.

The paper is organized as follows. Section 2 is a brief de-scription of color spaces, their strengths and limitations.Section 3 presents a new color space called HCL while Sec-tion 4 presents a set of existing color distances, proposesa new similarity measure and provides a performance anal-ysis of color distances applied to a set of color spaces. Aconclusion is given in Section 5.

2. COLOR SPACESThe most commonly used and popular color space is RGB.However, this space presents some limitations: (i) the pres-ence of a negative part in the spectra, which does not allowthe representation of certain colors by a superposition of thethree spectra, (ii) the difficulty to determine color featureslike the presence or the absence of a given color, and (iii)the inability of the Euclidean distance to correctly capture

Page 2: 10.1.1.125.3833

color differences in the RGB space. Figure 4 illustrates thelatter fact.

Color spaces like HSV and HSL are also commonly used inimage processing. As opposed to the RGB model, HSL andHSV are considered as natural representation color models(i.e., close to the physiological perception of human eye). Inthese models, color is decomposed according to physiologicalcriteria like hue, saturation and luminance. Hue refers tothe pure spectrum colors and corresponds to dominant coloras perceived by a human. Saturation corresponds to therelative purity or the quantity of white light that is mixedwith hue while luminance refers to the amount of light in acolor [2].

A great advantage of HSL/HSV models over the RGB modellies in their capacity to recognize the presence/absence ofcolors in a given image. However, the main drawback ofHSL and HSV models concerns their luminance variationwhich does not correspond to human perception. Visually,a color with a great amount of white has small variation ofluminosity than a fully saturated color. Such a situation isnot correctly captured in these models.

In the HSV model, saturated colors have the same intensityas colors with 100% of white color. However, this is not thecase for the HSL model since there is a great luminosity gapbetween saturated colors and colors with a great amount ofwhite. Therefore, using metric distances such as Euclidean(see Equation 6) and cylindric distances (see Equation 10)with HSV and HSL models does not capture the color dif-ference as human eye does.

The CIE (Commission Internationale de l’Eclairage) has de-fined two perceptually uniform or approximately-uniformcolor spaces L∗a∗b∗ and L∗u∗v∗. Further, the L∗C∗H∗

(Lightness, Chroma, and Hue) and L∗t∗θ∗ (t = Chromaand θ∗ = Hue) color spaces have been defined as derivativesof L∗u∗v∗ and L∗a∗b∗ [3]. The L*a*b* and L*C*H* colormodels are represented in Figure 1. Figure 1-a shows colordistribution in these models while Figure 1-b illustrates thevariation of chroma C∗ et luminance L∗ for six differenthue values H∗ (red, yellow, green, cyan, blue and purple).One can see that the luminosity of a hue (respectively thechroma) grows (respectively decreases) slowly according tothe increase in the percentage of white. This variation cor-responds to human perception and hence represents a goodfeature in L*a*b* and L*C*H* color models.

As pointed out by [7, 8], the spaces L*a*b* and L*C*H*have a significant deficiency since they have weak hue con-stancy for blues as illustrated by Figure 1-a) which showsthat the blue hue angle varies between 2900 to 3060. Hueconstancy means that a color object created by varying theencoding values to obtain different sensations in lightness orchroma should still lead to the same hue over the entire ob-ject. Moreover, simple nonlinear channel editing should nothave an impact on the hue of a color. In order to get suchconstancy, another color space called “CIE Color appear-ance model” (CIECAM02) has been proposed in [7]. How-ever, CIECAM02 improves hue constancy for almost all col-ors except the blue as illustrated in Figure 2-b which showsthe variation of hue angles for red, yellow, green, cyan, blue

(a) (b)

Figure 1: a) L*a*b* and L*C*H* color space models.

b) Chroma and Luminance variations for six hue values.

and purple. One can notice that hue angle for blue variesbetween 2570 and 2740.

(a) (b)

Figure 2: a) CIECAM02 color space model. b) Chroma

and luminance variations for six hue values.

3. A NEW COLOR SPACEWhile in [6] we propose new similarity semi-metric distancesbased on color histograms, the present paper investigatescolor pixel similarity analysis on a new perceptually uni-form color space that we call HCL (Hue, Chroma and Lu-minance). Such a new color space exploits the advantagesof each one of the color spaces: HSL/HSV and L∗a∗b∗ anddiscards their drawbacks.

We assume that the chroma and the hue of any color can bedefined as a blend of the three chrominance elemental sensa-tions: R-G (from red to green), G-B (from green to blue) andB-R (from blue to red). Based on this assumption and theMunsell color system with the three color attributes closedto human perceptions: hue (H), chroma (C) and luminance(L), we define below a mapping from RGB space to HCLspace.

We recall that a color containing a lot of white is brighterthan one with less white. A saturated color contains 0% ofwhite and has a maximum value of chroma. An increasingvalue of white leads to a decreasing value of chroma and

Page 3: 10.1.1.125.3833

a less saturated color. Concretely, a color is saturated ifMax(R, G, B) is equal to R, G, or B, and Min(R, G, B) = 0.The saturation of a color is null (i.e., chroma =0) whenMin(R, G, B) = Max(R, G, B). Therefore, we will use theexpressions Max(R, G.B) and Min(R, G, B) to compute lu-minance L.

Human vision reacts in a non-linear (logarithmic) manner tocolor intensity. For example, a 20% reduction of luminosityis perceived as a 50% reduction. Based on the proportion-ality law of Van Kries, luminance L can be expressed byQ.Y where Y corresponds to the luminosity captured by aphoto-receptor. Color spaces YIQ, YUV, YCrCb, L*u*v*and L*a*b* express Y by Y = 0.299R + 0.587G + 0.114B,while spaces HSI, HSV, and HSL use Y = I = (R+G+B)/3,Y = L = Max(R, G, B) and Y = L = (Max(R, G, B) +Min(R, G, B))/2 respectively.

We define luminance L as a linear combination of Max(R, G, B)and Min(R, G, B) as follows :

L =Q.Max(R, G, B) + (1−Q).Min(R, G, B)

2(1)

where Q = eαγ is a parameter that allows a tuning ofthe variation of luminosity between a saturated hue (color)and a hue containing a great amount of white, with α =( Min(R,G,B)

Max(R,G,B). 1Y0

)and Y0 = 100. γ is a correction factor

whose value (= 3) coincides with the one used in L*a*b*space. It should be noted that when Min(R, G, B) = 0 andMax(R, G, B) varies between 0 and 255, luminance L takesa value between 0 (black) and 128. When Max(R, G, B) =255 and Min(R, G, B) varies between 0 and 255, luminancetakes a value between 128 and 135.

In a similar way, we define chroma C = Q.Cn where Cn

represents a mixture of three different combinations of R,G, and B components: red-green, green-blue and blue-red.The proposed formulae for C (Equation 2) ensures linearitywithin lines/planes of hue (see Figure 3-d).

C =Q.

(R−G|+ |G−B|+ |B −R|

)3

(2)

The hue value can be computed using the following equation:

H = arctan(G−B

R−G

)(3)

However, hue values (Equation 3) vary between −900 and+900 only. To allow hue values to vary in a larger intervalgoing from −1800 to 1800 we propose the following alternateformula (see figures 3-a and 3-c):

if ((R−G) < 0 and (G−B) ≥ 0), then H = 180 + Hif ((R−G) < 0 and (G−B) < 0), then H = H − 180 .

(4)

or

if ((R−G) ≥ 0 and (G−B) ≥ 0), then H = 23H

if ((R−G) ≥ 0 and (G−B) < 0), then H = 43H

if ((R−G) < 0 and (G−B) ≥ 0), then H = 180 + 43H

if ((R−G) < 0 and (G−B) < 0), then H = 34H − 180.

(5)

(a)

(b)

(c)

(d)

Figure 3: a) and c) HCL color space model with H com-

puted using Equations 4 and 5 respectively. b) and d)

Variation of chroma C and luminance L for six different

hue values.

Figure 3 shows the HCL color model where Figures 3-a and3-c are obtained using formula L, C as well as H computedusing Equations 4 and 5 respectively. We can notice thatthe two variants of the HCL model (according to the twoways the hue H is computed) have a uniform hue angle.The chroma C decreases while the luminance L increasesaccording to an increase of the white color. In Figure 3-b, the following colors: red, yellow, green, cyan, blue andpurple have a unique angle whose value is 00, 900, 1350,

Page 4: 10.1.1.125.3833

1800, 2700 and 3150 respectively. In Figure 3-d, the angleis 00, 600, 1200, 1800, 2400 et 3000 respectively. Such resultshows that HCL model offers a better hue constancy thanL*C*H et CIECAM02 models.

4. COLOR SIMILARITY MEASURESThe notion of uniform color perception is an important cri-terion for classification and discrimination between colorspaces. In order to capture perceptual uniformity in a colorrepresentation space, it is crucial to rely on the distance cri-terion which states that the distance D(c1, c2) between twocolors c1 et c2 is correct if and only if the distance value isclose to the difference perceived by the human eye [9].

Many distances have been proposed based on the existingcolor models. The Euclidean distance (denoted by 4E)is frequently used in cubic representation spaces such asRGB and L*a*b* and occasionally in cylindric spaces likeL*C*H* (see Equations 6 to 8). Another Euclidean-like dis-tance (Equation 9) was intensionally proposed for L*C*H[1]. In Equation 10, a cylindric distance (denoted by Dcyl)[10] is used for cylindric and conic spaces like HSL, HSV andL*C*H*. Recently, another formulae for computing colordifference (denoted by 4E00 in Equation 11) has been pro-posed in [5].

4ERGB =√4R2 +4G2 +4B2 (6)

4Eab =

√4L∗2 +4a∗2 +4b∗2 (7)

4ECH =

√4L∗2 +4C∗2 +4H∗2 (8)

4E94 =

√( 4L∗

kLSL

)2

+( 4C∗

kCSC

)2

+( 4H∗

kHSH

)2

(9)

where kL = kC = kH = 1, SL = 1, SC = 0.045√

C1C2 +1 and SH = 0.015

√C1C2 + 1

Dcyl =√4L∗2 + C∗

12 + C∗

22 − 2C∗

1C∗2 cos(4H) (10)

4E00 =

√( 4L∗

kLSL

)2

+( 4C∗

kCSC

)2

+( 4H∗

kHSH

)2

+4R (11)

We have conducted an experimental study to first analyzethe compatibility between these distances and the color spacesHSV, L*C*H* and CIECAM02, and then contrast these dis-tances against human perception. To that end, we have se-lected ten different colors as reference (target) colors. Eachone of them is compared to a collection of randomly gener-ated colors using each one of the proposed similarity mea-sures. Colors are generated automatically by a variation ofR, G and B values (0 ≤ R, G, B ≤ 255) using an incrementequal to 15. This leads to a set of 4913 colors for each colorspace.

To illustrate the potential of the new color space HCL de-fined earlier, Figures 4 through 12 show an experimentalcase using a fully saturated and pure yellow color (R=255,

G=255, B=0). This reference color appears on the leftmosttop cell of each figure. The most similar colors returned bythe selected distances (e.g., Euclidean, 4E94, Dcyl) are dis-played in a decreasing order of similarity from left to rightand top to bottom. Figures 4 to 6 give the sequences ofcolors returned by the Euclidean distance applied to RGB,L*a*b* and L*C*H* respectively. Figures 7 and 9 show thelist of colors returned by the application of 4E94 to theL*C*H* and CIECAM02 spaces. Figures 8 and 10 showthe list of colors returned by the application of 4E00 to theL*C*H* and CIECAM02 spaces while Figures 11 and 12 ex-hibit the colors returned by the cylindric distance applied toHSV and HCL respectively.

From these figures, one can see that the application of theEuclidean distance to L*a*b* and L*C*H* spaces providesthe worst answers, i.e., most of the returned colors are notclose to the target color. Such a distance is appropriate tothe RGB space, but is far from being uniform like humanperception. However, using the 4E94 and 4E00 distancesfor color spaces like L*C*H* and CIECAM02 and the cylin-dric distance for color spaces like HSV and HCL offers goodresults with a slight superiority of the HCL space (see Fig-ure 12) we defined in this paper. However, all the providedresults are not completely compatible with human percep-tion.

4.1 A New Color Similarity MeasureIn the following we define a new color similarity measurecalled DHCL and based on the cylindric model with param-eters AL and ACH . This measure is particularly adapted tothe new color space defined in this paper.

DHCL =√

(AL4L)2 + AH(C12 + C2

2 − 2C1C2 cos(4H)) (12)

where AL is a constant of linearization for luminance fromthe conic color model to the cylindric model, and AH is aparameter which helps reduce the distance between colorshaving a same hue as the hue in the target (reference) color.

In order to determine these two parameters, we consider aslice of the HCL model. For example, let us take a refer-ence pixel Pr of saturated purple (see Figure 3). We can seethat a pixel Pa with the same hue (4H = 0) and the sameluminance (4L = 0) with a difference in chroma equal to4C = 50 is more similar to pixel Pr than pixel Pb having4L = 0, 4C = 0 and 4H close to 80. Then, we can deter-mine ACH as ACH = 4H + 8/50 = 4H + 0.16. Moreover,the pixel Pb is more similar to pixel Pr than the pixel Pc hav-ing 4H = 0 and 4C = 50, and being darker (4L = 37).However, the pixel Pd with 4H = 0, 4C = 50 and a greaterluminance (4L = 25) is more similar to pixel Pr than pixelPb does. Due to this luminance effect, we proceed to a tri-angulation computation which leads to a correction factorequal to AL = 1.4456.

Figure 13 illustrates the output provided by the new simi-larity measure DHCL when it is applied to the HCL colorspace. One can notice that the returned colors are closer tothe reference color (leftmost top cell) than those obtainedusing existing color distances and spaces (see Figures 4 to

Page 5: 10.1.1.125.3833

Figure 4: Euclidean distance applied to RGB space.

Figure 5: Euclidean distance applied to L*a*b* space.

Figure 6: Euclidean distance applied to L*C*H* space.

Figure 7: Distance 4E94 applied to L*C*H* space.

Figure 8: Distance 4E00 applied to L*C*H* space.

Figure 9: Distance 4E94 applied to CIECAM02 space.

Figure 10: Distance 4E00 applied to CIECAM02 space.

Figure 11: Cylindric distance Dcyl applied to HSV

space.

Figure 12: Cylindric distance Dcyl applied to HCL

space.

11) or using Dcyl with the new HCL color space (see Figure12). Experimental results on reference colors other than yel-low confirm that the application of the new color distanceDHCL to the new color space HCL leads to a better per-ceptual uniformity than HSV, HSL, L*a*b* et L*C*H* forwhich existing distances are used (see Equations 6 to 10).

Figure 13: New distance DHCL applied to HCL space.

4.2 Empirical AnalysisIn order to compare the sequence of colors returned by thecomputer system (according to different color spaces anddistances) with the list returned by the human system, sevensubjects were asked to evaluate the output. For each one ofthe ten cases (see Figures 4 to 13) corresponding to pairs ofa given color space and a color distance, there are 48 cells:the reference color cell (leftmost top cell) and 47 (returned)color cells. Every subject has to choose and rank the topten colors that are most similar to the reference color. If lessthan ten colors are selected by a subject for a given combina-tion of color distance and space (e.g., Euclidean distance and

Page 6: 10.1.1.125.3833

RGB), then the rank of missing colors is given the value 48.At the end of the experimentation, all subjects concludedthat using DHCL on HCL leads to better results than theother combinations of distance and space. Indeed, the com-bination of DHCL and HCL returns much more colors thatare similar to the reference color than any one of the othercombinations.

Figure 14 exhibits five rows corresponding to different colors.The first cell in each row identifies the reference color (red,yellow, green, blue and purple) while the remaining cellshave a rank from 1 to 12 where rank 1 corresponds to thecolor which is the most similar to the reference color. Theranking is computed as the mean of the judgment of sevensubjects, three of them are experts in image processing.

Figure 14: Five reference colors with the average rank-

ing of similar colors (from 1 to 12).

Figure 15 provides the ranking for the purple color. The firstrow corresponds to the ranking (from the most similar tothe less similar) using the distance Dcyl and the HCL spacedefined in the paper. The remaining rows give the rankingreturned by the pairs Dcyl and HSV, 4E and L*a*b*, 4Eand L*C*H*, 4E94 and L*C*H*, 4E00 and L*C*H*, 4E94

and CIECAM02, and 4E00 with CIECAM02, respectively.

To quantify the potential of each distance to return the col-ors that are close to human perception, we have applied thefollowing effectiveness measure (see [6] for more details).

Effsys =1

1 + log( RRc

)

∑Rci=1 i∑Rc

i=1 i +∑Rc

i=1 |i− ri|. (13)

where Rc is the total number of relevant colors (according tothe user’s judgment) in the color set, R is the total numberof retrieved colors (R ≥ Rc), i (= 1, 2, · · · , Rc) is similarityimage ranking by human judgment and ri corresponds tosystem image ranking (in a decreasing relevance order).

The curves in Figure 16 illustrate the retrieval effectivenessratio of color distance and space combinations pour five ref-erence colors where the ordinate represents the average ef-fectiveness computed from the judgment of seven subjects.One can see that the combination of DHCL and color space

HCL outperforms the other combinations of color distancesand spaces. The pair 4E00 and CIECAM02 provides goodresults for yellow and green but the worst effectiveness ratiofor the three other colors. The pair 4E94 and L*C*H* givesthe worst retrieval effectiveness for all the selected colors.

Moreover, we conducted additional empirical studies to com-pare the proposed color space HCL against L*C*H* andCIECAM02 on an image data set of 3500 images repre-senting photographs et paintings of small, medium or highresolution. This includes 500 images from the database ofthe Info-Muse network [4] containing museum collections inQuebec (Canada) as well as images from different web sites[11]. The first set contains art images related to paintings,statues, medals and ancient clothing items. The whole col-lection is grouped under four overlapping semantic classes:painting, close-up, indoor and outdoor images. Each class(e.g., Outdoor) is further split into subgroups (e.g., city,landscape, etc.).

Figure 15: Ranking according to eight pairs of distances

and color spaces.

Figure 16: Retrieval effectiveness of six combinations of

distances and color spaces.

Based on our previous work on similarity analysis [6], thecomparison between two images makes use of color histogramsand a similarity distance involving the Dirichlet distribution.Figures 17 through 19 illustrate the retrieval output pro-vided by the system when CIECAM02, L*C*H* and HCLcolor spaces are used, respectively. When an image query(leftmost top image) is submitted, the system returns im-ages in a decreasing order of similarity. A careful look atthe three figures indicates that HCL outperforms the twoother spaces. For example, one can see that the first tworows in Figure 19 contain images with colors closer to thosein the image query than images in the same rows of Figures17 (CIECAM02) and 18 (L*C*H*).

Page 7: 10.1.1.125.3833

Figure 17: Image retrieval using CIECAM02 color

space.

Figure 18: Image retrieval using L*C*H* color space.

Figure 19: Image retrieval using HCL color space.

5. CONCLUSIONIn order to overcome the limitations of existing color spacesand color distances in correctly capturing color differencesperceived by the human system, we have presented a newcolor space called HCL inspired from HSL/HSV and L*a*b*spaces as well as a new similarity measure labelled DHCL

and tailored to the HCL space. Experimental results showthat using DHCL on HCL leads to a solution very close tohuman perception of colors and hence to a potentially moreeffective content-based image/video retrieval.

We are currently studying the potential of our findings inthree fields of image/video processing, namely : image seg-mentation, object edge extraction, and content-based image(or sub-image) retrieval.

AcknowledgmentsThe authors would like to thank the anonymous reviewersfor their valuable comments and suggestions for improve-ment. This work is part of CoRIMedia research projects thatare financially supported by Valorisation Recherche Quebec,Canadian Heritage and Canada Foundation for Innovation.

Page 8: 10.1.1.125.3833

6. REFERENCES[1] D. Alman. Industrial color difference evaluation. Color

Research and Application, no.3:137–139, 1993.

[2] R. C. Gonzalez and R. E. Woods. Digital ImageProcessing. Prentice Hall, second edition, 2002.

[3] B. Hill, T. Roger, and F. Vorhagen. Comparativeanalysis of the quantization of color spaces on thebasis of the cielab color-difference formula. ACMTrans. on Graphics, 16:109–154, April 1997.

[4] N. Info-Muse. Societe des Musees Quebecois (SMQ);(http://www.smq.qc.ca/publicsspec/smq/services/infomuse/index.phtml). 2004.

[5] M. R. Luo, G. Cui, and B. Rigg. The developpementof the cie 2000 colour difference formula: Ciede2000.COLOR Research and Application, 26:340–350, 2001.

[6] R. Missaoui, M. Sarifuddin, and J. Vaillancourt. Aneffective approach towards content-based imageretrieval. In Proceedings of the InternationalConference on Image and Video Retrieval (CIVR2004), Dublin, Ireland, pages 335–343, July 2004.

[7] N. Moroney. The ciecam02 color appearance model. InProceedings of the the Tenth Color ImagingConference: Color Science, System and Application,pages 23–27, 2002.

[8] N. Moroney. A hypothesis regarding the poor blueconstancy of cielab. Color Research and application,28, no.3:371–378, 2003.

[9] G. Paschos. Perceptually uniform color spaces for colortexture analysis: An exeprimental evaluation. IEEETrans. on Image Processing, 10, no.6:932–937, 2001.

[10] K. Plataniotis and A. Venetsanopoulos. Color imageprocessing and applications. Springer, Ch. 1, pp268-269, 2000.

[11] W. sites. http://www.hemera.com,http://www.corbis.com; http://www.webshots.com;http://www.freefoto.com. 2004.

[12] A. W. M. Smeulders, M. Worring, S. Santini,A. Gupta, and R. Jain. Content-based image retrievalat the end of the early years. IEEE Trans. PatternAnal. Mach. Intell., 22(12):1349–1380, 2000.

[13] A. R. Smith. Color gamut transform pairs. ComputerGraphics, 12, no.3:12–19, 1978.

[14] J. R. Smith. Integrated spatial and feature imagesystem: retrieval, compression and analysis. In Ph.D.dissertation, Colombia Univ. New York, 1997.

[15] G. Wyszecki and W. S. Stiles. Color Science:Concepts and Methods, Quantitative Data andFormulae. John Wiley and Sons, second edition, 1982.