ASSESSMENT OF FACIAL EXPRESSIONS IN PRODUCT APPRECIATION

Mirela Carmia Popa, Leon J.M. Rothkrantz, Pascal Wiggers, Caifeng Shan

Abstract


In the marketing area, new trends are emerging, as customers are not
only interested in the quality of the products or delivered services, but also in a stimulating shopping experience. Creating and infuencing customers' experiences has become a valuable diferentiation strategy for retailers. Therefore, understanding and assessing the customers' emotional response in relation to products/services represents an important asset. The purpose of this paper consists of investigating
whether the customer's facial expressions showed during product appreciation are positive or negative and also which types of emotions are related to product appreciation. We collected a database of emotional facial expressions, by presenting a set of forty product related pictures to a number of test subjects. Next, we analysed the obtained facial expressions, by extracting both geometric and appearance
features. Furthermore, we modeled them both in an unsupervised and supervised manner. Clustering techniques proved efficient at diferentiating between positive and negative facial expressions in 78% of the cases. Next, we performed a more refined analysis of the dierent types of emotions, by employing dierent classification methods and we achieved 84% accuracy for seven emotional classes and 95%
for the positive vs. negative.


Keywords


Product Emotions; Facial Expression Analysis; Geometric Features; Appearance Features; Unsupervised Learning; Supervised Learning

References


Bilenko, M., Basu, S. and Mooney, R.J., 2004. Integrating constraints and metric learning in semi-supervised clustering. In Proc. 21st Int. Conf. on Machine Learning.

Bradley, M.M., Lang, P.J., 1994. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. J. of Behavior Therapy and Experimental Psychiatry, vol. 25, no. 1,

pp. 49-59.

Carenini, G. and Poole, D., 2002. Constructed Preferences and Value-focused Thinking: Implications for AI research on Preference Elicitation, In AAAI-02 Workshop on Preferences

in AI and CP: symbolic approaches, Edmonton, Canada.

Cootes, T.F., Edwards, G.J., and Taylor, C.J., 2001. Active appearance models. In IEEE Tran. on Pattern Analysis and Machine Intelligence, no. 23, vol. 6 pp. 681-685.

Dalal, N. and Triggs, B., 2005. Histograms of oriented gradients for human detection. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR'05), vol. 1, pp. 886-893, San

Diego, California.

Desmet, P.M.A. and Hekkert, P., 2002. The basis of product emotions. In W. Green and P. Jordan (Eds.), Pleasure with Products, Beyond usability, London: Taylor and Francis, pp.

-68.

Duda, R.O., Hart, P.E., Stork, D.G., 2001. Pattern Classication, Second Edition, John Wiley & Sons Inc.

Ekman, P. and Friesen, W. 1978. Facial Action Coding System, Consulting Psychologists Press, Inc., Palo Alto California, USA.

Habibu, R., Mashohor, S., Hamiruce, M.M, Saripan, M.I., 2012. UPM-3D Facial Expression Recognition Database(UPM-3DFE). In PRICAI 2012, pp. 470-479.

Kobayashi, H. and Hara, F., 1997. Facial interaction between animated 3D face robot and human beings. In IEEE Int. Conf. on System, Man and Cybernetics, pp. 3732-3737.

Lang, P.J., Bradley, M.M., and Cuthbert, B.N., 2007. International Affective Picture System (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-8, University of Florida, Gainesville, FL.

Li, X., Ruan, Q., Ming, Y., 2010., 3D Facial expression recognition based on basic geometric features. In IEEE Int. Conf. on Signal Processing (ICSP), pp. 1366-1369.

Liu X., Krahnstoever, N., Yu, T., Tu, P., 2007. What are customers looking at? IEEE Conf. on Advanced Video and Signal Based Surveillance, pp. 405-410.

Matsumoto, D., Hwang, H.S., Skinner, L., Frank, M.G., 2011. Evaluating truthfulness and detecting deception. In FBI Law Enforcement Bulletin, pp. 1-11.

Pantic, M., Valstar, M.F., Rademaker, R., and Maat, L., 2005. Web-based Database for Facial Expression Analysis, Proc. IEEE Int'l Conf. Multimedia and Expo (ICME'05), Amsterdam, The Netherlands.

Pardas, M. and Bonafonte, A., 2002. Facial Animation Parameters Extraction and Expression Detection Using HMM. In Signal Processing: Image Communication, vol. 17, pp. 675-688.

Petermans, A. and Van Cleempoel, K., 2009. Retail Design and the Experience Economy: Where are we going?. In Design Principles and Practices, vol. 3, no. 1, pp. 171-182.

Popa, M.C., Rothkrantz, L.J.M, Shan, C., Gritti, T., Wiggers, P., 2012. Semantic Assessment of Shopping Behavior Using Trajectories, Shopping Related Actions, and Context Informa-

tion, Pattern Recognition Letters, DOI: 10.1016/j.patrec.2012.04.015.

Russel, J. A., 1980. A circumplex model of aect. Journal of Personality and Social Psychology, vol. 39, pp. 1167-1178.

Shan, C. and Gritti, T., 2008. Learning Discriminative LBP-Histogram Bins for Facial Expression Recognition. Proc. British Machine Vision Conference (BMVC'08), Leeds, UK.

Tan, C. T., Rosser, D., Bakkes, S., Pisan, Y., 2012. A feasibility study in using facial expressions analysis to evaluate player experiences. In Proc. of the 8th Australasian Conf. on Interactive Entertainment, pp. 1-10.

Viola, P. and Jones, M., 2001. Rapid object detection using a boosted cascade of simple features. In IEEE Conf. on Computer Vision and Pattern Recognition(CVPR), pp. 511-

Wells, W.D. and Lo Sciuto, L.A., 1966. Direct Observation of Purchasing Behavior, Journal of Marketing Research, vol. 3, no. 3, pp. 227-233.

Yeasin M., Bullot, B., and Sharma, R., 2006. Recognition of Facial Expressions and Measurement of Levels of Interest from Video. IEEE Trans. on Multimedia, vol.8, no. 3, pp. 500-508.




DOI: http://dx.doi.org/10.14311/NNW.1901.%25x

Refbacks

  • There are currently no refbacks.


Should you encounter an error (non-functional link, missing or misleading information, application crash), please let us know at nnw.ojs@fd.cvut.cz.
Please, do not use the above address for non-OJS-related queries (manuscript status, etc.).
For your convenience we maintain a list of frequently asked questions here. General queries to items not covered by this FAQ shall be directed to the journal editoral office at nnw@fd.cvut.cz.