The impact of recommender type and recommendation framing on consumer attitudes and purchase intentions
DOI:
https://doi.org/10.5585/2025.29503Palavras-chave:
AI recommendation, preferences, frames, evaluationsResumo
Purpose: In an increasingly personalized marketplace, understanding the dynamics between different types of recommenders and their influence on consumer behavior is necessary for businesses looking to optimize their engagement strategies. Thus, this research aims to analyze the impact of the type of recommender (AI vs. human) and the type of recommendation (material vs. experiential) on consumer' attitude and purchase intention.
Design/methodology: We conducted a factorial manipulation approach in our experimental studies to examine the mais effect, serial mediation effects, and moderation effect. Specifically, we conducted three experiments with different scenarios to check the hypothesis.
Findings: The results show that AI recommendations generated more positive reviews and greater purchase intention compared to human recommendations (Studies 1a and 1b). However, about experiential framing vs. material, AI recommendations in an experiential context were better evaluated (Study 2). On the other hand, recommendations from a human expert were more preferred in material context.
Originality: This study is the first application of model serial mediation (Hayes PROCESS model 6) relating AI recommendations and consumer' evaluations. The research contributes to the Choice Process Theory, in addition to the literature that addresses Artificial Intelligence and its applicability in consumer relations.
Managerial implications: This study can guide companies in developing personalized recommendation strategies, adapted to different user profiles, aiming to improve their experience.
Research limitations: The data is limited to recommendations made in three different scenarios. It is necessary to test other different products/services.
Downloads
Tradução
Referências
Albert, D., Aschenbrenner, K. M., & Schmalhofer, F. (1989). Cognitive choice processes and the attitude-behavior relation. In Attitudes and behavioral decisions (pp. 61-99). New York, NY: Springer New York.
Banker, S., & Khetani, S. (2019). Algorithm Overdependence: How the Use of Algorithmic Recommendation Systems Can Increase Risks to Consumer Well-Being. Journal of Public Policy & Marketing, 38(4), 500–515. https://doi.org/10.1177/0743915619858057
Benbasat, Izak & Wang, Weiquan (2005) "Trust In and Adoption of Online Recommendation Agents," Journal of the Association for Information Systems, 6(3). https://doi.org/10.17705/1jais.00065
Bonezzi, A., & Ostinelli, M. (2021). Can algorithms legitimize discrimination? Journal of Experimental Psychology: Applied, 27(2), 447. https://doi.org/10.1037/xap0000294
Cadario, R., Longoni, C., & Morewedge, C. K. (2021). Understanding, explaining, and utilizing medical artificial intelligence. Nature human behaviour, 5(12), 1636-1642. https://doi.org/10.1038/s41562-021-01146-0
Castelo, N., Bos, M. W., & Lehmann, D. R. (2019). Task-dependent algorithm aversion. Journal of Marketing Research, 56(5), 809-825. https://doi.org/10.1177/0022243719851788
Dabholkar, P. A., & Sheng, X. (2012). Consumer participation in using online recommendation agents: effects on satisfaction, trust, and purchase intentions. The Service Industries Journal, 32(9), 1433-1449. https://doi.org/10.1080/02642069.2011.624596
Danienta, N., & Rindfleisch, A. (2020). Consumers Respond to Artificial Intelligence Recommendations: Experiential Vs. Material Framing. ACR North American Advances.
Du, S., & Xie, C. (2021). Paradoxes of artificial intelligence in consumer markets: Ethical challenges and opportunities. Journal of Business Research, 129, 961-974. https://doi.org/10.1016/j.jbusres.2020.08.024
Efendić, E., Van de Calseyde, P. P., & Evans, A. M. (2020). Slow response times undermine trust in algorithmic (but not human) predictions. Organizational Behavior and Human Decision Processes, 157, 103-114. https://doi.org/10.1016/j.obhdp.2020.01.008
Einhorn, H. J., & Hogarth, R. M. (1981). Behavioral decision theory: Processes of judgement and choice. Annual review of psychology, 32(1), 53-88.
Gallo, I., Townsend, C., & Alegre, I. (2019). Experiential product framing and its influence on the creation of consumer reviews. Journal of Business Research, 98, 177-190. https://doi.org/10.1016/j.jbusres.2019.01.007.
Garvey, A. M., Kim, T., & Duhachek, A. (2023). Bad News? Send an AI. Good News? Send a Human. Journal of Marketing, 87(1), 10–25. https://doi.org/10.1177/00222429211066972
Han, J. J., Smale, M. C., & Lee, J. (2023). How power increases preference for experiential purchases but not for material purchases. Psychology & Marketing, 40(6), 1089-1102. https://doi.org/10.1002/mar.21793.
Hansen, F. (1976). Psychological theories of consumer choice. Journal of Consumer Research, 117-142.
He, F., Wang, X., & Liu, B. (2010). Attack detection by rough set theory in recommendation system. In 2010 IEEE International Conference on Granular Computing (pp. 692-695). IEEE. https://doi.org/10.1109/GrC.2010.130.
Jin, F. and Zhang, X. (2023), "Artificial intelligence or human: when and why consumers prefer AI recommendations", Information Technology & People, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/ITP-01-2023-0022
Kim, J., Giroux, M., & Lee, J. C. (2021). When do you trust AI? The effect of number presentation detail on consumer trust and acceptance of AI recommendations. Psychology & Marketing, 38(7), 1140–1155. https://doi.org/10.1002/mar.21498.
Lawler, E. J. (1992). Affective attachments to nested groups: A choice-process theory. American Sociological Review, 327-339. https://doi.org/10.2307/2096239.
Lee, M. K. (2018). Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management. Big Data & Society, 5(1). https://doi.org/10.1177/2053951718756684.
Li, J., Huang, J., & Li, Y. (2023). Examining the effects of authenticity fit and association fit: A digital human avatar endorsement model. Journal of Retailing and Consumer Services, 71, https://doi.org/10.1016/j.jretconser.2022.103230.
Logg, J. M., Minson, J. A., & Moore, D. A. (2019). Algorithm appreciation: People prefer algorithmic to human judgment. Organizational Behavior and Human Decision Processes, 151, 90-103. https://doi.org/10.1016/j.obhdp.2018.12.005.
Longoni, C., & Cian, L. (2022). Artificial intelligence in utilitarian vs. hedonic contexts: The “word-of-machine” effect. Journal of Marketing, 86(1), 91-108, https://doi.org/10.1177/0022242920957347.
Longoni, C., Bonezzi, A., & Morewedge, C. K. (2019). Resistance to medical artificial intelligence. Journal of Consumer Research, 46(4), 629-650, https://doi.org/10.1093/jcr/ucz013.
McLean, G., Osei-Frimpong, K., & Barhorst, J. (2021). Alexa, do voice assistants influence consumer brand engagement?–Examining the role of AI powered voice assistants in influencing consumer brand engagement. Journal of Business Research, 124, 312-328, https://doi.org/10.1016/j.jbusres.2020.11.045.
Newman, D. T., Fast, N. J., & Harmon, D. J. (2020). When eliminating bias isn’t fair: Algorithmic reductionism and procedural justice in human resource decisions. Organizational Behavior and Human Decision Processes, 160, 149-167, https://doi.org/10.1016/j.obhdp.2020.03.008.
Pantano, E., & Scarpi, D. (2022). I, Robot, You, Consumer: Measuring Artificial Intelligence Types and their Effect on Consumers Emotions in Service. Journal of Service Research, 25(4), 583–600. https://doi.org/10.1177/10946705221103538
Promberger, M., & Baron, J. (2006). Do patients trust computers?. Journal of Behavioral Decision Making, 19(5), 455-468, https://doi.org/10.1002/bdm.542.
Ruan, Y., & Mezei, J. (2022). When do AI chatbots lead to higher customer satisfaction than human frontline employees in online shopping assistance? Considering product attribute type. Journal of Retailing and Consumer Services, 68, 103059, https://doi.org/10.1016/j.jretconser.2022.103059.
Sinha, R. R., & Swearingen, K. (2001). Comparing recommendations made by online systems and friends. DELOS, 106.
Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. science, 333(6043), 776-778, https://doi.org/10.1126/science.1207745.
Srinivasan, R., & Sarial-Abi, G. (2021). When algorithms fail: Consumers’ responses to brand harm crises caused by algorithm errors. Journal of Marketing, 85(5), 74-91, https://doi.org/10.1177/0022242921997082.
Wegner, D. M., & Ward, A. F. (2013). How Google is changing your brain. Scientific American, 309(6), 58-61.
Xie, Z., Yu, Y., Zhang, J., & Chen, M. (2022). The searching artificial intelligence: Consumers show less aversion to algorithm‐recommended search product. Psychology & Marketing, 39(10), 1902-1919, https://doi.org/10.1002/mar.21706.
Xu, L., & Mehta, R. (2022). Technology devalues luxury? Exploring consumer responses to AI-designed luxury products. Journal of the Academy of Marketing Science, 50(6), 1135-1152, https://doi.org/10.1007/s11747-022-00854-x.
Yalcin, G., Lim, S., Puntoni, S., & van Osselaer, S. M. (2022). Thumbs up or down: Consumer reactions to decisions by algorithms versus humans. Journal of Marketing Research, 59(4), 696-717, https://doi.org/10.1177/00222437211070016.
Yeomans, M., Shah, A., Mullainathan, S., & Kleinberg, J. (2019). Making sense of recommendations. Journal of Behavioral Decision Making, 32(4), 403-414, https://doi.org/10.1002/bdm.2118.
Zarantonello, L., Jedidi, K., & Schmitt, B. H. (2013). Functional and experiential routes to persuasion: An analysis of advertising in emerging versus developed markets. International Journal of Research in Marketing, 30(1), 46–56, https://doi.org/10.1016/j.ijresmar.2012.09.001.
Downloads
Publicado
Como Citar
Edição
Seção
Licença
Copyright (c) 2025 Elielton dos Santos Oliveira, Danielle Mantovani

Este trabalho está licenciado sob uma licença Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
- Resumo 211
- PDF (English) 172

