Résumés
Abstract
This article focuses on the integration of generative artificial intelligence in higher education, specifically in the form of the ChatGPT chatbot. It explores how the prescribed use of ChatGPT in an educational setting influences students’ acceptance of this technology. Based on Bobillier Chaumon’s approach of “situated acceptance” (2016) adapted to the learning context, the article analyzes technological acceptance across four dimensions: individual, interpersonal, organizational, and transpersonal. The chosen methodology is qualitative, analyzing 31 reflective accounts from students who experimented with ChatGPT during a specific educational activity conducted from December 2023 to January 2024. The results reveal varied student perceptions, emphasizing the importance of maintaining the students’ active engagement and critical thinking about emerging technologies in order to maximize the educational potential of these tools while managing the challenges they present.
Keywords:
- Generative artificial intelligence (GenAI),
- artificial intelligence (AI),
- generative AI,
- situated acceptance,
- qualitative research,
- higher education
Résumé
Cet article porte sur l’intégration de l’intelligence artificielle générative, et plus particulièrement du chatbot ChatGPT, dans l’enseignement supérieur. Il explore comment l’usage prescrit de ChatGPT dans un cadre pédagogique influence l’acceptation des étudiants vis-à-vis de cette technologie. S’appuyant sur l’approche de l’« acceptation située » de Bobillier Chaumon (2016), ajustée au contexte d’apprentissage, l’article analyse l’acceptation technologique à travers quatre dimensions : individuelle, interpersonnelle, organisationnelle et transpersonnelle. La méthodologie adoptée est qualitative, analysant 31 comptes-rendus réflexifs d’étudiants ayant expérimenté ChatGPT lors d’une activité pédagogique conduite entre décembre 2023 et janvier 2024. Les résultats montrent des perceptions variées des étudiants. Ils soulignent l’importance de maintenir l’engagement actif et l’esprit critique des étudiants face aux technologies émergentes, afin de maximiser leur potentiel éducatif tout en gérant les défis qu’elles présentent.
Mots-clés :
- Intelligence artificielle générative (IAg),
- intelligence artificielle (IA),
- IA générative,
- acceptation située,
- recherche qualitative,
- enseignement supérieur
Parties annexes
References
- Baidoo-Anu, D., & Owusu Ansah, L. (2023). Education in the era of generative artificial intelligence (AI): Understanding the potential benefits of ChatGPT in promoting teaching and learning. Journal of AI, 7(1), 52-62. https://doi.org/10.61969/jai.1337500
- Barcenilla, J., & Bastien, C. (2009). L’acceptabilité des nouvelles technologies : quelles relations avec l’ergonomie, l’utilisabilité et l’expérience utilisateur? [Acceptability of innovative technologies: relationship between ergonomics, usability, and user experience]. Le travail humain, 72(4), 311-331. https://doi.org/10.3917/th.724.0311
- Bobillier Chaumon, M.-E. (2003). Évolutions techniques et mutations du travail : émergence de nouveaux modèles d’activité [Technological advances and mutations in the work environment: Emergence of new models of activity]. Le travail humain, 66(2), 161-192. https://doi.org/10.3917/th.662.0161
- Bobillier Chaumon, M.-E. (2016). L’acceptation située des technologies dans et par l’activité : premiers étayages pour une clinique de l’usage [The situated acceptance of ICT in/for the activity: Towards a clinical use]. Psychologie du travail et des organisations, 22(1), 4-21. https://doi.org/10.1016/j.pto.2016.01.001
- Bobillier Chaumon, M.-E. (2021). Exploring the situated acceptance of emerging technologies in and concerning activity: Approaches and processes. In M.-E. Bobillier Chaumon (dir.), Digital transformations in the challenge of activity and work: Understanding and supporting technological changes (pp. 237-256). Wiley. https://doi.org/pbxm
- Bobillier Chaumon, M.-E., & Dubois, M. (2009). L’adoption des technologies en situation professionnelle : quelles articulations possibles entre acceptabilité et acceptation? [The adoption of technologies in professional settings: What possible connections between acceptability and acceptance?]. Le travail humain, 72(4), 355-382. https://doi.org/10.3917/th.724.0355
- Brangier, É., Hammes-Adelé, S., & Bastien, J.-M. C. (2010). Analyse critique des approches de l’acceptation des technologies : de l’utilisabilité à la symbiose humain-technologie-organisation [Critical analysis of technology acceptance approaches: From usability to human-technology-organization symbiosis]. European Review of Applied Psychology, 60(2), 129-146. https://doi.org/10.1016/j.erap.2009.11.002
- Cahour, B., & Lancry, A. (2011). Émotions et activités professionnelles et quotidiennes [Emotions in professional and daily activities]. Le travail humain, 74(2), 97-106. https://doi.org/10.3917/th.742.0097
- Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education, 20(1), Article 43. https://doi.org/gshsfg
- Chi, M. T. (1997). Quantifying qualitative analyses of verbal data: A practical guide. Journal of the Learning Sciences, 6(3), 271-315. https://doi.org/b4gknx
- Cotton, D. R. E., Cotton, P. A., & Shipway, J. R. (2023). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International, 61(2), 228-239. https://doi.org/grzhk7
- Crittenden, W. F., Biel, I. K., & Lovely, W. A. (2019). Embracing digitalization: Student learning and new technologies. Journal of Marketing Education, 41(1), 5-14. https://doi.org/gh33kk
- Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319‑340. https://doi.org/10.2307/249008
- Dillon, A., & Morris, M. G. (1999). Power, perception and performance: From usability engineering to technology acceptance with the P3 model of user response. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (vol. 43, no 19, pp. 1017-1021). Sage. https://doi.org/fzsr84
- Doshi, A. R., & Hauser, O. P. (2024). Generative AI enhances individual creativity but reduces the collective diversity of novel content. Science Advances, 10(28). https://doi.org/10.1126/sciadv.adn5290
- Duarte, F. (2025, March 25). Number of ChatGPT Users (March 2025). Exploding Topics. https://explodingtopics.com/blog/chatgpt-users
- Epley, N., Waytz, A., Akalis, S., & Cacioppo, J. T. (2008). When we need a human: Motivational determinants of anthropomorphism. Social Cognition, 26(2), 143-155. https://doi.org/10.1521/soco.2008.26.2.143
- Farrelly, T., & Baker, N. (2023). Generative artificial intelligence: Implications and considerations for higher education practice. Education Sciences, 13(11), Article 1109. https://doi.org/10.3390/educsci13111109
- Gašević, D., Siemens, G., & Sadiq, S. (2023). Empowering learners for the age of artificial intelligence. Computers and Education: Artificial Intelligence, 4, Article 100130. https://doi.org/10.1016/j.caeai.2023.100130
- García Peñalvo, F. J., Llorens-Largo, F., & Vidal, J. (2024). La nueva realidad de la educación ante los avances de la inteligencia artificial generativa [The new reality of education in the face of advances in generative artificial intelligence]. RIED – Revista Iberoamericana de Educación a Distancia, 27(1), 9-39. https://doi.org/10.5944/ried.27.1.37716
- Goudey, A., & Bonnin, G. (2016). Must smart objects look human? Study of the impact of anthropomorphism on the acceptance of companion robots. Recherche et applications en marketing (English Edition), 31(2), 2-20. https://doi.org/ghw4r5
- Guthrie, S. E. (1997). Anthropomorphism: A definition and a theory. In R. W. Mitchell, N. S. Thompson & H. L. Miles (Eds.), Anthropomorphism, anecdotes, and animals (pp. 50‑58). State University of New York Press.
- Institut Le Sphinx, & Compilatio (2023, November 7). Survey results: Teachers and students confront their views on AI [Press release]. https://compilatio.net/...
- Janis, I. L. (1972). Victims of groupthink: A psychological study of foreign-policy decisions and fiascoes. Houghton Mifflin. https://archive.org/...
- KPMG (2024, October 21). Students using generative AI confess they’re not learning as much [News release]. https://kpmg.com/...
- Miao, F., & Holmes, W. (2023). Guidance for generative AI in education and research. UNESCO. https://doi.org/10.54675/EWZM9535
- Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. (2nd ed.). Sage.
- Mori, M. (2012). The uncanny valley [from the field] (K. F. MacDorman & N. Kageki, Transl.). IEEE Robotics & Automation Magazine, 19(2), 98-100. (Original work published 1970) https://doi.org/10.1109/MRA.2012.2192811
- Nielsen, J. (1994). Usability engineering. Morgan Kaufmann.
- Nyholm, S. (2018). Attributing agency to automated systems: Reflections on human-robot collaborations and responsibility-loci. Science and Engineering Ethics, 24(4), 1201-1219. https://doi.org/gd4sw3
- Nunamaker J. F., Jr, Dennis, A. R., Valacich, J. S., & Vogel, D. R. (1991). Information technology for negotiating groups: Generating options for mutual gain. Management Science, 37(10), 1325-1346. https://doi.org/10.1287/mnsc.37.10.1325
- Paillé, P., & Mucchielli, A. (2016). L’analyse qualitative en sciences humaines et sociales [Qualitative analysis in the humanities and social sciences]. (4e éd.). Armand Colin.
- Pickering, J. B., Engen, V., & Walland, P. (2017). The interplay between human and machine agency. In M. Kurosu (Ed.), Human-computer interaction. User interface design, development and multimodality – HCI 2017 (Lecture notes in computer science, vol. 10271, pp. 47-59). Springer. https://doi.org/pbzq
- Popenici, S. A. D., Kerr, S. (2017). Exploring the impact of artificial intelligence on teaching and learning in higher education. Research and Practice in Technology Enhanced Learning, 12, Article 22. https://doi.org/gdvnnf
- Proust-Androwkha, S. (2022). Description de la mise en oeuvre d'une démarche inductive pour caractériser les perceptions de présence des pairs-apprenants dans le cadre de la réalisation d'activités collectives à distance [Description of the implementation of an inductive approach to characterize peer-learners’ perceptions of presence in the context of collective distance activities]. Distances et médiations des savoirs, (38). https://doi.org/10.4000/dms.7812
- Proust-Androwkha, S., & Jézégou, A. (2019). Présence socio-cognitive lors d’une activité collective et à distance synchrone : une étude empirique réalisée auprès de trois groupes d’enseignants en situation de formation [Socio-cognitive presence during a distance, synchronous, group learning task: An empirical study conducted with three groups of teachers]. Revue internationale des technologies en pédagogie universitaire, 16(3), 22-38. https://doi.org/10.18162/ritpu-2019-v16n3-02
- Rabardel, P., & Béguin, P. (2005). Instrument mediated activity: From subject development to anthropocentric design. Theoretical Issues in Ergonomics Science, 6(5), 429-461. https://doi.org/fgbzgr
- Rosenberg, A. L. (2023). Institutional ethnographies on digital technologies: Investigating and developing critical digital literacy practices with high school students [Ph.D. dissertation, McGill University, Canada]. eScholarship. https://escholarship.mcgill.ca/concern/theses/m613n403g
- Shachak, A., Kuziemsky, C., & Petersen, C. (2019). Beyond TAM and UTAUT: Future directions for HIT implementation research. Journal of Biomedical Informatics, 100, Article 103315. https://doi.org/10.1016/j.jbi.2019.103315
- Shaer, O., Cooper, A., Mokryn, O., Kun, A. L., & Shoshan, H. B. (2024). AI-augmented brainwriting: Investigating the use of LLMs in group ideation. In F. Floyd Mueller, P. Kyburz, J. R. Williamson, C. Sas, M. L. Wilson, P. Toups Dugas & I. Shklovski (Eds.), CHI ’24 – Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (article 150). ACM. https://doi.org/gtxzrm
- Thellman, S., de Graaf, M., & Ziemke, T. (2022). Mental state attribution to robots: A systematic review of conceptions, methods, and findings. ACM Transactions on Human-Robot Interaction, 11(4), Article 41. https://doi.org/10.1145/3526112
- Toma, R. B., & Yánez-Pérez, I. (2024). Effects of ChatGPT use on undergraduate students’ creativity: A threat to creative thinking? Discover Artificial Intelligence, 4, Article 74. https://doi.org/pb3h
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł. & Polosukhin, I. (2017). Attention is all you need. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan & R. Garnett (Eds.), Advances in neural information processing system 30 (NIPS 2017). https://papers.nips.cc/...
- Venkatesh, V., & Bala, H. (2008). Technology acceptance model 3 and a research agenda on interventions. Decision Sciences, 39(2), 273-315. https://doi.org/10/bpkdfj
- Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186-204. https://doi.org/10.1287/mnsc.46.2.186.11926
- Venkatesh, V., Morris, M., Davis, G., & Davis, F. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478. https://doi.org/10.2307/30036540
- Venkatesh, V., Thong, J. Y., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Quarterly, 36(1), 157-178. https://doi.org/10.2307/41410412
- Waytz, A., Heafner, J., & Epley, N. (2014). The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology, 52, 113-117. https://doi.org/10.1016/j.jesp.2014.01.005
- Zhou, T., & Zhang, C. (2024). Examining generative AI user addiction from a C-A-C perspective. Technology in Society, 78, Article 102653. https://doi.org/10.1016/j.techsoc.2024.102653

