Abstracts
Abstract
The authors share the sentiments of other researchers that conducting a literature review and creating a matrix is a cumbersome task. The development of research tools that benefit researchers in their study conduct is always interesting and welcome. Thus, the objectives of this study are: (1) To evaluate the features and functionalities of Elicit, SciSpace, and Consensus in facilitating literature review and research and recommend the applicability of each tool for a type of research; (2) Assess and identify the strengths of these tools and align the impact with research workflows and efficiency; and (3) Utilize documented user feedback and opinions to ensure the appropriate AI tool is available for the researchers. This study adopts a mixed-methods approach. Firstly, a systematic literature review is conducted to gather relevant studies, user feedback, and expert opinions on the AI tools. These same studies will be used as the sample variables for this study. Secondly, a comparative evaluation of each AI tool against the original document and each other is conducted to assess each tool against the established evaluation criteria, which include search capabilities, document retrieval, summarization accuracy, citation analysis, and integration with existing research workflows. Lastly, the strengths and weaknesses of each tool are identified in relation to the criteria, and the effectiveness of the AI tool is determined based on the original content of the sample material. The findings of this study include (a) identified AI tools, such as Elicit, SciSpace, and Consensus, which offer valuable contributions to the research community through productivity-boosting features. (b) Each tool has strengths and weaknesses in search capabilities, document retrieval, summarization accuracy, citation analysis, and integration with existing research workflows; and (c) Documented user feedback indicates positive experiences with the usability and effectiveness of the tools, highlighting their potential to enhance research workflows. This study acknowledges potential limitations, including the reliance on user feedback and the subjective nature of user experiences. The evaluation is based on a specific set of criteria, and the results may vary depending on individual research needs and preferences. This study offers practical implications for researchers, students, and professionals seeking efficient and effective tools for conducting literature reviews. Elicit, SciSpace, and Consensus offer insights into their strengths, weaknesses, and potential applications. The findings contribute to informed decision-making regarding the adoption and utilization of these AI tools in research. This study examines AI tools designed explicitly for conducting literature reviews, distinguishing itself from existing research that predominantly emphasizes the application of AI in academic research settings. It offers an analysis of how various AI features can enhance the literature review process, thereby contributing a unique perspective to the ongoing discourse on the integration of AI in research methodologies.
Keywords:
- AI tools,
- Literature Review,
- Research Tools,
- Elicit,
- Consensus,
- Scispace
Résumé
Les auteurs partagent l’avis d’autres chercheurs selon lequel réaliser une revue de littérature et créer une matrice constitue une tâche laborieuse. Le développement d’outils de recherche qui facilitent le travail des chercheurs est donc toujours intéressant et bienvenu. Ainsi, les objectifs de cette étude sont les suivants : (1) Évaluer les fonctionnalités d’Elicit, SciSpace et Consensus pour faciliter la revue de littérature et la recherche, et recommander l’applicabilité de chaque outil selon le type de recherche ; (2) Examiner et identifier les forces de ces outils et aligner leur impact sur les flux de travail et l’efficacité de la recherche ; (3) Utiliser les retours et avis documentés des utilisateurs afin de garantir que l’outil d’IA approprié soit disponible pour les chercheurs. Cette étude adopte une approche méthodologique mixte. Premièrement, une revue systématique de la littérature est menée afin de recueillir des études pertinentes, des retours d’utilisateurs et des avis d’experts sur ces outils d’IA. Ces mêmes études serviront de variables d’échantillonnage pour cette recherche. Deuxièmement, une évaluation comparative de chaque outil d’IA, à la fois par rapport au document original et entre eux, est réalisée afin de les analyser selon des critères d’évaluation établis, incluant les capacités de recherche, la récupération de documents, la précision des résumés, l’analyse des citations et l’intégration aux flux de travail existants. Enfin, les forces et faiblesses de chaque outil sont identifiées au regard de ces critères, et l’efficacité de l’outil d’IA est déterminée à partir du contenu original du matériel étudié. Les résultats de cette étude montrent : (a) l’identification d’outils d’IA tels qu’Elicit, SciSpace et Consensus, qui apportent des contributions précieuses à la communauté scientifique grâce à des fonctionnalités améliorant la productivité ; (b) chaque outil présente des forces et des faiblesses en matière de capacités de recherche, de récupération de documents, de précision des résumés, d’analyse des citations et d’intégration aux flux de travail existants ; (c) les retours documentés des utilisateurs indiquent des expériences positives concernant l’utilisabilité et l’efficacité de ces outils, soulignant leur potentiel à améliorer les processus de recherche. Cette étude reconnaît certaines limites potentielles, notamment la dépendance aux retours des utilisateurs et la nature subjective de leurs expériences. L’évaluation repose sur un ensemble spécifique de critères, et les résultats peuvent varier selon les besoins et préférences de chaque chercheur. Cette étude a des implications pratiques pour les chercheurs, étudiants et professionnels à la recherche d’outils efficaces pour mener des revues de littérature. Elicit, SciSpace et Consensus fournissent un aperçu de leurs forces, faiblesses et applications potentielles. Les résultats contribuent à éclairer la prise de décision concernant l’adoption et l’utilisation de ces outils d’IA dans la recherche. Enfin, cette étude examine des outils d’IA conçus explicitement pour la réalisation de revues de littérature, se distinguant ainsi des travaux existants qui se concentrent principalement sur l’application de l’IA dans les environnements de recherche académique. Elle propose une analyse de la manière dont diverses fonctionnalités d’IA peuvent améliorer le processus de revue de littérature, apportant ainsi une perspective unique au débat actuel sur l’intégration de l’IA dans les méthodologies de recherche.
Mots-clés :
- Outils IA,
- Revue de littérature,
- Outils de recherche,
- Elicit,
- Scispace,
- Consensus
Appendices
Bibliography
- Academic English Now. (2024, April 17). Automate your literature review search with Consensus AI [Video]. YouTube. https://www.youtube.com/watch?v=ntDF70L_Yug
- Andy Stapleton. (2024a, March 19). How To Use Consensus AI - Don’t Get Left Behind! [Video]. YouTube. https://www.youtube.com/watch?v=I8VC6R7-J6M
- Andy Stapleton. (2024b, March 28). How To Use SciSpace and Copilot - Dominate Research in ONE tool! [Video]. YouTube. https://www.youtube.com/watch?v=TV1tfPzQGfU
- Andy Stapleton. (2024c, May 28). How to Use Elicit AI, Literature Reviews + More: Beginner Tutorial and Research Tips! [Video]. YouTube. https://www.youtube.com/watch?v=rJJPS-EvNfk
- Anstey, L., & Watson, G. (2018). A rubric for evaluating e-learning tools in higher education. Educause Review, 10(09).
- Bansal, G., Nushi, B., Kamar, E., Horvitz, E., & Weld, D. S. (2021, May). Is the most accurate ai the best teammate? optimizing ai for teamwork. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 35, No. 13, pp. 11405-11414).
- Bruce, C. (2001). Interpreting the scope of their literature reviews: significant differences in research students’ concerns. New Library World, 102(4/5), 158–166. https://doi.org/10.1108/03074800110390653
- Bruce, C. 2001. “Interpreting the Scope of Their Literature Reviews: Significant Differences in Research Students’ Concerns.” New Library World 102 (4/5): 158–166. https://doi.org/10.1108/03074800110390653
- Chen, D. T. V., Wang, Y. M., & Lee, W. C. (2016). Challenges confronting beginning researchers in conducting literature reviews. Studies in Continuing Education, 38(1), 47-60.
- Choguill, C. L. (2005). The research design matrix: A tool for development planning research studies. Habitat International, 29(4), 615-626.
- Consensus: AI Search Engine for Research. (n.d.). Consensus: AI Search Engine for Research. Retrieved January 29, 2025, from https://consensus.app/
- Dakhi, S. (2009). Students’ difficulties in reading English newspaper. Jurnal Littera, 2(1), 19-27
- Dr Amina Yonis. (2023, November 7). How To Write A Strong Literature Review Using AI | Write In 4 Easy Steps [Video]. YouTube. https://www.youtube.com/watch?v=DDGCi-_0-iU
- Dr Amina Yonis. (2024, May 5). The HOTTEST New AI Literature Search Engine | Consensus AI [Video]. YouTube. https://www.youtube.com/watch?v=klu1CWYIitI
- Elicit: Find scientific research papers. (n.d.). Retrieved January 29, 2025, from https://elicit.com/?workflow=table-of-papers
- El Hussein, M. T., Kennedy, A., & Oliver, B. (2017). Grounded theory and the conundrum of literature review: Framework for novice researchers. The Qualitative Report, 22(4), 1198-1210.
- Grant, G. (2024, March 26). Citation needed: Putting 7 AI research tools to the test -- pure AI. Pure AI. https://pureai.com/Articles/2024/03/26/Testing-7-AI-Research-Tools.aspx
- Hyland, K. (2019). Writing in academic contexts: A genre-based approach. Writing Research Quarterly, 40(4), 12-27. https://doi.org/10.1016/j.writres.2019.03.001
- Jain, S. J., Sibbu, K., & Kuri, R. (2023). Conducting Effective Research using SciSpace: A Practical Approach. Authorea Preprints.
- Joulin, A., Grave, E., Mikolov, T., & Ranzato, M. (2017). Bag of Tricks for Efficient Text Classification. arXiv preprint arXiv:1607.01759. https://doi.org/10.48550/arXiv.1607.01759
- Knopf, J. W. (2006). Doing a literature review. PS: Political Science & Politics, 39(1), 127-132.
- Kowalski, M., Liu, X., & Roberts, D. (2021). AI in systematic reviews: Handling contradictory evidence. Research Synthesis Methods, 12(3), 42-57. https://doi.org/10.1002/jrsm.1423
- Kritandani, W., Putra, A. W., Mali, Y. C. G., & Isharyanti, N. (2024). SciSpace for Finding Relevant Literature in English Language Education Contexts: A Technology Review. Indonesian Journal of English Language Studies (IJELS), 10(2), 108-117. https://doi.org/10.24071/ijels.v10i2.9146
- Li, F., Zhang, J., & Zheng, J. (2021). Challenges of AI in the synthesis of domain-specific research: A case study on clinical trials. Journal of Machine Learning Research, 22(88), 3564-3587. https://doi.org/10.1109/JMLR.2021.3524375
- Li, M. (2024). The impact of artificial intelligence on human resource management systems - Applications and risks. Applied and Computational Engineering, 48(1), 7–16. https://doi.org/10.54254/2755-2721/48/20241060
- Mitchell, A., & Rich, M. (2022). Literature Reviews: What are the Challenges, and how can Students and new Business Researchers be Successful?. Electronic Journal of Business Research Methods, 20(3), 99-110.
- Morrison, J. L., Kim, H. S., & Kydd, C. T. (1998). Student preferences for cybersearch strategies: Impact on critical evaluation of sources. Journal of Education for Business, 73(5), 264-268.
- Norris, J., & Lichtenstein, S. (2019). The role of theory in AI-assisted literature synthesis. Journal of Artificial Intelligence Research, 63(1), 45-60. https://doi.org/10.1613/jair.2342
- Passi, S., & Vorvoreanu, M. (2022). Overreliance on AI literature review. Microsoft Research.
- Purdy, J., & Tuck, A. (2018). Evaluating AI tools for citation management in academic writing. Information Processing & Management, 55(6), 964-977. https://doi.org/10.1016/j.ipm.2018.06.004
- Premnath, S., Arun, A., & R, D. A. (2020). A Qualitative Study of Artificial Intelligence Application Framework in Human Resource Management. https://doi.org/10.31219/osf.io/uqhn2
- Ravindran, P. (2022). The need for domain-specific AI tools in evidence synthesis. AI in Health Research, 7(2), 130-145. https://doi.org/10.1145/1234567
- Sastry, M. K., & Mohammed, C. (2013, July). The summary-comparison matrix: A tool for writing the literature review. In Professional Communication Conference (IPCC), 2013 IEEE International pp. (1-5).
- Schmidt, M., & Haase, M. (2020). Context in evidence synthesis: The challenges of applying AI to systematic reviews. Journal of Research Synthesis, 23(1), 34-56. https://doi.org/10.1016/j.jrs.2020.04.005
- Science Grad School Coach. (2024, June 12). Elicit vs. SciSpace 2024 Literature Review Tool: A Detailed Comparison Between Both Platforms [Video]. YouTube. https://www.youtube.com/watch?v=4aGT9d9FQrc
- SciSpace (Formerly Typeset). (2023, October 31). AI Literature review tool: Elicit Vs SciSpace [Video]. YouTube. https://www.youtube.com/watch?v=MuNpTIx3vSU
- Siau, K., & Wang, W. (2018). The influence of AI on decision-making and trust. Journal of Information Technology, 33(4), 1-12. https://doi.org/10.1177/0268396218785435
- Souifi, L., Khabou, N., Rodriguez, I., & Kacem, A. (2024). Towards the use of AI-Based tools for systematic literature review. Proceedings of the 14th International Conference on Agents and Artificial Intelligence, 595–603. https://doi.org/10.5220/0012467700003636
- Sundaram, G., & Berleant, D. (2023). Automating systematic literature reviews with natural language processing and text mining: A systematic literature review. In International Congress on Information and Communication Technology (pp. 73-92). Springer, Singapore. https://doi.org/10.48550/arXiv.2211.15397
- Tewari, I., & Pant, M. (2020). Artificial Intelligence Reshaping Human Resource Management: A Review. https://doi.org/10.1109/icatmri51801.2020.9398420
- Tsafnat G, Glasziou P, Choong MK, et al. (2014) Systematic review automation technologies. Systematic Reviews 3: 1–15
- Typeset (AI Chat for scientific PDFs | SciSpace). (n.d.). Retrieved January 29, 2025, from https://typeset.io
- Vermeulen, L., Smeets, M., & Borg, J. (2020). Peer review and the collaborative writing process in AI-assisted research. Journal of Scholarly Publishing, 51(1), 98-112. https://doi.org/10.1353/scp.2020.0022
- Warburton, J., & Macauley, P. (2014). Wrangling the literature: Quietly contributing to HDR completions. Australian Academic & Research Libraries, 45(3), 159-175.
- Webster, J., & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS quarterly, xiii-xxiii.
- Zhai, S., & Liu, Z. (2023). Artificial intelligence technology innovation and firm productivity: Evidence from China. Finance Research Letters, 58, 104437. https://doi.org/10.1016/j.frl.2023.104437

