Ghosh, S., Kamal, S., Chowdhury, L., Neogi, B., Dey, N. and Sherratt, S.
ORCID: https://orcid.org/0000-0001-7899-4445
(2023)
Explainable AI to understand study interest of engineering students.
Education and Information Technologies.
ISSN 1573-7608
doi: 10.1007/s10639-023-11943-x
Abstract/Summary
Students are the future of a nation. Personalizing student interests in higher education courses is one of the biggest challenges in higher education. Various AI and ML approaches have been used to study student behaviour. Existing AI and ML algorithms are used to identify features for various fields, such as behavioural analysis, economic analysis, image processing, and personalized medicine. However, there are major concerns about the interpretability and understandability of the decision made by a model.This is becausemostAI algorithms are black-box models. In this study, explainable AI (XAI) aims to break the black box nature of an algorithm. In this study, XAI is used to identify engineering students’ interests, and BRB and SP-LIME are used to explain which attributes are critical to their studies. We also used (PCA) for feature selection to identify the student cohort. Clustering the cohort helps to analyse the between influential features in terms of engineering discipline selection. The results show that there are some valuable factors that influence their study and, ultimately, the future of a nation.
Altmetric Badge
| Item Type | Article |
| URI | https://reading-clone.eprints-hosting.org/id/eprint/112533 |
| Identification Number/DOI | 10.1007/s10639-023-11943-x |
| Refereed | Yes |
| Divisions | Life Sciences > School of Biological Sciences > Biomedical Sciences Life Sciences > School of Biological Sciences > Department of Bio-Engineering |
| Publisher | Springer Link |
| Download/View statistics | View download statistics for this item |
University Staff: Request a correction | Centaur Editors: Update this record
Download
Download