Jay Mhaiskar Leah Bailly CAPS 499 24 November 2025 Personalized Course Guidance: Designing an AI Elective Recommender for Capilano University Introduction Choosing electives can be a stressful part of university life. I’ve experienced this firsthand, struggling to figure out which courses align with my interests, improve my GPA, and match the prerequisites I’ve already completed. I would make a rough plan of courses I saw fit and visit my advisor to check if my plan fit the requirements. The primary ways I found out about courses was by asking friends and checking MyCapMap, an online tool that shows an outline of courses needed to graduate. This would make the process tougher as word-of-mouth was not a credible source for me to determine if the course was of my interest and although MyCapMap has a “Plan Ahead” feature, it does not show the pre-requisite needed. So, the plan can go astray if the student doesn’t check the requirements beforehand. Many students at Capilano University (CapU) follow a similar path and often pick courses based on word-ofmouth or short descriptions that rarely tell the full story. My position is that an AI-driven elective recommendation bot can streamline this process by analyzing students’ transcripts and suggesting suitable electives based on past performance and interests specified by the student. This bot would enable academic advisors to recommend courses that the student is interested in and more importantly already eligible for. Alternatively, students can use this tool to develop an informed course plan before meeting with their academic advisor, making the conversations more focused and productive. Background and Context At CapU, students often struggle to choose electives that fit their degree and personal goals. MyCapMap shows program requirements, but it doesn’t guide students on which electives align with their interests, strengths, or future plans. Academic advisors help fill this gap, but limited access and course availability make it difficult for every student to get tailored advice. As a result, many end up selecting electives by trial and error rather than through informed planning.As students continue to face this issue, the conversation around a potential AI solution is shaped by a broader academic climate where AI is often met with skepticism. Many fear that automation will replace human labour, especially in roles like advising that rely on structured information. As Stephanie Marshall notes, when technology merely automates human roles “individuals lose both income and political power, risking destabilisation of democratic institutions” (Marshall 2025). These concerns are understandable. However, critics overlook how overworked advising teams already are. Advisors at CapU routinely manage high student demand, tight scheduling windows, and perform manual elective searches. An AI tool would not eliminate these roles; instead, it would help reduce repetitive tasks and free advisors to focus on conversations that require human nuance. By positioning AI as a support rather than a substitute, universities can ease workload pressure while improving the student experience without creating privacy risks or removing staff from essential decision-making. Algarni and Sheldon’s systematic review maps the rise of personalized recommendation systems in education, noting that “personalized recommendation systems (PRS) are becoming more and more common… and they are now making their way into the educational space” (Algarni, 561). Their review shows how universities around the world are experimenting with recommendation tools to make academic planning more manageable for students. Cha and colleagues extend this by studying how learners interact with AI-based course-recommender systems during the search process. Their findings show that students expect AI to help them organize information and narrow options, explaining that AI tools can “retrieve, filter, and visualize high-quality courserelated information” in ways that support personalized decision-making (Cha, 7). Other studies highlight how students respond to these systems once they use them. Akbar and coauthors surveyed students who had interacted with AI-supported advising and found that most trusted the system’s suggestions, with 80% of respondents reporting confidence in the recommendations. At the same time, scholars like Potapoff argue for a balanced approach, noting that “AI should act as an augmentation tool, providing reasoning humans can incorporate into their decision-making rather than replacing it” (Potapoff). Taken together, these studies point to a clear pattern. Students benefit from structured, personalized guidance, and AI can help deliver it. But successful adoption requires human oversight and transparent design. In an advising environment where staff face heavy workloads and many students feel unsure about their elective choices, an AI tool can ease pressure without replacing professional judgment. Instead, it can offer the first layer of organization and clarity, allowing advisors to focus on the parts of the conversation that matter most. Stakeholder Insights To understand how an AI-driven elective recommendation tool could fit into existing advising structures, I interviewed three key stakeholders: Urmila Jangra, Alex Zhang, and Jason Madar. Together, their insights provide a well-rounded perspective that combines advising practice, student experience, and technical design. Image 1 description: A photo taken after my interview with Urmila Jangra (right). Urmila Jangra, the Bachelor of Science coordinator at CapU, described academic advising as a process centered on both efficiency and accuracy. Her priority is ensuring that students complete their program requirements and prerequisites without wasting credits. She emphasized that while grades are often seen as indicators of ability, they play a limited role in advising decisions once prerequisites are met. Instead, she focuses on student interests and professional aspirations, such as medical or pharmacy school preparation. Urmila supported the concept of an AI system that filters electives by prerequisites and interests rather than GPA, calling it a tool that could save time for students and advisors alike. However, she cautioned that any AI system must remain transparent and require students to verify results, since even official advising platforms like MyCapMap sometimes make errors. Image 2 description: A photo taken during my interview with Alex Zhang (right). Alex Zhang, an academic advisor at CapU, provided a complementary viewpoint focused on the student experience. He explained that course advising is a relational process that starts with understanding a student’s goals, availability, and interests. His approach prioritizes factors such as class schedule, course availability, and personal motivation. Alex believed that students perform better when they take courses that genuinely interest them, rather than those perceived as GPA boosters. He viewed an AI elective recommender as a useful support tool, especially if it is institution-specific and regularly updated. Like Urmila, he stressed that the system should assist rather than replace advisors. He also noted that many advisors are open to using new technologies but often hesitate due to outdated information or lack of integration between departments. For AI to be trusted, it must be accurate, current, and clearly linked to CapU’s internal systems. Jason Madar, a computing instructor with AI expertise, contributed a more technical and ethical analysis. He described AI as a machine that should act as a time-saving assistant, not a “partner.” In his view, universities must implement safeguards that make the tool’s automated nature clear to users. He advised against giving AI a human tone and suggested building in disclaimers that remind students it is not a substitute for professional advising. Technically, he identified challenges such as maintaining data reliability, filtering out irrelevant outputs, and handling cases when no valid course options exist. He recommended a hybrid model, beginning with rule-based filtering to ensure factual accuracy, followed by machine learning to adapt recommendations based on user behavior and preferences. Across these interviews, a common theme emerged: AI has clear potential to streamline elective planning but only when balanced with human oversight. Urmila and Alex viewed it as a timesaving, student-preparatory tool, while Jason positioned it as a system that must be transparent and tightly controlled. Together, their perspectives reinforce that AI can enhance advising efficiency and personalization, provided it operates under ethical, well-monitored conditions that maintain the human element in academic decision-making. The AI Bot Project The goal of this project was to create a tool that helps students navigate elective selection with more clarity and less frustration. The system takes the student’s unofficial transcript in PDF format as the input and extracts the course code, course name and letter grade. I chose to use the unofficial transcript because it avoids student data concerns. It doesn’t include a name or student ID, it’s free, and it can be downloaded online without restrictions. I also don’t store any processed information in a database. Working this way helps me avoid ethical issues since the transcript contains no identifiers and none of the processed data is collected. The bot also asks students to enter their areas of interest, which helps narrow down the suggestions to courses they are more likely to enjoy. The prototype currently runs through a terminal environment with a Python FastAPI backend and a React native frontend to deliver a clean user interface where students can upload transcripts, enter interests, and receive a structured list of eligible electives. Image 3 description: Bot interface showing a transcript upload section and 18 interest categories for filtering elective options. Image 4 description: 5 course recommendations generated by the bot with reasoning for recommended courses. The logic follows a hybrid approach that mixes rule-based filtering with AI generated reasoning. In simple terms, the rule-based layer handles data extraction and filtering, while the AI layer verifies which courses the student can take. This structure keeps the system accurate and personalized, following Jason’s advice that reliable educational tools should prioritize correctness and use hybrid designs. The results above show the output for my own transcript after selecting “science” and “humanities” as my interests. The bot clearly explains why each recommended course is a good fit, which completed courses satisfy the requirement and whether the grade threshold has been met. Implications for Students and Advisors For advisors, the bot can reduce repetitive tasks, such as scanning transcripts for eligibility or manually cross-checking prerequisites. It allows them to spend more time on the human side of advising, which includes clarifying long term goals, addressing concerns, and helping students make choices that feel meaningful rather than random. At the same time, all three interviewees warned that accuracy and oversight matter. The bot should never replace advisors and should always encourage students to verify results before making decisions. Future improvements could include adding real time course availability, better handling of edge cases where no valid options exist, and a more advanced interest mapping system. A tool like this can make a real difference for students who feel overwhelmed by the number of electives available or unsure about how prerequisites and credit structures work. Instead of scrolling through hundreds of course descriptions, students can arrive at an advising meeting with a small list of classes that fit their goals and academic background. Both Urmila and Alex noted that this kind of preparation would make advising conversations more focused and productive. Conclusion This project shows that AI can support academic decision making in a practical and responsible way. Elective selection is often confusing, and students benefit from tools that organize information and highlight options that match their interests and academic history. By combining rule-based checks with AI reasoning, the bot provides structure without trying to act like a replacement for human advisors. The interviews and research make it clear that the most effective use of AI in academic settings is as an enhancement. When accuracy, transparency, and human oversight remain at the center, AI can reduce confusion, save time, and help students feel more confident in the choices they make. As universities continue exploring how technology can improve student experiences, tools like this offer a realistic blueprint for integrating AI into everyday academic life. Works Cited Akbar, Zulfikri, et al. “The Role of Artificial Intelligence-Based Recommendation Systems in Selection of Courses for Students.” Journal of Social Science Utilizing Technology, vol. 1, no. 4, Dec. 2023, pp. 249–60. journal.ypidathu.or.id, https://doi.org/10.70177/jssut.v1i4.671. Alfredo, Riordan, et al. “Human-Centred Learning Analytics and AI in Education: A Systematic Literature Review.” Computers and Education: Artificial Intelligence, vol. 6, June 2024, p. 100215. arXiv.org, https://doi.org/10.1016/j.caeai.2024.100215. Algarni, Shrooq, and Frederick Sheldon. “Systematic Review of Recommendation Systems for Course Selection.” Machine Learning and Knowledge Extraction, vol. 5, no. 2, June 2023, pp. 560–96. www.mdpi.com, https://doi.org/10.3390/make5020033. Cha, Seungeon, et al. “The Impact of AI-Based Course-Recommender System on Students’ Course-Selection Decision-Making Process.” Applied Sciences, vol. 14, no. 9, Jan. 2024, p. 3672. www.mdpi.com, https://doi.org/10.3390/app14093672. Potapoff, Julia. “AI in the Driver’s Seat? Research Examines Human-AI Decision-Making Dynamics.” Foster Business Magazine, https://magazine.foster.uw.edu/insights/aidecision-making-leonard-boussioux/. Accessed 17 Oct. 2025. Marshall, Stephanie. “Universities Must Promote AI Augmentation, Not Automation.” Times Higher Education (THE), 7 Apr. 2025, https://www.timeshighereducation.com/opinion/universities-must-promote-aiaugmentation-not-automation.