AI Chatbots as an Answer to the Shortage of High School Counselors or a Barrier?

In response to the scarcity of high school counselors, Jon Siapno, a seasoned college and career counselor from the Bay Area, embarked on developing a chatbot during the pandemic. Initially utilizing IBM’s Watson for question-answering, Siapno recognized the transformative potential of generative artificial intelligence. With the advent of advanced AI, he anticipated a paradigm shift in the education landscape.

Siapno envisioned a scenario where students at Making Waves Academy in Richmond, California, could engage with an AI Copilot to discuss their academic and career pathways. This AI-powered chatbot, armed with knowledge about colleges and professions, aimed to assist students in addressing fundamental queries before delving into more personalized conversations with counselors like Siapno, as detailed by CalMatters.

Notably, nearly a quarter of U.S. schools lack a dedicated counselor, highlighting the pressing need for innovative solutions. Although Californian high schools exhibit a relatively better counselor-to-student ratio, the figures still fall short of the recommended standards set by the American School Counselor Association.

Siapno was not alone in recognizing the scalability of generative AI in counseling services. Numerous AI bots, bearing human-like monikers such as Ava, Kelly, and Ethan, have emerged to aid individuals in navigating educational and career choices. Despite the proliferation of such tools, concerns have been raised about the potential repercussions of students forming bonds with AI counselors instead of human counterparts.

Julia Freeland Fisher, the education director at the Clayton Christensen Institute, cautioned against underestimating the impact of AI-powered counseling on students’ social and academic development. Highlighting the significance of weak ties in fostering success, she emphasized the need to evaluate the consequences of relying extensively on AI for guidance.

As the discussion around regulating AI companions for students gains traction in California, stakeholders must consider the unintended consequences of an overreliance on AI-driven counseling services. While AI chatbots offer a novel approach to supporting students, there is a need to tread cautiously to prevent the emergence of unforeseen challenges in the realm of education and career guidance.

The Making Waves Academy ensures that all its graduates meet the minimum admissions requirements for California’s four-year public colleges. Nine out of 10 graduates pursue higher education, receiving personalized coaching, scholarships, budget planning, and career guidance from the Making Waves Education Foundation to help them graduate on time with no debt and job offers. Patrick O’Donnell, CEO of Making Waves, acknowledges the scarcity of counselors in schools and the challenge of providing personalized guidance. The Making Waves AI CoPilot has been particularly beneficial for younger students seeking career information. CareerVillage.org and ChatGPT have also played roles in scaling career advice, with the latter offering an AI Career Coach for free online and in educational institutions. Coach for Nurses, a specialized version, provides round-the-clock career exploration support tailored to users’ career stages, interests, and goals. While acknowledging AI’s limitations in offering empathetic guidance, educators like Shakira Henderson see the value of AI tools as supplements to human advisors. Marcus Strother of MENTOR California highlights the importance of tools like Coach in supporting young people, especially those in underserved communities, who may not have access to mentorship opportunities. Coach is seen as a valuable resource, akin to having a mentor in one’s pocket.

A San Diego Democrat has introduced a bill, Senate Bill 243, aimed at safeguarding children from the potential harm of chatbots. The legislation seeks to restrict companies from creating chatbots that manipulate users into increased engagement, faster responses, or longer chats through psychological tactics. Such design features have been shown to foster addictive behaviors that could prevent individuals from participating in healthy activities or forming unhealthy emotional attachments to the bots.

The addictive qualities of certain apps have been a concern in the realm of social media, particularly among young users. Research by Freeland Fisher at the Clayton Christensen Institute highlighted the issue, including a statement from Vinay Bhaskara, co-founder of CollegeVine, which developed a free AI counselor named Ivy in 2023. Bhaskara shared anecdotes of students forming close bonds with the chatbot, leading to both heartwarming and concerning situations.

Bhaskara emphasized that CollegeVine’s chatbot was designed to be friendly and approachable to ensure students felt at ease using it. Despite millions of students benefiting from the tool, concerns have been raised about the potential risks associated with excessive usage outside of normal hours.

The bill proposed by Senator Padilla aims to address cases where chatbots have influenced children to engage in dangerous behaviors. By establishing guidelines, the legislation intends to balance the benefits of chatbot assistance with the need to protect young users.

Freeland Fisher warned that AI companions used for companionship and romantic interactions pose a greater risk than AI advisors for educational or career purposes. She suggested that caution should be exercised by schools and developers when integrating AI solutions to supplement counseling services.

The potential danger lies in the possibility of these tools gradually replacing meaningful interactions with human advisors and mentors, which are crucial for students’ social development and well-being. Organizations like Making Waves and CareerVillage are acknowledging the risks associated with chatbots and taking steps to address them, such as retiring certain AI programs to prioritize their mission of providing support and guidance to those in need.

Finding a way to utilize technology for helping children develop social connections, beyond simply providing answers to questions about college and career, is crucial. CareerVillage has implemented measures to address concerns raised by Padilla. The chatbot, Coach, informs users that the more they engage with it, the more personalized its recommendations will be. However, Coach is specifically designed to focus on career development discussions only. If users attempt to steer the conversation off-topic, Coach will redirect them. Chung, the executive director, emphasized that the company has implemented safety measures to prevent users from forming emotional attachments to the chatbot. It is a challenging task, but Chung believes it is achievable with effort. Data reporter Erica Yee contributed to this article, which was created by CalMatters and reviewed and distributed by Stacker.

Author

Recommended news

Major Appointee at Justice Department Revealed

WASHINGTON (AP) — Pam Bondi, a longtime ally of Donald Trump, was confirmed as the U.S. attorney general by...