
CHATBOT CHAOS
-
Issue: Addiction/Dependency on Technology. A student begins to rely on the AI chatbot for emotional support to an unhealthy degree, believing it “understands them more than any real friend or adult.” This matches “Addiction/Dependency on Technology,” where excessive AI use leads to problems or distress.
-
Human in the Loop: Ensure that AI counseling is supplemented by human counselors. For example, schedule regular check-ins with a trained counselor, so students don’t rely solely on the bot. This mixes human care with AI tools.
-
Risk Level: High. A vulnerable student’s well-being was at stake; relying on a generic AI response to self-harm indicators is very dangerous.
-
Decision: Mitigate. We cannot accept this outcome. We would immediately integrate human oversight (actual counselors) into the support process so that at-risk students are never left talking only to an impersonal bot.