After carrying out an onboarding session for Digital Learning Designer programmes, it became apparent that the new apprentices were struggling with the overwhelming amount of information. They often had many questions afterwards, and although they could reach out to coaches via Slack, sometimes they would be busy and instant answers were desirable.
FAQs researched and tracked over time, stored in Supabase
Captures metadata of user interactions through Slack
Agent searches database and provides answers to post onboarding questions
Periodic analysis agent identifies pain points and suggests improvements
Human approval guardrail: responses escalated to humans if AI confidence is low
Users can opt out of AI and request a coach directly
Ensuring FAQ accuracy requires continuous research and tracking, and strong Evals
Balancing automation with human oversight (knowing when to escalate)
Maintaining the user trust when AI responses aren't confident enough
Managing metadata collection without creating privacy concerns, as it was of paramount importance to let learners know their data was being processed by AI
Keeping the FAQ database current as onboarding processes evolve
Educating users about when/how to opt out of AI for human support
The Agent had a positive effect and was able to bring to light a number of core challenges with the onboarding process, particularly around the terminology and amount of information provided over the morning afternoon. Slack's interface was useful; however, it seemed more appropriate to embed this within the onboarding eLearning package rather than use it standalone, as it would otherwise degrade the social interaction that was wanted on Slack. The agent generated over 50 queries in the first month, highlighting challenges with the amount of information provided and serving as the catalyst for changes.