Developing Chat Functions: Getting Started¶
What is a Chat Function?¶
A chat function is a function which calls Large Language Models (LLMs) to respond to the messages of students given contextual data:
- question data
- user data such as past responses to the problem
Chat functions host a chatbot. Chatbots capture and automate the process of assisting students during their learning process when outside of classroom.
Getting Setup for Development¶
-
Get the code on your local machine (Using github desktop or the
gitcli)- For new functions: clone the template repo for chat-function-boilerplate. Make sure the new repository is set to public (it needs access to organisation secrets).
- For existing functions: please make your changes on a new separate branch
-
If you are creating a new chatbot, you can either edit the
src/agents/base_agentor copy it and rename it based on the name of your chatbot. -
You are now ready to start making changes and implementing features by editing each of the main function-logic files:
-
src/agents/{base_agent}/{base}_agent.py: This file contains the main LLM pipeline using LangGraph and LangChain. -
the chat function expects the following arguments when it being called:
Body with necessary Params:
{ "message":"hi", "params":{ "conversation_id":"12345Test", "conversation_history": [{"type":"user","content":"hi"}] } }Body with optional Params:
{ "message":"hi", "params":{ "conversation_id":"12345Test", "conversation_history":[{"type":"user","content":"hi"}], "summary":" ", "conversational_style":" ", "question_response_details": "", "include_test_data": true, "agent_type": {agent_name} } } -
-
src/agents/{base_agent}/{base}_prompts.py: This is where you can write the system prompts that describe how your AI Assistant should behave and respond to the user. -
If you edited the chatbot agent file name, make sure to add your chatbot
invoke()function to themodule.pyfile.- Update the
config.jsonfile with the name of the chat function.
- Update the
-
Please add a
README.mdfile to describe the use and behaviour of your chatbot. -
Changes can be tested locally by running the pipeline tests using:
Running and Testing Chat Functions Locallypytest src/module_test.py -
Merge commits into dev branch will trigger the
dev.ymlworkflow, which will build the docker image, push it to a shareddevECR repository and deploy an AWS Lambda function available to any http requests. In order to make your new chatbot available on thedevenvironment of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform. -
You can now test the deployed chat function using your preferred request client (such as Insomnia or Postman or simply
curlfrom a terminal).DEVFunctions are made available at:https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/<function name as defined in config.json>Example Request to chatFunctionBoilerplate-dev
curl --location 'https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/chatFunctionBoilerplate-dev' \ --header 'Content-Type: application/json' \ --data '{ "message": "hi", "params": { "conversation_id": "12345Test", "conversation_history": [ { "type": "user", "content": "hi" } ] } }' -
Once the
devchat function is fully tested, you can merge the code to the default branch (main). This will trigger themain.ymlworkflow, which will deploy thestagingandprodversions of your chat function. Please contact the ADMIN to provide you the URLS for thestagingandprodversions of your chat function. -
In order to make your new chat function available on any of the environments of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform.