Skip to content

Developing Chat Functions: Getting Started

What is a Chat Function?

A chat function is a function which calls Large Language Models (LLMs) to respond to the messages of students given contextual data:

  • question data
  • user data such as past responses to the problem

Chat functions host a chatbot. Chatbots capture and automate the process of assisting students during their learning process when outside of classroom.

Getting Setup for Development

  1. Get the code on your local machine (Using github desktop or the git cli)

    • For new functions: clone the template repo for chat-function-boilerplate. Make sure the new repository is set to public (it needs access to organisation secrets).
    • For existing functions: please make your changes on a new separate branch
  2. If you are creating a new chatbot, you can either edit the src/agents/base_agent or copy it and rename it based on the name of your chatbot.

  3. You are now ready to start making changes and implementing features by editing each of the main function-logic files:

    1. src/agents/{base_agent}/{base}_agent.py: This file contains the main LLM pipeline using LangGraph and LangChain.

    2. the chat function expects the following arguments when it being called:

    Body with necessary Params:

    {
        "message":"hi",
        "params":{
                "conversation_id":"12345Test",
                "conversation_history": [{"type":"user","content":"hi"}]
        }
    }
    

    Body with optional Params:

    {
        "message":"hi",
        "params":{
                "conversation_id":"12345Test",
                "conversation_history":[{"type":"user","content":"hi"}],
                "summary":" ",
                "conversational_style":" ",
                "question_response_details": "",
                "include_test_data": true,
                "agent_type": {agent_name}
        }
    }
    
  4. src/agents/{base_agent}/{base}_prompts.py: This is where you can write the system prompts that describe how your AI Assistant should behave and respond to the user.

  5. If you edited the chatbot agent file name, make sure to add your chatbot invoke() function to the module.py file.

    1. Update the config.json file with the name of the chat function.
  6. Please add a README.md file to describe the use and behaviour of your chatbot.

  7. Changes can be tested locally by running the pipeline tests using:

    pytest src/module_test.py
    
    Running and Testing Chat Functions Locally

  8. Merge commits into dev branch will trigger the dev.yml workflow, which will build the docker image, push it to a shared dev ECR repository and deploy an AWS Lambda function available to any http requests. In order to make your new chatbot available on the dev environment of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform.

  9. You can now test the deployed chat function using your preferred request client (such as Insomnia or Postman or simply curl from a terminal). DEV Functions are made available at:

    https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/<function name as defined in config.json>
    

    Example Request to chatFunctionBoilerplate-dev

    curl --location 'https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/chatFunctionBoilerplate-dev' \
    --header 'Content-Type: application/json' \
    --data '{
            "message": "hi",
            "params": {
                    "conversation_id": "12345Test",
                    "conversation_history": [
                            {
                                    "type": "user",
                                    "content": "hi"
                            }
                    ]
            }
    }'
    
  10. Once the dev chat function is fully tested, you can merge the code to the default branch (main). This will trigger the main.yml workflow, which will deploy the staging and prod versions of your chat function. Please contact the ADMIN to provide you the URLS for the staging and prod versions of your chat function.

  11. In order to make your new chat function available on any of the environments of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform.