Create Your Own Language Model#
This guide will help you to bring your own language model to be used with byLLM. This is helpful if you have a self-hosted Language Model or you are using a different service that is not currently supported by byLLM.
IMPORTANT
This assumes that you have a proper understanding on how to inference with your language model. If you are not sure about this, please refer to the documentation of your language model.
Steps#
- Create a new class that inherits from
BaseLLMclass.
In Python,
In Jaclang,
- Initialize your model with the required parameters.
import from my_llm {MyLLM}
# Initialize as global variable
glob llm = MyLLM();
# Initialize as local variable
with entry {
llm = MyLLM();
}
Changing the Prompting Techniques#
You can change the prompting techniques overriding the the following parameters in your class.
from byllm.llms.base import BaseLLM
class MyLLM(BaseLLM):
byLLM_SYSTEM_PROMPT = 'Your System Prompt'
byLLM_PROMPT = 'Your Prompt' # Not Recommended to change this
_METHOD_PROMPTS = {
"Normal": 'Your Normal Prompt',
"Reason": 'Your Reason Prompt',
"Chain-of-Thoughts": 'Your Chain-of-Thought Prompt',
"ReAct": 'Your ReAct Prompt',
}
OUTPUT_FIX_PROMPT = 'Your Output Fix Prompt'
OUTPUT_CHECK_PROMPT = 'Your Output Check Prompt'
# Rest of the code
Thats it! You have successfully created your own Language Model to be used with byLLM.
NOTICE
We are constantly adding new LMs to the library. If you want to add a new LM, please open an issue here.