Skip to content

GenAI Models#

To incorporate a Large Language Model (LLM) into code, initialize it by importing the relavent model interface from the mtllm.llms module.

Here are the list of models/ model providers which are available to use out of the box with jac-lang.

Cloud Hosted LLMs (API Clients)#

Note:

  • Theses LLMs require an API Key and the relevent python libraries to be installed. -->
pip install mtllm[openai]
pip install mtllm[anthropic]
pip install mtllm[groq]
pip install mtllm[together]

Running Local LLMs#

  • Ollama

    Downlad Ollama, install and run the server by running ollama serve. Pull and install your model of choice by bashing ollama run <model_name> on a new terminal.

  • HuggingFace

    Download and run opensource LLMs from the plethora of models available on the Hugging Face website.

Note:

  • Running Local LLMs would be demanding for your PC setup where it will either simply not run the model or inference performance will take a hit. Check whether you have sufficient system requirements to run local LLMs.

In the jac program that you require to inference an LLM, please code as following template code snippets.

1
2
3
4
5
import from mtllm.llms {OpenAI}

glob llm = OpenAI(
            model_name = "gpt-4o"
            );
1
2
3
4
5
import from mtllm.llms {Anthropic}

glob llm = Anthropic(
            model_name = "claude-3-sonnet-20240229"
            );
1
2
3
4
5
import from mtllm.llms { Gemini }

glob llm = Gemini(
            model_name="gemini-2.0-flash"
            );
1
2
3
4
5
import from mtllm.llms {Groq}

glob llm = Groq(
            model_name = "llama3-8b-8192", # Go through available models in website
            );
1
2
3
4
5
import from mtllm.llms {TogetherAI}

glob llm = TogetherAI(
            model_name = "meta-llama/Llama-2-70b-chat-hf" # Go through available models in website
            );
1
2
3
4
5
import from mtllm.llms {Ollama}

glob llm = Ollama(
            model_name = "llama3:8b" # Will pull model if does not exists
            );
1
2
3
4
5
import from mtllm.llms {Huggingface}

glob llm = Huggingface(
            model_name = "mistralai/Mistral-7B-v0.3" # Will pull model if does not exists
            );

The llm model is defined in these examples which can be intialized with specific attributes.

Note:

  • If the coder wants to visualize the prompts during inference, enable verbose by adding verbose = True as an argument when defining the LLM.
  • Passing eny parameter such as a model hyperparameter such as "temperature" can be passed into the llm as keyword arguments.

This approach allows for the initialization of the desired model as a model code construct with a specific name (in this case, llm), facilitating its integration into code. -->