AI-Integrated Programming with byLLM#
This guide covers different ways to use byLLM for AI-integrated software development in Jaclang. byLLM provides language-level abstractions for integrating Large Language Models into applications, from basic AI-powered functions to complex multi-agent systems. For agentic behavior capabilities, byLLM includes the ReAct method with tool integration.
Supported Models#
byLLM uses LiteLLM to provide integration with a wide range of models.
Note
Additional supported models and model serving platforms are available with LiteLLM. Refer to their documentation for model names.
MTP for Functions#
Basic Functions#
Functions can be integrated with LLM capabilities by adding the by llm
declaration. This eliminates the need for manual API calls and prompt engineering:
These functions process natural language inputs and generate contextually appropriate outputs.
Functions with Reasoning#
The method='Reason'
parameter enables step-by-step reasoning for complex tasks:
Structured Output Functions#
byLLM supports generation of structured outputs. Functions can return complex types:
A more complex example using object schema for context and structured output generation is demonstrated in the game level generation example.
Context-Aware MTP Methods#
Methods can be integrated with LLM capabilities to process object state and context:
When integrating LLMs for methods of a class, MTP automatically adds attributes of the initialized object into the prompt of the LLM, adding extra context to the LLM.
Adding Explicit Context for Functions, Methods and Objects#
Providing appropriate context is essential for optimal LLM performance. byLLM provides multiple methods to add context to functions and objects.
Adding Context with Docstrings#
Docstrings provide context for LLM-integrated functions. byLLM uses docstrings to understand function purpose and expected behavior.
Adding Context with Semantic Strings (Semstrings)#
Jaclang provides semantic strings using the sem
keyword for describing object attributes and function parameters. This is useful for:
- Describing object attributes with domain-specific meaning
- Adding context to parameters
- Providing semantic information while maintaining clean code
Additional Context with incl_info
#
The incl_info
parameter provides additional context to LLM methods for context-aware processing:
When to Use Each Approach#
- Docstrings: Use for function-level context and behavior description
- Semstrings: Use for attribute-level descriptions and domain-specific terminology
- incl_info: Use to selectively include relevant object state in method calls
The sem
keyword can be used in separate implementation files for improved code organization and maintainability.
In this example:
Tool-Calling Agents with ReAct#
The ReAct (Reasoning and Acting) method enables agentic behavior by allowing functions to reason about problems and use external tools. Functions can be made agentic by adding the by llm(tools=[...])
declaration.
A comprehensive tutorial on building an agentic application is available here.
Streaming Outputs#
The streaming feature enables real-time token reception from LLM functions, useful for generating content where results should be displayed as they are produced.
Set stream=True
in the invoke parameters to enable streaming:
NOTE:
"The stream=True
parameter only supports str
output type. Tool calling is not currently supported in streaming mode but will be available in future releases."