Examples Gallery
Check out these runnable examples to learn how to use Maticlib for different use cases. You can find all these scripts in the examples/ directory of the repository.
Basic Chat & Multi-Provider
Demonstrates how to initialize and use LLM clients from different providers.
- Basic Sync Chat: A simple script to get text responses from OpenAI, Mistral, and Gemini.
- Focused OpenAI Example: Detailed usage of OpenAI-specific features like reasoning tokens.
- Async Concurrent Chat: Sending requests to multiple models at the same time using
asyncio.gather.
Graph Workflows
Learn how to build intelligent agents using the MaticGraph engine.
- Sequential Graph: A basic A → B → C pipeline.
- Conditional Routing: Branching logic based on model outputs (e.g., sentiment analysis).
- Parallel Node Execution: Running multiple independent analysis nodes in parallel for maximum performance.
Structured Output & Parsing
- Pydantic Response Parsing: Force models to generate validated JSON that maps directly to your Pydantic schemas.
- Typed Graph State: Using Pydantic models to enforce strict state schemas throughout your graph workflows.
Native Tool Calling
- Tool Definition & Usage: Use the
@tooldecorator to let LLMs call your Python functions directly.
Running the Examples
-
Set up your environment: Make sure you have your API keys in a
.envfile. -
Execute a script: