Skip to content

rcw3bb/sample-smolagents

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Sample Usage: Smolagents

[TOC]

Overview

This project demonstrates the use of smolagents from Hugging Face to create AI agents. The samples were created on Windows and will create files in the C:\\tmp directory.

Requirements

Running the Ollama Server

To start the Ollama server, execute the following command:

ollama serve

The server listens on port 11434 by default.

Running the Qwen2.5 Model with Ollama

While the Ollama server is running, download the model using the following command:

ollama run qwen2.5

This only needs to be done once.

Note: This is a small agent and may get confused easily. For a more robust implementation, use a model with more parameters.

The .env File

Create a .env file with the following content:

HF_TOKEN=<HUGGING_FACE_TOKEN>

Replace <HUGGING_FACE_TOKEN> with your actual Hugging Face token. This is required because all the samples use the Inference API with the following model by default:

Qwen/QwQ-32B

Switch to the Ollama model if needed by changing the active_model to local_model in the model.ModelMgr module.

Installation

  1. Clone the repository:

    git clone https://github.com/rcw3bb/sample-smolagents.git
    cd sample-smolagents
  2. Install the dependencies:

    If Poetry is not yet installed, use the following command to install it:

    python -m pip install poetry

    After installation, make poetry available to the CLI by updating the PATH environment variable to include the following if you are using Windows:

    %LOCALAPPDATA%\Programs\Python\Python313\Scripts

    If your system Python version is lower than Python 3.13, use the following command to install it:

    poetry python install 3.13
    poetry install

Non-MCP Samples

Simple File Management with AI Agent

poetry run python -m sample.simple.file_management_sample

Observe whether it uses the provided tools.

Simple File Management with AI Manager Agent

poetry run python -m sample.simple.file_management_managed_sample

Observe whether it uses the provided tools.

MCP Stdio Server Samples

Simple File Management with AI Agent

poetry run python -m sample.mcp.stdio.file_management_sample

Observe whether it uses the provided tools.

Simple File Management with AI Manager Agent

poetry run python -m sample.mcp.stdio.file_management_managed_sample

Observe whether it uses the provided tools.

Using Just the MCP Server

poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_stdio

Where <ROOT_DIR> is the directory that contains the mcp_servers directory.

Use the following prompt to test the server:

Write the text "Hello, World!" to "C:/tmp/mcp-stdio-hello.txt" and show me its content.

Expect to see that the write_file and read_file tools were utilized.

MCP SSE Server Samples

Starting the MCP SSE Server

The server must be running before running any sample from this section.

poetry run python -m mcp_servers.file_manager_server_sse

Simple File Management with AI Agent

poetry run python -m sample.mcp.sse.file_management_sample

Observe whether it uses the provided tools.

Simple File Management with AI Manager Agent

poetry run python -m sample.mcp.sse.file_management_managed_sample

Observe whether it uses the provided tools.

Using Just the MCP Server

  1. Run the server using the following command:

    poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_sse

    Where <ROOT_DIR> is the directory that contains the mcp_servers directory. This will run a server on port 8000.

  2. Use the following address to attach to an agent:

    http://localhost:8000/sse

Use the following prompt to test the server:

Write the text "Hello, World!" to "C:/tmp/mcp-sse-hello.txt" and show me its content.

Expect to see that the write_file and read_file tools were utilized.

Author

Ronaldo Webb

About

This project demonstrates the use of smolagents from Hugging Face to create AI agents.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages