Skip to content

Latest commit

 

History

History
234 lines (170 loc) · 5.7 KB

README.md

File metadata and controls

234 lines (170 loc) · 5.7 KB

🤖 LLMIntegrationBundle

License: MIT PHP Version Symfony Version

LLMIntegrationBundle is a powerful Symfony bundle that seamlessly integrates Large Language Models (LLMs) into your Symfony applications. With support for multiple AI providers and a flexible architecture, it's designed for easy extension and customization.

📚 Table of Contents

✨ Features

  • 🌐 Support for multiple AI providers
  • ⚙️ Flexible configuration
  • 🛡️ Exception handling with custom exceptions
  • 🖥️ CLI integration for generating new AI service classes
  • 🧩 Extensible architecture
  • 🧪 Comprehensive unit testing

📦 Installation

Install the bundle using Composer:

composer require saqqal/llm-integration-bundle

🛠️ Configuration

  1. Register the bundle in config/bundles.php:
<?php
return [
    // ...
    Saqqal\LlmIntegrationBundle\LlmIntegrationBundle::class => ['all' => true],
];
  1. Create config/packages/llm_integration.yaml:
llm_integration:
    llm_provider: 'api_together'
    llm_api_key: '%env(LLM_PROVIDER_API_KEY)%'
    llm_model: 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo'
  1. Set the API key in your .env file:
LLM_PROVIDER_API_KEY=your_api_key_here

🚀 Usage

Injecting the AI Service

Inject AiServiceInterface into your services or controllers:

use Saqqal\LlmIntegrationBundle\Interface\AiServiceInterface;

class YourService
{
    private AiServiceInterface $aiService;

    public function __construct(AiServiceInterface $aiService)
    {
        $this->aiService = $aiService;
    }

    // ...
}

Generating Responses

Use the generate method to send prompts and receive responses:

public function generateResponse(string $prompt): string
{
    $response = $this->aiService->generate($prompt);
    return $response->getData()['content'];
}

Changing Output Type

You can change the output type to DynamicAiResponse for more flexible access to API responses:

public function generateDynamicResponse(string $prompt): mixed
{
    $response = $this->aiService->generate($prompt, [], true);
    return $response->choices[0]->message->content;
}

🤝 Available AI Clients

LLMIntegrationBundle supports the following AI clients:

  1. API Together (ApiTogetherClient)
  2. OpenAI (OpenAiClient)
  3. Anthropic (AnthropicClient)
  4. Arliai (ArliaiClient)
  5. Deepinfra (DeepinfraClient)
  6. Groq (GroqClient)
  7. HuggingFace (HuggingFaceClient)
  8. Mistral (MistralClient)
  9. OpenRouter (OpenRouterClient)
  10. Tavily (TavilyClient)

To use a specific client, set the llm_provider in your configuration to the corresponding provider name.

💻 CLI Commands

Generate a new AI service class

php bin/console llm:create-ai-service

Follow the prompts to enter the provider name and API endpoint.

List available AI clients

php bin/console llm:list-ai-services

This command will list all available AI clients that are tagged with the @AiClient attribute.

🔧 Extending the Bundle

To add a new AI provider:

  1. Create a new client class extending AbstractAiClient:
use Saqqal\LlmIntegrationBundle\Attribute\AiClient;
use Saqqal\LlmIntegrationBundle\Client\AbstractAiClient;

#[AiClient('your_provider')]
class YourProviderClient extends AbstractAiClient
{
    protected function getApiUrl(): string
    {
        return 'https://api.yourprovider.com/v1/chat/completions';
    }

    protected function getAdditionalRequestData(string $prompt, ?string $model): array
    {
        return [
            // Add provider-specific options here
        ];
    }
}
  1. Update your configuration to use the new provider:
llm_integration:
    llm_provider: 'your_provider'
    llm_api_key: '%env(YOUR_PROVIDER_API_KEY)%'
    llm_model: 'your-default-model'

🚦 Exception Handling

Create an event subscriber to handle LlmIntegrationExceptionEvent:

use Saqqal\LlmIntegrationBundle\Event\LlmIntegrationExceptionEvent;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;

class LlmIntegrationExceptionSubscriber implements EventSubscriberInterface
{
    public static function getSubscribedEvents(): array
    {
        return [
            LlmIntegrationExceptionEvent::class => 'onLlmIntegrationException',
        ];
    }

    public function onLlmIntegrationException(LlmIntegrationExceptionEvent $event): void
    {
        $exception = $event->getException();
        // Handle the exception
    }
}

🧪 Testing

Run the test suite:

./vendor/bin/phpunit

📄 License

This bundle is released under the MIT License. See the LICENSE file for details.

👨‍💻 Author

Abdelaziz Saqqal - LinkedIn - Portfolio

🤝 Contributing

Contributions are welcome! Please fork the repository and submit a pull request with your changes.

📚 Documentation

For more detailed documentation, please visit our Wiki.