LLMIntegrationBundle is a powerful Symfony bundle that seamlessly integrates Large Language Models (LLMs) into your Symfony applications. With support for multiple AI providers and a flexible architecture, it's designed for easy extension and customization.
- Features
- Installation
- Configuration
- Usage
- Available AI Clients
- CLI Commands
- Extending the Bundle
- Exception Handling
- Testing
- License
- Author
- Contributing
- Documentation
- Acknowledgements
- 🌐 Support for multiple AI providers
- ⚙️ Flexible configuration
- 🛡️ Exception handling with custom exceptions
- 🖥️ CLI integration for generating new AI service classes
- 🧩 Extensible architecture
- 🧪 Comprehensive unit testing
Install the bundle using Composer:
composer require saqqal/llm-integration-bundle
- Register the bundle in
config/bundles.php
:
<?php
return [
// ...
Saqqal\LlmIntegrationBundle\LlmIntegrationBundle::class => ['all' => true],
];
- Create
config/packages/llm_integration.yaml
:
llm_integration:
llm_provider: 'api_together'
llm_api_key: '%env(LLM_PROVIDER_API_KEY)%'
llm_model: 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo'
- Set the API key in your
.env
file:
LLM_PROVIDER_API_KEY=your_api_key_here
Inject AiServiceInterface
into your services or controllers:
use Saqqal\LlmIntegrationBundle\Interface\AiServiceInterface;
class YourService
{
private AiServiceInterface $aiService;
public function __construct(AiServiceInterface $aiService)
{
$this->aiService = $aiService;
}
// ...
}
Use the generate
method to send prompts and receive responses:
public function generateResponse(string $prompt): string
{
$response = $this->aiService->generate($prompt);
return $response->getData()['content'];
}
You can change the output type to DynamicAiResponse
for more flexible access to API responses:
public function generateDynamicResponse(string $prompt): mixed
{
$response = $this->aiService->generate($prompt, [], true);
return $response->choices[0]->message->content;
}
LLMIntegrationBundle supports the following AI clients:
- API Together (
ApiTogetherClient
) - OpenAI (
OpenAiClient
) - Anthropic (
AnthropicClient
) - Arliai (
ArliaiClient
) - Deepinfra (
DeepinfraClient
) - Groq (
GroqClient
) - HuggingFace (
HuggingFaceClient
) - Mistral (
MistralClient
) - OpenRouter (
OpenRouterClient
) - Tavily (
TavilyClient
)
To use a specific client, set the llm_provider
in your configuration to the corresponding provider name.
php bin/console llm:create-ai-service
Follow the prompts to enter the provider name and API endpoint.
php bin/console llm:list-ai-services
This command will list all available AI clients that are tagged with the @AiClient
attribute.
To add a new AI provider:
- Create a new client class extending
AbstractAiClient
:
use Saqqal\LlmIntegrationBundle\Attribute\AiClient;
use Saqqal\LlmIntegrationBundle\Client\AbstractAiClient;
#[AiClient('your_provider')]
class YourProviderClient extends AbstractAiClient
{
protected function getApiUrl(): string
{
return 'https://api.yourprovider.com/v1/chat/completions';
}
protected function getAdditionalRequestData(string $prompt, ?string $model): array
{
return [
// Add provider-specific options here
];
}
}
- Update your configuration to use the new provider:
llm_integration:
llm_provider: 'your_provider'
llm_api_key: '%env(YOUR_PROVIDER_API_KEY)%'
llm_model: 'your-default-model'
Create an event subscriber to handle LlmIntegrationExceptionEvent
:
use Saqqal\LlmIntegrationBundle\Event\LlmIntegrationExceptionEvent;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
class LlmIntegrationExceptionSubscriber implements EventSubscriberInterface
{
public static function getSubscribedEvents(): array
{
return [
LlmIntegrationExceptionEvent::class => 'onLlmIntegrationException',
];
}
public function onLlmIntegrationException(LlmIntegrationExceptionEvent $event): void
{
$exception = $event->getException();
// Handle the exception
}
}
Run the test suite:
./vendor/bin/phpunit
This bundle is released under the MIT License. See the LICENSE file for details.
Abdelaziz Saqqal - LinkedIn - Portfolio
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
For more detailed documentation, please visit our Wiki.