Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Various docs improvements #809

Merged
merged 8 commits into from
Jan 30, 2025
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 9 additions & 9 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,32 +13,32 @@ We built PydanticAI with one simple aim: to bring that FastAPI feeling to GenAI

## Why use PydanticAI

:material-account-group:{ .md .middle .team-blue }&nbsp;<strong class="vertical-middle">Built by the Pydantic Team</strong><br>
* __Built by the Pydantic Team__:
Built by the team behind [Pydantic](https://docs.pydantic.dev/latest/) (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more).

:fontawesome-solid-shapes:{ .md .middle .shapes-orange }&nbsp;<strong class="vertical-middle">Model-agnostic</strong><br>
* __Model-agnostic__:
Supports OpenAI, Anthropic, Gemini, Ollama, Groq, and Mistral, and there is a simple interface to implement support for [other models](models.md).

:logfire-logo:{ .md .middle }&nbsp;<strong class="vertical-middle">Pydantic Logfire Integration</strong><br>
* __Pydantic Logfire Integration__:
Seamlessly [integrates](logfire.md) with [Pydantic Logfire](https://pydantic.dev/logfire) for real-time debugging, performance monitoring, and behavior tracking of your LLM-powered applications.

:material-shield-check:{ .md .middle .secure-green }&nbsp;<strong class="vertical-middle">Type-safe</strong><br>
* __Type-safe__:
Designed to make [type checking](agents.md#static-type-checking) as powerful and informative as possible for you.

:snake:{ .md .middle }&nbsp;<strong class="vertical-middle">Python-centric Design</strong><br>
* __Python-centric Design__:
Leverages Python's familiar control flow and agent composition to build your AI-driven projects, making it easy to apply standard Python best practices you'd use in any other (non-AI) project.

:simple-pydantic:{ .md .middle .pydantic-pink }&nbsp;<strong class="vertical-middle">Structured Responses</strong><br>
* __Structured Responses__:
Harnesses the power of [Pydantic](https://docs.pydantic.dev/latest/) to [validate and structure](results.md#structured-result-validation) model outputs, ensuring responses are consistent across runs.

:material-puzzle-plus:{ .md .middle .puzzle-purple }&nbsp;<strong class="vertical-middle">Dependency Injection System</strong><br>
* __Dependency Injection System__:
Offers an optional [dependency injection](dependencies.md) system to provide data and services to your agent's [system prompts](agents.md#system-prompts), [tools](tools.md) and [result validators](results.md#result-validators-functions).
This is useful for testing and eval-driven iterative development.

:material-sine-wave:{ .md .middle }&nbsp;<strong class="vertical-middle">Streamed Responses</strong><br>
* __Streamed Responses__:
Provides the ability to [stream](results.md#streamed-results) LLM outputs continuously, with immediate validation, ensuring rapid and accurate results.

:material-graph:{ .md .middle .graph-green }&nbsp;<strong class="vertical-middle">Graph Support</strong><br>
* __Graph Support__:
[Pydantic Graph](graph.md) provides a powerful way to define graphs using typing hints, this is useful in complex applications where standard control flow can degrade to spaghetti code.

!!! example "In Beta"
Expand Down
22 changes: 21 additions & 1 deletion docs/logfire.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,9 @@ import logfire
logfire.configure()
```

The [logfire documentation](https://logfire.pydantic.dev/docs/) has more details on how to use logfire, including how to instrument other libraries like Pydantic, HTTPX and FastAPI.
The [logfire documentation](https://logfire.pydantic.dev/docs/) has more details on how to use logfire,
including how to instrument other libraries like [Pydantic](https://logfire.pydantic.dev/docs/integrations/pydantic/),
[HTTPX](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) and [FastAPI](https://logfire.pydantic.dev/docs/integrations/web-frameworks/fastapi/).

Since Logfire is build on [OpenTelemetry](https://opentelemetry.io/), you can use the Logfire Python SDK to send data to any OpenTelemetry collector.

Expand All @@ -79,3 +81,21 @@ To demonstrate how Logfire can let you visualise the flow of a PydanticAI run, h
We can also query data with SQL in Logfire to monitor the performance of an application. Here's a real world example of using Logfire to monitor PydanticAI runs inside Logfire itself:

![Logfire monitoring PydanticAI](img/logfire-monitoring-pydanticai.png)

### Monitoring HTTPX Requests

In order to monitor HTTPX requests made by models, you can use `logfire`'s [HTTPX](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) integration.

Instrumentation is as easy as adding the following three lines to your application:

```py {title="instrument_httpx.py" test="skip" lint="skip"}
...
import logfire
logfire.configure() # (1)!
logfire.instrument_httpx() # (2)!
...
```
```

In particular, this can help you to trace specific requests, responses, and headers which might be of particular interest
if you're using a custom `httpx` client in your model.
6 changes: 6 additions & 0 deletions docs/troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,3 +19,9 @@ Note: This fix also applies to Google Colab.
### `UserError: API key must be provided or set in the [MODEL]_API_KEY environment variable`

If you're running into issues with setting the API key for your model, visit the [Models](models.md) page to learn more about how to set an environment variable and/or pass in an `api_key` argument.

## Monitoring HTTPX Requests

You can use custom `httpx` clients in your models in order to access specific requests, responses, and headers at runtime.

It's particularly helpful to use `logfire`'s [HTTPX integration](logfire.md#monitoring-httpx-requests) to monitor the above.