An interactive application showcasing advanced prompting techniques for legal professionals working with Large Language Models (LLMs).
This application provides a comprehensive guide to various legal-focused prompting techniques for LLMs, helping legal professionals get better responses for their work. The app includes:
- Interactive examples with bad and good prompts for legal contexts
- Detailed explanations of why each technique works
- Real-time LLM response testing through a direct API connection
- Multilingual support (English and Portuguese)
- Role-Based Prompting (Persona Priming) - Assigning the AI a specific legal role or expertise
- Context-Rich Prompting - Including detailed legal background and jurisdictional information
- Constraint-Based Prompting - Setting conditional and focused instructions for legal analysis
- Example-Based Prompting (Few-Shot Learning) - Providing examples of desired outputs for consistent formats
- Step-by-Step Prompting (Chain-of-Thought Legal Reasoning) - Breaking down complex legal analysis
- Extracting Key Provisions and Data from Contracts - Targeted extraction techniques
- Master Service Agreement Clause Drafting and Refinement - Specialized contract drafting methods
- Handling Ambiguity and Multiple Interpretations - Analyzing legal uncertainties
- Comparative Law Analysis Across Jurisdictions - Cross-jurisdictional legal analysis methods
- Recency Bias (Le Gran Finale) - Strategic placement of critical instructions at the end of prompts
You can run the application in several ways:
# Using make (recommended)
# will install uv and construct the virtual environment based on the pyproject.toml file
make run
# Or using uv directly (assuming it is installed)
uv venv
uv sync --install --all-packages
uv run marimo run app.py
This repository includes a DevContainer configuration for VS Code and GitHub Codespaces:
- Open the repository in VS Code with the DevContainer extension or GitHub Codespaces
- The environment will be automatically set up
- Run the application with:
make run
This application uses Marimo's official Docker container, making deployment simple and reliable:
# Build and start the container
docker-compose up -d
# View logs
docker-compose logs -f
# Build the Docker image
docker build -t legal-prompting-app .
# Run the container
docker run -p 8080:8080 legal-prompting-app
After pushing to GitHub, the application is automatically built and published to GitHub Container Registry:
# Pull the latest image
docker pull ghcr.io/arthrod/prompting:main
# Run the container
docker run -p 8080:8080 ghcr.io/arthrod/prompting:main
Access the application at https://prompting.synthetic.lawyer after deployment.
Note: The port exposure (
-p 8080:8080
or theports
section in docker-compose.yml) is necessary for browser access to the application. The container internally runs on port 8080, and this mapping makes it accessible at the same port on your host machine.
For development with live code changes, use the volume mount in docker-compose.yml:
# Start the container with mounted volumes
docker-compose up -d
# Make changes to your local files and they'll be reflected in the running app
app.py
: Main application file with all cells defining the interactive UIlayouts/app.slides.json
: Layout definition for the UIcustom.css
: Custom styling for the applicationDockerfile
: Uses Marimo's official container as a basedocker-compose.yml
: Multi-container Docker configuration with development support
This project requires:
- Python 3.12+
- Marimo 0.11.17+
- HTTPX for API communication
- Various libraries for data visualization and manipulation
All dependencies are managed through pyproject.toml
and installed using uv sync --all-packages
.
- Complete Portuguese translations for all techniques
- Add advanced prompting templates specific to different legal practice areas
- Integrate with more LLM providers
- Add export functionality for generated prompts
- Create a companion CLI tool for quick prompt generation
- Completar traduções em português para todas as técnicas
- Adicionar modelos avançados de prompts específicos para diferentes áreas do direito
- Integrar com mais provedores de LLM
- Adicionar funcionalidade de exportação para prompts gerados
- Criar uma ferramenta CLI para geração rápida de prompts
A special thank you to Thomas Schmelzer for his invaluable contributions and for helping me fix some of my "noob" mistakes. His expertise and guidance have been instrumental in improving this project.
- Arthur Souza Rodrigues ([email protected])
MIT License