A modern chat application built with Vue.js, FastAPI, and .NET Aspire, leveraging LangChain for advanced conversational AI capabilities. The application supports both local development with Ollama and production deployment with Azure OpenAI.
.NET Aspire provides several key benefits for this application:
-
Unified Orchestration:
- Single command to start all services (
dotnet run
) - Automatic service discovery and networking
- Consistent development experience across different platforms
- Single command to start all services (
-
Built-in Monitoring:
- Real-time health monitoring through the Aspire dashboard
- Integrated logging and diagnostics
- Performance metrics and telemetry
-
Multi-Language Support:
- Seamlessly manages both Python (FastAPI) and Node.js (Vue) applications
- Handles cross-service communication
- Environment variable management across services
-
Production Readiness:
- Container orchestration for deployment
- Load balancing and scaling capabilities
- Infrastructure as code for cloud deployments
-
Developer Experience:
- Interactive dashboard at http://localhost:15888
- Live reload and hot module replacement
- Simplified debugging across services
📝 To unlock the power of .NET Aspire, you'll need the .NET SDK, just like you need Python for Jupyter Notebooks. Aspire then enhances your development experience by providing a centralized dashboard for monitoring and managing all the moving parts of your distributed applications, making debugging and troubleshooting a breeze.
This application uses LangChain to provide:
-
Conversation Memory: Maintains context across messages using
ConversationBufferMemory
, enabling more coherent and contextual responses. -
Flexible Model Integration:
- Local development: Uses Ollama with the Mistral model
- Production: Seamlessly switches to Azure OpenAI
-
Advanced Chain Management: Implements
ConversationChain
for sophisticated conversation handling and state management. -
Future Extensibility: Ready for advanced features like:
- Document processing and QA
- Semantic search with embeddings
- Custom agents for complex tasks
- Multiple memory types for different use cases
- Node.js 20+
- Python 3.11+
- .NET 9.0 SDK
- Docker
- Ollama (for local development)
- Azure OpenAI Service (for production)
├── fastAspire/ # Root project directory
├── aspire/ # .NET Aspire orchestration
├── backend/ # Python FastAPI backend
├── src/ # FastAPI application code
└── requirements.txt
└── frontend/ # Vue.js frontend (current directory)
├── src/ # Vue source code
└── public/ # Static assets
- Install .NET Aspire workload:
dotnet workload install aspire
- Install backend dependencies:
cd backend
pip install -r requirements.txt
- Install frontend dependencies:
npm install
- Configure environment variables:
cp backend/.env.example backend/.env
Edit .env
file with your configuration:
- For local development: Use
ENVIRONMENT=local
- For production: Set Azure OpenAI credentials
- Start Ollama and pull the Mistral model:
ollama pull mistral
- Start the application with Aspire:
cd fastAspire/fastAspire.AppHost
dotnet run
- Update environment variables for Azure OpenAI
- Build and deploy using Docker containers
- Real-time chat interface with memory-enabled responses
- Intelligent conversation management via LangChain
- Environment-aware model selection (Ollama/Azure OpenAI)
- .NET Aspire orchestration for service management
- Health monitoring and logging
- Service discovery and networking
The application leverages several key LangChain components:
- Memory Management
- Uses
ConversationBufferMemory
for maintaining chat history - Enables contextual responses based on previous interactions
- Model Integration
- Local: Ollama with Mistral model for development
- Production: Azure OpenAI for scalable deployment
- Conversation Chain
- Manages message flow and conversation state
- Handles context and memory updates automatically
- Environment Awareness
- Automatically switches between local and production models
- Maintains consistent API regardless of the underlying model