graph TD
%% μ¬μ©μμ μμ€ν
κ°μ μ£Όμ μνΈμμ©
A[μ¬μ©μ] -->|λͺ¨λΈ μ
λ‘λ μμ²| B[Nginx]
B -->|νλ‘μ μμ²| C[FastAPI μλ²]
C -->|νμΌ μ μ₯| D[models/ λλ ν 리]
C -->|λͺ¨λΈ λ‘λ| E[ModelManager]
E -->|λͺ¨λΈ λ±λ‘| F[available_models]
A -->|λͺ¨λΈ λͺ©λ‘ μ‘°ν μμ²| B
B -->|νλ‘μ μλ΅| A
A -->|μμΈ‘ μμ²| B
B -->|νλ‘μ μμ²| C
C -->|λͺ¨λΈ κ°μ Έμ€κΈ°| E
E -->|λͺ¨λΈ μμΈ‘| G[μμΈ‘ μν]
G -->|μμΈ‘ κ²°κ³Ό λ°ν| C
C -->|νλ‘μ μλ΅| A
%% μΌμ λ°μ΄ν° μ
λ‘λ λ° MongoDBμμ μνΈμμ©
H[μΌμ] -->|λ°μ΄ν° μ
λ‘λ| I[MongoDB]
J[Model Training Script] -->|λ°μ΄ν° κ°μ Έμ€κΈ°| I
J -->|λ°μ΄ν° μ μ²λ¦¬| K[app/utils.py]
J -->|λͺ¨λΈ νμ΅ λ° μ μ₯| D
D -->|λͺ¨λΈ λ‘λ| E
%% μΈνλΌ κ΅¬μ± μμ
subgraph μΈνλΌ
L[EC2 μΈμ€ν΄μ€] --> B
L --> H
L --> J
end
%% μ€νμΌ μ μ
classDef api fill:#f9f,stroke:#333,stroke-width:2px;
class A,B,C,G api;
PLKIT-AI.analysis is a FastAPI-based application designed for managing and deploying time series forecasting models using TSMixer. It provides functionalities to upload, manage, and perform predictions with TSMixer models, specifically tailored for single data point predictions. The application ensures secure cross-origin requests and maintains robust logging for monitoring and debugging purposes.
PLKIT-AI.analysis/
βββ app/
β βββ init.py
β βββ config.py
β βββ main.py
β βββ model_manager.py
β βββ model.py
β βββ schemas.py
β βββ train_model.py
β βββ train.log
β βββ utils.py
βββ data/
β βββ init.py
β βββ get_from_mongodb.py
β βββ mongo_utils.py
β βββ upload_to_mongodb.py
βββ models/
β βββ # TSMixer model files (.pt)
βββ server.log
βββ requirements.txt
βββ run.sh
βββ pyproject.toml
βββ config.conf
βββ .gitignore
βββ .env
βββ README.md
-
app/: Contains the main application code.
-
main.py: FastAPI application with API endpoints.
-
model.py: Defines the TSMixerModel class.
-
model_manager.py: Manages model loading, training, and prediction.
-
schemas.py: Pydantic models for request and response validation.
-
train_model.py: Script for training and saving models.
-
utils.py: Utility functions for data loading and preprocessing.
-
models/: Directory where TSMixer model files are stored.
-
server.log: Log file for application logging.
-
requirements.txt: Lists all Python dependencies.
-
README.md: Project documentation.
-
Model Uploading: Upload TSMixer models (
.pt
files) via the/upload-model/
endpoint. -
Model Management: List all available models using the
/models/
endpoint. -
Time Series Prediction: Perform predictions using TSMixer models with single data point inputs via the
/predict/
endpoint. -
CORS Support: Configurable Cross-Origin Resource Sharing to allow secure requests from different origins.
-
Robust Logging: Comprehensive logging for monitoring requests, errors, and model operations.
-
Scalable Architecture: Designed to easily add support for additional model types in the future.
-
Python 3.8+
-
MongoDB: For storing and retrieving sensor data.
-
Git: To clone the repository.
-
Virtual Environment: (Optional but recommended) To manage project dependencies.
- Clone the Repository
git clone https://github.com/PLKIT-AI/analysis.git
cd analysis
- Create a Virtual Environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install Dependencies
pip install --upgrade pip
pip install -r requirements.txt
- Configure MongoDB
Ensure MongoDB is running and accessible. Update the MongoDB URI in app/utils.py if necessary.
- Start the FastAPI Server
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
The --reload
flag enables auto-reloading during development.
- Access Swagger UI
Navigate to http://localhost:8000/docs
to access the interactive API documentation.
The project relies on the following major dependencies:
-
FastAPI: Web framework for building APIs.
-
Uvicorn: ASGI server for running FastAPI applications.
-
Darts: Time series library for forecasting.
-
PyMongo: MongoDB driver for Python.
-
Pydantic: Data validation and settings management.
-
Torch: PyTorch library for tensor computations.
All dependencies are listed in requirements.txt
. Install them using:
pip install -r requirements.txt
Cross-Origin Resource Sharing (CORS) is configured in app/main.py using FastAPI's CORSMiddleware. Adjust the origins list to include the domains that should be allowed to access the API.
# app/main.py
from fastapi.middleware.cors import CORSMiddleware
origins = [
"http://localhost",
"http://localhost:3000",
"http://43.201.54.104",
# Add additional allowed origins here
]
app.add_middleware(
CORSMiddleware,
allow_origins=origins, # Allowed origins
allow_credentials=True, # Allow credentials
allow_methods=["*"], # Allowed HTTP methods
allow_headers=["*"], # Allowed HTTP headers
)
Note: For development purposes, you can set allow_origins=["*"]
to allow all origins. However, this is not recommended for production due to security risks.
You may need to set environment variables for sensitive configurations such as MongoDB URI or server settings. Consider using a .env file and packages like python-dotenv to manage them securely.
Example .env
file:
MONGODB_URI=mongodb://localhost:27017/
Update app/utils.py
to load environment variables accordingly.
Contributions are welcome! Follow these steps to contribute to the project:
- Fork the Repository
Click the "Fork" button on the repository page to create a copy under your GitHub account.
- Clone Your Fork
git clone https://github.com/your-username/analysis.git
cd analysis
- Create a New Branch
git checkout -b feature/your-feature-name
- Make Your Changes
Implement your feature or bug fix.
- Commit Your Changes
git add .
git commit -m "Add your commit message"
- Push to Your Fork
git push origin feature/your-feature-name
- Create a Pull Request
Go to the original repository and create a pull request from your fork's branch.
Guidelines
-
Code Style: Follow PEP 8 guidelines for Python code.
-
Commit Messages: Use clear and descriptive commit messages.
-
Documentation: Update the README or other documentation if necessary.
-
Testing: Ensure that your changes do not break existing functionality. Add tests if applicable.
This project is licensed under the MIT License.
Β© 2024 PLKIT-AI.analysis. All rights reserved.