Skip to content

yasaenghwa/PLKIT-AI.analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

28 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

PLKIT-AI.analysis

License Python FastAPI

Table of Contents

Overview

graph TD
    %% μ‚¬μš©μžμ™€ μ‹œμŠ€ν…œ κ°„μ˜ μ£Όμš” μƒν˜Έμž‘μš©
    A[μ‚¬μš©μž] -->|λͺ¨λΈ μ—…λ‘œλ“œ μš”μ²­| B[Nginx]
    B -->|ν”„λ‘μ‹œ μš”μ²­| C[FastAPI μ„œλ²„]
    C -->|파일 μ €μž₯| D[models/ 디렉토리]
    C -->|λͺ¨λΈ λ‘œλ“œ| E[ModelManager]
    E -->|λͺ¨λΈ 등둝| F[available_models]
    
    A -->|λͺ¨λΈ λͺ©λ‘ 쑰회 μš”μ²­| B
    B -->|ν”„λ‘μ‹œ 응닡| A
    
    A -->|예츑 μš”μ²­| B
    B -->|ν”„λ‘μ‹œ μš”μ²­| C
    C -->|λͺ¨λΈ κ°€μ Έμ˜€κΈ°| E
    E -->|λͺ¨λΈ 예츑| G[예츑 μˆ˜ν–‰]
    G -->|예츑 κ²°κ³Ό λ°˜ν™˜| C
    C -->|ν”„λ‘μ‹œ 응닡| A
    
    %% μ„Όμ„œ 데이터 μ—…λ‘œλ“œ 및 MongoDBμ™€μ˜ μƒν˜Έμž‘μš©
    H[μ„Όμ„œ] -->|데이터 μ—…λ‘œλ“œ| I[MongoDB]
    J[Model Training Script] -->|데이터 κ°€μ Έμ˜€κΈ°| I
    J -->|데이터 μ „μ²˜λ¦¬| K[app/utils.py]
    J -->|λͺ¨λΈ ν•™μŠ΅ 및 μ €μž₯| D
    D -->|λͺ¨λΈ λ‘œλ“œ| E
    
    %% 인프라 ꡬ성 μš”μ†Œ
    subgraph 인프라
        L[EC2 μΈμŠ€ν„΄μŠ€] --> B
        L --> H
        L --> J
    end
    
    %% μŠ€νƒ€μΌ μ •μ˜
    classDef api fill:#f9f,stroke:#333,stroke-width:2px;
    class A,B,C,G api;

Loading

PLKIT-AI.analysis is a FastAPI-based application designed for managing and deploying time series forecasting models using TSMixer. It provides functionalities to upload, manage, and perform predictions with TSMixer models, specifically tailored for single data point predictions. The application ensures secure cross-origin requests and maintains robust logging for monitoring and debugging purposes.

Project Structure


PLKIT-AI.analysis/

β”œβ”€β”€ app/

β”‚ β”œβ”€β”€ init.py

β”‚ β”œβ”€β”€ config.py

β”‚ β”œβ”€β”€ main.py

β”‚ β”œβ”€β”€ model_manager.py

β”‚ β”œβ”€β”€ model.py

β”‚ β”œβ”€β”€ schemas.py

β”‚ β”œβ”€β”€ train_model.py

β”‚ β”œβ”€β”€ train.log

β”‚ └── utils.py

β”œβ”€β”€ data/

β”‚ β”œβ”€β”€ init.py

β”‚ β”œβ”€β”€ get_from_mongodb.py

β”‚ β”œβ”€β”€ mongo_utils.py

β”‚ └── upload_to_mongodb.py

β”œβ”€β”€ models/

β”‚ └── # TSMixer model files (.pt)

β”œβ”€β”€ server.log

β”œβ”€β”€ requirements.txt

β”œβ”€β”€ run.sh

β”œβ”€β”€ pyproject.toml

β”œβ”€β”€ config.conf

β”œβ”€β”€ .gitignore

β”œβ”€β”€ .env

└── README.md


  • app/: Contains the main application code.

  • main.py: FastAPI application with API endpoints.

  • model.py: Defines the TSMixerModel class.

  • model_manager.py: Manages model loading, training, and prediction.

  • schemas.py: Pydantic models for request and response validation.

  • train_model.py: Script for training and saving models.

  • utils.py: Utility functions for data loading and preprocessing.

  • models/: Directory where TSMixer model files are stored.

  • server.log: Log file for application logging.

  • requirements.txt: Lists all Python dependencies.

  • README.md: Project documentation.

Features

  • Model Uploading: Upload TSMixer models (.pt files) via the /upload-model/ endpoint.

  • Model Management: List all available models using the /models/ endpoint.

  • Time Series Prediction: Perform predictions using TSMixer models with single data point inputs via the /predict/ endpoint.

  • CORS Support: Configurable Cross-Origin Resource Sharing to allow secure requests from different origins.

  • Robust Logging: Comprehensive logging for monitoring requests, errors, and model operations.

  • Scalable Architecture: Designed to easily add support for additional model types in the future.

Getting Started

Prerequisites

  • Python 3.8+

  • MongoDB: For storing and retrieving sensor data.

  • Git: To clone the repository.

  • Virtual Environment: (Optional but recommended) To manage project dependencies.

Installation

  1. Clone the Repository
git  clone  https://github.com/PLKIT-AI/analysis.git



cd  analysis
  1. Create a Virtual Environment

python3 -m venv venv



source venv/bin/activate # On Windows: venv\Scripts\activate

  1. Install Dependencies

pip install --upgrade pip



pip install -r requirements.txt

  1. Configure MongoDB

Ensure MongoDB is running and accessible. Update the MongoDB URI in app/utils.py if necessary.

Running the Application

  1. Start the FastAPI Server

uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload

The --reload flag enables auto-reloading during development.

  1. Access Swagger UI

Navigate to http://localhost:8000/docs to access the interactive API documentation.

Dependencies

The project relies on the following major dependencies:

  • FastAPI: Web framework for building APIs.

  • Uvicorn: ASGI server for running FastAPI applications.

  • Darts: Time series library for forecasting.

  • PyMongo: MongoDB driver for Python.

  • Pydantic: Data validation and settings management.

  • Torch: PyTorch library for tensor computations.

All dependencies are listed in requirements.txt. Install them using:


pip install -r requirements.txt

Configuration

CORS Settings

Cross-Origin Resource Sharing (CORS) is configured in app/main.py using FastAPI's CORSMiddleware. Adjust the origins list to include the domains that should be allowed to access the API.


# app/main.py



from fastapi.middleware.cors import CORSMiddleware



origins = [

"http://localhost",

"http://localhost:3000",

"http://43.201.54.104",

# Add additional allowed origins here

]



app.add_middleware(

CORSMiddleware,

allow_origins=origins, # Allowed origins

allow_credentials=True, # Allow credentials

allow_methods=["*"], # Allowed HTTP methods

allow_headers=["*"], # Allowed HTTP headers

)

Note: For development purposes, you can set allow_origins=["*"] to allow all origins. However, this is not recommended for production due to security risks.

Environment Variables

You may need to set environment variables for sensitive configurations such as MongoDB URI or server settings. Consider using a .env file and packages like python-dotenv to manage them securely.

Example .env file:


MONGODB_URI=mongodb://localhost:27017/

Update app/utils.py to load environment variables accordingly.

Contributing

Contributions are welcome! Follow these steps to contribute to the project:

  1. Fork the Repository

Click the "Fork" button on the repository page to create a copy under your GitHub account.

  1. Clone Your Fork

git clone https://github.com/your-username/analysis.git

cd analysis

  1. Create a New Branch

git checkout -b feature/your-feature-name

  1. Make Your Changes

Implement your feature or bug fix.

  1. Commit Your Changes

git add .

git commit -m "Add your commit message"

  1. Push to Your Fork

git push origin feature/your-feature-name

  1. Create a Pull Request

Go to the original repository and create a pull request from your fork's branch.

Guidelines

  • Code Style: Follow PEP 8 guidelines for Python code.

  • Commit Messages: Use clear and descriptive commit messages.

  • Documentation: Update the README or other documentation if necessary.

  • Testing: Ensure that your changes do not break existing functionality. Add tests if applicable.

License

This project is licensed under the MIT License.

Β© 2024 PLKIT-AI.analysis. All rights reserved.

About

🌱 PLKIT's [AI] Analysis AI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published