The Importer
class is a Python utility designed to scan a network for IoT devices using the TapoNetworkScanner and store the information in a local SQLite database. This tool can also generate device configuration in YAML format for further use.
- Scans the local network for IoT devices using Tapo API credentials.
- Stores device information in an SQLite database.
- Generates JSON or YAML configuration of devices for easy integration.
- Supports both importing devices into the database and exporting configuration.
- Python 3.7+
asyncio
for asynchronous operationsyaml
for exporting data to YAML formatsqlite3
via theIoTDeviceDatabase
for device data storage- Docker (optional for containerization)
-
Clone the repository:
git clone <repository_url> cd <repository_directory>
-
Install the required dependencies:
pip install -r requirements.txt
Ensure that your
requirements.txt
file includes packages such aspyyaml
,sqlite3
, etc.
A Dockerfile
is provided to containerize the application for easier deployment.
To build the Docker image locally, use the build.sh
script as follows:
./build.sh
This script builds the Docker image for multiple platforms, such as amd64
and arm64
, and optionally pushes it to a registry if specified.
Alternatively, you can manually build the Docker image:
docker build -t iot-device-importer .
To run the Docker container:
docker run -d -p 4667:4667 -e CONFIG_PATH=/data/config.yaml -v /path/to/your/data:/data -v /path/to/your/rest:/rest iot-device-importer
- Replace
/path/to/your/data
and/path/to/your/rest
with the actual paths on your host machine.
The Dockerfile
included in this project is set up to create a lightweight container for the IoT Device Importer. It uses a python:3.9-slim
base image to keep the container small and efficient.
- Base Image:
python:3.9-slim
- A minimal Python image. - Working Directory:
/app
- All files will be copied here. - Dependencies: Requirements are installed using
pip
fromrequirements.txt
. - Environment Variables:
FLASK_PORT
: The port that Flask listens on, default is4667
.CONFIG_PATH
: Set to/data/config.yaml
, allowing easy configuration changes.FLASK_ENV
andFLASK_DEBUG
for production settings.
- Volumes: Two volumes are defined:
/data
: For configuration and logging./rest
: For storing REST configuration files.
- Run Command: Uses
gunicorn
to run the Flask server, ensuring production-level performance.
The Dockerfile supports multi-platform builds. You can use the Docker buildx
plugin to create images for multiple platforms (e.g., amd64
, arm64
).
Example build command:
docker buildx build --platform linux/amd64,linux/arm64 -t iot-device-importer:latest .
To simplify running multiple services, you can create a docker-compose.yml
file. Here is an example to run the IoT Device Importer:
version: '3.8'
services:
iot-importer:
image: iot-device-importer:latest
ports:
- "4667:4667"
environment:
CONFIG_PATH: /data/config.yaml
volumes:
- /path/to/your/data:/data
- /path/to/your/rest:/rest
Run the application with:
docker-compose up -d
The server provides an API endpoint to list all devices stored in the database.
- Endpoint:
/api/devices
- Method:
GET
- Query Parameters:
page
(optional, int): The page number for pagination (default is1
).page_size
(optional, int): The number of devices per page (default is10
).state
(optional, str): Filter devices by their state (e.g.,up
).
Example request:
curl -X GET "http://localhost:4667/api/devices?page=1&page_size=5"
This will return a paginated list of devices stored in the database.
The server provides an API endpoint to import devices by scanning the network.
- Endpoint:
/api/import_devices
- Method:
POST
- Request Body (JSON):
ip_range
(str): The IP range to scan (e.g.,192.168.0.0/24
).
Example request:
curl -X POST "http://localhost:4667/api/import_devices" -H "Content-Type: application/json" -d '{"ip_range": "192.168.0.0/24"}'
This will initiate a network scan and import detected devices into the database. If a path_to_rest_config
is specified in the configuration, the scanned data will also be stored in a YAML file.
The Importer expects the following configuration:
tapo:
username: "<redacted>" # Tapo account username (anonymized for security)
password: "<redacted>" # Tapo account password (anonymized for security)
database: "iot_devices.db" # Path to the database file
log_file: "/data/server.log" # Path to the log file
# https://github.com/hanibalsk/tapo-rest-crossplatform
path_to_rest_config: "/rest/config.yaml" # Path to the REST configuration file (optional)
Lists all devices in the database.
- Query Parameters:
page
(optional, int): The page number for pagination.page_size
(optional, int): The number of devices per page.state
(optional, str): Filter devices by their state.
- Returns: A list of devices in the database.
Imports devices by scanning the network.
- Request Body (JSON):
ip_range
(str): The IP range to scan.
- Returns: A message indicating the import status.
Logs are written to the log file specified in the configuration (log_file
), with INFO-level logging used throughout the script to track progress and operations.
This project is licensed under the MIT License. See the LICENSE file for more information.
Contributions are welcome! Please submit a pull request or open an issue for suggestions or improvements.
Currently, the API does not have any authentication mechanism. Implementing authentication is necessary to ensure that only authorized users can access and modify IoT device data. Possible approaches include:
- Token-based authentication (e.g., JWT).
- OAuth2 for more advanced use cases.
- Basic API key authentication.
The Tapo credentials (username
and password
) are sensitive. Always handle them securely and avoid hardcoding in public repositories.