This repository contains PyTorch implementation of real and synthetic Korean licence plate (LP) registration numbers. First, the synthetic Korean LP numbers are generated and they are used as input to the Generative Adversarial Network (GAN) model to make real-life LP numbers with certain amount of distortions.
conda create -n <ENV_NAME> python=3.9
conda activate <ENV_NAME>
pip install -r requirements.txt
The synthetic LP numbers are generated based on the latest available online information. According to the information, there are six widely-used car LP types in South Korea:
- Private (European-sized);
- Private (North American-sized);
- Commercial (European-sized);
- Commercial (North American-sized);
- Private Cars Old-style (1973~2003);
- Private Cars Old-style (2004~2006);
Private European-sized (3 digit) | Private North American-sized (3 digit) | Private European-sized (2 digit) | Private North American-sized (2 digit) |
---|---|---|---|
Commercial European-sized | Commercial North American-sized | Private Cars Old-style | Private Cars Old-style |
---|---|---|---|
python generate.py --data_path "path/to/csv_file" --save_path "path/to/save/synthetic_lps" --random=False --transformations=False --save=True
This script gets information about the LPs from pre-defined csv file, generates synthethic LPs, and saves them into the save_path.
python generate.py --save_path "path/to/save/synthetic_lps" --random=True --transformations=False --random=True --save=True --number_of_plates 100
This script randomly creates LP information, generates synthethic LPs from the randomly created information, and saves them into the save_path.
python make_dataset.py --in_im_paths "path/to/generated/synthetic_lps" --out_im_paths "path/to/real-life/images" --trainA "path/to/copy/synthetic/images" --trainB "path/to/copy/real-life/images" --type "train or test depending on dataset type"
After getting synthetic images, we train them using modified (more efficient and fast) CUT GAN model as follows:
python train.py --dataroot path/to/the/dataset --name name/of/the/trained/model --CUT_mode CUT/FastCUT
This script trains the model based on the "--CUT_mode" argument (CUT or FastCUT) using the given dataroot (the root should contain two folders, trainA and trainB, respectively) and saves the model outputs under "--name" (this is later used for testing purposes) model name.
python test.py --dataroot path/to/the/dataset --name name/of/the/trained/model --CUT_mode CUT/FastCUT --phase test
This script conducts inference with the pretrained model (choose the model using "--name" argument) based on the "--CUT_mode" argument (CUT or FastCUT) using the given test dataroot (the root should contain two folders, testA and testB, respectively). The inference results can be found at ./results/name/train_latest/...
Generated sample LPs can be seen below: