Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Damo Yolo pth to coreml #139

Open
4 tasks done
adkbbx opened this issue May 17, 2024 · 0 comments
Open
4 tasks done

Damo Yolo pth to coreml #139

adkbbx opened this issue May 17, 2024 · 0 comments
Labels
question Further information is requested

Comments

@adkbbx
Copy link

adkbbx commented May 17, 2024

Before Asking

  • I have read the README carefully. 我已经仔细阅读了README上的操作指引。

  • I want to train my custom dataset, and I have read the tutorials for finetune on your data carefully and organize my dataset correctly; 我想训练自定义数据集,我已经仔细阅读了训练自定义数据的教程,以及按照正确的目录结构存放数据集。

  • I have pulled the latest code of main branch to run again and the problem still existed. 我已经拉取了主分支上最新的代码,重新运行之后,问题仍不能解决。

Search before asking

  • I have searched the DAMO-YOLO issues and found no similar questions.

Question

I am trying to convert my custom object detection model built on Damo-yolo with 4 classes into Coreml to run on my swift application and wanted to know what my output means.

image

Shape of Both "var_1262" and "var_1298" are of ([1,8400,4]) and ([1,8400,4]) respectively. I wanted to know what these ouputs meant and how I can add an NMS layer to this current model so as to make prediction on my IOS device.

I referred the onnx conversion code to convert my .pth model into coreml using coremltools and the code is as follows.

import torch
import coremltools as ct
from torch import nn
from loguru import logger
from damo.base_models.core.end2end import End2End
from damo.base_models.core.ops import RepConv, SiLU
from damo.config.base import parse_config
from damo.detectors.detector import build_local_model
from damo.utils.model_utils import get_model_info, replace_module


device = torch.device('cpu')
config_file = "./configs/damoyoloT.py"
config = parse_config(config_file)

# build model
model = build_local_model(config, device)
model.eval()

ckpt_file = "./latest_ckpt.pth"
# load model paramerters
ckpt = torch.load(ckpt_file, map_location=device)

if 'model' in ckpt:
    ckpt = ckpt['model']
model.load_state_dict(ckpt, strict=True)
logger.info(f'loading checkpoint from {ckpt_file}.')

model = replace_module(model, nn.SiLU, SiLU)

for layer in model.modules():
    if isinstance(layer, RepConv):
        layer.switch_to_deploy()

info = get_model_info(model, (640, 640))
logger.info(info)
model.head.nms = False

inputs = torch.randn(1,3,640,640)
traced_model = torch.jit.trace(model,inputs)

input_image = ct.ImageType(name='inputs', shape=(1,3,640,640),scale=1/255,
              bias=[0,0,0])
coreml_model = ct.convert(traced_model, inputs=[input_image])
coreml_model.save('./latest_ckpt_test.mlmodel')

Thanks for the answer.

Additional

No response

@adkbbx adkbbx added the question Further information is requested label May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant