Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems loading torchscript #795

Open
davave opened this issue Jun 21, 2021 · 2 comments
Open

Problems loading torchscript #795

davave opened this issue Jun 21, 2021 · 2 comments

Comments

@davave
Copy link

davave commented Jun 21, 2021

Background

python version: 3.7.3
pytorch version: 1.8.1
torchvision version: 0.9.1

Issue

I'm trying to run a torchscript model from Ultralytics/YoloV5 in RedisAI using python.
I exported the model using export.py:

# TorchScript export -----------------------------------------------------------------------------------------------   
 if 'torchscript' in include or 'coreml' in include:
    prefix = colorstr('TorchScript:')
    try:
         print(f'\n{prefix} starting export with torch {torch.__version__}...')
         f = weights.replace('.pt', '.torchscript.pt')  # filename
         ts = torch.jit.trace(model, img, strict=False)
         (optimize_for_mobile(ts) if optimize else ts).save(f)
         print(f'{prefix} export success, saved as {f} ({file_size(f):.1f} MB)')
    except Exception as e:
         print(f'{prefix} export failure: {e}')

and the file weights.torchscript.pt is generated. When I try to run

    import redis
    from urllib.parse import urlparse
    url = urlparse('redis://127.0.0.1:6379')
    conn = redis.Redis(host=url.hostname, port=url.port)
    with open('./weights.torchscript.pt', 'rb') as f:
        model = f.read()
        res = conn.execute_command('AI.SCRIPTSET', 'yolo:model', args.device, model)

the following error appears:

Traceback (most recent call last):
  File "init.py", line 45, in <module>
    res = conn.execute_command('AI.SCRIPTSET', 'yolo:model', args.device, model)
  File ".../.pyenv/versions/3.7.3/envs/3_7_3_venv/lib/python3.7/site-packages/redis/client.py", line 775, in execute_command
    return self.parse_response(connection, command_name, **options)
  File ".../.pyenv/versions/3.7.3/envs/3_7_3_venv/lib/python3.7/site-packages/redis/client.py", line 789, in parse_response
    response = connection.read_response()
  File ".../.pyenv/versions/3.7.3/envs/3_7_3_venv/lib/python3.7/site-packages/redis/connection.py", line 642, in read_response
    raise response
redis.exceptions.ResponseError: expected def but found 'ident' here: at <string>:1:0 PK ~~ <--- HERE 

Could anyone help me please? I am starting to use redisAI into the redisedge framework but I can't figure out whether the problem is in torchscript export, in the redisAI loading, or I am missing something important.
Thank you!

@gkorland
Copy link
Contributor

@davave are you sure it's a torchscript script and not a pytorch model?
Can you please try to load the model with modelset and not scriptset?

@davave
Copy link
Author

davave commented Jun 22, 2021

Thank you @gkorland.

@davave are you sure it's a torchscript script and not a pytorch model?

I am not really sure as i am quite a newbie with this tools, but I looked around and found that there are two different ways to create a torchscript script from a pytorch model: jit.trace and jit.script. I chose to use the former since it was the one proposed by the authors and I did not have a complete knowledge of the model.

Can you please try to load the model with modelset and not scriptset?

When I run:

    with open('./weights.torchscript.pt', 'rb') as f:
        model = f.read()
        res = conn.execute_command('AI.MODELSET', 'yolo:model', 'TORCH', args.device, model)

the result is

Loading model - Traceback (most recent call last):
  File "init.py", line 45, in <module>
    res = conn.execute_command('AI.MODELSET', 'yolo:model', 'TORCH', args.device, model)
  File "/home/davide/.pyenv/versions/3.7.3/envs/3_7_3_venv/lib/python3.7/site-packages/redis/client.py", line 775, in execute_command
    return self.parse_response(connection, command_name, **options)
  File "/home/davide/.pyenv/versions/3.7.3/envs/3_7_3_venv/lib/python3.7/site-packages/redis/client.py", line 789, in parse_response
    response = connection.read_response()
  File "/home/davide/.pyenv/versions/3.7.3/envs/3_7_3_venv/lib/python3.7/site-packages/redis/connection.py", line 642, in read_response
    raise response
redis.exceptions.ResponseError: version_number <= kMaxSupportedFileFormatVersion INTERNAL ASSERT FAILED at /pytorch/caffe2/serialize/inline_container.cc:131,
 please report a bug to PyTorch. Attempted to read a PyTorch file with version 3, but the maximum supported version for reading is 1. Your PyTorch installation may be too old. (init at /pytorch/caffe2/serialize/inline_container.cc:131) 
frame #0: c10::Error::Error(c10::SourceLocation, std::string const&) + 0x33 (0x7fc6eddd2273 in /usr/lib/redis/modules/backends/redisai_torch/lib/libc10.so) 
frame #1: caffe2::serialize::PyTorchStreamReader::init() + 0x1e9a (0x7fc6efc2c40a in /usr/lib/redis/modules/backends/redisai_torch/lib/libtorch.so) 
frame #2: caffe2::serialize::PyTorchStreamReader::PyTorchStreamReader(std::unique_ptr<caffe2::serialize::ReadAdapterInterface, std::default_delete<caffe2::serialize::ReadAdapterInterface> >) + 0x53 (0x7fc6efc2d5c3 in /usr/lib/redis/modules/backends/redisai_torch/lib/libtorch.so)
frame #3: <unknown function> + 0x2d120d2 (0x7fc6f0d0b0d2 in /usr/lib/redis/modules/backends/redisai_torch/lib/libtorch.so) 
frame #4: torch::jit::load(std::unique_ptr<caffe2::serialize::ReadAdapterInterface, std::default_delete<caffe2::serialize::ReadAdapterInterface> >, c10::optional<c10::Device>, std::unordered_map<std::string, std::string, std::hash<std::string>, std::equal_to<std::string>, std::allocator<std::pair<std::string const, std::string> > >&) + 0x27 (0x7fc6f0d0a027 in /usr/lib/redis/modules/backends/redisai_torch/lib/libtorch.so)
frame #5: torch::jit::load(std::istream&, c10::optional<c10::Device>, std::unordered_map<std::string, std::string, std::hash<std::string>, std::equal_to<std::string>, std::allocator<std::pair<std::string const, std::string> > >&) + 0x69 (0x7fc6f0d0a2a9 in /usr/lib/redis/modules/backends/redisai_torch/lib/libtorch.so) 
frame #6: torchLoadModel + 0x1f7 (0x7fc6fa8ec8b7 in /usr/lib/redis/modules/backends/redisai_torch/redisai_torch.so)
frame #7: RAI_ModelCreateTorch + 0x9d (0x7fc6fa8e8ebd in /usr/lib/redis/modules/backends/redisai_torch/redisai_torch.so) 
frame #8: RedisAI_ModelSet_RedisCommand + 0x242 (0x7fc73ca82782 in /usr/lib/redis/modules/redisai.so)
frame #9: RedisModuleCommandDispatcher + 0x56 (0x564b52a9b6d6 in redis-server *:6379)
frame #10: call + 0xa7 (0x564b52a2b2b7 in redis-server *:6379) 
frame #11: processCommand + 0x51e (0x564b52a2bb5e in redis-server *:6379) 
frame #12: processInputBuffer + 0x171 (0x564b52a3bd51 in redis-server *:6379) 
frame #13: aeProcessEvents + 0x101 (0x564b52a252e1 in redis-server *:6379) 
frame #14: aeMain + 0x2b (0x564b52a256eb in redis-server *:6379) 
frame #15: main + 0x4b9 (0x564b52a22569 in redis-server *:6379) 
frame #16: __libc_start_main + 0xeb (0x7fc73cabf09b in /lib/x86_64-linux-gnu/libc.so.6) 
frame #17: _start + 0x2a (0x564b52a227aa in redis-server *:6379) 

I also tried with AI.MODELSTORE but the result is the same

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants