Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming model docker not working #1906

Open
Hamlet626 opened this issue Sep 14, 2024 · 0 comments
Open

Streaming model docker not working #1906

Hamlet626 opened this issue Sep 14, 2024 · 0 comments

Comments

@Hamlet626
Copy link

I tried to dockerize a streaming model by following example streaming server, and run with something like "mlserver build streaming_model/ -t 'stream_ml_service", then "docker run -it --rm -p 8080:8080 stream_ml_service".
But it raised:

Traceback (most recent call last):
  File "/opt/conda/bin/mlserver", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 269, in main
    root()
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 24, in wrapper
    return asyncio.run(f(*args, **kwargs))
  File "/opt/conda/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 47, in start
    server = MLServer(settings)
  File "/opt/conda/lib/python3.10/site-packages/mlserver/server.py", line 32, in __init__
    self._metrics_server = MetricsServer(self._settings)
  File "/opt/conda/lib/python3.10/site-packages/mlserver/metrics/server.py", line 26, in __init__
    self._app = self._get_app()
  File "/opt/conda/lib/python3.10/site-packages/mlserver/metrics/server.py", line 30, in _get_app
    app.add_route(self._settings.metrics_endpoint, self._endpoint.handle_metrics)
  File "/opt/conda/lib/python3.10/site-packages/starlette/applications.py", line 166, in add_route
    self.router.add_route(
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 833, in add_route
    route = Route(
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 226, in __init__
    assert path.startswith("/"), "Routed paths must start with '/'"
AssertionError: Routed paths must start with '/'

Does anyone know how to dockerize a streaming model, thanks?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant