Skip to content

Commit

Permalink
Merge pull request #10 from viam-modules/inline-docs
Browse files Browse the repository at this point in the history
RSDK-9945: edits for inline docs
  • Loading branch information
bhaney authored Feb 11, 2025
2 parents cda5ae4 + ccfd915 commit e09ebc1
Show file tree
Hide file tree
Showing 4 changed files with 23 additions and 6 deletions.
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,14 @@ The following attributes are available to configure your module:
| `model_path` | string | **Required** | | Path to **standalone** model file |
| `label_path` | string | Optional | | Path to file with class labels. |

### Example configuration

```json
{
"model_path": "${packages.ml_model.myMLModel}/my_model.pt",
"label_path": "${packages.ml_model.myMLModel}/labels.txt"
}
```


# Methods
Expand Down
8 changes: 5 additions & 3 deletions meta.json
Original file line number Diff line number Diff line change
@@ -1,12 +1,14 @@
{
"module_id": "viam:torch-cpu",
"visibility": "public",
"url": "https://github.com/viam-labs/torch",
"url": "https://github.com/viam-modules/torch",
"description": "Viam ML Module service serving PyTorch models.",
"models": [
{
"api": "rdk:service:mlmodel",
"model": "viam:mlmodel:torch-cpu"
"model": "viam:mlmodel:torch-cpu",
"markdown_link": "README.md#example-configuration",
"short_description": "An ML Model Service that can run PyTorch models in a standard format"
}
],
"build": {
Expand All @@ -19,4 +21,4 @@
]
},
"entrypoint": "dist/main"
}
}
1 change: 0 additions & 1 deletion src/test_local.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,4 +180,3 @@ def test_infer_method(self):

if __name__ == "__main__":
unittest.main()

13 changes: 11 additions & 2 deletions src/torch_mlmodel_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,11 @@ def get_attribute_from_config(attribute_name: str, default, of_type=None):
self._metadata = self.inspector.find_metadata(label_file)

async def infer(
self, input_tensors: Dict[str, NDArray], *, timeout: Optional[float]
self,
input_tensors: Dict[str, NDArray],
*,
extra: Optional[Mapping[str, ValueTypes]],
timeout: Optional[float],
) -> Dict[str, NDArray]:
"""Take an already ordered input tensor as an array,
make an inference on the model, and return an output tensor map.
Expand All @@ -110,7 +114,12 @@ async def infer(
"""
return self.torch_model.infer(input_tensors)

async def metadata(self, *, timeout: Optional[float]) -> Metadata:
async def metadata(
self,
*,
extra: Optional[Mapping[str, ValueTypes]],
timeout: Optional[float],
) -> Metadata:
"""Get the metadata (such as name, type, expected tensor/array shape,
inputs, and outputs) associated with the ML model.
Expand Down

0 comments on commit e09ebc1

Please sign in to comment.