diff --git a/.github/FUNDING.yml b/.github/FUNDING.yml
index 57c33ba..8d5eae0 100644
--- a/.github/FUNDING.yml
+++ b/.github/FUNDING.yml
@@ -1,3 +1,6 @@
# FUNDING.yml
-github: VigneshVSV
\ No newline at end of file
+github: VigneshVSV
+buy_me_a_coffee: vigneshvsv
+thanks_dev: gh/vigneshvsv
+open_collective: hololinked-dev
diff --git a/.github/workflows/python-publish-pypi.yml b/.github/workflows/python-publish-pypi.yml
index 27de753..8f3264c 100644
--- a/.github/workflows/python-publish-pypi.yml
+++ b/.github/workflows/python-publish-pypi.yml
@@ -30,7 +30,7 @@ jobs:
python -m pip install --upgrade pip
pip install build
- name: Build package
- run: python -m build --wheel
+ run: python -m build
- name: Publish package
uses: pypa/gh-action-pypi-publish@release/v1
with:
diff --git a/.github/workflows/python-publish-testpypi.yml b/.github/workflows/python-publish-testpypi.yml
index b2a88b4..98a29c4 100644
--- a/.github/workflows/python-publish-testpypi.yml
+++ b/.github/workflows/python-publish-testpypi.yml
@@ -30,7 +30,7 @@ jobs:
python -m pip install --upgrade pip
pip install build
- name: Build package
- run: python -m build --wheel
+ run: python -m build
- name: Publish package
uses: pypa/gh-action-pypi-publish@release/v1
with:
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 5b98bdf..ac099f5 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -9,9 +9,29 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
✓ means ready to try
+New:
- cookie auth & its specification in TD (cookie auth branch)
-- image event handlers (develop branch) for streaming live video as JPEG and PNG ✓
-- pydantic support for property models (develop branch) ✓
+- adding custom handlers for each property, action and event to override default behaviour
+- pydantic support for property models
+
+Bug Fixes:
+- composed sub`Thing`s exposed with correct URL path ✓
+
+## [v0.2.7] - 2024-10-22
+
+- HTTP SSE would previously remain unclosed when client abruptly disconnected (like closing a browser tab), but now it would close correctly
+- retrieve unserialized data from events with `ObjectProxy` (like JPEG images) by setting `deserialize=False` in `subscribe_event()`
+-
+
+## [v0.2.6] - 2024-09-09
+
+- bug fix events when multiple serializers are used
+- events support custom HTTP handlers (not polished yet, use as last resort, not auto-added to TD)
+- image event handlers for streaming live video as JPEG and PNG (not polished yet, not auto-added to TD)
+
+## [v0.2.5] - 2024-09-09
+
+- released to anaconda, it can take a while to turn up. A badge will be added in README when successful.
## [v0.2.4] - 2024-09-09
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index f1c59e6..2f506d9 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -11,16 +11,15 @@ All types of contributions are encouraged and valued.
> - Mention the project at local meetups/conferences and tell your friends/colleagues
> - Donate to cover the costs of maintaining it
-
## I Have a Question
-Do feel free to reach out to me at vignesh.vaidyanathan@hololinked.dev. I will try my very best to respond.
+Do feel free to reach out to me at vignesh.vaidyanathan@hololinked.dev or in discord. I will try my very best to respond.
Nevertheless, one may also refer the available how-to section of the [Documentation](https://hololinked.readthedocs.io/en/latest/index.html).
If the documentation is insufficient for any reason including being poorly documented, one may open a new discussion in the [Q&A](https://github.com/VigneshVSV/hololinked/discussions/categories/q-a) section of GitHub discussions.
For questions related to workings of HTTP, JSON schema, basic concepts of python like descriptors, decorators etc., it is also advisable to search the internet for answers first.
-For generic questions related to web of things standards or its ideas, I recommend to join web of things [discord](https://discord.com/invite/RJNYJsEgnb) group and [community](https://www.w3.org/community/wot/) group.
+For generic questions related to web of things standards or its ideas, it is recommended to join web of things [discord](https://discord.com/invite/RJNYJsEgnb) group and [community](https://www.w3.org/community/wot/) group.
If you believe your question might also be a bug, you might want to search for existing [Issues](https://github.com/VigneshVSV/hololinked/issues) that might help you.
In case you have found a suitable issue and still need clarification, you can write your question in this issue. If an issue is not found:
@@ -44,13 +43,21 @@ Otherwise, I will then take care of the issue as soon as possible.
> ### Legal Notice
> When contributing to this project, you must agree that you have authored 100% of the content or that you have the necessary rights to the content. For example, you copied code from projects with MIT/BSD License. Content from GPL-related licenses may be maintained in a separate repository as an add-on.
-Developers are always welcome to contribute to the code base. If you want to tackle any issues, un-existing features, let me know (at my email), I can create some open issues and features which I was never able to solve or did not have the time. You can also suggest what else can be contributed functionally or conceptually or also simply code-refactoring. The lack of issues or features in the [Issues](https://github.com/VigneshVSV/hololinked/issues) section of github does not mean the project is considered feature complete or I dont have ideas what to do next. On the contrary, there is tons of work to do.
+Developers are always welcome to contribute to the code base. If you want to tackle any issues, un-existing features, let me know (at my email), I can create some open issues and features which I was never able to solve or did not have the time. You can also suggest what else can be contributed functionally, conceptually or also simply code-refactoring.
-There are also repositories which can use your skills:
+There are also other repositories which can use your skills:
- An [admin client](https://github.com/VigneshVSV/thing-control-panel) in react
- [Documentation](https://github.com/VigneshVSV/hololinked-docs) in sphinx which needs significant improvement in How-To's, beginner level docs which may teach people concepts of data acquisition or IoT, Docstring or API documentation of this repository itself
- [Examples](https://github.com/VigneshVSV/hololinked-examples) in nodeJS, Dashboard/PyQt GUIs or server implementations using this package. Hardware implementations of unexisting examples are also welcome, I can open a directory where people can search for code based on hardware and just download your code.
+## Git Branching
+
+A simpler model is used roughly based on [this article](https://www.bitsnbites.eu/a-stable-mainline-branching-model-for-git/) -
+- main branch is where all stable developments are merged, all your branches must merge here
+- main branch is merged to release branch when it is decided to created a release.
+- A specific release is tagged and not created as its own branch. Instead release branch simply follows the main branch at the release time. People should clone the main branch for latest (mostly-) stable code base and release branch for released code base.
+- other branches are feature or bug fix branches. A develop branch may be used to make general improvements as the package is constantly evolving, but its not a specific philosophy to use a develop branch.
+- Bug fixes on releases must proceed from the tag of that release. Perhaps, even a new release can be made after fixing the bug by merging a bug fix branch to main branch.
## Attribution
This guide is based on the **contributing-gen**. [Make your own](https://github.com/bttger/contributing-gen)!
diff --git a/README.md b/README.md
index f9b9301..53bd63f 100644
--- a/README.md
+++ b/README.md
@@ -3,16 +3,23 @@
### Description
`hololinked` is a beginner-friendly server side pythonic tool suited for instrumentation control and data acquisition over network, especially with HTTP. If you have a requirement to control and capture data from your hardware/instrumentation, show the data in a browser/dashboard, provide a GUI or run automated scripts, `hololinked` can help. Even for isolated applications or a small lab setup without networking concepts, one can still separate the concerns of the tools that interact with the hardware & the hardware itself.
+
+For those that understand, this package is a ZMQ/HTTP-RPC.
-[![Documentation Status](https://readthedocs.org/projects/hololinked/badge/?version=latest)](https://hololinked.readthedocs.io/en/latest/?badge=latest) [![PyPI](https://img.shields.io/pypi/v/hololinked?label=pypi%20package)](https://pypi.org/project/hololinked/) [![PyPI - Downloads](https://img.shields.io/pypi/dm/hololinked)](https://pypistats.org/packages/hololinked) [![codecov](https://codecov.io/gh/VigneshVSV/hololinked/graph/badge.svg?token=JF1928KTFE)](https://codecov.io/gh/VigneshVSV/hololinked)
+[![Documentation Status](https://readthedocs.org/projects/hololinked/badge/?version=latest)](https://hololinked.readthedocs.io/en/latest/?badge=latest) [![PyPI](https://img.shields.io/pypi/v/hololinked?label=pypi%20package)](https://pypi.org/project/hololinked/) [![Anaconda](https://anaconda.org/conda-forge/hololinked/badges/version.svg)](https://anaconda.org/conda-forge/hololinked)
+[![codecov](https://codecov.io/gh/VigneshVSV/hololinked/graph/badge.svg?token=JF1928KTFE)](https://codecov.io/gh/VigneshVSV/hololinked)
[![email](https://img.shields.io/badge/email%20me-brown)](mailto:vignesh.vaidyanathan@hololinked.dev) [![ways to contact me](https://img.shields.io/badge/ways_to_contact_me-brown)](https://hololinked.dev/contact)
+
+[![PyPI - Downloads](https://img.shields.io/pypi/dm/hololinked?label=pypi%20downloads)](https://pypistats.org/packages/hololinked)
+[![Conda Downloads](https://img.shields.io/conda/d/conda-forge/hololinked)](https://anaconda.org/conda-forge/hololinked)
### To Install
-From pip - ``pip install hololinked``
+From pip - ``pip install hololinked``
+From conda - `conda install -c conda-forge hololinked`
-Or, clone the repository (develop branch for latest codebase) and install `pip install .` / `pip install -e .`. The conda env ``hololinked.yml`` can also help to setup all dependencies.
+Or, clone the repository (main branch for latest codebase) and install `pip install .` / `pip install -e .`. The conda env ``hololinked.yml`` can also help to setup all dependencies.
### Usage/Quickstart
@@ -77,6 +84,7 @@ class OceanOpticsSpectrometer(Thing):
super().__init__(instance_name=instance_name, serial_number=serial_number, **kwargs)
```
+> There is an ongoing work to remove HTTP API from the property API and completely move them to the HTTP server
In non-expert terms, properties look like class attributes however their data containers are instantiated at object instance level by default.
For example, the `integration_time` property defined above as `Number`, whenever set/written, will be validated as a float or int, cropped to bounds and assigned as an attribute to each instance of the `OceanOpticsSpectrometer` class with an internally generated name. It is not necessary to know this internally generated name as the property value can be accessed again in any python logic, say, `print(self.integration_time)`.
@@ -103,9 +111,9 @@ class OceanOpticsSpectrometer(Thing):
```
-In this case, instead of generating a data container with an internal name, the setter method is called when `integration_time` property is set/written. One might add the hardware device driver (say, supplied by the manufacturer) logic here to apply the property onto the device. In the above example, there is not a way provided by lower level library to read the value from the device, so we store it in a variable after applying it and supply the variable back to the getter method. Normally, one would also want the getter to read from the device directly.
+In this case, instead of generating a data container with an internal name, the setter method is called when `integration_time` property is set/written. One might add the hardware device driver logic here (say, supplied by the manufacturer) or a protocol that talks directly to the device to apply the property onto the device. In the above example, there is not a way provided by the device driver library to read the value from the device, so we store it in a variable after applying it and supply the variable back to the getter method. Normally, one would also want the getter to read from the device directly.
-Those familiar with Web of Things (WoT) terminology may note that these properties generate the property affordance schema. An example of autogenerated property affordance for `integration_time` is as follows:
+Those familiar with Web of Things (WoT) terminology may note that these properties generate the property affordance. An example for `integration_time` is as follows:
```JSON
"integration_time": {
@@ -128,8 +136,8 @@ Those familiar with Web of Things (WoT) terminology may note that these properti
},
```
If you are not familiar with Web of Things or the term "property affordance", consider the above JSON as a description of
-what the property represents and how to interact with it from somewhere else. Such a JSON is both human-readable, yet consumable
-by a client provider to create a client object to interact with the property. For example, the Eclipse ThingWeb [node-wot](https://github.com/eclipse-thingweb/node-wot) supports this feature to produce a HTTP(s) client that can issue `readProperty("integration_time")` and `writeProperty("integration_time", 1000)` to read and write this property.
+what the property represents and how to interact with it from somewhere else. Such a JSON is both human-readable, yet consumable by any application that may use the property, say a client provider to create a client object to interact with the property or a GUI application to autogenerate a suitable input field for this property.
+For example, the Eclipse ThingWeb [node-wot](https://github.com/eclipse-thingweb/node-wot) supports this feature to produce a HTTP(s) client that can issue `readProperty("integration_time")` and `writeProperty("integration_time", 1000)` to read and write this property.
The URL path segment `../spectrometer/..` in href field is taken from the `instance_name` which was specified in the `__init__`.
This is a mandatory key word argument to the parent class `Thing` to generate a unique name/id for the instance. One should use URI compatible strings.
@@ -271,7 +279,7 @@ what the event represents and how to subscribe to it) with subprotocol SSE (HTTP
Events follow a pub-sub model with '1 publisher to N subscribers' per `Event` object, both through ZMQ and HTTP SSE.
-Although the code is the very familiar & age-old RPC server style, one can directly specify HTTP methods and URL path for each property, action and event. A configurable HTTP Server is already available (from `hololinked.server.HTTPServer`) which redirects HTTP requests to the object according to the specified HTTP API on the properties, actions and events. To plug in a HTTP server:
+To start the Thing, a configurable HTTP Server is already available (from `hololinked.server.HTTPServer`) which redirects HTTP requests to the object:
```python
import ssl, os, logging
@@ -293,6 +301,7 @@ if __name__ == '__main__':
# or O.run(zmq_protocols=['IPC', 'TCP'], tcp_socket_address='tcp://*:9999')
# both interprocess communication & TCP, no HTTP
```
+> There is an ongoing work to remove HTTP API from the API of all of properties, actions and events and completely move them to the HTTP server for a more accurate syntax. The functionality will not change though.
Here one can see the use of `instance_name` and why it turns up in the URL path. See the detailed example of the above code [here](https://gitlab.com/hololinked-examples/oceanoptics-spectrometer/-/blob/simple/oceanoptics_spectrometer/device.py?ref_type=heads).
@@ -305,7 +314,20 @@ See a list of currently supported possibilities while using this package [below]
> You may use a script deployment/automation tool to remote stop and start servers, in an attempt to remotely control your hardware scripts.
-One may use the HTTP API according to one's beliefs (including letting the package auto-generate it), but it is mainly intended for web development and cross platform clients like the [node-wot](https://github.com/eclipse-thingweb/node-wot) HTTP(s) client. If your plan is to develop a truly networked system, it is recommended to learn more and use [Thing Descriptions](https://www.w3.org/TR/wot-thing-description11) to describe your hardware. A Thing Description will be automatically generated if absent as shown in JSON examples above or can be supplied manually.
+### Looking for sponsorships
+
+Kindly read my message [in my README](https://github.com/VigneshVSV#sponsor)
+
+### A little more about Usage
+
+One may use the HTTP API according to one's beliefs (including letting the package auto-generate it), but it is mainly intended for web development and cross platform clients
+like the interoperable [node-wot](https://github.com/eclipse-thingweb/node-wot) HTTP(s) client. If your plan is to develop a truly networked system, it is recommended to learn more and
+se [Thing Descriptions](https://www.w3.org/TR/wot-thing-description11) to describe your hardware. A Thing Description will be automatically generated if absent as shown in JSON examples above or can be supplied manually. The default end point to
+fetch thing descriptions are:
`http(s):////resources/wot-td`
+If there are errors in generation of Thing Description
+(mostly due to JSON non-complaint types), one could use:
`http(s):////resources/wot-td?ignore_errors=true`
+
+(client docs will be updated here next)
### Currently Supported
@@ -315,22 +337,19 @@ One may use the HTTP API according to one's beliefs (including letting the packa
- use serializer of your choice (except for HTTP) - MessagePack, JSON, pickle etc. & extend serialization to suit your requirement. HTTP Server will support only JSON serializer to maintain comptibility with Javascript (MessagePack may be added later). Default is JSON serializer based on msgspec.
- asyncio compatible - async RPC server event-loop and async HTTP Server - write methods in async
- choose from multiple ZeroMQ transport methods which offers some possibilities like the following without changing the code:
- - run HTTP Server & python object in separate processes or the same process
- - serve multiple objects with the same HTTP server
- - run direct ZMQ-TCP server without HTTP details
- expose only a dashboard or web page on the network without exposing the hardware itself
+ - run direct ZMQ-TCP server without HTTP details
+ - serve multiple objects with the same HTTP server, run HTTP Server & python object in separate processes or the same process
Again, please check examples or the code for explanations. Documentation is being activety improved.
### Currently being worked
-- improving accuracy of Thing Descriptions
-- separation of HTTP protocol specification like URL path and HTTP verbs from the API of properties, actions and events and move their customization completely to the HTTP server
- unit tests coverage
+- separation of HTTP protocol specification like URL path and HTTP verbs from the API of properties, actions and events and move their customization completely to the HTTP server
+- serve multiple things with the same server (unfortunately due to a small oversight it is currently somewhat difficult for end user to serve multiple things with the same server, although its possible. This will be fixed.)
+- improving accuracy of Thing Descriptions
- cookie credentials for authentication - as a workaround until credentials are supported, use `allowed_clients` argument on HTTP server which restricts access based on remote IP supplied with the HTTP headers. This wont still help you in public networks or modified/non-standard HTTP clients.
-### Internals
-
-This package is an implementation of a ZeroMQ-based Object Oriented RPC with customizable HTTP end-points. A dual transport in both ZMQ and HTTP is provided to maximize flexibility in data type, serialization and speed, although HTTP is preferred for networked applications. If one is looking for an object oriented approach towards creating components within a control or data acquisition system, or an IoT device, one may consider this package.
diff --git a/examples b/examples
index 3484e58..e4db921 160000
--- a/examples
+++ b/examples
@@ -1 +1 @@
-Subproject commit 3484e58d9b6f166f977b80c3e267c7871bac59a4
+Subproject commit e4db9210ec15eb141db9be6c021b867384064ae9
diff --git a/hololinked/__init__.py b/hololinked/__init__.py
index 788da1f..6cd38b7 100644
--- a/hololinked/__init__.py
+++ b/hololinked/__init__.py
@@ -1 +1 @@
-__version__ = "0.2.4"
+__version__ = "0.2.7"
diff --git a/hololinked/client/proxy.py b/hololinked/client/proxy.py
index 50f8deb..692fb34 100644
--- a/hololinked/client/proxy.py
+++ b/hololinked/client/proxy.py
@@ -463,7 +463,7 @@ async def async_write_multiple_properties(self, **properties) -> None:
def subscribe_event(self, name : str, callbacks : typing.Union[typing.List[typing.Callable], typing.Callable],
- thread_callbacks : bool = False) -> None:
+ thread_callbacks : bool = False, deserialize : bool = True) -> None:
"""
Subscribe to event specified by name. Events are listened in separate threads and supplied callbacks are
are also called in those threads.
@@ -489,7 +489,7 @@ def subscribe_event(self, name : str, callbacks : typing.Union[typing.List[typin
if event._subscribed:
event.add_callbacks(callbacks)
else:
- event.subscribe(callbacks, thread_callbacks)
+ event.subscribe(callbacks, thread_callbacks, deserialize)
def unsubscribe_event(self, name : str):
@@ -567,7 +567,7 @@ def load_thing(self):
elif data.what == ResourceTypes.EVENT:
assert isinstance(data, ServerSentEvent)
event = _Event(self.zmq_client, data.name, data.obj_name, data.unique_identifier, data.socket_address,
- serializer=self.zmq_client.zmq_serializer, logger=self.logger)
+ serialization_specific=data.serialization_specific, serializer=self.zmq_client.zmq_serializer, logger=self.logger)
_add_event(self, event, data)
self.__dict__[data.name] = event
@@ -755,21 +755,24 @@ def oneway_set(self, value : typing.Any) -> None:
class _Event:
- __slots__ = ['_zmq_client', '_name', '_obj_name', '_unique_identifier', '_socket_address', '_callbacks',
- '_serializer', '_subscribed', '_thread', '_thread_callbacks', '_event_consumer', '_logger']
+ __slots__ = ['_zmq_client', '_name', '_obj_name', '_unique_identifier', '_socket_address', '_callbacks', '_serialization_specific',
+ '_serializer', '_subscribed', '_thread', '_thread_callbacks', '_event_consumer', '_logger', '_deserialize']
# event subscription
# Dont add class doc otherwise __doc__ in slots will conflict with class variable
def __init__(self, client : SyncZMQClient, name : str, obj_name : str, unique_identifier : str, socket : str,
- serializer : BaseSerializer = None, logger : logging.Logger = None) -> None:
+ serialization_specific : bool = False, serializer : BaseSerializer = None, logger : logging.Logger = None) -> None:
+ self._zmq_client = client
self._name = name
self._obj_name = obj_name
self._unique_identifier = unique_identifier
self._socket_address = socket
+ self._serialization_specific = serialization_specific
self._callbacks = None
self._serializer = serializer
self._logger = logger
self._subscribed = False
+ self._deserialize = True
def add_callbacks(self, callbacks : typing.Union[typing.List[typing.Callable], typing.Callable]) -> None:
if not self._callbacks:
@@ -780,12 +783,15 @@ def add_callbacks(self, callbacks : typing.Union[typing.List[typing.Callable], t
self._callbacks.append(callbacks)
def subscribe(self, callbacks : typing.Union[typing.List[typing.Callable], typing.Callable],
- thread_callbacks : bool = False):
- self._event_consumer = EventConsumer(self._unique_identifier, self._socket_address,
- f"{self._name}|RPCEvent|{uuid.uuid4()}", b'PROXY',
- zmq_serializer=self._serializer, logger=self._logger)
+ thread_callbacks : bool = False, deserialize : bool = True):
+ self._event_consumer = EventConsumer(
+ 'zmq-' + self._unique_identifier if self._serialization_specific else self._unique_identifier,
+ self._socket_address, f"{self._name}|RPCEvent|{uuid.uuid4()}", b'PROXY',
+ zmq_serializer=self._serializer, logger=self._logger
+ )
self.add_callbacks(callbacks)
self._subscribed = True
+ self._deserialize = deserialize
self._thread_callbacks = thread_callbacks
self._thread = threading.Thread(target=self.listen)
self._thread.start()
@@ -793,7 +799,7 @@ def subscribe(self, callbacks : typing.Union[typing.List[typing.Callable], typin
def listen(self):
while self._subscribed:
try:
- data = self._event_consumer.receive()
+ data = self._event_consumer.receive(deserialize=self._deserialize)
if data == 'INTERRUPT':
break
for cb in self._callbacks:
diff --git a/hololinked/requirements.txt b/hololinked/requirements.txt
new file mode 100644
index 0000000..09b1205
--- /dev/null
+++ b/hololinked/requirements.txt
@@ -0,0 +1,14 @@
+ConfigParser==7.1.0
+fastjsonschema==2.20.0
+ifaddr==0.2.0
+ipython==8.12.3
+jsonschema==4.23.0
+msgspec==0.18.6
+numpy==2.1.2
+pandas==2.2.3
+pydantic==2.9.2
+pyzmq==26.2.0
+serpent==1.41
+SQLAlchemy==2.0.30
+tornado==6.4.1
+uvloop==0.20.0
diff --git a/hololinked/server/HTTPServer.py b/hololinked/server/HTTPServer.py
index 5fd4b3f..72be3f5 100644
--- a/hololinked/server/HTTPServer.py
+++ b/hololinked/server/HTTPServer.py
@@ -1,4 +1,5 @@
import asyncio
+from dataclasses import dataclass
import zmq
import zmq.asyncio
import logging
@@ -14,7 +15,7 @@
from ..param import Parameterized
from ..param.parameters import (Integer, IPAddress, ClassSelector, Selector, TypedList, String)
from .constants import ZMQ_PROTOCOLS, CommonRPC, HTTPServerTypes, ResourceTypes, ServerMessage
-from .utils import get_IP_from_interface
+from .utils import get_IP_from_interface, issubklass
from .dataklasses import HTTPResource, ServerSentEvent
from .utils import get_default_logger
from .serializers import JSONSerializer
@@ -22,11 +23,26 @@
from .zmq_message_brokers import AsyncZMQClient, MessageMappedZMQClientPool
from .handlers import RPCHandler, BaseHandler, EventHandler, ThingsHandler, StopHandler
from .schema_validators import BaseSchemaValidator, JsonSchemaValidator
+from .events import Event
from .eventloop import EventLoop
from .config import global_config
+
+@dataclass
+class InteractionAffordance:
+ URL_path : str
+ obj : Event # typing.Union[Property, Action, Event]
+ http_methods : typing.Tuple[str, typing.Optional[str], typing.Optional[str]]
+ handler : BaseHandler
+ kwargs : dict
+
+ def __eq__(self, other : "InteractionAffordance") -> bool:
+ return self.obj == other.obj
+
+
+
class HTTPServer(Parameterized):
"""
HTTP(s) server to route requests to ``Thing``.
@@ -63,7 +79,7 @@ class HTTPServer(Parameterized):
Unlike pure CORS, the server resource is not even executed if the client is not
an allowed client. if None any client is served.""")
host = String(default=None, allow_None=True,
- doc="Host Server to subscribe to coordinate starting sequence of remote objects & web GUI" ) # type: str
+ doc="Host Server to subscribe to coordinate starting sequence of things & web GUI" ) # type: str
# network_interface = String(default='Ethernet',
# doc="Currently there is no logic to detect the IP addresss (as externally visible) correctly, \
# therefore please send the network interface name to retrieve the IP. If a DNS server is present, \
@@ -138,6 +154,7 @@ def __init__(self, things : typing.List[str], *, port : int = 8080, address : st
self._zmq_protocol = ZMQ_PROTOCOLS.IPC
self._zmq_inproc_socket_context = None
self._zmq_inproc_event_context = None
+ self._local_rules = dict() # type: typing.Dict[str, typing.List[InteractionAffordance]]
@property
def all_ok(self) -> bool:
@@ -147,6 +164,9 @@ def all_ok(self) -> bool:
f"{self.address}:{self.port}"),
self.log_level)
+ if self._zmq_protocol == ZMQ_PROTOCOLS.INPROC and (self._zmq_inproc_socket_context is None or self._zmq_inproc_event_context is None):
+ raise ValueError("Inproc socket context is not provided. Logic Error.")
+
self.app = Application(handlers=[
(r'/remote-objects', ThingsHandler, dict(request_handler=self.request_handler,
event_handler=self.event_handler)),
@@ -250,7 +270,7 @@ async def update_router_with_thing(self, client : AsyncZMQClient):
# Just to avoid duplication of this call as we proceed at single client level and not message mapped level
return
self._lost_things[client.instance_name] = client
- self.logger.info(f"attempting to update router with remote object {client.instance_name}.")
+ self.logger.info(f"attempting to update router with thing {client.instance_name}.")
while True:
try:
await client.handshake_complete()
@@ -272,7 +292,13 @@ async def update_router_with_thing(self, client : AsyncZMQClient):
)))
elif http_resource["what"] == ResourceTypes.EVENT:
resource = ServerSentEvent(**http_resource)
- handlers.append((instruction, self.event_handler, dict(
+ if resource.class_name in self._local_rules and any(ia.obj._obj_name == resource.obj_name for ia in self._local_rules[resource.class_name]):
+ for ia in self._local_rules[resource.class_name]:
+ if ia.obj._obj_name == resource.obj_name:
+ handlers.append((f'/{client.instance_name}{ia.URL_path}', ia.handler, dict(resource=resource, validator=None,
+ owner=self, **ia.kwargs)))
+ else:
+ handlers.append((instruction, self.event_handler, dict(
resource=resource,
validator=None,
owner=self
@@ -306,10 +332,11 @@ def __init__(
to make RPCHandler work
"""
self.app.wildcard_router.add_rules(handlers)
- self.logger.info(f"updated router with remote object {client.instance_name}.")
+ self.logger.info(f"updated router with thing {client.instance_name}.")
break
except Exception as ex:
- self.logger.error(f"error while trying to update router with remote object - {str(ex)}. " +
+ print("error", ex)
+ self.logger.error(f"error while trying to update router with thing - {str(ex)}. " +
"Trying again in 5 seconds")
await asyncio.sleep(5)
@@ -328,10 +355,39 @@ def __init__(
raise_client_side_exception=True
)
except Exception as ex:
- self.logger.error(f"error while trying to update remote object with HTTP server details - {str(ex)}. " +
+ self.logger.error(f"error while trying to update thing with HTTP server details - {str(ex)}. " +
"Trying again in 5 seconds")
self.zmq_client_pool.poller.register(client.socket, zmq.POLLIN)
self._lost_things.pop(client.instance_name)
+
+
+ def add_event(self, URL_path : str, event : Event, handler : typing.Optional[BaseHandler] = None,
+ **kwargs) -> None:
+ """
+ Add an event to be served by HTTP server
+
+ Parameters
+ ----------
+ URL_path : str
+ URL path to access the event
+ event : Event
+ Event to be served
+ handler : BaseHandler, optional
+ custom handler for the event
+ kwargs : dict
+ additional keyword arguments to be passed to the handler's __init__
+ """
+ if not isinstance(event, Event):
+ raise TypeError("event should be of type Event")
+ if not issubklass(handler, BaseHandler):
+ raise TypeError("handler should be subclass of BaseHandler")
+ if event.owner.__name__ not in self._local_rules:
+ self._local_rules[event.owner.__name__] = []
+ obj = InteractionAffordance(URL_path=URL_path, obj=event,
+ http_methods=('GET',), handler=handler or self.event_handler,
+ kwargs=kwargs)
+ if obj not in self._local_rules[event.owner.__name__]:
+ self._local_rules[event.owner.__name__].append(obj)
__all__ = [
diff --git a/hololinked/server/dataklasses.py b/hololinked/server/dataklasses.py
index cf0f018..4ab92e8 100644
--- a/hololinked/server/dataklasses.py
+++ b/hololinked/server/dataklasses.py
@@ -290,6 +290,7 @@ class HTTPResource(SerializableDataclass):
pass the request as a argument to the callable. For HTTP server ``tornado.web.HTTPServerRequest`` will be passed.
"""
what : str
+ class_name : str # just metadata
instance_name : str
obj_name : str
fullpath : str
@@ -298,10 +299,11 @@ class HTTPResource(SerializableDataclass):
request_as_argument : bool = field(default=False)
- def __init__(self, *, what : str, instance_name : str, obj_name : str, fullpath : str,
+ def __init__(self, *, what : str, class_name : str, instance_name : str, obj_name : str, fullpath : str,
request_as_argument : bool = False, argument_schema : typing.Optional[JSON] = None,
**instructions) -> None:
self.what = what
+ self.class_name = class_name
self.instance_name = instance_name
self.obj_name = obj_name
self.fullpath = fullpath
@@ -340,6 +342,7 @@ class ZMQResource(SerializableDataclass):
argument schema of the method/action for validation before passing over the instruction to the RPC server.
"""
what : str
+ class_name : str # just metadata
instance_name : str
instruction : str
obj_name : str
@@ -350,10 +353,11 @@ class ZMQResource(SerializableDataclass):
return_value_schema : typing.Optional[JSON]
request_as_argument : bool = field(default=False)
- def __init__(self, *, what : str, instance_name : str, instruction : str, obj_name : str,
+ def __init__(self, *, what : str, class_name : str, instance_name : str, instruction : str, obj_name : str,
qualname : str, doc : str, top_owner : bool, argument_schema : typing.Optional[JSON] = None,
return_value_schema : typing.Optional[JSON] = None, request_as_argument : bool = False) -> None:
self.what = what
+ self.class_name = class_name
self.instance_name = instance_name
self.instruction = instruction
self.obj_name = obj_name
@@ -390,7 +394,9 @@ class ServerSentEvent(SerializableDataclass):
"""
name : str = field(default=UNSPECIFIED)
obj_name : str = field(default=UNSPECIFIED)
+ class_name : str = field(default=UNSPECIFIED) # just metadata
unique_identifier : str = field(default=UNSPECIFIED)
+ serialization_specific : bool = field(default=False)
socket_address : str = field(default=UNSPECIFIED)
what : str = field(default=ResourceTypes.EVENT)
@@ -404,7 +410,7 @@ def build_our_temp_TD(instance):
assert isinstance(instance, Thing), f"got invalid type {type(instance)}"
- our_TD = instance.get_thing_description()
+ our_TD = instance.get_thing_description(ignore_errors=True)
our_TD["inheritance"] = [class_.__name__ for class_ in instance.__class__.mro()]
for instruction, remote_info in instance.instance_resources.items():
@@ -470,6 +476,7 @@ def get_organised_resources(instance):
httpserver_resources[fullpath] = HTTPResource(
what=ResourceTypes.PROPERTY,
+ class_name=instance.__class__.__name__,
instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name,
obj_name=remote_info.obj_name,
fullpath=fullpath,
@@ -477,6 +484,7 @@ def get_organised_resources(instance):
)
zmq_resources[fullpath] = ZMQResource(
what=ResourceTypes.PROPERTY,
+ class_name=instance.__class__.__name__,
instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name,
instruction=fullpath,
doc=prop.__doc__,
@@ -494,6 +502,8 @@ def get_organised_resources(instance):
assert isinstance(prop._observable_event_descriptor, Event), f"observable event not yet set for {prop.name}. logic error."
evt_fullpath = f"{instance._full_URL_path_prefix}{prop._observable_event_descriptor.URL_path}"
dispatcher = EventDispatcher(evt_fullpath)
+ dispatcher._remote_info.class_name = instance.__class__.__name__
+ dispatcher._remote_info.serialization_specific = instance.zmq_serializer != instance.http_serializer
setattr(instance, prop._observable_event_descriptor._obj_name, dispatcher)
# prop._observable_event_descriptor._remote_info.unique_identifier = evt_fullpath
httpserver_resources[evt_fullpath] = dispatcher._remote_info
@@ -515,6 +525,7 @@ def get_organised_resources(instance):
# needs to be cleaned up for multiple HTTP methods
httpserver_resources[instruction] = HTTPResource(
what=ResourceTypes.ACTION,
+ class_name=instance.__class__.__name__,
instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name,
obj_name=remote_info.obj_name,
fullpath=fullpath,
@@ -524,6 +535,7 @@ def get_organised_resources(instance):
)
zmq_resources[instruction] = ZMQResource(
what=ResourceTypes.ACTION,
+ class_name=instance.__class__.__name__,
instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name,
instruction=instruction,
obj_name=getattr(resource, '__name__'),
@@ -538,13 +550,13 @@ def get_organised_resources(instance):
# Events
for name, resource in inspect._getmembers(instance, lambda o : isinstance(o, Event), getattr_without_descriptor_read):
assert isinstance(resource, Event), ("thing event query from inspect.ismethod is not an Event",
- "logic error - visit https://github.com/VigneshVSV/hololinked/issues to report")
- if getattr(instance, name, None):
- continue
+ "logic error - visit https://github.com/VigneshVSV/hololinked/issues to report")
# above assertion is only a typing convenience
fullpath = f"{instance._full_URL_path_prefix}{resource.URL_path}"
# resource._remote_info.unique_identifier = fullpath
dispatcher = EventDispatcher(fullpath)
+ dispatcher._remote_info.class_name = instance.__class__.__name__
+ dispatcher._remote_info.serialization_specific = instance.zmq_serializer != instance.http_serializer
setattr(instance, name, dispatcher) # resource._remote_info.unique_identifier))
httpserver_resources[fullpath] = dispatcher._remote_info
zmq_resources[fullpath] = dispatcher._remote_info
@@ -558,6 +570,7 @@ def get_organised_resources(instance):
# for example, a shared logger
continue
resource._owner = instance
+ resource._prepare_resources()
httpserver_resources.update(resource.httpserver_resources)
# zmq_resources.update(resource.zmq_resources)
instance_resources.update(resource.instance_resources)
diff --git a/hololinked/server/eventloop.py b/hololinked/server/eventloop.py
index f06073b..0c968e3 100644
--- a/hololinked/server/eventloop.py
+++ b/hololinked/server/eventloop.py
@@ -353,8 +353,8 @@ async def execute_once(cls, instance_name : str, instance : Thing, instruction_s
if action == "write":
if resource.state is None or (hasattr(instance, 'state_machine') and
instance.state_machine.current_state in resource.state):
- if isinstance(arguments, dict) and len(arguments) == 1 and "value" in arguments:
- return prop.__set__(owner_inst, arguments["value"])
+ if isinstance(arguments, dict) and len(arguments) == 1 and 'value' in arguments:
+ return prop.__set__(owner_inst, arguments['value'])
return prop.__set__(owner_inst, arguments)
else:
raise StateMachineError("Thing {} is in `{}` state, however attribute can be written only in `{}` state".format(
diff --git a/hololinked/server/events.py b/hololinked/server/events.py
index fafe01e..787288f 100644
--- a/hololinked/server/events.py
+++ b/hololinked/server/events.py
@@ -61,6 +61,8 @@ def __get__(self, obj, objtype) -> "EventDispatcher":
def __get__(self, obj : ParameterizedMetaclass, objtype : typing.Optional[type] = None):
try:
+ if not obj:
+ return self
return obj.__dict__[self._internal_name]
except KeyError:
raise AttributeError("Event object not yet initialized, please dont access now." +
@@ -68,14 +70,13 @@ def __get__(self, obj : ParameterizedMetaclass, objtype : typing.Optional[type]
def __set__(self, obj : Parameterized, value : typing.Any) -> None:
if isinstance(value, EventDispatcher):
- if not obj.__dict__.get(self._internal_name, None):
- value._remote_info.name = self.friendly_name
- value._remote_info.obj_name = self._obj_name
- value._owner_inst = obj
- obj.__dict__[self._internal_name] = value
- else:
- raise AttributeError(f"Event object already assigned for {self._obj_name}. Cannot reassign.")
- # may be allowing to reassign is not a bad idea
+ value._remote_info.name = self.friendly_name
+ value._remote_info.obj_name = self._obj_name
+ value._owner_inst = obj
+ current_obj = obj.__dict__.get(self._internal_name, None) # type: typing.Optional[EventDispatcher]
+ if current_obj and current_obj._publisher:
+ current_obj._publisher.unregister(current_obj)
+ obj.__dict__[self._internal_name] = value
else:
raise TypeError(f"Supply EventDispatcher object to event {self._obj_name}, not type {type(value)}.")
diff --git a/hololinked/server/handlers.py b/hololinked/server/handlers.py
index 4deaf17..9165382 100644
--- a/hololinked/server/handlers.py
+++ b/hololinked/server/handlers.py
@@ -215,6 +215,9 @@ class EventHandler(BaseHandler):
"""
handles events emitted by ``Thing`` and tunnels them as HTTP SSE.
"""
+ def initialize(self, resource, validator: BaseSchemaValidator, owner=None) -> None:
+ super().initialize(resource, validator, owner)
+ self.data_header = b'data: %s\n\n'
def set_headers(self) -> None:
"""
@@ -271,9 +274,8 @@ async def handle_datastream(self) -> None:
event_consumer = event_consumer_cls(self.resource.unique_identifier, self.resource.socket_address,
identity=f"{self.resource.unique_identifier}|HTTPEvent|{uuid.uuid4()}",
logger=self.logger, http_serializer=self.serializer,
- context=self.owner._zmq_event_context if self.resource.socket_address.startswith('inproc') else None)
+ context=self.owner._zmq_inproc_event_context if self.resource.socket_address.startswith('inproc') else None)
event_loop = asyncio.get_event_loop()
- data_header = b'data: %s\n\n'
self.set_status(200)
except Exception as ex:
self.logger.error(f"error while subscribing to event - {str(ex)}")
@@ -289,16 +291,17 @@ async def handle_datastream(self) -> None:
data = await event_loop.run_in_executor(None, self.receive_blocking_event, event_consumer)
if data:
# already JSON serialized
- self.write(data_header % data)
- await self.flush()
+ self.write(self.data_header % data)
+ await self.flush() # log after flushing just to be sure
self.logger.debug(f"new data sent - {self.resource.name}")
else:
- self.logger.debug(f"found no new data")
+ self.logger.debug(f"found no new data - {self.resource.name}")
+ await self.flush() # heartbeat - raises StreamClosedError if client disconnects
except StreamClosedError:
break
except Exception as ex:
self.logger.error(f"error while pushing event - {str(ex)}")
- self.write(data_header % self.serializer.dumps(
+ self.write(self.data_header % self.serializer.dumps(
{"exception" : format_exception_as_json(ex)}))
try:
if isinstance(self.owner._zmq_inproc_event_context, zmq.asyncio.Context):
@@ -307,42 +310,22 @@ async def handle_datastream(self) -> None:
self.logger.error(f"error while closing event consumer - {str(ex)}" )
-class ImageEventHandler(EventHandler):
+class JPEGImageEventHandler(EventHandler):
"""
handles events with images with image data header
"""
+ def initialize(self, resource, validator: BaseSchemaValidator, owner = None) -> None:
+ super().initialize(resource, validator, owner)
+ self.data_header = b'data:image/jpeg;base64,%s\n\n'
- async def handle_datastream(self) -> None:
- try:
- event_consumer = AsyncEventConsumer(self.resource.unique_identifier, self.resource.socket_address,
- f"{self.resource.unique_identifier}|HTTPEvent|{uuid.uuid4()}",
- http_serializer=self.serializer, logger=self.logger,
- context=self.owner._zmq_event_context if self.resource.socket_address.startswith('inproc') else None)
- self.set_header("Content-Type", "application/x-mpegURL")
- self.write("#EXTM3U\n")
- delimiter = "#EXTINF:{},\n"
- data_header = b'data:image/jpeg;base64,%s\n'
- while True:
- try:
- data = await event_consumer.receive(timeout=10000, deserialize=False)
- if data:
- # already serialized
- self.write(delimiter)
- self.write(data_header % data)
- await self.flush()
- self.logger.debug(f"new image sent - {self.resource.name}")
- else:
- self.logger.debug(f"found no new data")
- except StreamClosedError:
- break
- except Exception as ex:
- self.logger.error(f"error while pushing event - {str(ex)}")
- self.write(data_header % self.serializer.dumps(
- {"exception" : format_exception_as_json(ex)}))
- event_consumer.exit()
- except Exception as ex:
- self.write(data_header % self.serializer.dumps(
- {"exception" : format_exception_as_json(ex)}))
+
+class PNGImageEventHandler(EventHandler):
+ """
+ handles events with images with image data header
+ """
+ def initialize(self, resource, validator: BaseSchemaValidator, owner = None) -> None:
+ super().initialize(resource, validator, owner)
+ self.data_header = b'data:image/png;base64,%s\n\n'
diff --git a/hololinked/server/properties.py b/hololinked/server/properties.py
index 3fe33ba..17f487d 100644
--- a/hololinked/server/properties.py
+++ b/hololinked/server/properties.py
@@ -614,7 +614,7 @@ class Tuple(Iterable):
__slots__ = ['accept_list']
- def __init__(self, default : typing.Any, *, bounds : typing.Optional[typing.Tuple[int, int]] = None,
+ def __init__(self, default : typing.Any = None, *, bounds : typing.Optional[typing.Tuple[int, int]] = None,
length: typing.Optional[int] = None, item_type : typing.Optional[typing.Tuple] = None,
accept_list : bool = False, deepcopy_default : bool = False,
doc : typing.Optional[str] = None, constant : bool = False,
@@ -674,7 +674,7 @@ class List(Iterable):
__slots__ = ['accept_tuple']
- def __init__(self, default: typing.Any, *, bounds : typing.Optional[typing.Tuple[int, int]] = None,
+ def __init__(self, default: typing.Any = None, *, bounds : typing.Optional[typing.Tuple[int, int]] = None,
length : typing.Optional[int] = None, item_type : typing.Optional[typing.Tuple] = None,
accept_tuple : bool = False, deepcopy_default : bool = False,
doc : typing.Optional[str] = None, constant : bool = False,
@@ -834,7 +834,7 @@ class Selector(SelectorBase):
# Selector is usually used to allow selection from a list of
# existing objects, therefore instantiate is False by default.
- def __init__(self, *, objects : typing.List[typing.Any], default : typing.Any, empty_default : bool = False,
+ def __init__(self, *, objects : typing.List[typing.Any], default : typing.Any = None, empty_default : bool = False,
doc : typing.Optional[str] = None, constant : bool = False,
readonly : bool = False, allow_None : bool = False, label : typing.Optional[str] = None,
URL_path : str = USE_OBJECT_NAME,
diff --git a/hololinked/server/property.py b/hololinked/server/property.py
index 6bdc031..1a1fe2a 100644
--- a/hololinked/server/property.py
+++ b/hololinked/server/property.py
@@ -110,7 +110,7 @@ class Property(Parameter):
"""
- __slots__ = ['db_persist', 'db_init', 'db_commit', 'metadata', '_remote_info',
+ __slots__ = ['db_persist', 'db_init', 'db_commit', 'metadata', 'model', '_remote_info',
'_observable', '_observable_event_descriptor', 'fcomparator', '_old_value_internal_name']
# RPC only init - no HTTP methods for those who dont like
@@ -161,7 +161,7 @@ def __init__(self, default: typing.Any = None, *,
(HTTP_METHODS.GET, HTTP_METHODS.PUT, HTTP_METHODS.DELETE),
state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None,
db_persist : bool = False, db_init : bool = False, db_commit : bool = False,
- observable : bool = False, class_member : bool = False,
+ observable : bool = False, class_member : bool = False, model = None,
fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None,
fdel : typing.Optional[typing.Callable] = None, fcomparator : typing.Optional[typing.Callable] = None,
deepcopy_default : bool = False, per_instance_descriptor : bool = False, remote : bool = True,
@@ -185,6 +185,9 @@ def __init__(self, default: typing.Any = None, *,
state=state,
isproperty=True
)
+ self.model = None
+ if model:
+ self.model = wrap_plain_types_in_rootmodel(model)
def __set_name__(self, owner: typing.Any, attrib_name: str) -> None:
@@ -325,7 +328,29 @@ def webgui_info(self, for_remote_params : typing.Union[Property, typing.Dict[str
return info
+try:
+ from pydantic import BaseModel, RootModel, create_model
+ def wrap_plain_types_in_rootmodel(model : type) -> type["BaseModel"]:
+ """
+ Ensure a type is a subclass of BaseModel.
+
+ If a `BaseModel` subclass is passed to this function, we will pass it
+ through unchanged. Otherwise, we wrap the type in a RootModel.
+ In the future, we may explicitly check that the argument is a type
+ and not a model instance.
+ """
+ try: # This needs to be a `try` as basic types are not classes
+ assert issubclass(model, BaseModel)
+ return model
+ except (TypeError, AssertionError):
+ return create_model(f"{model!r}", root=(model, ...), __base__=RootModel)
+ except NameError:
+ raise ImportError("pydantic is not installed, please install it to use this feature") from None
+except ImportError:
+ def wrap_plain_types_in_rootmodel(model : type) -> type:
+ raise ImportError("pydantic is not installed, please install it to use this feature") from None
+
__all__ = [
Property.__name__
]
\ No newline at end of file
diff --git a/hololinked/server/serializers.py b/hololinked/server/serializers.py
index 664a47e..29bfd96 100644
--- a/hololinked/server/serializers.py
+++ b/hololinked/server/serializers.py
@@ -22,9 +22,8 @@
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
-import json
import pickle
-from msgspec import json, msgpack
+from msgspec import json as msgspecjson, msgpack
import json as pythonjson
import inspect
import array
@@ -36,6 +35,11 @@
from enum import Enum
from collections import deque
+try:
+ import numpy
+except ImportError:
+ pass
+
from ..param.parameters import TypeConstrainedList, TypeConstrainedDict, TypedKeyMappingsConstrainedDict
from .constants import JSONSerializable, Serializers
from .utils import format_exception_as_json
@@ -82,15 +86,15 @@ class JSONSerializer(BaseSerializer):
def __init__(self) -> None:
super().__init__()
- self.type = json
+ self.type = msgspecjson
def loads(self, data : typing.Union[bytearray, memoryview, bytes]) -> JSONSerializable:
"method called by ZMQ message brokers to deserialize data"
- return json.decode(self.convert_to_bytes(data))
+ return msgspecjson.decode(self.convert_to_bytes(data))
def dumps(self, data) -> bytes:
"method called by ZMQ message brokers to serialize data"
- return json.encode(data, enc_hook=self.default)
+ return msgspecjson.encode(data, enc_hook=self.default)
@classmethod
def default(cls, obj) -> JSONSerializable:
@@ -119,6 +123,8 @@ def default(cls, obj) -> JSONSerializable:
if obj.typecode == 'u':
return obj.tounicode()
return obj.tolist()
+ if 'numpy' in globals() and isinstance(obj, numpy.ndarray):
+ return obj.tolist()
replacer = cls._type_replacements.get(type(obj), None)
if replacer:
return replacer(obj)
diff --git a/hololinked/server/state_machine.py b/hololinked/server/state_machine.py
index ffee714..d0ba1a7 100644
--- a/hololinked/server/state_machine.py
+++ b/hololinked/server/state_machine.py
@@ -3,12 +3,13 @@
from types import FunctionType, MethodType
from enum import EnumMeta, Enum, StrEnum
-from ..param.parameterized import Parameterized
+from ..param.parameterized import Parameterized, edit_constant
from .utils import getattr_without_descriptor_read
+from .exceptions import StateMachineError
from .dataklasses import RemoteResourceInfoValidator
from .property import Property
from .properties import ClassSelector, TypedDict, Boolean
-from .events import Event
+
@@ -28,7 +29,7 @@ class StateMachine:
on_exit = TypedDict(default=None, allow_None=True, key_type=str,
doc="""callbacks to execute when certain state is exited;
specfied as map with state as keys and callbacks as list""") # typing.Dict[str, typing.List[typing.Callable]]
- machine = TypedDict(default=None, allow_None=True, key_type=str, item_type=(list, tuple),
+ machine = TypedDict(default=None, allow_None=True, item_type=(list, tuple), key_type=str, # i.e. its like JSON
doc="the machine specification with state as key and objects as list") # typing.Dict[str, typing.List[typing.Callable, Property]]
valid = Boolean(default=False, readonly=True, fget=lambda self: self._valid,
doc="internally computed, True if states, initial_states and the machine is valid")
@@ -85,8 +86,10 @@ def _prepare(self, owner : Parameterized) -> None:
owner_methods = [obj[0] for obj in inspect._getmembers(owner, inspect.ismethod, getattr_without_descriptor_read)]
if isinstance(self.states, list):
+ self.__class__.states.constant = False
self.states = tuple(self.states) # freeze the list of states
-
+ self.__class__.states.constant = True
+
# first validate machine
for state, objects in self.machine.items():
if state in self:
@@ -107,7 +110,7 @@ def _prepare(self, owner : Parameterized) -> None:
raise AttributeError(f"Object {resource} was not made remotely accessible," +
" use state machine with properties and actions only.")
else:
- raise AttributeError("Given state {} not in states Enum {}".format(state, self.states.__members__))
+ raise StateMachineError("Given state {} not in states Enum {}".format(state, self.states.__members__))
# then the callbacks
for state, objects in self.on_enter.items():
diff --git a/hololinked/server/td.py b/hololinked/server/td.py
index 1a549fc..c86383d 100644
--- a/hololinked/server/td.py
+++ b/hololinked/server/td.py
@@ -1,6 +1,8 @@
import typing, inspect
from dataclasses import dataclass, field
+from hololinked.server.eventloop import EventLoop
+
from .constants import JSON, JSONSerializable
from .utils import getattr_without_descriptor_read
@@ -13,7 +15,6 @@
-
@dataclass
class Schema:
"""
@@ -53,7 +54,9 @@ def format_doc(cls, doc : str):
if index > 0:
line = ' ' + line # add space to left in case of new line
final_doc.append(line)
- return ''.join(final_doc)
+ final_doc = ''.join(final_doc)
+ final_doc = final_doc.lstrip().rstrip()
+ return final_doc
@@ -187,14 +190,14 @@ def build(self, property : Property, owner : Thing, authority : str) -> None:
self.description = Schema.format_doc(property.doc)
if property.metadata and property.metadata.get("unit", None) is not None:
self.unit = property.metadata["unit"]
- # if property.allow_None:
- # if not hasattr(self, 'oneOf'):
- # self.oneOf = []
- # if hasattr(self, 'type'):
- # self.oneOf.append(dict(type=self.type))
- # del self.type
- # if not any(types["type"] == None for types in self.oneOf):
- # self.oneOf.append(dict(type=None))
+ if property.allow_None:
+ if not hasattr(self, 'oneOf'):
+ self.oneOf = []
+ if hasattr(self, 'type'):
+ self.oneOf.append(dict(type=self.type))
+ del self.type
+ if not any(types["type"] == None for types in self.oneOf):
+ self.oneOf.append(dict(type="null"))
@@ -262,6 +265,17 @@ def generate_schema(self, property : Property, owner : Thing, authority : str) -
schema = OneOfSchema()
elif self._custom_schema_generators.get(property, NotImplemented) is not NotImplemented:
schema = self._custom_schema_generators[property]()
+ elif isinstance(property, Property) and property.model is not None:
+ from .td_pydantic_extensions import GenerateJsonSchemaWithoutDefaultTitles, type_to_dataschema
+ schema = PropertyAffordance()
+ schema.build(property=property, owner=owner, authority=authority)
+ data_schema = type_to_dataschema(property.model).model_dump(mode='json', exclude_none=True)
+ final_schema = schema.asdict()
+ if schema.oneOf: # allow_None = True
+ final_schema['oneOf'].append(data_schema)
+ else:
+ final_schema.update(data_schema)
+ return final_schema
else:
raise TypeError(f"WoT schema generator for this descriptor/property is not implemented. name {property.name} & type {type(property)}")
schema.build(property=property, owner=owner, authority=authority)
@@ -518,7 +532,7 @@ class Link(Schema):
href : str
anchor : typing.Optional[str]
type : typing.Optional[str] = field(default='application/json')
- rel : typing.Optional[str] = field(default='next')
+ # rel : typing.Optional[str] = field(default='next')
def __init__(self):
super().__init__()
@@ -719,17 +733,18 @@ class ThingDescription(Schema):
'events', 'thing_description', 'GUI', 'object_info' ]
skip_actions = ['_set_properties', '_get_properties', '_add_property', '_get_properties_in_db',
- 'push_events', 'stop_events', 'get_postman_collection', 'get_thing_description']
+ 'push_events', 'stop_events', 'get_postman_collection', 'get_thing_description',
+ 'get_our_temp_thing_description']
# not the best code and logic, but works for now
def __init__(self, instance : Thing, authority : typing.Optional[str] = None,
- allow_loose_schema : typing.Optional[bool] = False) -> None:
+ allow_loose_schema : typing.Optional[bool] = False, ignore_errors : bool = False) -> None:
super().__init__()
self.instance = instance
self.authority = authority
self.allow_loose_schema = allow_loose_schema
-
+ self.ignore_errors = ignore_errors
def produce(self) -> typing.Dict[str, typing.Any]:
self.context = "https://www.w3.org/2022/wot/td/v1.1"
@@ -754,18 +769,25 @@ def produce(self) -> typing.Dict[str, typing.Any]:
def add_interaction_affordances(self):
# properties and actions
for resource in self.instance.instance_resources.values():
- if (resource.isproperty and resource.obj_name not in self.properties and
- resource.obj_name not in self.skip_properties and hasattr(resource.obj, "_remote_info") and
- resource.obj._remote_info is not None):
- if (resource.obj_name == 'state' and (not hasattr(self.instance, 'state_machine') or
- not isinstance(self.instance.state_machine, StateMachine))):
- continue
- self.properties[resource.obj_name] = PropertyAffordance.generate_schema(resource.obj,
+ try:
+ if (resource.isproperty and resource.obj_name not in self.properties and
+ resource.obj_name not in self.skip_properties and hasattr(resource.obj, "_remote_info") and
+ resource.obj._remote_info is not None):
+ if (resource.obj_name == 'state' and (not hasattr(self.instance, 'state_machine') or
+ not isinstance(self.instance.state_machine, StateMachine))):
+ continue
+ self.properties[resource.obj_name] = PropertyAffordance.generate_schema(resource.obj,
self.instance, self.authority)
- elif (resource.isaction and resource.obj_name not in self.actions and
- resource.obj_name not in self.skip_actions and hasattr(resource.obj, '_remote_info')):
- self.actions[resource.obj_name] = ActionAffordance.generate_schema(resource.obj,
- self.instance, self.authority)
+
+ elif (resource.isaction and resource.obj_name not in self.actions and
+ resource.obj_name not in self.skip_actions and hasattr(resource.obj, '_remote_info')):
+
+ self.actions[resource.obj_name] = ActionAffordance.generate_schema(resource.obj,
+ self.instance, self.authority)
+ except Exception as ex:
+ if not self.ignore_errors:
+ raise ex from None
+ self.instance.logger.error(f"Error while generating schema for {resource.obj_name} - {ex}")
# Events
for name, resource in inspect._getmembers(self.instance, lambda o : isinstance(o, Event),
getattr_without_descriptor_read):
@@ -773,15 +795,20 @@ def add_interaction_affordances(self):
continue
if '/change-event' in resource.URL_path:
continue
- self.events[name] = EventAffordance.generate_schema(resource, self.instance, self.authority)
- # for name, resource in inspect._getmembers(self.instance, lambda o : isinstance(o, Thing), getattr_without_descriptor_read):
- # if resource is self.instance or isinstance(resource, EventLoop):
- # continue
- # if self.links is None:
- # self.links = []
- # link = Link()
- # link.build(resource, self.instance, self.authority)
- # self.links.append(link.asdict())
+ try:
+ self.events[name] = EventAffordance.generate_schema(resource, self.instance, self.authority)
+ except Exception as ex:
+ if not self.ignore_errors:
+ raise ex from None
+ self.instance.logger.error(f"Error while generating schema for {resource.obj_name} - {ex}")
+ for name, resource in inspect._getmembers(self.instance, lambda o : isinstance(o, Thing), getattr_without_descriptor_read):
+ if resource is self.instance or isinstance(resource, EventLoop):
+ continue
+ if self.links is None or self.links == NotImplemented:
+ self.links = []
+ link = Link()
+ link.build(resource, self.instance, self.authority)
+ self.links.append(link.asdict())
def add_top_level_forms(self):
diff --git a/hololinked/server/td_pydantic_extensions.py b/hololinked/server/td_pydantic_extensions.py
new file mode 100644
index 0000000..5bc3295
--- /dev/null
+++ b/hololinked/server/td_pydantic_extensions.py
@@ -0,0 +1,289 @@
+from __future__ import annotations
+
+from enum import Enum
+from pydantic import BaseModel, Field, ConfigDict, TypeAdapter, ValidationError
+from pydantic.json_schema import GenerateJsonSchema
+from pydantic._internal._core_utils import is_core_schema, CoreSchemaOrField
+from typing import Optional, Sequence, Union, Any, Mapping, List, Dict
+from .serializers import JSONSerializer
+
+JSONSchema = dict[str, Any] # A type to represent JSONSchema
+
+AnyUri = str
+Description = str
+Descriptions = Optional[Dict[str, str]]
+Title = str
+Titles = Optional[Dict[str, str]]
+Security = Union[List[str], str]
+Scopes = Union[List[str], str]
+TypeDeclaration = Union[str, List[str]]
+
+
+class Type(Enum):
+ boolean = "boolean"
+ integer = "integer"
+ number = "number"
+ string = "string"
+ object = "object"
+ array = "array"
+ null = "null"
+
+
+class DataSchema(BaseModel):
+
+ field_type: Optional[TypeDeclaration] = Field(None, alias="@type")
+ description: Optional[Description] = None
+ title: Optional[Title] = None
+ descriptions: Optional[Descriptions] = None
+ titles: Optional[Titles] = None
+ writeOnly: Optional[bool] = None
+ readOnly: Optional[bool] = None
+ oneOf: Optional[list[DataSchema]] = None
+ unit: Optional[str] = None
+ enum: Optional[list] = None
+ # enum was `Field(None, min_length=1, unique_items=True)` but this failed with
+ # generic models
+ format: Optional[str] = None
+ const: Optional[Any] = None
+ default: Optional[Any] = None
+ type: Optional[Type] = None
+ # The fields below should be empty unless type==Type.array
+ items: Optional[Union[DataSchema, List[DataSchema]]] = None
+ maxItems: Optional[int] = Field(None, ge=0)
+ minItems: Optional[int] = Field(None, ge=0)
+ # The fields below should be empty unless type==Type.number or Type.integer
+ minimum: Optional[Union[int, float]] = None
+ maximum: Optional[Union[int, float]] = None
+ exclusiveMinimum: Optional[Union[int, float]] = None
+ exclusiveMaximum: Optional[Union[int, float]] = None
+ multipleOf: Optional[Union[int, float]] = None
+ # The fields below should be empty unless type==Type.object
+ properties: Optional[Mapping[str, DataSchema]] = None
+ required: Optional[list[str]] = None
+ # The fields below should be empty unless type==Type.string
+ minLength: Optional[int] = None
+ maxLength: Optional[int] = None
+ pattern: Optional[str] = None
+ contentEncoding: Optional[str] = None
+ contentMediaType: Optional[str] = None
+
+ model_config = ConfigDict(extra="forbid")
+
+
+
+def is_a_reference(d: JSONSchema) -> bool:
+ """Return True if a JSONSchema dict is a reference
+
+ JSON Schema references are one-element dictionaries with
+ a single key, `$ref`. `pydantic` sometimes breaks this
+ rule and so I don't check that it's a single key.
+ """
+ return "$ref" in d
+
+
+def look_up_reference(reference: str, d: JSONSchema) -> JSONSchema:
+ """Look up a reference in a JSONSchema
+
+ This first asserts the reference is local (i.e. starts with #
+ so it's relative to the current file), then looks up
+ each path component in turn.
+ """
+ if not reference.startswith("#/"):
+ raise NotImplementedError(
+ "Built-in resolver can only dereference internal JSON references "
+ "(i.e. starting with #)."
+ )
+ try:
+ resolved: JSONSchema = d
+ for key in reference[2:].split("/"):
+ resolved = resolved[key]
+ return resolved
+ except KeyError as ke:
+ raise KeyError(
+ f"The JSON reference {reference} was not found in the schema "
+ f"(original error {ke})."
+ )
+
+
+def is_an_object(d: JSONSchema) -> bool:
+ """Determine whether a JSON schema dict is an object"""
+ return "type" in d and d["type"] == "object"
+
+
+def convert_object(d: JSONSchema) -> JSONSchema:
+ """Convert an object from JSONSchema to Thing Description"""
+ out: JSONSchema = d.copy()
+ # AdditionalProperties is not supported by Thing Description, and it is ambiguous
+ # whether this implies it's false or absent. I will, for now, ignore it, so we
+ # delete the key below.
+ if "additionalProperties" in out:
+ del out["additionalProperties"]
+ return out
+
+
+def convert_anyof(d: JSONSchema) -> JSONSchema:
+ """Convert the anyof key to oneof
+
+ JSONSchema makes a distinction between "anyof" and "oneof", where the former
+ means "any of these fields can be present" and the latter means "exactly one
+ of these fields must be present". Thing Description does not have this
+ distinction, so we convert anyof to oneof.
+ """
+ if "anyOf" not in d:
+ return d
+ out: JSONSchema = d.copy()
+ out["oneOf"] = out["anyOf"]
+ del out["anyOf"]
+ return out
+
+
+def convert_prefixitems(d: JSONSchema) -> JSONSchema:
+ """Convert the prefixitems key to items
+
+ JSONSchema 2019 (as used by thing description) used
+ `items` with a list of values in the same way that JSONSchema
+ now uses `prefixitems`.
+
+ JSONSchema 2020 uses `items` to mean the same as `additionalItems`
+ in JSONSchema 2019 - but Thing Description doesn't support the
+ `additionalItems` keyword. This will result in us overwriting
+ additional items, and we raise a ValueError if that happens.
+
+ This behaviour may be relaxed in the future.
+ """
+ if "prefixItems" not in d:
+ return d
+ out: JSONSchema = d.copy()
+ if "items" in out:
+ raise ValueError(f"Overwrote the `items` key on {out}.")
+ out["items"] = out["prefixItems"]
+ del out["prefixItems"]
+ return out
+
+
+def convert_additionalproperties(d: JSONSchema) -> JSONSchema:
+ """Move additionalProperties into properties, or remove it"""
+ if "additionalProperties" not in d:
+ return d
+ out: JSONSchema = d.copy()
+ if "properties" in out and "additionalProperties" not in out["properties"]:
+ out["properties"]["additionalProperties"] = out["additionalProperties"]
+ del out["additionalProperties"]
+ return out
+
+
+def check_recursion(depth: int, limit: int):
+ """Check the recursion count is less than the limit"""
+ if depth > limit:
+ raise ValueError(
+ f"Recursion depth of {limit} exceeded - perhaps there is a circular "
+ "reference?"
+ )
+
+
+def jsonschema_to_dataschema(
+ d: JSONSchema,
+ root_schema: Optional[JSONSchema] = None,
+ recursion_depth: int = 0,
+ recursion_limit: int = 99,
+) -> JSONSchema:
+ """remove references and change field formats
+
+ JSONSchema allows schemas to be replaced with `{"$ref": "#/path/to/schema"}`.
+ Thing Description does not allow this. `dereference_jsonschema_dict` takes a
+ `dict` representation of a JSON Schema document, and replaces all the
+ references with the appropriate chunk of the file.
+
+ JSONSchema can represent `Union` types using the `anyOf` keyword, which is
+ called `oneOf` by Thing Description. It's possible to achieve the same thing
+ in the specific case of array elements, by setting `items` to a list of
+ `DataSchema` objects. This function does not yet do that conversion.
+
+ This generates a copy of the document, to avoid messing up `pydantic`'s cache.
+ """
+ root_schema = root_schema or d
+ check_recursion(recursion_depth, recursion_limit)
+ # JSONSchema references are one-element dictionaries, with a single key called $ref
+ while is_a_reference(d):
+ d = look_up_reference(d["$ref"], root_schema)
+ recursion_depth += 1
+ check_recursion(recursion_depth, recursion_limit)
+
+ if is_an_object(d):
+ d = convert_object(d)
+ d = convert_anyof(d)
+ d = convert_prefixitems(d)
+ d = convert_additionalproperties(d)
+
+ # After checking the object isn't a reference, we now recursively check
+ # sub-dictionaries and dereference those if necessary. This could be done with a
+ # comprehension, but I am prioritising readability over speed. This code is run when
+ # generating the TD, not in time-critical situations.
+ rkwargs: dict[str, Any] = {
+ "root_schema": root_schema,
+ "recursion_depth": recursion_depth + 1,
+ "recursion_limit": recursion_limit,
+ }
+ output: JSONSchema = {}
+ for k, v in d.items():
+ if isinstance(v, dict):
+ # Any items that are Mappings (i.e. sub-dictionaries) must be recursed into
+ output[k] = jsonschema_to_dataschema(v, **rkwargs)
+ elif isinstance(v, Sequence) and len(v) > 0 and isinstance(v[0], Mapping):
+ # We can also have lists of mappings (i.e. Array[DataSchema]), so we
+ # recurse into these.
+ output[k] = [jsonschema_to_dataschema(item, **rkwargs) for item in v]
+ else:
+ output[k] = v
+ return output
+
+
+def type_to_dataschema(t: Union[type, BaseModel], **kwargs) -> DataSchema:
+ """Convert a Python type to a Thing Description DataSchema
+
+ This makes use of pydantic's `schema_of` function to create a
+ json schema, then applies some fixes to make a DataSchema
+ as per the Thing Description (because Thing Description is
+ almost but not quite compatible with JSONSchema).
+
+ Additional keyword arguments are added to the DataSchema,
+ and will override the fields generated from the type that
+ is passed in. Typically you'll want to use this for the
+ `title` field.
+ """
+ if isinstance(t, BaseModel):
+ json_schema = t.model_json_schema()
+ else:
+ json_schema = TypeAdapter(t).json_schema()
+ schema_dict = jsonschema_to_dataschema(json_schema)
+ # Definitions of referenced ($ref) schemas are put in a
+ # key called "definitions" or "$defs" by pydantic. We should delete this.
+ # TODO: find a cleaner way to do this
+ # This shouldn't be a severe problem: we will fail with a
+ # validation error if other junk is left in the schema.
+ for k in ["definitions", "$defs"]:
+ if k in schema_dict:
+ del schema_dict[k]
+ schema_dict.update(kwargs)
+ try:
+ return DataSchema(**schema_dict)
+ except ValidationError as ve:
+ print(
+ "Error while constructing DataSchema from the "
+ "following dictionary:\n"
+ + JSONSerializer().dumps(schema_dict, indent=2)
+ + "Before conversion, the JSONSchema was:\n"
+ + JSONSerializer().dumps(json_schema, indent=2)
+ )
+ raise ve
+
+
+class GenerateJsonSchemaWithoutDefaultTitles(GenerateJsonSchema):
+ """Drops autogenerated titles from JSON Schema"""
+
+ # https://stackoverflow.com/questions/78679812/pydantic-v2-to-json-schema-translation-how-to-suppress-autogeneration-of-title
+ def field_title_should_be_set(self, schema: CoreSchemaOrField) -> bool:
+ return_value = super().field_title_should_be_set(schema)
+ if return_value and is_core_schema(schema):
+ return False
+ return return_value
diff --git a/hololinked/server/thing.py b/hololinked/server/thing.py
index 10cce80..7edbdc3 100644
--- a/hololinked/server/thing.py
+++ b/hololinked/server/thing.py
@@ -55,6 +55,7 @@ def __new__(cls, __name, __bases, __dict : TypedKeyMappingsConstrainedDict):
def __call__(mcls, *args, **kwargs):
instance = super().__call__(*args, **kwargs)
+
instance.__post_init__()
return instance
@@ -81,7 +82,7 @@ class Thing(Parameterized, metaclass=ThingMeta):
Subclass from here to expose python objects on the network (with HTTP/TCP) or to other processes (ZeroMQ)
"""
- __server_type__ = ServerTypes.THING
+ __server_type__ = ServerTypes.THING # not a server, this needs to be removed.
# local properties
instance_name = String(default=None, regex=r'[A-Za-z]+[A-Za-z_0-9\-\/]*', constant=True, remote=False,
@@ -107,7 +108,7 @@ class Thing(Parameterized, metaclass=ThingMeta):
remote=False, isinstance=False,
doc="""Validator for JSON schema. If not supplied, a default JSON schema validator is created.""") # type: BaseSchemaValidator
- # remote paramerters
+ # remote properties
state = String(default=None, allow_None=True, URL_path='/state', readonly=True, observable=True,
fget=lambda self : self.state_machine.current_state if hasattr(self, 'state_machine') else None,
doc="current state machine's state if state machine present, None indicates absence of state machine.") #type: typing.Optional[str]
@@ -127,16 +128,6 @@ class Thing(Parameterized, metaclass=ThingMeta):
URL_path='/object-info') # type: ThingInformation
- def __new__(cls, *args, **kwargs):
- obj = super().__new__(cls)
- # defines some internal fixed attributes. attributes created by us that require no validation but
- # cannot be modified are called _internal_fixed_attributes
- obj._internal_fixed_attributes = ['_internal_fixed_attributes', 'instance_resources',
- '_httpserver_resources', '_zmq_resources', '_owner', 'rpc_server', 'message_broker',
- '_event_publisher']
- return obj
-
-
def __init__(self, *, instance_name : str, logger : typing.Optional[logging.Logger] = None,
serializer : typing.Optional[JSONSerializer] = None, **kwargs) -> None:
"""
@@ -183,10 +174,10 @@ class attribute, see docs.
self._owner : typing.Optional[Thing] = None
self._internal_fixed_attributes : typing.List[str]
self._full_URL_path_prefix : str
+ self._gui = None # filler for a future feature
+ self._event_publisher = None # type : typing.Optional[EventPublisher]
self.rpc_server = None # type: typing.Optional[RPCServer]
self.message_broker = None # type : typing.Optional[AsyncPollingZMQServer]
- self._event_publisher = None # type : typing.Optional[EventPublisher]
- self._gui = None # filler for a future feature
# serializer
if not isinstance(serializer, JSONSerializer) and serializer != 'json' and serializer is not None:
raise TypeError("serializer key word argument must be JSONSerializer. If one wishes to use separate serializers " +
@@ -216,19 +207,6 @@ def __post_init__(self):
self.logger.info(f"initialialised Thing class {self.__class__.__name__} with instance name {self.instance_name}")
- def __setattr__(self, __name: str, __value: typing.Any) -> None:
- if __name == '_internal_fixed_attributes' or __name in self._internal_fixed_attributes:
- # order of 'or' operation for above 'if' matters
- if not hasattr(self, __name) or getattr(self, __name, None) is None:
- # allow setting of fixed attributes once
- super().__setattr__(__name, __value)
- else:
- raise AttributeError(f"Attempted to set {__name} more than once. " +
- "Cannot assign a value to this variable after creation.")
- else:
- super().__setattr__(__name, __value)
-
-
def _prepare_resources(self):
"""
this method analyses the members of the class which have '_remote_info' variable declared
@@ -302,7 +280,9 @@ def _get_object_info(self):
@object_info.setter
def _set_object_info(self, value):
self._object_info = ThingInformation(**value)
-
+ for name, thing in inspect._getmembers(self, lambda o: isinstance(o, Thing), getattr_without_descriptor_read):
+ thing._object_info.http_server = self._object_info.http_server
+
@property
def properties(self) -> ClassProperties:
@@ -421,8 +401,9 @@ def event_publisher(self) -> EventPublisher:
@event_publisher.setter
def event_publisher(self, value : EventPublisher) -> None:
if self._event_publisher is not None:
- raise AttributeError("Can set event publisher only once")
-
+ if value is not self._event_publisher:
+ raise AttributeError("Can set event publisher only once")
+
def recusively_set_event_publisher(obj : Thing, publisher : EventPublisher) -> None:
for name, evt in inspect._getmembers(obj, lambda o: isinstance(o, Event), getattr_without_descriptor_read):
assert isinstance(evt, Event), "object is not an event"
@@ -472,7 +453,7 @@ def get_postman_collection(self, domain_prefix : str = None):
@action(URL_path='/resources/wot-td', http_method=HTTP_METHODS.GET)
- def get_thing_description(self, authority : typing.Optional[str] = None):
+ def get_thing_description(self, authority : typing.Optional[str] = None, ignore_errors : bool = False):
# allow_loose_schema : typing.Optional[bool] = False):
"""
generate thing description schema of Web of Things https://www.w3.org/TR/wot-thing-description11/.
@@ -487,8 +468,11 @@ def get_thing_description(self, authority : typing.Optional[str] = None):
'http://my-pc:9090' or 'https://IT-given-domain-name'. If absent, a value will be automatically
given using ``socket.gethostname()`` and the port at which the last HTTPServer (``hololinked.server.HTTPServer``)
attached to this object was running.
-
- Returns:
+ ignore_errors: bool, optional, Default False
+ if True, offending interaction affordances will be removed from the schema. This is useful to build partial but working
+ schema always.
+ Returns
+ -------
hololinked.wot.td.ThingDescription
represented as an object in python, gets automatically serialized to JSON when pushed out of the socket.
"""
@@ -498,7 +482,7 @@ def get_thing_description(self, authority : typing.Optional[str] = None):
# In other words, schema validation will always pass.
from .td import ThingDescription
return ThingDescription(instance=self, authority=authority or self._object_info.http_server,
- allow_loose_schema=False).produce() #allow_loose_schema)
+ allow_loose_schema=False, ignore_errors=ignore_errors).produce() #allow_loose_schema)
@action(URL_path='/exit', http_method=HTTP_METHODS.POST)
@@ -515,7 +499,11 @@ def exit(self) -> None:
raise BreakInnerLoop # stops the inner loop of the object
else:
warnings.warn("call exit on the top object, composed objects cannot exit the loop.", RuntimeWarning)
-
+
+ @action()
+ def ping(self) -> None:
+ """ping the Thing to see if it is alive"""
+ pass
def run(self,
zmq_protocols : typing.Union[typing.Sequence[ZMQ_PROTOCOLS],
@@ -550,6 +538,7 @@ def run(self,
# expose the associated Eventloop which executes the object. This is generally useful for remotely
# adding more objects to the same event loop.
# dont specify http server as a kwarg, as the other method run_with_http_server has to be used
+ self._prepare_resources()
context = kwargs.get('context', None)
if context is not None and not isinstance(context, zmq.asyncio.Context):
raise TypeError("context must be an instance of zmq.asyncio.Context")
@@ -583,8 +572,8 @@ def run(self,
httpserver = kwargs.pop('http_server')
assert isinstance(httpserver, HTTPServer)
httpserver._zmq_protocol = ZMQ_PROTOCOLS.INPROC
- httpserver._zmq_socket_context = context
- httpserver._zmq_event_context = self.event_publisher.context
+ httpserver._zmq_inproc_socket_context = context
+ httpserver._zmq_inproc_event_context = self.event_publisher.context
assert httpserver.all_ok
httpserver.tornado_instance.listen(port=httpserver.port, address=httpserver.address)
self.event_loop.run()
@@ -648,4 +637,3 @@ def run_with_http_server(self, port : int = 8080, address : str = '0.0.0.0',
-
diff --git a/hololinked/server/utils.py b/hololinked/server/utils.py
index 72ebf52..efe56b5 100644
--- a/hololinked/server/utils.py
+++ b/hololinked/server/utils.py
@@ -63,7 +63,8 @@ def pep8_to_URL_path(word : str) -> str:
>>> pep8_to_dashed_URL("device_type")
'device-type'
"""
- return re.sub(r'_+', '-', word.lstrip('_').rstrip('_'))
+ val = re.sub(r'_+', '-', word.lstrip('_').rstrip('_'))
+ return val.replace(' ', '-')
def get_default_logger(name : str, log_level : int = logging.INFO, log_file = None,
@@ -108,6 +109,7 @@ def run_coro_sync(coro : typing.Coroutine):
eventloop = asyncio.get_event_loop()
except RuntimeError:
eventloop = asyncio.new_event_loop()
+ asyncio.set_event_loop(eventloop)
if eventloop.is_running():
raise RuntimeError(f"asyncio event loop is already running, cannot setup coroutine {coro.__name__} to run sync, please await it.")
# not the same as RuntimeError catch above.
@@ -125,6 +127,7 @@ def run_callable_somehow(method : typing.Union[typing.Callable, typing.Coroutine
eventloop = asyncio.get_event_loop()
except RuntimeError:
eventloop = asyncio.new_event_loop()
+ asyncio.set_event_loop(eventloop)
if asyncio.iscoroutinefunction(method):
coro = method()
else:
@@ -231,7 +234,8 @@ def issubklass(obj, cls):
run_coro_sync.__name__,
run_callable_somehow.__name__,
get_signature.__name__,
- isclassmethod.__name__
+ isclassmethod.__name__,
+ issubklass.__name__
]
diff --git a/hololinked/server/zmq_message_brokers.py b/hololinked/server/zmq_message_brokers.py
index cc33829..2770233 100644
--- a/hololinked/server/zmq_message_brokers.py
+++ b/hololinked/server/zmq_message_brokers.py
@@ -2,6 +2,7 @@
import os
import threading
import time
+import warnings
import zmq
import zmq.asyncio
import asyncio
@@ -2188,7 +2189,21 @@ def register(self, event : "EventDispatcher") -> None:
raise AttributeError(f"event {event._name} already found in list of events, please use another name.")
self.event_ids.add(event._unique_identifier)
self.events.add(event)
-
+
+ def unregister(self, event : "EventDispatcher") -> None:
+ """
+ unregister event with a specific (unique) name
+
+ Parameters
+ ----------
+ event: ``Event``
+ ``Event`` object that needs to be unregistered.
+ """
+ if event in self.events:
+ self.events.remove(event)
+ self.event_ids.remove(event._unique_identifier)
+ else:
+ warnings.warn(f"event {event._name} not found in list of events, please use another name.", UserWarning)
def publish(self, unique_identifier : bytes, data : typing.Any, *, zmq_clients : bool = True,
http_clients : bool = True, serialize : bool = True) -> None:
@@ -2215,10 +2230,15 @@ def publish(self, unique_identifier : bytes, data : typing.Any, *, zmq_clients :
return
if zmq_clients:
# TODO - event id should not any longer be unique
- self.socket.send_multipart([unique_identifier, self.zmq_serializer.dumps(data)])
+ self.socket.send_multipart([b'zmq-' + unique_identifier, self.zmq_serializer.dumps(data)])
if http_clients:
self.socket.send_multipart([unique_identifier, self.http_serializer.dumps(data)])
- else:
+ elif not isinstance(self.zmq_serializer , JSONSerializer):
+ if zmq_clients:
+ self.socket.send_multipart([b'zmq-' + unique_identifier, data])
+ if http_clients:
+ self.socket.send_multipart([unique_identifier, data])
+ else:
self.socket.send_multipart([unique_identifier, data])
else:
raise AttributeError("event name {} not yet registered with socket {}".format(unique_identifier, self.socket_address))
diff --git a/licenses/labthings-fastapi-LICENSE.txt b/licenses/labthings-fastapi-LICENSE.txt
new file mode 100644
index 0000000..ee7f13e
--- /dev/null
+++ b/licenses/labthings-fastapi-LICENSE.txt
@@ -0,0 +1,21 @@
+MIT License
+
+Copyright (c) 2024 Richard William Bowman
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
diff --git a/requirements.txt b/requirements.txt
index 69595f1..0694c5a 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,8 +1,12 @@
argon2-cffi==23.1.0
ifaddr==0.2.0
msgspec==0.18.6
-pyzmq==25.1.0
+pyzmq>=26.2.0
SQLAlchemy==2.0.21
SQLAlchemy_Utils==0.41.1
tornado==6.3.3
jsonschema==4.22.0
+
+
+
+
diff --git a/setup.py b/setup.py
index 01725b9..ad97418 100644
--- a/setup.py
+++ b/setup.py
@@ -7,7 +7,7 @@
setuptools.setup(
name="hololinked",
- version="0.2.4",
+ version="0.2.7",
author="Vignesh Vaidyanathan",
author_email="vignesh.vaidyanathan@hololinked.dev",
description="A ZMQ-based Object Oriented RPC tool-kit for instrument control/data acquisition or controlling generic python objects.",
diff --git a/tests/requirements.txt b/tests/requirements.txt
index 7992173..47fbc6b 100644
--- a/tests/requirements.txt
+++ b/tests/requirements.txt
@@ -1,7 +1,7 @@
argon2-cffi==23.1.0
ifaddr==0.2.0
msgspec==0.18.6
-pyzmq==25.1.0
+pyzmq>=26.2.0
SQLAlchemy==2.0.21
SQLAlchemy_Utils==0.41.1
tornado==6.3.3
diff --git a/tests/test_events.py b/tests/test_events.py
index 27e5804..59956d1 100644
--- a/tests/test_events.py
+++ b/tests/test_events.py
@@ -27,7 +27,7 @@ def push_events(self):
def _push_worker(self):
for i in range(100):
self.test_event.push('test data')
- time.sleep(0.01)
+ time.sleep(0.01) # 10ms
diff --git a/tests/test_thing_init.py b/tests/test_thing_init.py
index f555905..8f5f83a 100644
--- a/tests/test_thing_init.py
+++ b/tests/test_thing_init.py
@@ -162,6 +162,7 @@ def test_7_servers_init(self):
def test_8_resource_generation(self):
# basic test only to make sure nothing is fundamentally wrong
thing = self.thing_cls(instance_name="test_servers_init", log_level=logging.WARN)
+ # thing._prepare_resources()
self.assertIsInstance(thing.get_thing_description(), dict)
self.assertIsInstance(thing.httpserver_resources, dict)
self.assertIsInstance(thing.zmq_resources, dict)