You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there! I'm new to patito, but I am excited about the possibility of using patito as a link between database table definitions and the runtime behavior outside of object-per-row orm silliness.
What I mean is the usual backend-y flow "define tables -> query some data -> transform it -> validate it at runtime before sending to the db". The existing technology is doing fine if its a crud operation on a single entity, but it's completely incompetent at the scale where polars is used.
The last step is the missing link, validating each row individually before sending it to the db is obviously a no-go and this alone, I think, keeps the data workflows away from using type-safe and convenient table definitions like what sqlmodel provides.
The reason I can't use patito right now to drive validation of my upload step of batch jobs is that patito doesn't respect the fact there is already an external source of truth for the tables - patito tries to be that source of truth.
The solution I see is likely quite a lot of work - implementing some way to convert sqlalchemy / sqlmodel models into patito models. The shared pydantic underneath seems largely irrelevant for integrating with sqlmodel since any non-trivial use-case requires raw-dogging sa_type = MyFancyEmbeddingUserDefinedType so turning that FieldInfo into something patito can work with is probably a pain.
Maybe, since so many things are using pydantic for model definitions, it could be turned into a universal table definition format through clever FieldInfo abusage?
Let me know what you think about this all, I'm open to working on some POCs if you tell me what style of integration you're comfortable with in your project
The text was updated successfully, but these errors were encountered:
Hi there! I'm new to patito, but I am excited about the possibility of using patito as a link between database table definitions and the runtime behavior outside of object-per-row orm silliness.
What I mean is the usual backend-y flow "define tables -> query some data -> transform it -> validate it at runtime before sending to the db". The existing technology is doing fine if its a crud operation on a single entity, but it's completely incompetent at the scale where polars is used.
The last step is the missing link, validating each row individually before sending it to the db is obviously a no-go and this alone, I think, keeps the data workflows away from using type-safe and convenient table definitions like what sqlmodel provides.
The reason I can't use patito right now to drive validation of my upload step of batch jobs is that patito doesn't respect the fact there is already an external source of truth for the tables - patito tries to be that source of truth.
The solution I see is likely quite a lot of work - implementing some way to convert sqlalchemy / sqlmodel models into patito models. The shared pydantic underneath seems largely irrelevant for integrating with sqlmodel since any non-trivial use-case requires raw-dogging
sa_type = MyFancyEmbeddingUserDefinedType
so turning that FieldInfo into something patito can work with is probably a pain.Maybe, since so many things are using pydantic for model definitions, it could be turned into a universal table definition format through clever FieldInfo abusage?
Let me know what you think about this all, I'm open to working on some POCs if you tell me what style of integration you're comfortable with in your project
The text was updated successfully, but these errors were encountered: