Replies: 3 comments 16 replies
-
I'm happy that you have found Polylith and also the things I've shared about monorepos! ⭐ Polylith support most of the modern Package & Dependency Management tools out there: Poetry, uv, hatch, PDM, Maturin, Pixi ... Polylith works very well with the About the code editor and IDE: VS Code will work for sure. I haven't any facts, but would guess the majority of Polylith users today are on VS Code, vim or PyCharm. You can also expect that the common Python tooling works well with Polylith (such as pytest, mypy, black, isort, ruff). |
Beta Was this translation helpful? Give feedback.
-
I did look deeper into python-polylith and uv workspaces. I did read the documentation on: https://polylith.gitbook.io/polylith (can recommend)
When setting up python-polylith I felt a bit lost, also in part because I tried evaluating uv workspaces at the same time. Doing both things at the same time didn't really work out. I found that I can replicate most of the polylith concept via uv workspaces. # Create a new component
uv init --package --lib components/my-component
# Create a new app
uv init --app apps/my-app
# Add my-component as dependency to my-app
uv add --package my-app my-component The resulting monorepo structure:
There are a view things missing:
But there are multiple issues on the uv repo asking for tooling like this:
Eventhough we might not end up using python-polylith, I thought it's worth sharing my thoughts. |
Beta Was this translation helpful? Give feedback.
-
I've been struggling with similar issues; only migrating multiple independent repos which were setup with a single implicit namespace into the same repository. One of the biggest issue I've had was actually carving out the specific modules for packaging applications together, which polylith does really well with the hatchling plugin. In some more testing, I'm currently looking at the following structure ( investigating retaining a toplevel directory of the project namespace ):
The thing with this is that I can continue to use polylith to build wheel packages with the following hooks in the pyproject.toml:
( I'm seemingly unable to recreate this behavior solely with hatching build rules! ) But I lose the ability to track sync new dependencies easily - I'm not entirely certain how helpful this observation is, but it feels similar to the problems Spenhouet has brought up. It could be rather powerful capability to practically "skip" the implicit (?) requirement to separate the bricks into |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @DavidVujic, thanks for providing python-polylith!
You seem very passionate about the Monorepo topic. Hence, I'd like to share what we are currently working on i.e. why I'm evaluating polylith and some thoughts around that.
We are maintaining a larger multi-repo microservice code-base with services and packages all versioned via semantic versioning. It gives a lot of flexibility (e.g. different python and dependency versions per service) and control (e.g. individual access control per repo).
BUT it comes with a ton of overhead (per project commits, PRs, reviews, merge orders, updating internal dependency versions, maintaining dev environments per repo, CVE analysis for multiple versions per dependency and python version, maintaining multi version documentation, more complex testing, ...).
Due to larger organization changes, our team can no longer produce meaningful progress with this setup.
One of the main motivators for a monorepo is our planned adoption of coding agents. Having an agent make consistent changes across many microservices and python packages, automatically test everything and update docs is magical. A monorepo makes this easier. All context in one place. Single commits for larger cross-service changes.
While evaluating our options I have a constant fear of the unknown unknowns and running into a "the grass on the other side isn't greener".
I started of by creating a parent "monorepo" using Git subtree. This way we could keep our separate repos (and existing CI/CD pipelines, workflows, etc.) while working with a "monorepo". Having a backup, in case the new setup isn't as great as we hoped.
In addition, this allows for some repos to be maintained separately by another team while one team maintains it as part of the mono repo.
We are making heavy use of Git LFS with custom LFS repos and it turns out that Git subtree isn't really compatible with Git LFS.
As IDE we use VS code. A supporting idea is to use VS code multi-root workspaces: https://code.visualstudio.com/docs/editing/workspaces/multi-root-workspaces
This allows the IDE to properly handle each projects venv, tests, etc. even in a monorepo setup.
This is especially important when working with agents, as they need to be able to properly run code and tests for each project.
As Git subtrees wasn't working out, I tried simply to just utilize multi-root workspaces to aggregate all repos in a single view.
For enabling the agent to work on the whole system, this seems to already work well (I wouldn't say great).
But the dev UX is not great. With x repos, the source control view becomes unusable. We'd also keep all the Git overhead.
My current perspective is to migrate some of our repos into a monorepo and combine that with other repos via VS code multi-root workspaces.
But I saw in your YouTube guides and GitHub issue comments that you are not using VS code, which makes me concerned with the compatibility / integration of polylith with VS code workspaces. Will have to test that.
Some of our core tooling/workflow is still based on 4-5 year old decision (pyenv, pip, GitFlow) but the python tooling evolved and matured.
Over time we already updated our tooling with replacing mypy and co. with ruff and pyright. For a while I wanted us to switch to uv, as it combined multiple tools we are using right now and also replaces multiple custom shell scripts we maintain for installing python versions and managing virtual envs.
That's one of the first aspects which make me a bit hesitant about polylith. The docs seem to center on Poetry and mypy, which in my view are outdated in the python ecosystem (maybe a controversial take).
In addition or as alternative to polylith I'm looking into uv workspaces. I found the uv workspaces + Dagger setup described in the following blog post intriguing: https://gafni.dev/blog/cracking-the-python-monorepo/
Can't tell yet, how or if polylith fit's into that.
Another aspect which concerns me is that not all our projects are python based. Tooling like polylith or uv workspaces are python only (which is okay), but it's unclear to me how I'm supposed to integrate other non-python-projects into that monorepo structure.
What I find appealing about polylith is it's opinionated repo structure promoting reusable code and the affected detection. Atm. I'm hoping polylith just complements uv workspaces and Dagger by providing a workflow and diff detection.
I will continue to share my findings. Hope it's helpful for someone.
Beta Was this translation helpful? Give feedback.
All reactions