-
-
Notifications
You must be signed in to change notification settings - Fork 818
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discuss installable Python package support for Python 3.8 plugin_host
and beyond
#1652
Comments
One of the most popular packages (SFTP, written by myself, and the impetus for Package Control in the fist place), used compiled 3.8. Using pip itself is unlikely to be too much of a win, since you will need to:
There is existing code to do a lot of the actual package mechanics, and we already have all of the downloading code running for a long time. Trying to have package downloads using one code path and pip libraries use a completely different downloading mechanism isn’t a recipe for success IMO. There is new code to handle most of the package operations and it is probably 80% done. Finishing that should take less work than learning all about pip internals and trying to get it all to work. I think aside from the many unique technical constraints of the ST Python environment, the big issue is that this is no longer a passion project for me, and there hasn’t been much work except by @deathaxe. This is probably largely my fault, but I don’t see the way to really get out of this situation unless Sublime HQ decides to take over the project and decide how to do such things. They currently bankroll the vast majority of money for hosting, etc through a GitHub sponsorship. But really the package work could likely use someone who is working at least part time on it and is willing to get deep into the headspace of cross-platform, multi-version Python package management. In terms of Python versions, that is 100% up to Sublime HQ. I personally wouldn’t ditch Python 3.8 any time soon, but I no longer work in such a capacity, so I don’t get to make such decisions. |
I've played around with running pip directly from within ST's python 3.8 plugin host a while ago by copying pip folder from a system's python 3.8 instance and running CLI via ST's console as follows: from pip._internal.cli.main import main
main(["install", "--only-binary", "-t", R"<st>\Data\Lib\python38", "pyyaml"]) If it works, it seems to work pretty well. The downsides are:
PEP440 version support in https://github.com/wbond/package_control/tree/wip/pep440 is nearly complete - except some minor edge cases with regards to local versions. But I don't think we need to care about them. Parsing and resolving requirements is somewhat open. It doesn't look too dificult, but well, I am already spending a lot of spare time for ST stuff and it's not my main domain. I'd already be happy to get the current PC4.0 released and packagecontrol.io updated to support new 4.0.0 scheme version, so we could even step forward to using python 3.8 packages at all. |
Thanks for responding @wbond =)
If the compiled 3.8 code for SFTP is in a wheel somewhere, we could upgrade the plugin host to a newer Python, the installer could reinstall the wheel, and the plugin could keep working. If that compiled 3.8 code is vendored then you'd have to update that plugin code manually or PC is stuck running the 3.8 plugin host forever
If we only support installable packages for a single "dynamic" plugin host (3.8 now, 3.X in the future), we can upgrade this plugin host frequently enough to always use a pip that's maintained. Python 3.8 is already old, but pip still works great for this version
PC could collect all the plugin deps, put them in a single
If Thank you also @deathaxe for responding, and for working on PC
1, sounds gnarly, I'm guessing there's a fix And again, if we use That said, if you think there's a good way to do installable packages for 3.8 and future plugin host versions without depending on Will, I agree with you that ideally Sublime HQ takes responsibility for the future of Package Control. IMO you and PC are a huge reason Sublime is successful, and if PC isn't maintained Sublime loses a big part of its value |
To clarify: Don't mix up Sublime Text packages and Python packages! Those are totally different things which don't share anything! SFTP is a Sublime Text package and will therefore never be published as WHEEL file. As all the other Sublime Text packages won't. pip can't be used to handle Sublime Text packages, as it is not designed to understand their structure. Even if it was theoretically possible to use the WHEEL format to publish packages/plugins for Sublime Text, it would significantly degrate developer experience as pip's package format is a terrible mess and creating wheels involves a build process, which is currently not required. This also answers: "3, true, but the download mechanism is understood by a lot more people than the current mechanism". The current download mechanism will not go away as it is crucial for handling Sublime Text packages. It may be replaced by something else in future, but not by pip. To conclude: Current mechanisms to install/upgrade/manage Sublime Text packages is not going to change - just may evolve. It's working fine for ST3 and 4 without any reason to drop support for any older build or ST3. The whole discussion about pip or using parts of it, is just about support for what was called "dependencies" in ST3/PC3 world. Instead of managing own sets of python packages in a ST compatible format, existing WHEEL files are to be supported to
The infrastructure (Libs/python... folder(s)) to install python libraries is supported as of ST3143, which is therefore the least required ST build for PC4.0 Supporting ST3 just means to install existing dependencies in the new format, which already works. It's just to keep Package Control compatible with ST3, no py38 libs and syntax can be used.
unacceptable and useless, because
It's always a maintenance burden to use foreign projects due to requirement to keep up with their development cycles. They decide to introduce breaking changes, good luck. Something stops working for you? Good luck. Many unexpected things can happen, which need to be debugged and cost time as well. So careful decisions need to be made about what deps to use. If it is so easy to maintain, just tell my why it's crashing ST's plugin_host. Is it so easy to find out? |
Hey @deathaxe
Yeah, I know... I think there was a misunderstanding. Here's what I'm saying:
Maybe it was confusing that I called ST packages "plugins" instead of packages. I was trying to make things less confusing by doing this, because "package" is the term used for Python packages, and because Sublime Text docs tend to mix these terms. |
Installing PIP modules and all of their dependencies needed for a Sublime Text plugin is consistently the highest friction part for developing plugins. Sublime Text has been my favorite editor for over a decade, but the difficulty of developing plugins for it will be its demise. I've easily spent 60-80% of my time developing plugins manually installing/resolving PIP modules that I could've installed with PIP idiomatically in any other situation where I was developing Python code. I would vastly prefer to install Python dependencies through PIP when developing plugins with Package Control. There is significant overhead for maintaining an additional Git repo for each PIP module you want to install as a dependency through Package Control. Authors will stop maintaining these packages, after which plugin developers will end up abandoning their plugins as well. It should be possible for existing dependency management mechanisms to continue working as-is in Package Control, while supporting an additional mechanism for installing Python dependencies with PIP:
There is a concern about different Sublime Text plugins/packages having different/conflicting versions of PIP modules being installed, but this is already not solved for robustly. Plugins relying strictly on the Package Control Channel Repository will have a consistency guarantee; however, I'm certain there plugins that have to operate using PIP modules outside of this. I am one of them. |
To clarify: Normal python packages can be installed into $data/Lib/python33 or $data/Lib/python38 for private use via pip since ST3143 by specifying them as custom site-packages via pip's CLI. That's supported by ST regardless Package Control being present. You just need to somehow ensure to install packages for correct interpreter version. PC3 just wasn't the tool to deploy such setups globally. Package Control 4 is designed to mimic pip behavior. It...
There is no need to maintain dedicated git repos for libraries anymore, which are already deployed via pypi.org! In fact most existing dependencies have already been migrated to be installed from pypi.org for at least python 3.8 environment. While the legacy dependency format (see: https://github.com/packagecontrol/example-dependency) has been extended to support sub directories such as The only requirement for python packages to be available via Package Control currently is them being registered to https://github.com/packagecontrol/channel. The only missing feature, currently is recursively resolving requirements. There are various reasons why pip can't be used to manage libraries, directly:
It was probably possible to levarage some basic packages, also used by pip, but... All official python packages such as What's missing, technically is completing the functionality to interpret python requirements and building the dependency tree from it, to generate a list of packages to install. Currently all required sub-requirements need to be specified manually by a package author, which is also state of the art in pyscript for instance. In order to use pydantic, a package author would need to specify a {
"*": {
"*": [
"annotated-types",
"eval-type-backport",
"pydantic",
"pydantic-core",
"typing-extensions"
]
}
} That's probably not ideal, but is also not impossible to manage. |
@deathaxe Appreciate the thoughtful and detailed message here. It seems like improvement has been made on this front, and the need to unfold transitive PIP dependencies into a Given your feedback, I would update my proposal to create another Sublime Text plugin - registered in the standard channel for Package Control - called
I understand that the
I would propose that this plugin only support Python 3.8 starting out, as I suspect effectively all plugin authors will only use 3.8 at this point. Nonetheless, the plugin should be written to automatically support future versions of Python that Sublime Text will adopt. I would be interested in your feedback to this proposal. Ultimately, this could be delivered without official "blessing" by Package Control, but I would prefer to learn from everyone's experience here. You've all spent far more time supporting plugin development and developers than I have. |
The requrirement to install a dedicated python and pip next to ST just to be able to install dependencies on all different OSs is not desirable. ST2 relied on Macs python26 and does therefore no longer work on modern macs as python2 has been removed. Same is true on Linux for python33. None of the current versions supports it (by shipping packages). Same will happen soon for python 3.8. Actually Package Control even shifted to UnitTesting package to handle CI tests, just to avoid external python dependencies which caused tests to start failing due to python 3.3 not being supported anymore. PIP is a dead end. Package Control ships everything required to install python packages. What's missing is using pypi's index directly to look out for packages and some logic to recursively resolve dependencies. |
Is my understanding correct that a Sublime Text plugin could provide its own repository JSON that referenced PIP modules under its |
Theoretically yes. Practically not with official packagecontrol.io channal though, as it does not yet support required scheme v4.0.0. Such a repository couldn't be registered in default channel. That's actually the reason why we currently ship a separate channel_v4.json for libraries. That channel's infrastructure is however not designed to take packages. |
@deathaxe Would I be able to add a package to the It seems that we should not be waiting on plugins to adopt the new mechanism. I am willing to use this mechanism instead of PIP, but I need to be able to use it now. |
Adding python packages aka. libraries there is ok, but no packages. The crawler is not capable of handling github api rate limits and will break the whole channel json when that happens. I haven't found the time to make it work like the original one, which is backed by a database to only crawl a subset of registered packages at a time. Furthermore those (normal) packages wouldn't be visible on packagecontrol.io or anywhere else, but in the quick panel. |
What is the crawler being referred to here? Is it Sublime Text on a user's machine? Any code/issue links would be appreciated. Are there existing initiatives to migrate the Package Control Channel/Repository schemas to v4? I understand that @wbond is MIA apparently, but the community should not be structed to hold up the entire show on one person. |
It refers to
packagecontrol.io and our community driven libraries channel run a scheduled crawler server-side to update channel.json from registered sources (Github/Gitlab/Bitbucket repos, or PyPI), to avoid this happening on clients.
Transition is
It is awaiting review, and release.
You are wellcome to setup an alterantive packagecontrol.io server using mentioned PR. Otherwise Will is the only one with access to packagecontrol.io servers and he already stated to stop maintanance (keep it running) in case other people get involved modifying his sources. I understand the basics of the webapp, but I don't feel like being able to overtake its maintanance or even further development in my rare spare time. Same applies to signing Package Control.sublime-package and officially releasing it so ST users can directly install Package Control 4 via installer script (see: #1661) I guess we can't expect more than packagecontrol.io keeping running as it is now, if no new actor steps in. Hence I am also not too motivated to push features such as pypi support for that client. As it will die as soon as packagecontrol.io stops working. |
It seems the community-maintained GitHub workflows timeout jobs after 6 hours and workflows after 35 days, so there are some scaling concerns here: https://docs.github.com/en/actions/learn-github-actions/usage-limits-billing-and-administration#usage-limits GitHub rate limits to 5k requests per hour for an authenticated user as well: https://docs.github.com/en/rest/using-the-rest-api/rate-limits-for-the-rest-api?apiVersion=2022-11-28 Due to the number of packages/libraries, scanning packages/libraries for Package Control needs some form of data storage. Seeing that packagecontrol.io uses a proper database, there could be some opportunity to store scanned packages directly as files/artifacts within a GitHub Action. If GitHub Actions are limited to 1000 GitHub API calls within an hour, this severely limits the ability to scrape packages within a GitHub Action. However, if a GitHub Action can authenticate as a user/GitHub App, then this could scale to 5k requests per hour. It would be useful to know how many GitHub requests are currently made by packagecontrol.io when rebuilding its package index. Any metrics on this would be quite useful. |
Those facts are the well known reasons for why normal packages are not accepted, currently, besides those not being visible anywhere. The amount of API calls per package depends on amount of tags associated with it. The range to expect is about 2 - 5 calls per package. That's why packagecontrol.io only scans 200 packages per crawl cycle. |
I reached out to @wbond over email, but I have not heard back from him yet. @deathaxe what if the community built a separate mechanism for crawling and re-building a |
It's not only a channel.json but also the ability to discover packages via website, which would mean to build a new packagecontrol.io website as well. |
Well, until such a time as @wbond responds, I'm assuming there's nothing that can be done about packagecontrol.io, even if it's to transfer ownership. Additionally, I rarely use the website to install packages. I almost always do it directly from Sublime Text itself. The description in the list is usually indicative enough about what it does for me to install it. |
Such decisions are not for single users thow. That said direction of this discussion goes off topic. The scope of this issue is to discuss how to handle libraries and their dependencies. It's not about general package infrastructure. |
The scope of this issue is discussing how to handle libraries and their dependencies. The standard package channel not adopting the v4 schema is an active blocker to the issue at hand. I am happy to continue discussion in the following PR if you would prefer: wbond/packagecontrol.io#157 However, we cannot pretend there aren't effective dependencies here. Either the Package Control community needs to adopt the v4 schema properly, or another mechanism - such as the |
Handling libraries and dependencies is not neccessarily associated with upgrading packagecontrol.io to v4.0.0 schema. For any python package to be available for end users, it currently just needs to be registered to https://github.com/packagecontrol/channel. The only other downside is Package Control not yet resolving and installing their dependencies automatically. They need to be listed explicitly by a package author. This has been the case with legacy dependencies since the beginning - not saying it is ideal. The primary goal is to teach Package Control client to recursively resolve and install python packages from pypi directly without the need to register it in any proprietary json file. So if Package Control finds a library name in "dependencies.json", which is not registered in a channel, it would instead reach out to pypi directly to install it. It should act like pip, but we currently can't use it for the repetitively stated arguments. The primary benefit of upgrading packagecontrol.io to v4.0.0 scheme would be to
|
Let me be explicit about what I see as the "ideal" experience for authors vending Sublime Text package through Package Control and schema v4, given everything discussed thus far:
Under this model, I only need to register the GitHub URL for the In order to realize this future though, Package Control need a channel that adheres to the v4 schema for packages and not just libraries. |
Well, still not being convinced of pip, but maybe found a reason for plugin_host being "crashed" by pip (see: sublimehq/sublime_text#6406). That said, I did the following:
I haven't however checked, what happens if a package is loaded and related DLL is locked on Windows. This is a common issue preventing upgrades while plugin_host is running. |
Hey @wbond @deathaxe @BenjaminSchaaf @FichteFoll
I'm creating this issue to discuss PC 4.0, specifically adding support for installing Python packages that plugins can depend on, and allowing the 3.8
plugin_host
to be upgraded at some point.I know very little about the PC source code, but I've read about design decisions and constraints, and how PC has evolved. From what I can tell:
Here are some thoughts on installable packages:
On upgrading the plugin host:
On how to build installable packages in PC:
pip
to do this is notpip
is solid, sophisticated, extremely well-tested, etcpip
, e.g.pip install --only-binary
pip
for the same Python version as the plugin hostrequirements.txt
pip
can install them and PC can put them on the plugin host's Python path, so they're importable by plugin codesite-packages
or wherever the oldpip
installed its packages, then reinstall them with the newpip
--only-binary
, and a plugin host that doesn't use a bleeding edge Python, this is unlikely to break packages, even those that depend on compiled codeOn backwards compatibility with 3.3:
pip
means packages aren't supported for the 3.3 plugin host, but that's fineIn summary, I think installable packages and a "dynamic" plugin host dovetail nicely, and are key to the future of ST plugin development. New plugin devs don't want to write code against obsolete Python versions, and vendor & patch obsolete Python packages.
Either one of these features (installable packages in PC, new plugin host) could be shipped on its own. If we plan on doing both of them, we'd first need to ship a PC release with existing (legacy) code for 3.3 plugins, and code for dynamic 3.X plugins with installable packages.
Changing the plugin host version every few years is bound to break some plugins, but installable packages make this less likely and less painful. The alternatives, running more and more plugin hosts or freezing the second plugin host at 3.8, are worse.
four-point-oh notwithstanding, PC hasn't changed much in the last 3 years. If it stops getting better people will eventually stop writing plugins. I love ST, I've used it and pretty much only it for my entire career, but I would have switched to something else if it weren't for LSP support, and that's a plugin.
The text was updated successfully, but these errors were encountered: