[Tech debt] pip
is not properly resolving on the internal PyPI when using extras
#1115
Open
2 tasks done
Labels
Is this a new bug in dbt-spark?
Current Behavior
When running
pip install dbt-spark[odbc,pyhive]
against the internal PyPI server, we are not getting the latest version ofdbt-spark
. However when we break out the extras as separate dependencies (e.g.pip install dbt-spark pyodbc pyhive), we get the latest version of
dbt-spark`.Expected Behavior
We should get the latest version of
dbt-spark
when installingdbt-spark[odbc,pyhive]
from the internal PyPI, plus the public PyPI versions ofpyodbc
andpyhive
. We suspect it has something to do with the extras and the direct dependency not being on the same PyPI server (direct is on internal PyPI while transients are on public PyPI).Ideally we can configure our internal index to mirror the desired dependencies from public pypi.
Steps To Reproduce
pip install dbt-spark[odbc,pyhive]
with the internal PyPI as an extra index and note that an old version ofdbt-spark
is installedpip install dbt-spark pyodbc pyhive
with the internal PyPI as an extra index and note that the latest version ofdbt-spark
is installedRelevant log output
No response
Environment
Additional Context
No response
The text was updated successfully, but these errors were encountered: