[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pip install tf-nightly[and-cuda] fails for recent nightlies #62035

Open
mattdangerw opened this issue Oct 2, 2023 · 7 comments
Open

pip install tf-nightly[and-cuda] fails for recent nightlies #62035

mattdangerw opened this issue Oct 2, 2023 · 7 comments
Assignees
Labels
comp:gpu:tensorrt Issues specific to TensorRT comp:gpu GPU related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF2.14 For issues related to Tensorflow 2.14.x type:bug Bug

Comments

@mattdangerw
Copy link
Member
mattdangerw commented Oct 2, 2023

Issue type

Bug

Have you reproduced the bug with TensorFlow Nightly?

Yes

Source

source

TensorFlow version

nightly

Custom code

No

OS platform and distribution

Google Colab

Mobile device

No response

Python version

3.10

Bazel version

No response

GCC/compiler version

No response

CUDA/cuDNN version

No response

GPU model and memory

No response

Current behavior?

The issue

Attempting to run pip install tf-nightly[and-cuda] will download a ton of nightly candidates before installing one from mid-September (before tf bumped to cuda12).

Attempting to pin the more recent versions shows the error with recent nightlies.

pip install tf-nightly[and-cuda]==2.15.0.dev20231002
...
ERROR: Could not find a version that satisfies the requirement tensorrt-libs==8.6.1; extra == "and-cuda" (from tf-nightly[and-cuda]) (from versions: 9.0.0.post11.dev1, 9.0.0.post12.dev1, 9.0.1.post11.dev4, 9.0.1.post12.dev4)
ERROR: No matching distribution found for tensorrt-libs==8.6.1; extra == "and-cuda"

You can work around this with pip install tf-nightly[and-cuda] --extra-index-url https://pypi.nvidia.com.

What should happen
pip install tf-nightly[and-cuda] should not self conflict, and recent nighties should be installable via PyPI.

Standalone code to reproduce the issue

https://colab.research.google.com/gist/mattdangerw/00acd58e43aabe7f80a74d595788bd86/tf-nightly-and-cuda.ipynb

Relevant log output

No response

@mattdangerw
Copy link
Member Author

Can reproduce this on ubuntu and colab.

This issue looks related NVIDIA/TensorRT#2933

Potentially a simple fix, I am not sure we really need to be pinning tensorrt-bindings and tensorrt-libs directly. pip install tensorrt==8.6.1.post1 seems to pull in the dependent libraries just fine.

@SuryanarayanaY SuryanarayanaY added TF2.14 For issues related to Tensorflow 2.14.x comp:gpu GPU related issues comp:gpu:tensorrt Issues specific to TensorRT labels Oct 3, 2023
@SuryanarayanaY
Copy link
Collaborator

I found failure of same package i.e tensorflow[and-cuda]>=2.14.0 with Python 3.11 environment mainly due to Tensorrt issue with Python 3.11v. Related issue #61986 .

@picobyte
Copy link
picobyte commented Oct 3, 2023

It seems this one does not exist: 'tensorrt-libs == 8.6.1', unless you install it like this:
pip install --no-cache-dir --extra-index-url https://pypi.nvidia.com tensorrt-libs==8.6.1

@NikhielRahulSingh
Copy link

Hi, I am unable to install tensorflow nightly. Any suggestions ?

I am running tensor flow 2.13.1 ?

@SuryanarayanaY
Copy link
Collaborator

@NikhielRahulSingh ,

Could you please confirm the install command and logs ? Please note that current nightly version is 2.16.0-dev20231018.

@RocketRider
Copy link
RocketRider commented Oct 31, 2023

I ran into the same issue with Version 2.15.0 rc0.
Please fix this for the release.
The fix from @picobyte saved the day

ERROR: Ignored the following versions that require a different python version: 0.28.0 Requires-Python >=3.7, <3.11; 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11; 1.6.2 Requires-Python >=3.7,<3.10; 1.6.3 Requires-Python >=3.7,<3.10; 1.7.0 Requires-Python >=3.7,<3.10; 1.7.1 Requires-Python >=3.7,<3.10; 1.7.2 Requires-Python >=3.7,<3.11; 1.7.3 Requires-Python >=3.7,<3.11; 1.8.0 Requires-Python >=3.8,<3.11; 1.8.0rc1 Requires-Python >=3.8,<3.11; 1.8.0rc2 Requires-Python >=3.8,<3.11; 1.8.0rc3 Requires-Python >=3.8,<3.11; 1.8.0rc4 Requires-Python >=3.8,<3.11; 1.8.1 Requires-Python >=3.8,<3.11; 1.9.5 Requires-Python >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, <3.7; 6.0.0 Requires-Python >=3.6, <3.10; 6.0.0a1.dev1606911628 Requires-Python >=3.6, <3.10; 6.0.1 Requires-Python >=3.6, <3.10; 6.0.2 Requires-Python >=3.6, <3.10; 6.0.3 Requires-Python >=3.6, <3.10; 6.0.4 Requires-Python >=3.6, <3.10; 6.1.0 Requires-Python >=3.6, <3.10; 6.1.1 Requires-Python >=3.6, <3.10; 6.1.2 Requires-Python >=3.6, <3.10; 6.1.3 Requires-Python >=3.6, <3.10; 6.2.0 Requires-Python >=3.6, <3.11; 6.2.1 Requires-Python >=3.6, <3.11; 6.2.2 Requires-Python >=3.6, <3.11; 6.2.2.1 Requires-Python >=3.6, <3.11; 6.2.3 Requires-Python >=3.6, <3.11; 6.2.4 Requires-Python >=3.6, <3.11; 6.3.0 Requires-Python <3.11,>=3.6; 6.3.1 Requires-Python <3.11,>=3.6; 6.3.2 Requires-Python <3.11,>=3.6; 6.4.0 Requires-Python <3.11,>=3.6
ERROR: Could not find a version that satisfies the requirement tensorrt-libs==8.6.1; extra == "and-cuda" (from tensorflow[and-cuda]) (from versions: 9.0.0.post11.dev1, 9.0.0.post12.dev1, 9.0.1.post11.dev4, 9.0.1.post12.dev4, 9.1.0.post11.dev4, 9.1.0.post12.dev4)
ERROR: No matching distribution found for tensorrt-libs==8.6.1; extra == "and-cuda"

@njzjz
Copy link
Contributor
njzjz commented Nov 19, 2023

This issue still exists in the stable 2.15.0 version.

@sachinprasadhs sachinprasadhs added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Nov 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:gpu:tensorrt Issues specific to TensorRT comp:gpu GPU related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF2.14 For issues related to Tensorflow 2.14.x type:bug Bug
Projects
None yet
Development

No branches or pull requests

10 participants