[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash in tf.raw_ops.ResizeNearestNeighbor/ResizeNearestNeighborGrad/ResizeArea/ResizeBicubic/ResizeBilinear #69322

Open
x0w3n opened this issue Jun 6, 2024 · 3 comments
Assignees
Labels
comp:ops OPs related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.16 type:bug Bug

Comments

@x0w3n
Copy link
x0w3n commented Jun 6, 2024

Issue type

Bug

Have you reproduced the bug with TensorFlow Nightly?

Yes

Source

source

TensorFlow version

TensorFlow Nightly

Custom code

Yes

OS platform and distribution

Linux Ubuntu 22.04.3 LTS (x86_64)

Mobile device

No response

Python version

No response

Bazel version

No response

GCC/compiler version

No response

CUDA/cuDNN version

No response

GPU model and memory

No response

Current behavior?

On specific input, these APIs will output "The session crashed because it took up all available RAM." The affected APIs are listed below:

  1. tf.raw_ops.ResizeBilinear
  2. tf.raw_ops.ResizeBicubic
  3. tf.raw_ops.ResizeArea
  4. tf.raw_ops.ResizeNearestNeighbor
  5. tf.raw_ops.ResizeNearestNeighborGrad

Standalone code to reproduce the issue

https://colab.research.google.com/drive/1a6B_lYMfENTbO4ifQMLYe8Hq6CHcc6hP?usp=sharing

Relevant log output

Timestamp	Level	Message
Jun 6, 2024, 6:57:43 PM	WARNING	WARNING:root:kernel d0444564-3fb8-4d42-ac83-1338e6842d5a restarted
Jun 6, 2024, 6:57:43 PM	INFO	KernelRestarter: restarting kernel (1/5), keep random ports
Jun 6, 2024, 6:57:25 PM	WARNING	2024-06-06 10:57:25.421101: W external/local_tsl/tsl/framework/cpu_allocator_impl.cc:83] Allocation of 51536461872 exceeds 10% of free system memory.
Jun 6, 2024, 6:57:21 PM	WARNING	2024-06-06 10:57:21.507228: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Jun 6, 2024, 6:57:17 PM	WARNING	To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
Jun 6, 2024, 6:57:17 PM	WARNING	2024-06-06 10:57:17.995922: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
Jun 6, 2024, 6:56:13 PM	WARNING	WARNING:root:kernel d0444564-3fb8-4d42-ac83-1338e6842d5a restarted
Jun 6, 2024, 6:56:13 PM	INFO	KernelRestarter: restarting kernel (1/5), keep random ports
Jun 6, 2024, 6:55:56 PM	WARNING	2024-06-06 10:55:56.764126: W external/local_tsl/tsl/framework/cpu_allocator_impl.cc:83] Allocation of 103072923744 exceeds 10% of free system memory.
Jun 6, 2024, 6:55:52 PM	WARNING	2024-06-06 10:55:52.277507: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
@VijayBonthu
Copy link

isn't this because the tensor is too big for the ram ? if you reduce the size to 20000, 20000 it should output without the resource exhaustion error.

@x0w3n
Copy link
Author
x0w3n commented Jun 7, 2024

isn't this because the tensor is too big for the ram ? if you reduce the size to 20000, 20000 it should output without the resource exhaustion error.

isn't this because the tensor is too big for the ram ? if you reduce the size to 20000, 20000 it should output without the resource exhaustion error.

If the size parameter is reduced to 20000, the code will not cause a crash; however, we recommend that the Tensorflow throws a runtime error when the requested memory exceeds the limit instead of continuing to request memory and triggering a crash.

@Venkat6871 Venkat6871 added comp:ops OPs related issues TF 2.16 and removed TF 2.16 labels Jun 7, 2024
@Venkat6871
Copy link
Venkat6871 commented Jun 7, 2024

Hi @x0w3n ,

  • I was able to reproduce the issue on Colab using TF v2.15 and TF-nightly ,Please find the gist here for reference.

Thank you!

@Venkat6871 Venkat6871 added TF 2.16 stat:awaiting tensorflower Status - Awaiting response from tensorflower labels Jun 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:ops OPs related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.16 type:bug Bug
Projects
None yet
Development

No branches or pull requests

3 participants