-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Character-level seq2seq model for translation and beam search. #56337
Comments
Hi @VallabhMahajan1, Could you share a reproducible code that supports your statement so that the issue can be easily understood? Thank you! |
I changed this one parameter -- 'char_level = True' in tf.keras tokenizer. class NMTDataset:
|
Here the tutorial describes Neural Machine Translation(NMT) using word level sequence-to-sequence model using TF Addons. Please refer to this tutorial for character level sequence-to-sequence using Tensorflow. |
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you. |
Closing as stale. Please reopen if you'd like to work on this further. |
Click to expand!
Issue Type
Documentation Feature Request
Source
source
Tensorflow Version
tf 2.8
Custom Code
Yes
OS Platform and Distribution
Colab GPU
Mobile device
No response
Python version
No response
Bazel version
No response
GCC/Compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current Behaviour?
Standalone code to reproduce the issue
Relevant log output
The text was updated successfully, but these errors were encountered: