-
-
Notifications
You must be signed in to change notification settings - Fork 25.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TSNE sets kl_divergence wrong #6507
Comments
Yes, that seems to be an error. It has been introduced during the merge of the master and Barnes-Hut SNE. |
Ok, then I will create a PR to fix it. :) |
ssaeger
pushed a commit
to ssaeger/scikit-learn
that referenced
this issue
Mar 21, 2016
ssaeger
pushed a commit
to ssaeger/scikit-learn
that referenced
this issue
Mar 22, 2016
jnothman
pushed a commit
to jnothman/scikit-learn
that referenced
this issue
Mar 23, 2017
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
In commit 6bf63f6
error
was replaced bykl_divergence
, but line 850 was missed so that the final kl_divergence value is actually an old value that misses the final optimization step. This is especially a problem since people can access the kl_divergence in the current version as described in #6477.Even in the verbose output one can see that the error/kl_divergence value from iteration 100 is kept as the final value at iteration 200.
If this confirms to be a bug, I will create a PR for it.
The text was updated successfully, but these errors were encountered: