[go: nahoru, domu]

Skip to content

Commit

Permalink
revert
Browse files Browse the repository at this point in the history
  • Loading branch information
jancervenka committed Sep 23, 2022
1 parent bab3160 commit 8aafb15
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions site/en/guide/core/optimizers_core.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"cellView": "form",
"id": "AwOEIRJC6Une"
},
"outputs": [],
Expand Down Expand Up @@ -592,7 +591,10 @@
"This notebook introduced the basics of writing and comparing optimizers with the [TensorFlow Core APIs](https://www.tensorflow.org/guide/core). Although prebuilt optimizers like Adam are generalizable, they may not always be the best choice for every model or dataset. Having fine-grained control over the optimization process can help streamline ML training workflows and improve overall performance. Refer to the following documentation for more examples of custom optimizers:\n",
"\n",
"* This Adam optimizer is used in the [Multilayer perceptrons](https://www.tensorflow.org/guide/core/mlp_core) tutorial and the [Distributed training]()\n",
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n"
"* [Model Garden](https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html) has a variety of [custom optimizers](https://github.com/tensorflow/models/tree/master/official/modeling/optimization) written with the Core APIs.\n",
"\n",
"\n",
"\n"
]
}
],
Expand Down

0 comments on commit 8aafb15

Please sign in to comment.