If the accuracy does not increase after few iterations using Adagrad, try changing the default learning rate defined by https://keras.io/optimizers/
I have tried to change default lr to 0.0006 and it works.
For Adadelta, keep lr default is ok.
If the accuracy does not increase after few iterations using Adagrad, try changing the default learning rate defined by https://keras.io/optimizers/
I have tried to change default lr to 0.0006 and it works.
For Adadelta, keep lr default is ok.
转载于:https://www.cnblogs.com/loveSH/p/9724794.html