Dropout (neural networks)

Dropout is a regularization technique for reducing overfitting in neural networks by preventing complex co-adaptations on training data. It is a very efficient way of performing model averaging with neural networks.[1] The term "dropout" refers to dropping out units (both hidden and visible) in a neural network.[2]

See also

References

  1. โ†‘ Hinton, Geoffrey E.; Srivastava, Nitish; Krizhevsky, Alex; Sutskever, Ilya; Salakhutdinov, Ruslan R. (2012). "Improving neural networks by preventing co-adaptation of feature detectors". arXiv:1207.0580 [cs.NE].
  2. โ†‘ "Dropout: A Simple Way to Prevent Neural Networks from Overfitting". Jmlr.org. Retrieved July 26, 2015.

[1]

  1. โ†‘ Warley-Farde et al,1312.6197 An empirical analysis of dropout in piecewise linear networks,2014(https://arxiv.org/abs/1312.6197)
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.