Dilution (neural networks)

Dropout and dilution (also called DropConnect) are regularization techniques for reducing overfitting in artificial neural networks by preventing complex co-adaptations on training data. They are an efficient way of performing model averaging with neural networks. Dilution refers to randomly decreasing weights towards zero, while dropout refers to randomly setting the outputs of hidden neurons to zero. Both are usually performed during the training process of a neural network, not during inference.