I am wondering whether I can simply apply dropout to convolutions in TensorFlow. How will it be applied? Are weights of the convolution mask randomly set to zero while it 'slides' over the input?
You can apply dropout on arbitrary input tensors. How this input was computed doesn't matter; each element of the input will simply either be kept (and scaled, see below) or set to zero.
keep_prob, outputs the input element scaled up by
1 / keep_prob, otherwise outputs
0. The scaling is so that the expected sum is unchanged.
By default, each element is kept or dropped independently.
conv = tf.nn.conv2d(...) drop = tf.nn.dropout(conv, keep_prob=0.5)