Question

A 2013 paper by Ian Goodfellow et al. introduced a technique named analogously to dropout in which layers that perform this operation are added to a neural network. The LogSumExp, or LSE, function essentially performs a smooth version of this operation. This is the most common operation used (15[1])for pooling (15[1])in convolutional neural networks. (15[1])The rectifier function, which is by far the most popular activation function in neural networks today, performs this operation on the two arguments of (*) 0 and the input. The L infinity norm of a vector is equal to this (10[1])operation applied (10[1])to its components. Generalizing the logistic function to multiple dimensions produces a function called “soft [this operation]”. For 10 points, name the larger of the two values that are subtracted to find the range of a dataset. ■END■

ANSWER: maximum [accept descriptions like “the biggest value”; accept softmax or argmax; accept supremum]
<AW>
= Average correct buzz position

Back to tossups