Question

Raymond Mooney is known for an expletive-laden quip arguing against using these objects. (15[1])A log-bilinear model named for these objects attempts to learn the logarithm of the ratio of co-occurrence probabilities. A team at Stanford led by Jeffrey Pennington created a model named for “global” examples of these objects. A model whose name includes an abbreviation of these objects uses two contrasting approaches called CBOW and skip-gram. For maximum margin (*) separators, (-5[1])data that lie exactly on the margin are partly named for these objects. (10[2])A word embedding model partly named for these objects (10[2])is usually illustrated (10[1])with the example of (-5[1])solving analogies. A classification technique that can use the kernel trick is named for the “support” (10[1])type of these objects. (10[1])For 10 points, a semantic model beginning “word2” is named for what mathematical objects and might calculate similarity by taking their dot product? ■END■ (10[1])

ANSWER: vectors [accept support vectors; accept global vectors or GloVe; accept word2vec; accept support vector machines after “kernel trick” and reject beforehand; reject “arrays” or “lists”] (The Raymond Mooney quote is “You can't cram the meaning of a whole %&!$# sentence into a single $&!#* vector!”)
<JX>
= Average correct buzz position