With this demo, you can solve the classic analogy task, which is often used to evaluate/showcase the strength of word embeddings. The analogy task consists of a query in the form of A is to B as C is to, and has to predict D (e.g. man is to king as woman is to .. queen). In word embeddings, this is often done by calculating the cosine distance between the concepts. However, usually the input words are specifically excluded from the output. In this demo we do not use this restriction, and thus A, B or C can be returned. For more information, see the paper.
We include three sets of pretrained embeddings. The GoogleNews embeddings, reddit embeddings from Manzini et al (w2v_0), and embeddings re-trained on the same reddit data, but with the default word2vec settings.
Pick an example, or try your own analogy below!