Monday, June 26, 2017

The Idea That Words Can Be Represented As Vectors


quora | Clarification: The idea itself isn’t recent, but a certain implementation (word2vec) of it is, which opened the floodgates for applications in various fields involving text and speech.

For example, the word “house” may be represented as [1, 4, 2, 3], “bike” as [6, 3, 4, 7] and so on. The two papers (here and here) explain how the vectors can be built by simply using any large text base (the entire Wikipedia for example). The vectors are usually much larger than their corresponding words, of course.
Now for the fun part. If the vectors are built correctly for every word in the English vocabulary, something amazing would happen if you perform simple arithmetic operations on those vectors:
If you perform: “King” - “Man” + “Woman”, you will get the vector corresponding to.. wait for it.. wait for some more time because this is going to blow your mind.. “Queen”!
Similarly,
“Windows” - “Microsoft” + “Google” will give “Android”
“Scientist” - “Einstein” + “Messi” will give “Midfielder”
 “Paris” - “France” + “Italy” will give “Rome”


Also, synonyms will end up having very similar vectors. Keep in mind that all of this will have been learnt without any preexisting “knowledge”, but simply by looking at millions of English sentences and nothing else.
And this idea opened the floodgates for use in all kinds of applications, ranging from chatbots, personal assistants, question answering and language translation to applications in medicine, law, retail, etc.
It is difficult to find a field involving text or speech, that cannot use this breakthrough idea.

0 comments:

Fuck Robert Kagan And Would He Please Now Just Go Quietly Burn In Hell?

politico | The Washington Post on Friday announced it will no longer endorse presidential candidates, breaking decades of tradition in a...