Talk:Word embedding
Re. "Most new word embedding techniques rely on a neural network architecture instead of more traditional n-gram models and unsupervised learning"--Can someone rewrite this to indicate whether it is "rely on (a neural network architecture) instead of (more traditional n-gram models and unsupervised learning)" or "rely on (a neural network architecture (instead of more traditional n-gram models) and unsupervised learning)"? Philgoetz (talk) 21:56, 21 February 2018 (UTC)
This article is or was the subject of a Wiki Education Foundation-supported course assignment. Further details are available on the course page. Assigned student editor(s): Linzhuoli. Assigned peer reviews: Akozlowski. |
I am considering adding a section introducing basic approaches of word embedding. --Linzhuoli (talk) 15:29, 20 May 2016 (UTC)
I think this article would benefit from greater description of the different word embedding tools that exist and how they differ from each other. Perhaps it could be restructured such that word2vec has a subsection, GloVe has a subsection, etc. Akozlowski (talk) 22:20, 9 June 2016 (UTC)
I am thinking of adding more content to the thought vector section and adding some images to illustrate word embedding. Chinoyhardik (talk) 20:14, 5 October 2017 (UTC)
WikiProject Computing | (Rated Stub-class, Low-importance) | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|