Talk:Word embedding

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Re. "Most new word embedding techniques rely on a neural network architecture instead of more traditional n-gram models and unsupervised learning"--Can someone rewrite this to indicate whether it is "rely on (a neural network architecture) instead of (more traditional n-gram models and unsupervised learning)" or "rely on (a neural network architecture (instead of more traditional n-gram models) and unsupervised learning)"? Philgoetz (talk) 21:56, 21 February 2018 (UTC)

I am considering adding a section introducing basic approaches of word embedding. --Linzhuoli (talk) 15:29, 20 May 2016 (UTC)

I think this article would benefit from greater description of the different word embedding tools that exist and how they differ from each other. Perhaps it could be restructured such that word2vec has a subsection, GloVe has a subsection, etc. Akozlowski (talk) 22:20, 9 June 2016 (UTC)

I am thinking of adding more content to the thought vector section and adding some images to illustrate word embedding. Chinoyhardik (talk) 20:14, 5 October 2017 (UTC)

WikiProject Computing (Rated Stub-class, Low-importance)
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Stub-Class article Stub  This article has been rated as Stub-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.