Distributional semantics using neural networks: technical report no. DCSE/TR-2016-04

dc.contributor.authorSvoboda, Lukáš
dc.date.accessioned2017-02-27T10:02:47Z
dc.date.available2017-02-27T10:02:47Z
dc.date.issued2016
dc.description.abstract-translatedDuring recent years, neural networks show crucial improvement in catching semantics of words or sentences. They also show improves in Language modeling, which is crucial for many tasks among Natural Language Processing (NLP). One of the most used architectures of Artificial Neural Networks (ANN) in NLP are Recurrent Neural Networks (RNN) that do not use limited size of context. By using recurrent connections, information can cycle in side these networks for arbitrarily long time. Thesis summarizes the state-of-the-art approaches to distributional semantics. Thesis also focus on further use of ANN among NLP problems.en
dc.format47 s.cs
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://www.kiv.zcu.cz/cz/vyzkum/publikace/technicke-zpravy/
dc.identifier.urihttp://hdl.handle.net/11025/25377
dc.language.isoenen
dc.publisherUniversity of West Bohemiaen
dc.rights© University of West Bohemia in Pilsenen
dc.rights.accessopenAccessen
dc.subjectneuronové sítěcs
dc.subjectsémantikacs
dc.subjectzpracování přirozeného jazykacs
dc.subject.translatedneural networksen
dc.subject.translatedsemanticen
dc.subject.translatednatural language processingen
dc.titleDistributional semantics using neural networks: technical report no. DCSE/TR-2016-04en
dc.typezprávacs
dc.typereporten
dc.type.versionpublishedVersionen

Files

Original bundle
Showing 1 - 1 out of 1 results
No Thumbnail Available
Name:
Svoboda.pdf
Size:
796.49 KB
Format:
Adobe Portable Document Format
Description:
Plný text
License bundle
Showing 1 - 1 out of 1 results
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections