Alternate Low-Rank Matrix Approximation in Latent Semantic Analysis

dc.contributor.authorHorasan, Fahrettin
dc.contributor.authorErbay, Hasan
dc.contributor.authorVarcin, Fatih
dc.contributor.authorDeniz, Emre
dc.date.accessioned2020-06-25T18:34:38Z
dc.date.available2020-06-25T18:34:38Z
dc.date.issued2019
dc.departmentKırıkkale Üniversitesi
dc.descriptionHorasan, Fahrettin/0000-0003-4554-9083; Erbay, Hasan/0000-0002-7555-541X; Deniz, Emre/0000-0003-0546-4229; Varcin, Fatih/0000-0002-5100-3012
dc.description.abstractThe latent semantic analysis (LSA) is a mathematical/statistical way of discovering hidden concepts between terms and documents or within a document collection (i.e., a large corpus of text). Each document of the corpus and terms are expressed as a vector with elements corresponding to these concepts to form a term-document matrix. Then, the LSA uses a low-rank approximation to the term-document matrix in order to remove irrelevant information, to extract more important relations, and to reduce the computational time. The irrelevant information is called as noise and does not have a noteworthy effect on the meaning of the document collection. This is an essential step in the LSA. The singular value decomposition (SVD) has been the main tool obtaining the low-rank approximation in the LSA. Since the document collection is dynamic (i.e., the term-document matrix is subject to repeated updates), we need to renew the approximation. This can be done via recomputing the SVD or updating the SVD. However, the computational time of recomputing or updating the SVD of the term-document matrix is very high when adding new terms and/or documents to preexisting document collection. Therefore, this issue opened the door of using other matrix decompositions for the LSA as ULV- and URV-based decompositions. This study shows that the truncated ULV decomposition (TULVD) is a good alternative to the SVD in the LSA modeling.en_US
dc.description.sponsorshipKirikkale University Scientific Research Projects (BAP)Kirikkale University [2016/150]en_US
dc.description.sponsorshipThis study was supported with project 2016/150 by Kirikkale University Scientific Research Projects (BAP).en_US
dc.identifier.citationFahrettin Horasan, Hasan Erbay, Fatih Varçın, Emre Deniz, "Alternate Low-Rank Matrix Approximation in Latent Semantic Analysis", Scientific Programming, vol. 2019, 1-12.en_US
dc.identifier.doi10.1155/2019/1095643
dc.identifier.issn1058-9244
dc.identifier.issn1875-919X
dc.identifier.scopus2-s2.0-85062328189
dc.identifier.scopusqualityN/A
dc.identifier.urihttps://doi.org/10.1155/2019/1095643
dc.identifier.urihttps://hdl.handle.net/20.500.12587/7983
dc.identifier.volume2019en_US
dc.identifier.wosWOS:000459097200001
dc.identifier.wosqualityQ4
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherHindawi Ltden_US
dc.relation.ispartofScientific Programming
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.titleAlternate Low-Rank Matrix Approximation in Latent Semantic Analysisen_US
dc.typeArticle

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
Alternate Low-Rank Matrix Approximation in Latent.pdf
Boyut:
1.73 MB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam Metin/Full Text