Make it cheap: learning with O(nd) complexity

Repository of Nicolaus Copernicus University

Show simple item record

dc.contributor.author Duch, Włodzisław
dc.contributor.author Jankowski, Norbert
dc.contributor.author Maszczyk, Tomasz
dc.date.accessioned 2012-12-19T16:30:20Z
dc.date.available 2012-12-19T16:30:20Z
dc.date.issued 2012-06
dc.identifier.citation The 2012 International Joint Conference on Neural Networks (IJCNN), pp. 132-135
dc.identifier.isbn 978-1-4673-1489-3
dc.identifier.uri http://repozytorium.umk.pl/handle/item/275
dc.description.abstract Learning methods with linear computational complexity O(nd) in number of samples and their dimension often give results that are better or at least not worse that more sophisticated and slower algorithms. This is demonstrated for many benchmark datasets downloaded from the UCI Machine Learning Repository. Results provided in this paper should be used as a reference for estimating usefulness of new learning algorithms. Methods with higher than linear complexity should provide significantly better results than those presented in this paper to justify their use.
dc.language.iso eng
dc.publisher Institute of Electrical and Electronics Engineers, Computational Intelligence Society
dc.rights info:eu-repo/semantics/openAccess
dc.subject computational complexity
dc.subject learning (artificial intelligence)
dc.subject pattern classification
dc.title Make it cheap: learning with O(nd) complexity
dc.type info:eu-repo/semantics/article

Files in this item

This item appears in the following Collection(s)

Show simple item record

Search repository

Advanced Search


My Account