Open Access. Powered by Scholars. Published by Universities.®
Articles 1 - 2 of 2
Full-Text Articles in Entire DC Network
Classification Theorems For The C*-Algebras Of Graphs With Sinks, Iain Raeburn, Mark Tomforde, Dana Williams
Classification Theorems For The C*-Algebras Of Graphs With Sinks, Iain Raeburn, Mark Tomforde, Dana Williams
Faculty of Engineering and Information Sciences - Papers: Part A
We consider graphs E which have been obtained by adding one or more sinks to a fixed directed graph G. We classify the C*-algebra of E up to a very strong equivalence relation, which insists, loosely speaking, that C*(G) is kept fixed. The main invariants are vectors WE: G0 → which describe how the sinks are attached to G; more precisely, the invariants are the classes of the WE in the cokernel of the map A – I, where A is the adjacency matrix of the graph …
Improving Adaboost For Classification On Small Training Sample Sets With Active Learning, Lei Wang, Xuchun Li, Eric Sung
Improving Adaboost For Classification On Small Training Sample Sets With Active Learning, Lei Wang, Xuchun Li, Eric Sung
Faculty of Engineering and Information Sciences - Papers: Part A
Recently, AdaBoost has been widely used in many computer vision applications and has shown promising results. However, it is also observed that its classification performance is often poor when the size of the training sample set is small. In certain situations, there may be many unlabelled samples available and labelling them is costly and time-consuming. Thus it is desirable to pick a few good samples to be labelled. The key is how. In this paper, we integrate active learning with AdaBoost to attack this problem. The principle idea is to select the next unlabelled sample base on it being at …