Open Access. Powered by Scholars. Published by Universities.®
![Digital Commons Network](http://assets.bepress.com/20200205/img/dcn/DCsunburst.png)
Physical Sciences and Mathematics Commons™
Open Access. Powered by Scholars. Published by Universities.®
Articles 1 - 2 of 2
Full-Text Articles in Physical Sciences and Mathematics
Importance Of Verification And Validation Of Data Sources In Attaining Information Superiority, Gautham Kasinath, Leisa Armstrong
Importance Of Verification And Validation Of Data Sources In Attaining Information Superiority, Gautham Kasinath, Leisa Armstrong
Leisa Armstrong
Information superiority has been defined as a state that is achieved when a competitive advantage is derived from the ability to exploit a superior information position. To achieve such a superior information position enterprises and nations, alike, must not only collect and record correct, accurate, timely and useful information but also ensure that information recorded is not lost to competitors due to lack of comprehensive security and leaks. Further, enterprises that aim to attain information superiority must also ensure mechanisms of validating and verifying information to reduce the chances of mis-information. Although, research has been carried out into ways to …
A Wrapper-Based Feature Selection For Analysis Of Large Data Sets, Jinsong Leng, Craig Valli, Leisa Armstrong
A Wrapper-Based Feature Selection For Analysis Of Large Data Sets, Jinsong Leng, Craig Valli, Leisa Armstrong
Leisa Armstrong
Knowledge discovery from large data sets using classic data mining techniques has been proved to be difficult due to large size in both dimension and samples. In real applications, data sets often consist of many noisy, redundant, and irrelevant features, resulting in degrading the classification accuracy and increasing the complexity exponentially. Due to the inherent nature, the analysis of the quality of data sets is difficult and very limited approaches about this issue can be found in the literature. This paper presents a novel method to investigate the quality and structure of data sets, i.e., how to analyze whether there …