Open Access. Powered by Scholars. Published by Universities.®
Social and Behavioral Sciences Commons™
Open Access. Powered by Scholars. Published by Universities.®
Articles 1 - 3 of 3
Full-Text Articles in Social and Behavioral Sciences
Φ-Divergence Loss-Based Artificial Neural Network, R. L. Salamwade, D. M. Sakate, S. K. Mathur
Φ-Divergence Loss-Based Artificial Neural Network, R. L. Salamwade, D. M. Sakate, S. K. Mathur
Journal of Modern Applied Statistical Methods
Artificial Neural Networks (ANNs) can fit non-linear functions and recognize patterns better than several standard techniques. Performance of ANNs is measured by using loss functions. Phi-divergence estimator is generalization of maximum likelihood estimator and it possesses all its properties. A neural network is proposed which is trained using phi-divergence loss.
Regularized Neural Network To Identify Potential Breast Cancer: A Bayesian Approach, Hansapani S. Rodrigo, Chris P. Tsokos, Taysseer Sharaf
Regularized Neural Network To Identify Potential Breast Cancer: A Bayesian Approach, Hansapani S. Rodrigo, Chris P. Tsokos, Taysseer Sharaf
Journal of Modern Applied Statistical Methods
In the current study, we have exemplified the use of Bayesian neural networks for breast cancer classification using the evidence procedure. The optimal Bayesian network has 81% overall accuracy in correctly classifying the true status of breast cancer patients, 59% sensitivity in correctly detecting the malignancy and 83% specificity in correctly detecting the non-malignancy. The area under the receiver operating characteristic curve (0.7940) shows that this is a moderate classification model.
Selection Of Independent Binary Features Using Probabilities: An Example From Veterinary Medicine, Ludmila I. Kuncheva, Zoë S.J. Hoare, Peter D. Cockcroft
Selection Of Independent Binary Features Using Probabilities: An Example From Veterinary Medicine, Ludmila I. Kuncheva, Zoë S.J. Hoare, Peter D. Cockcroft
Journal of Modern Applied Statistical Methods
Supervised classification into c mutually exclusive classes based on n binary features is considered. The only information available is an n×c table with probabilities. Knowing that the best d features are not the d best, simulations were run for 4 feature selection methods and an application to diagnosing BSE in cattle and Scrapie in sheep is presented.