Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 5 of 5

Full-Text Articles in Physical Sciences and Mathematics

Statistical And Machine Learning Approaches To Describe Factors Affecting Preweaning Mortality Of Piglets, Md Towfiqur Rahman, Tami M. Brown-Brandl, Gary A. Rohrer, Sudhendu R. Sharma, Vamsi Manthena, Yeyin Shi Oct 2023

Statistical And Machine Learning Approaches To Describe Factors Affecting Preweaning Mortality Of Piglets, Md Towfiqur Rahman, Tami M. Brown-Brandl, Gary A. Rohrer, Sudhendu R. Sharma, Vamsi Manthena, Yeyin Shi

Biological Systems Engineering: Papers and Publications

High preweaning mortality (PWM) rates for piglets are a significant concern for the worldwide pork industries, causing economic loss and well-being issues. This study focused on identifying the factors affecting PWM, overlays, and predicting PWM using historical production data with statistical and machine learning models. Data were collected from 1,982 litters from the United States Meat Animal Research Center, Nebraska, over the years 2016 to 2021. Sows were housed in a farrowing building with three rooms, each with 20 farrowing crates, and taken care of by well-trained animal caretakers. A generalized linear model was used to analyze the various sow, …


Improving Animal Monitoring Using Small Unmanned Aircraft Systems (Suas) And Deep Learning Networks, Meilun Zhou, Jared A. Elmore, Sathishkumar Samiappan, Kristine O. Evans, Morgan Pfeiffer, Bradley F. Blackwell, Raymond B. Iglay Sep 2021

Improving Animal Monitoring Using Small Unmanned Aircraft Systems (Suas) And Deep Learning Networks, Meilun Zhou, Jared A. Elmore, Sathishkumar Samiappan, Kristine O. Evans, Morgan Pfeiffer, Bradley F. Blackwell, Raymond B. Iglay

USDA Wildlife Services: Staff Publications

In recent years, small unmanned aircraft systems (sUAS) have been used widely to monitor animals because of their customizability, ease of operating, ability to access difficult to navigate places, and potential to minimize disturbance to animals. Automatic identification and classification of animals through images acquired using a sUAS may solve critical problems such as monitoring large areas with high vehicle traffic for animals to prevent collisions, such as animal-aircraft collisions on airports. In this research we demonstrate automated identification of four animal species using deep learning animal classification models trained on sUAS collected images. We used a sUAS mounted with …


Estimating Wildlife Strike Costs At Us Airports: A Machine Learning Approach, Levi Altringer, Jordan Navin, Michael J. Begier, Stephanie A. Shwiff, Aaron M. Anderson Jan 2021

Estimating Wildlife Strike Costs At Us Airports: A Machine Learning Approach, Levi Altringer, Jordan Navin, Michael J. Begier, Stephanie A. Shwiff, Aaron M. Anderson

USDA Wildlife Services: Staff Publications

Current lower bound estimates of the economic burden of wildlife strikes make use of mean cost assignment to impute missing values in the National Wildlife Strike Database (NWSD). The accuracy of these estimates, however, are undermined by the skewed nature of reported cost data and fail to account for differences in observed strike characteristics—e.g., type of aircraft, size of aircraft, type of damage, size of animal struck, etc. This paper makes use of modern machine learning techniques to provide a more accurate measure of the strike-related costs that accrue to the US civil aviation industry. We estimate that wildlife strikes …


Improving The Accessibility And Transferability Of Machine Learning Algorithms For Identification Of Animals In Camera Trap Images: Mlwic2, Michael A. Tabak, Mohammad S. Norouzzadeh, David W. Wolfson, Erica J. Newton, Raoul K. Boughton, Jacob S. Ivan, Eric Odell, Eric S. Newkirk, Reesa Y. Conrey, Jennifer Stenglein, Fabiola Iannarilli, John Erb, Ryan K. Brook, Amy J. Davis, Jesse Lewis, Daniel P. Walsh, James C. Beasley, Kurt C. Vercauteren, Jeff Clune, Ryan S. Miller Jan 2020

Improving The Accessibility And Transferability Of Machine Learning Algorithms For Identification Of Animals In Camera Trap Images: Mlwic2, Michael A. Tabak, Mohammad S. Norouzzadeh, David W. Wolfson, Erica J. Newton, Raoul K. Boughton, Jacob S. Ivan, Eric Odell, Eric S. Newkirk, Reesa Y. Conrey, Jennifer Stenglein, Fabiola Iannarilli, John Erb, Ryan K. Brook, Amy J. Davis, Jesse Lewis, Daniel P. Walsh, James C. Beasley, Kurt C. Vercauteren, Jeff Clune, Ryan S. Miller

USDA Wildlife Services: Staff Publications

Motion-activated wildlife cameras (or “camera traps”) are frequently used to remotely and noninvasively observe animals. The vast number of images collected from camera trap projects has prompted some biologists to employ machine learning algorithms to automatically recognize species in these images, or at least filter-out images that do not contain animals. These approaches are often limited by model transferability, as a model trained to recognize species from one location might not work as well for the same species in different locations. Furthermore, these methods often require advanced computational skills, making them inaccessible to many biologists. We used 3 million camera …


Vowel Recognition From Continuous Articulatory Movements For Speaker-Dependent Applications, Jun Wang, Jordan R. Green, Ashok Samal, Tom D. Carrell Jan 2010

Vowel Recognition From Continuous Articulatory Movements For Speaker-Dependent Applications, Jun Wang, Jordan R. Green, Ashok Samal, Tom D. Carrell

Department of Special Education and Communication Disorders: Faculty Publications

A novel approach was developed to recognize vowels from continuous tongue and lip movements. Vowels were classified based on movement patterns (rather than on derived articulatory features, e.g., lip opening) using a machine learning approach. Recognition accuracy on a single-speaker dataset was 94.02% with a very short latency. Recognition accuracy was better for high vowels than for low vowels. This finding parallels previous empirical findings on tongue movements during vowels. The recognition algorithm was then used to drive an articulation-to-acoustics synthesizer. The synthesizer recognizes vowels from continuous input stream of tongue and lip movements and plays the corresponding sound samples …