Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Andrews University

2022

Machine Learning

Articles 1 - 3 of 3

Full-Text Articles in Engineering

Art To Influence Creativity In Algorithmic Composition, Tyler Braithwaite Apr 2022

Art To Influence Creativity In Algorithmic Composition, Tyler Braithwaite

Honors Theses

Advances in Recurrent Neural Network (RNN) techniques have caused an explosion of problems posed that revolve around the mass analysis and generation of sequential data, including symbolic music. Building off the work of Nathaniel Patterson’s Musical Autocomplete: An LSTM Approach, we extend this problem of continuing a composition by examining the creative impact that injecting latent-space encoded image data, specifically fine art from the WikiArt Dataset, has on the musical output of RNN architectures designed for autocomplete. For comparison purposes with Patterson, we will also be using a corpus of Erik Satie’s piano music for training, validation, and testing.


Conditional Variational Autoencoder (Cvae) For The Augmentation Of Ecl Biosensor Data, Matthew Dulcich Apr 2022

Conditional Variational Autoencoder (Cvae) For The Augmentation Of Ecl Biosensor Data, Matthew Dulcich

Honors Theses

Machine Learning (ML) is vastly improving the world, from computer vision to fully self-driving cars, we are now able accomplish objectives that were thought to only be dreams. In order to train ML models accurately, they require mountains of information to work with, but sometimes it becomes impossible to collect the data needed, so we turn to data augmentation. In this project we use a conditional variational auto encoder to supplement the original video electrochemiluminescence biosensor dataset, in order to increase the accuracy of a future classification model. In other words, using a cVAE we will create unique realistic videos …


Exploring The Efficiency Of Neural Architecture Search (Nas) Modules, Joshua Dulcich Apr 2022

Exploring The Efficiency Of Neural Architecture Search (Nas) Modules, Joshua Dulcich

Honors Theses

Machine learning is obscure and expensive to develop. Neural architecture search (NAS) algorithms automate this process by learning to create premier ML networks, minimizing the bias and necessity of human experts. From this recently emerging field, most research has focused on optimizing a promisingly unique combination of NAS’s three segments. Despite regularly acquiring state of the art results, this practice sacrifices computing time and resources for slight increases in accuracy; this also obstructs performance comparison across papers. To resolve this issue, we use NASLib’s modular library to test the efficiency per module in a unique subset of combinations. Each NAS …