Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Engineering

McKelvey School of Engineering Theses & Dissertations

Machine learning

Publication Year

Articles 1 - 2 of 2

Full-Text Articles in Entire DC Network

Holistic Control For Cyber-Physical Systems, Yehan Ma Jan 2021

Holistic Control For Cyber-Physical Systems, Yehan Ma

McKelvey School of Engineering Theses & Dissertations

The Industrial Internet of Things (IIoT) are transforming industries through emerging technologies such as wireless networks, edge computing, and machine learning. However, IIoT technologies are not ready for control systems for industrial automation that demands control performance of physical processes, resiliency to both cyber and physical disturbances, and energy efficiency. To meet the challenges of IIoT-driven control, we propose holistic control as a cyber-physical system (CPS) approach to next-generation industrial automation systems. In contrast to traditional industrial automation systems where computing, communication, and control are managed in isolation, holistic control orchestrates the management of cyber platforms (networks and computing platforms) …


Approximation And Relaxation Approaches For Parallel And Distributed Machine Learning, Stephen Tyree Dec 2014

Approximation And Relaxation Approaches For Parallel And Distributed Machine Learning, Stephen Tyree

McKelvey School of Engineering Theses & Dissertations

Large scale machine learning requires tradeoffs. Commonly this tradeoff has led practitioners to choose simpler, less powerful models, e.g. linear models, in order to process more training examples in a limited time. In this work, we introduce parallelism to the training of non-linear models by leveraging a different tradeoff--approximation. We demonstrate various techniques by which non-linear models can be made amenable to larger data sets and significantly more training parallelism by strategically introducing approximation in certain optimization steps.

For gradient boosted regression tree ensembles, we replace precise selection of tree splits with a coarse-grained, approximate split selection, yielding both faster …