Open Access. Powered by Scholars. Published by Universities.®

Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 37

Full-Text Articles in Mathematics

The Eyes Have It: Eye Tracking Data Visualizations Of Viewing Patterns Of Statistical Graphics, Trent Fawcett May 2016

The Eyes Have It: Eye Tracking Data Visualizations Of Viewing Patterns Of Statistical Graphics, Trent Fawcett

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

As statistical graphics continue to expand to manage an ever growing amount of diverse data, a need to evaluate the effectiveness of graphics, both basic and complex, has arisen. Technological advancements have given a means to evaluate the effectiveness of graphs and graphical components through eye tracking systems. Eye tracking systems are likewise in need of software that will enable easy evaluation and exploration of data. The focus of this Master's Report is to evaluate the dual solution. An exploration of an eye tracker setup is made, with extensive consideration of testing statistical graphics providing a basis for continued research …


Survival Analysis For Truncated Data And Competing Risks, Michael Steelman May 2015

Survival Analysis For Truncated Data And Competing Risks, Michael Steelman

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The purpose of this project is to consider the problems of left truncation and competing risks in analyzing censored survival data, and to compare and contrast various approaches for handling these problems. The motivation for this work comes from an analysis of data from the Cache County Memory Study. Study investigators were interested in the association between early-life psychologically stressful events (e.g., parental or sibling death, or parental divorce, among others) and late-life risk of Alzheimer's disease (AD). While conventional methods for censored survival data can be applied, the presence of left truncation and competing risks (i.e., other adverse events …


Comparing Linear Mixed Models To Meta-Regression Analysis In The Greenville Air Quality Study, Lynsie M. Daley May 2015

Comparing Linear Mixed Models To Meta-Regression Analysis In The Greenville Air Quality Study, Lynsie M. Daley

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The effect of air quality on public health is an important issue in need of better understanding. There are many stakeholders, especially in Utah and Cache Valley, where the poor air quality as measured by PM 2.5 levels and consequent inversions can sometimes be the very worst in the nation. This project focuses on comparing two statistical methods used to analyze an important air quality data set from the Greenville Air Quality Study, focusing on a lung function response variable. A linear mixed model, with a random factor for subject, gives slope estimates and their significance for predictor variables of …


Modeling Asset Volatility Using Various Resources, Isaac G. Blackhurst May 2014

Modeling Asset Volatility Using Various Resources, Isaac G. Blackhurst

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Volatility is of central interest in modern financial econometrics. This thesis evaluates three different methods of measuring volatility.


Assessing Changes In The Abundance Of The Continental Population Of Scaup Using A Hierarchical Spatio-Temporal Model, Beth E. Ross May 2012

Assessing Changes In The Abundance Of The Continental Population Of Scaup Using A Hierarchical Spatio-Temporal Model, Beth E. Ross

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

In ecological studies, the goal is often to describe and gain further insight into ecological processes underlying the data collected during observational studies. Because of the nature of observational data, it can often be difficult to separate the variation in the data from the underlying process or 'state dynamics.' In order to better address this issue, it is becoming increasingly common for researchers to use hierarchical models. Hierarchical spatial, temporal, and spatio-temporal models allow for the simultaneous modeling of both first and second order processes, thus accounting for underlying autocorrelation in the system while still providing insight into overall spatial …


Ignoring The Spatial Context In Intro Statistics Classes - And Some Simple Graphical Remedies, Nathan Voge May 2012

Ignoring The Spatial Context In Intro Statistics Classes - And Some Simple Graphical Remedies, Nathan Voge

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Statistical data often have a spatial (geographic) context, be it countries of the world, states in the US, counties within a state, cities across the globe, or locations where measurements have been taken. However, most introductory statistics books do not even suggest that such data often are not independent from location, but rather are eected by some spatial association. Remedies are simple: Display data via various map views and brie y discuss which additional information can be extracted from such a graphical representation. In this report, we will visit a variety of popular introductory statistics textbooks and show how some …


Collecting, Analyzing And Interpreting Bivariate Data From Leaky Buckets: A Project-Based Learning Unit, Florence Funmilayo Obielodan May 2011

Collecting, Analyzing And Interpreting Bivariate Data From Leaky Buckets: A Project-Based Learning Unit, Florence Funmilayo Obielodan

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Despite the significance and the emphasis placed on mathematics as a subject and field of study, achieving the right attitude to improve students‟ understanding and performance is still a challenge. Previous studies have shown that the problem cuts across nations around the world, both developing countries and developed alike. Teachers and educators of the subject have responsibilities to continuously develop innovative pedagogical approaches that will enhance students‟ interests and performance. Teaching approaches that emphasize real life applications of the subject have become imperative. It is believed that this will stimulate learners‟ interest in the subject as they will be able …


Estimation Of Beta In A Simple Functional Capital Asset Pricing Model For High Frequency Us Stock Data, Yan Zhang May 2011

Estimation Of Beta In A Simple Functional Capital Asset Pricing Model For High Frequency Us Stock Data, Yan Zhang

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

This project applies the methods of functional data analysis (FDA) to intra-daily returns of US corporations. It focuses on an extension of the Capital Asset Pricing Model (CAPM) to such returns. The CAPM is essentially a linear regression with the slope coefficient β. Returns of an asset are regressed on index return. We compare the estimates of β obtained for the daily and intra-daily returns. The variability of these estimates is assessed by two bootstrap methods. All computations are performed using statistical software R. Customized functions are developed to process the raw data, estimate the parameters and assess their variability. …


Controlling Error Rates With Multiple Positively-Dependent Tests, Abdullah Al Masud May 2011

Controlling Error Rates With Multiple Positively-Dependent Tests, Abdullah Al Masud

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

It is a typical feature of high dimensional data analysis, for example a microarray study, that a researcher allows thousands of statistical tests at a time. All inferences for the tests are determined using the p-values; a smaller p-value than the α-level of the test signifies a statistically significant test. As the number of tests increases, the chance of observing some small p-values is very high even when all null hypotheses are true. Consequently, we make wrong conclusions on the hypotheses. This type of potential problem frequently happens when we test several hypotheses simultaneously, i.e., the multiple testing problem. …


On The Use Of Log-Transformation Vs. Nonlinear Regression For Analyzing Biological Power-Laws, Xiao Xiao Jan 2011

On The Use Of Log-Transformation Vs. Nonlinear Regression For Analyzing Biological Power-Laws, Xiao Xiao

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Power-law relationships are among the most well-studied functional relationships in biology . Recently the common practice of fitting power-laws using linear regression on log-transformed data (LR) has been criticized, calling into question the conclusions of hundreds of studies. It has been suggested that nonlinear regression (NLR) is preferable, but no rigorous comparison of these two methods has been conducted. Using Monte Carlo simulations we demonstrate that the error distribution determines which method performs better, with LR better characterizing data with multiplicative lognormal error and NLR better characterizing data with additive normal error. Analysis of 471 biological power-laws shows that both …


Virtual Manipulatives In The Classroom And Resulting Articles And Lesson Plans, Cheryl Juliana Aug 2010

Virtual Manipulatives In The Classroom And Resulting Articles And Lesson Plans, Cheryl Juliana

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Upon coming across mathematical manipulatives generated and produced by Utah State University, as a math teacher, I conducted a classroom teaching experiment in three pre-algebra classes with students of various achievement levels. After teaching the entire year using no manipulatives in the classroom, I tested my students with a general, end-of-year, core criterion, or cumulative test. Their scores were noted. The students in the study group were then given opportunities to try several manipulatives offered on the "National Library of Virtual Manipulatives," both as a class, and alone, and then retested. The following paper gives the parameters of the study, …


Assessment Of Utah Bankruptcies By Census Tracts: A Spatial Statistical Approach, Kenneth Pena Jan 2010

Assessment Of Utah Bankruptcies By Census Tracts: A Spatial Statistical Approach, Kenneth Pena

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

There are two questions raised when looking at the spatial pattern of the rate of bankruptcies in Utah: (i) are there similarities between the bankruptcy data in adjacent census tracts and (ii) can local cluster and outliers be identified within the data? Specifically, are there similar rates of bankruptcies in bordering census tracts and are there any localized areas of interest where we find extremely high or extremely low rates of bankruptcies? This study uses spatial statistics to perform tests for spatial autocorrelation to address these two questions. It also looks at commonalities in the clusters and differences in the …


Improving Accuracy Of Large-Scale Prediction Of Forest Disease Incidence Through Bayesian Data Reconciliation, Ephraim M. Hanks Jan 2010

Improving Accuracy Of Large-Scale Prediction Of Forest Disease Incidence Through Bayesian Data Reconciliation, Ephraim M. Hanks

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Increasing the accuracy of predictions made from ecological data typically involves replacing or replicating the data, but the cost of updating large-scale data sets can be prohibitive. Focusing resources on a small sample of locations from a large, less accurate data set can result in more reliable observations, though on a smaller scale. We present an approach for increasing the accuracy of predictions made from a large-scale eco logical data set through reconciliation with a small, highly accurate data set within a Bayesian hierarchical modeling framework. This approach is illustrated through a study of incidence of eastern spruce dwarf mistletoe …


Numerical Solution Of The Five-Moment Ideal Two-Fluid Equations In One Dimension, Marcus Scott Jan 2010

Numerical Solution Of The Five-Moment Ideal Two-Fluid Equations In One Dimension, Marcus Scott

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Plasmas are frequently treated as a single conducting fluid and modeled using the equations of magnetohydrodynamics. However, this regime works better for low-frequency plasmas. High-frequency plasmas may be modeled using the principles of kinetic theory. For plasmas with frequencies between these two extremes, a two-fluid approach can yield better results. In 2006, Ammar Hakim mathematically modeled a plasma with a set of equations called the five-moment ideal two-fluid equations. An attempt is made reproduce those results. A derivation of this set of equations by taking moments of the Boltzmann equation is presented. Electric and magnetic fields contribute to the source …


Assessing The Precision And Accuracy In A Small Sample Of Actical Devices, Peter Sherick Jan 2010

Assessing The Precision And Accuracy In A Small Sample Of Actical Devices, Peter Sherick

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Actigraphy is an increasingly popular approach in medicine to assess patient activity levels in a variety of scenarios. The devices are essentially accelerometers encased in a write-watch type assembly. This project sought to determine the device precision and accuracy for the Actical model. In a sample of four Acticals, it was found that intra-device variability was minimal. However, one device was found to be statistically biased in comparison to the other three. This bias could have adverse effects on aggregated or magnitude dependent data analysis. Also, inter-device comparisons may be problematic.


A Comparison Of Prediction Methods Of Functional Autoregressive Time Series, Devin Didericksen Jan 2010

A Comparison Of Prediction Methods Of Functional Autoregressive Time Series, Devin Didericksen

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Functional data analysis (FDA) is a relatively new branch of statistics that has seen a lot of expansion recently. With the advent of computer processing power and more efficient software packages we have entered the beginning stages of applying FDA methodology and techniques to data. Part of this undertaking should include an empirical assessment of the effectiveness of some of the tools of FDA, which are sound on theoretical grounds. In a small way, this project helps advance this objective.

This work begins by introducing FDA, scalar prediction techniques, and the functional autoregressive model of order one - FAR(1). Two …


Simulating Power For One-Way Anova By Using Non-Normal Error Distributions, Caixia Xu Jan 2010

Simulating Power For One-Way Anova By Using Non-Normal Error Distributions, Caixia Xu

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Usually we assume that the distribution of the additive errors in a one-way ANOVA linear model is normal. However, exceptions to this assumption about the error distribution may exist. In such cases, we might consider non-normal error distributions, but proceed with the "usual" ANOVA F-test analyses. This study focuses on simulating power for one-way ANOVA when using non-normal error distributions.


Statistical Analysis Of Wastewater Remediation And Bio-Fuels Production Of Algae, Jay D. Jones Jan 2010

Statistical Analysis Of Wastewater Remediation And Bio-Fuels Production Of Algae, Jay D. Jones

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The Logan city wastewater treatment system consists of a series of seven large aerated ponds (460 acres) that biologically treats 15 million gallons per day of wastewater from Logan city and six other communities. Tighter regulations of allowed phosphorus levels in the effluent have recently been implemented due to environmental concerns of a downstream reservoir. The Biological Engineering program at Utah State University, the Bio-fuels Center, the Utah Water Research Laboratory (UWRL) and the city of Logan are working together to remediate the wastewater treatment system using microalgae. Algal growth requires the uptake of phosphorus. Thus, phosphorus in the effluent …


Logistic Models With Missing Categorical Covariates, Jeremiah Rounds Jan 2009

Logistic Models With Missing Categorical Covariates, Jeremiah Rounds

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

We present an EM based solution to missing categorical covariates in Binomial models with logit links using an assumption that experimental units are drawn from a Multinomial population of infinite size. We further address the problem of separation of points inducing large variances on parameter estimates by the use of a novel score-modification based on Firth's bias-reduction score-modification. We simulate to address questions about estimate bias, distribution, and appropriate parameter coverage by Wald intervals.


An Investigation Of The Ends Of Finitely Generated Groups, Daniel T. Murphree Jan 2009

An Investigation Of The Ends Of Finitely Generated Groups, Daniel T. Murphree

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Geometric group theory is a relatively new branch of mathematics, studied as a distinct area since the 1990's. It explores invariant properties of groups based on group actions defined on topological or geometrical spaces. One of the pioneering works in geometric group theory is the article "Topological Methods in Group Theory" by Peter Scott and Terry Wall, written in 1977. This article was an overview of revised notes from an advanced course give in Liverpool in the same year. This report is an attempt to make these notes more accessible to lower level graduate students in the fields of topology …


A Method For Finding Standard Error Estimates For Rma Expression Levels Using Bootstrap, Gabriel Nicholas May 2007

A Method For Finding Standard Error Estimates For Rma Expression Levels Using Bootstrap, Gabriel Nicholas

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Oligonucleotide arrays are used in many applications. Affymetrix GeneChip arrays are widely used. Before researchers can use the information from these arrays, the raw data must be transformed and summarized into a more meaningful and usable form. One of the more popular methods for doing so is RMA (Robust Multi-array Analysis).

A problem with RMA is that the end result (estimated gene expression levels) is based on a fairly complicated process that is unusual. Specifically, there is no closed-form estimate of standard errors for the estimated gene expression levels. The current recommendation is to use a naive estimate for the …


Special Classification Models For Lichens In The Pacific Northwest, Janeen Ardito May 2005

Special Classification Models For Lichens In The Pacific Northwest, Janeen Ardito

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

A common problem in ecological studies is that of determining where to look for rare species. This paper shows how statistical models, such as classification trees, may be used to assist in the design of probability-based surveys for rare species using information on more abundant species that are associated with the rare species. This model assisted approach to survey design involves first building models for the more abundant species. The models are then used to determine stratifications for the rare species that are associated with the more abundant species. The goal of this approach is to increase the number of …


The Robustness Of Factor Analyses When The Data Does Not Conform To Standard Parametric Requirements, Haisong Peng May 2004

The Robustness Of Factor Analyses When The Data Does Not Conform To Standard Parametric Requirements, Haisong Peng

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Objective: To access the robustness of factor analyses when the data does not conform to standard parametric requirements.

Methods: Data were simulated in package R. Maximum likelihood was used to fit and assess the factor models. Chi-square statistics were obtained to test hypotheses about the correct number of factors in simulated settings where the true number of factors was known. The number of true factors varied between 1 and 3; the number of observed variables was either 6 (for 1 factor) or 3 per factor for 2 or more factors.

Results: With standard normal factor populations, and normal errors added …


Optimal Path Planning And The Fast Marching Method, J. J. Clark Aug 2002

Optimal Path Planning And The Fast Marching Method, J. J. Clark

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The problem of determining an optimal path for an object moving through some obstacle space presents several nontrivial subproblems. The foremost being the computational complexity that is involved and how to best deal with the associated large data volume. For example, a non-symmetric object moving in three dimensions possesses six degrees of freedom. This can lead to a computational grid that may easily be on the order of 1012. Furthermore, for every point in the computational domain, several complex calculations must be performed. These include performing tests to determine if the object and obstacles intersect, and numerically solving …


Analysis Of A Non-Replicated Split-Split Plot Experiment, Emily Simmons Sim Jan 2001

Analysis Of A Non-Replicated Split-Split Plot Experiment, Emily Simmons Sim

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

A major obstacle in the analysis of experimental data, in many situations, is the lack of "true" or "complete" replication. In some disciplines, researchers are very aware of the importance of replication and the methods for correctly replicating an experiment. In other subject areas, however, researchers are less aware of what it means to properly replicate an experiment. Due to this lack of awareness, many non-replicated experiments are carried out every year. For many of these non-replicated experiments, there is no satisfactory statistical analysis.

The subject of this report is the analysis of two non-replicated experiments in environmental engineering. First, …


Multiple Shooting, Monique St-Maurice Jan 1985

Multiple Shooting, Monique St-Maurice

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The purpose of this report was to study the Multiple Shooting method, a numerical method to solve boundary value ordinary differential equation. A FORTRAN program was written to solve the specific problem.

y'' = -y, y(0) = 0, y(π/2) = 1

on a microcomputer, using the microsoft FORTRAN compiler and the 8087 coprocessor.


Monte Carlo Simulation Of The Game Of Twenty-One, Douglas E. Loer Jan 1985

Monte Carlo Simulation Of The Game Of Twenty-One, Douglas E. Loer

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The purpose of this paper is to demonstrate the application of computer simulation to the game of Twenty-One to predict a player's expected return from the game. Twenty-One has traditionally been one of the most popular casino games and has attracted much effort to accurately estimate the house's true advantage. Probability theory has been tried, but the thousands of different combinations of cards possible in all hands throughout the entire pack make it practically impossible to apply probability theory without overlooking some possibilities. For this reason, Twenty-One is a perfect candidate for simulation. By blocking several simulations, normal theory can …


Comparison Of Bootstrap And Jacknife Statistical Procedures, Amanuel Gobena Jan 1985

Comparison Of Bootstrap And Jacknife Statistical Procedures, Amanuel Gobena

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

This report compares the bootstrapping to jacknifing statistical procedures in terms in bias, confidence interval and estimation of median. Related literature have been reviewed. A bootstrap allows a researcher to get an approximation to the distribution of possibly complicated statistical summaries. It is based on random sampling with replacement from experimental units. Jacknife has also been in operation prior to bootstrapping statistical procedure. The jacknife divides the data into subgroups and obtains partial estimates of these subgroups by omitting one subgroup at a time. When both of these statistical resampling procedures are compared the bootstrap has less bias, more accurate …


The Frobenius Theorem, Hiroshi Nagao Jan 1981

The Frobenius Theorem, Hiroshi Nagao

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Many theorems in differential geometry which deal with the existence of certain geometrical structures or properties depend upon various existence and uniqueness theorems for differential equations. Because of its wide range of applications one of the most important of these theorems is the Frobenius Theorem for systems of total differential equations. There are four different forms of the Frobenius Theorem. In applications of the theorem one form is often preferable to the others. In this report we -shall prove the Frobenius Theorem, establish the equivalence of these various forms, and discuss a few applications.


An Evaluation Of Bartlett's Chi-Square Approximation For The Determinant Of A Matrix Of Sample Zero-Order Correlation Coefficients, Stephen M. Hattori Jan 1975

An Evaluation Of Bartlett's Chi-Square Approximation For The Determinant Of A Matrix Of Sample Zero-Order Correlation Coefficients, Stephen M. Hattori

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The single equation least-squares regression model has been extensively studied by economists and statisticians alike in order to determine the problems which arise when particular assumptions are violated. Much literature is available in terms of the properties and limitations of the model. However, on the multicollinearity problem, there has been little research, and consequently, limited literature is available when the problem is encountered. Farrar & Glauber (1967) present a collection of techniques to use in order to detect or diagnose the occurrence of multicollinearity within a regression analysis. They attempt to define multicollinearity in terms of departures from a hypothesized …