Open Access. Powered by Scholars. Published by Universities.®

Statistical Models Commons

Open Access. Powered by Scholars. Published by Universities.®

PDF

Series

Applied Mathematics

Institution
Keyword
Publication Year
Publication

Articles 1 - 30 of 54

Full-Text Articles in Statistical Models

Modeling Biphasic, Non-Sigmoidal Dose-Response Relationships: Comparison Of Brain- Cousens And Cedergreen Models For A Biochemical Dataset, Venkat D. Abbaraju, Tamaraty L. Robinson, Brian P. Weiser Aug 2023

Modeling Biphasic, Non-Sigmoidal Dose-Response Relationships: Comparison Of Brain- Cousens And Cedergreen Models For A Biochemical Dataset, Venkat D. Abbaraju, Tamaraty L. Robinson, Brian P. Weiser

Rowan-Virtua School of Osteopathic Medicine Faculty Scholarship

Biphasic, non-sigmoidal dose-response relationships are frequently observed in biochemistry and pharmacology, but they are not always analyzed with appropriate statistical methods. Here, we examine curve fitting methods for “hormetic” dose-response relationships where low and high doses of an effector produce opposite responses. We provide the full dataset used for modeling, and we provide the code for analyzing the dataset in SAS using two established mathematical models of hormesis, the Brain-Cousens model and the Cedergreen model. We show how to obtain and interpret curve parameters such as the ED50 that arise from modeling, and we discuss how curve parameters might change …


Statistical Characteristics Of High-Frequency Gravity Waves Observed By An Airglow Imager At Andes Lidar Observatory, Alan Z. Liu, Bing Cao May 2022

Statistical Characteristics Of High-Frequency Gravity Waves Observed By An Airglow Imager At Andes Lidar Observatory, Alan Z. Liu, Bing Cao

Publications

The long-term statistical characteristics of high-frequency quasi-monochromatic gravity waves are presented using multi-year airglow images observed at Andes Lidar Observatory (ALO, 30.3° S, 70.7° W) in northern Chile. The distribution of primary gravity wave parameters including horizontal wavelength, vertical wavelength, intrinsic wave speed, and intrinsic wave period are obtained and are in the ranges of 20–30 km, 15–25 km, 50–100 m s−1, and 5–10 min, respectively. The duration of persistent gravity wave events captured by the imager approximately follows an exponential distribution with an average duration of 7–9 min. The waves tend to propagate against the local background winds and …


A Simple Algorithm For Generating A New Two Sample Type-Ii Progressive Censoring With Applications, E. M. Shokr, Rashad Mohamed El-Sagheer, Mahmoud Mansour, H. M. Faied, B. S. El-Desouky Jan 2022

A Simple Algorithm For Generating A New Two Sample Type-Ii Progressive Censoring With Applications, E. M. Shokr, Rashad Mohamed El-Sagheer, Mahmoud Mansour, H. M. Faied, B. S. El-Desouky

Basic Science Engineering

In this article, we introduce a simple algorithm to generating a new type-II progressive censoring scheme for two samples. It is observed that the proposed algorithm can be applied for any continues probability distribution. Moreover, the description model and necessary assumptions are discussed. In addition, the steps of simple generation algorithm along with programming steps are also constructed on real example. The inference of two Weibull Frechet populations are discussed under the proposed algorithm. Both classical and Bayesian inferential approaches of the distribution parameters are discussed. Furthermore, approximate confidence intervals are constructed based on the asymptotic distribution of the maximum …


Application Of Randomness In Finance, Jose Sanchez, Daanial Ahmad, Satyanand Singh May 2021

Application Of Randomness In Finance, Jose Sanchez, Daanial Ahmad, Satyanand Singh

Publications and Research

Brownian Motion which is also considered to be a Wiener process and can be thought of as a random walk. In our project we had briefly discussed the fluctuations of financial indices and related it to Brownian Motion and the modeling of Stock prices.


Lecture 04: Spatial Statistics Applications Of Hrl, Trl, And Mixed Precision, David Keyes Apr 2021

Lecture 04: Spatial Statistics Applications Of Hrl, Trl, And Mixed Precision, David Keyes

Mathematical Sciences Spring Lecture Series

As simulation and analytics enter the exascale era, numerical algorithms, particularly implicit solvers that couple vast numbers of degrees of freedom, must span a widening gap between ambitious applications and austere architectures to support them. We present fifteen universals for researchers in scalable solvers: imperatives from computer architecture that scalable solvers must respect, strategies towards achieving them that are currently well established, and additional strategies currently being developed for an effective and efficient exascale software ecosystem. We consider recent generalizations of what it means to “solve” a computational problem, which suggest that we have often been “oversolving” them at the …


Do Metabolic Networks Follow A Power Law? A Psamm Analysis, Ryan Geib, Lubos Thoma, Ying Zhang May 2019

Do Metabolic Networks Follow A Power Law? A Psamm Analysis, Ryan Geib, Lubos Thoma, Ying Zhang

Senior Honors Projects

Inspired by the landmark paper “Emergence of Scaling in Random Networks” by Barabási and Albert, the field of network science has focused heavily on the power law distribution in recent years. This distribution has been used to model everything from the popularity of sites on the World Wide Web to the number of citations received on a scientific paper. The feature of this distribution is highlighted by the fact that many nodes (websites or papers) have few connections (internet links or citations) while few “hubs” are connected to many nodes. These properties lead to two very important observed effects: the …


Pawnee Dam Inflow Design Flood (Idf) Update And Stage-Frequency Curve Development Using Rmcrfa, Jennifer P. Christensen, Joshua J. Melliger Jan 2019

Pawnee Dam Inflow Design Flood (Idf) Update And Stage-Frequency Curve Development Using Rmcrfa, Jennifer P. Christensen, Joshua J. Melliger

United States Geological Survey: Water Reports and Publications

Pawnee Dam is one of the ten Salt Creek Dams designed and built in the 1960s to mitigate flooding in Lincoln, Nebraska. This short paper illustrates the update of the Pawnee Dam inflow design flood (IDF) through calibration to recent high flow events and the development of its stage-frequency or hydrologic loading curve with the U.S. Army Corps of Engineers’ Risk Management Center Reservoir Frequency Analysis (RMC-RFA) model. The IDF update follows Engineering Regulation 1110-8-2, Inflow Design Flood for Dams and Reservoirs, including unit hydrograph peaking and two antecedent pool elevations. Background information on the original design of the dam …


Sabermetrics - Statistical Modeling Of Run Creation And Prevention In Baseball, Parker Chernoff Mar 2018

Sabermetrics - Statistical Modeling Of Run Creation And Prevention In Baseball, Parker Chernoff

FIU Electronic Theses and Dissertations

The focus of this thesis was to investigate which baseball metrics are most conducive to run creation and prevention. Stepwise regression and Liu estimation were used to formulate two models for the dependent variables and also used for cross validation. Finally, the predicted values were fed into the Pythagorean Expectation formula to predict a team’s most important goal: winning.

Each model fit strongly and collinearity amongst offensive predictors was considered using variance inflation factors. Hits, walks, and home runs allowed, infield putouts, errors, defense-independent earned run average ratio, defensive efficiency ratio, saves, runners left on base, shutouts, and walks per …


On The Three Dimensional Interaction Between Flexible Fibers And Fluid Flow, Bogdan Nita, Ryan Allaire Jan 2017

On The Three Dimensional Interaction Between Flexible Fibers And Fluid Flow, Bogdan Nita, Ryan Allaire

Department of Mathematics Facuty Scholarship and Creative Works

In this paper we discuss the deformation of a flexible fiber clamped to a spherical body and immersed in a flow of fluid moving with a speed ranging between 0 and 50 cm/s by means of three dimensional numerical simulation developed in COMSOL . The effects of flow speed and initial configuration angle of the fiber relative to the flow are analyzed. A rigorous analysis of the numerical procedure is performed and our code is benchmarked against well established cases. The flow velocity and pressure are used to compute drag forces upon the fiber. Of particular interest is the behavior …


Hpcnmf: A High-Performance Toolbox For Non-Negative Matrix Factorization, Karthik Devarajan, Guoli Wang Feb 2016

Hpcnmf: A High-Performance Toolbox For Non-Negative Matrix Factorization, Karthik Devarajan, Guoli Wang

COBRA Preprint Series

Non-negative matrix factorization (NMF) is a widely used machine learning algorithm for dimension reduction of large-scale data. It has found successful applications in a variety of fields such as computational biology, neuroscience, natural language processing, information retrieval, image processing and speech recognition. In bioinformatics, for example, it has been used to extract patterns and profiles from genomic and text-mining data as well as in protein sequence and structure analysis. While the scientific performance of NMF is very promising in dealing with high dimensional data sets and complex data structures, its computational cost is high and sometimes could be critical for …


Gis-Integrated Mathematical Modeling Of Social Phenomena At Macro- And Micro- Levels—A Multivariate Geographically-Weighted Regression Model For Identifying Locations Vulnerable To Hosting Terrorist Safe-Houses: France As Case Study, Elyktra Eisman Nov 2015

Gis-Integrated Mathematical Modeling Of Social Phenomena At Macro- And Micro- Levels—A Multivariate Geographically-Weighted Regression Model For Identifying Locations Vulnerable To Hosting Terrorist Safe-Houses: France As Case Study, Elyktra Eisman

FIU Electronic Theses and Dissertations

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to …


Bootstrapping Vs. Asymptotic Theory In Property And Casualty Loss Reserving, Andrew J. Difronzo Jr. Apr 2015

Bootstrapping Vs. Asymptotic Theory In Property And Casualty Loss Reserving, Andrew J. Difronzo Jr.

Honors Projects in Mathematics

One of the key functions of a property and casualty (P&C) insurance company is loss reserving, which calculates how much money the company should retain in order to pay out future claims. Most P&C insurance companies use non-stochastic (non-random) methods to estimate these future liabilities. However, future loss data can also be projected using generalized linear models (GLMs) and stochastic simulation. Two simulation methods that will be the focus of this project are: bootstrapping methodology, which resamples the original loss data (creating pseudo-data in the process) and fits the GLM parameters based on the new data to estimate the sampling …


Best Practice Recommendations For Data Screening, Justin A. Desimone, Peter D. Harms, Alice J. Desimone Feb 2015

Best Practice Recommendations For Data Screening, Justin A. Desimone, Peter D. Harms, Alice J. Desimone

Department of Management: Faculty Publications

Survey respondents differ in their levels of attention and effort when responding to items. There are a number of methods researchers may use to identify respondents who fail to exert sufficient effort in order to increase the rigor of analysis and enhance the trustworthiness of study results. Screening techniques are organized into three general categories, which differ in impact on survey design and potential respondent awareness. Assumptions and considerations regarding appropriate use of screening techniques are discussed along with descriptions of each technique. The utility of each screening technique is a function of survey design and administration. Each technique has …


Spin Glass Reflection Of The Decoding Transition For Quantum Error Correcting Codes, Alexey Kovalev, Leonid P. Pryadko Jan 2015

Spin Glass Reflection Of The Decoding Transition For Quantum Error Correcting Codes, Alexey Kovalev, Leonid P. Pryadko

Department of Physics and Astronomy: Faculty Publications

We study the decoding transition for quantum error correcting codes with the help of a mapping to random-bond Wegner spin models. Families of quantum low density parity-check (LDPC) codes with a finite decoding threshold lead to both known models (e.g., random bond Ising and random plaquette Z2 gauge models) as well as unexplored earlier generally non-local disordered spin models with non-trivial phase diagrams. The decoding transition corresponds to a transition from the ordered phase by proliferation of "post-topological" extended defects which generalize the notion of domain walls to non-local spin models. In recently discovered quantum LDPC code families with …


Light Pollution Research Through Citizen Science, John Kanemoto Aug 2014

Light Pollution Research Through Citizen Science, John Kanemoto

STAR Program Research Presentations

Light pollution (LP) can disrupt and/or degrade the health of all living things, as well as, their environments. The goal of my research at the NOAO was to check the accuracy of the citizen science LP reporting systems entitled: Globe at Night (GaN), Dark Sky Meter (DSM), and Loss of the Night (LoN). On the GaN webpage, the darkness of the night sky (DotNS) is reported by selecting a magnitude chart. Each magnitude chart has a different density/number of stars around a specific constellation. The greater number of stars implies a darker night sky. Within the DSM iPhone application, a …


Relationships Between Elements Of Leslie Matrices And Future Growth Of The Population, Lorisha Lynn Riley Mar 2014

Relationships Between Elements Of Leslie Matrices And Future Growth Of The Population, Lorisha Lynn Riley

Honors Program Projects

Leslie matrices have been used for years to model and predict the growth of animal populations. Recently, general rules have been applied that can relatively easily determine whether an animal population will grow or decline. My mentor, Dr. Justin Brown and I examine, more specifically, whether there are relationships between certain elements of a population and the dominant eigenvalue, which determines growth. Not only do we consider the general 3x3 Leslie matrix, but also we looked into modified versions for incomplete data and migration models of Leslie matrices. We successfully found several connections within these cases; however, there is much …


Modelling And Analysis On Noisy Financial Time Series, Jinsong Leng Jan 2014

Modelling And Analysis On Noisy Financial Time Series, Jinsong Leng

Research outputs 2014 to 2021

Building the prediction model(s) from the historical time series has attracted many researchers in last few decades. For example, the traders of hedge funds and experts in agriculture are demanding the precise models to make the prediction of the possible trends and cycles. Even though many statistical or machine learning (ML) models have been proposed, however, there are no universal solutions available to resolve such particular prob-lem. In this paper, the powerful forward-backward non-linear filter and wavelet-based denoising method are introduced to remove the high level of noise embedded in financial time series. With the filtered time series, the statistical …


Stochastic Dea With A Perfect Object And Its Application To Analysis Of Environmental Efficiency, Alexander Vaninsky Jul 2013

Stochastic Dea With A Perfect Object And Its Application To Analysis Of Environmental Efficiency, Alexander Vaninsky

Publications and Research

The paper introduces stochastic DEA with a Perfect Object (SDEA PO). The Perfect Object (PO) is a virtual Decision Making Unit (DMU) that has the smallest inputs and greatest outputs. Including the PO in a collection of actual objects yields an explicit formula of the efficiency index. Given the distributions of DEA inputs and outputs, this formula allows us to derive the probability distribution of the efficiency score, to find its mathematical expectation, and to deliver common (group–related) and partial (object-related) efficiency components. We apply this approach to a prospective analysis of environmental efficiency of the major national and regional …


A Superposed Log-Linear Failure Intensity Model For Repairable Artillery Systems, Byeong Min Mun, Suk Joo Bae, Paul Kvam Jan 2013

A Superposed Log-Linear Failure Intensity Model For Repairable Artillery Systems, Byeong Min Mun, Suk Joo Bae, Paul Kvam

Department of Math & Statistics Faculty Publications

This article investigates complex repairable artillery systems that include several failure modes. We derive a superposed process based on a mixture of nonhomogeneous Poisson processes in a minimal repair model. This allows for a bathtub-shaped failure intensity that models artillery data better than currently used methods. The method of maximum likelihood is used to estimate model parameters and construct confidence intervals for the cumulative intensity of the superposed process. Finally, we propose an optimal maintenance policy for repairable systems with bathtub-shaped intensity and apply it to the artillery-failure data.


A Normal Truncated Skewed-Laplace Model In Stochastic Frontier Analysis, Junyi Wang May 2012

A Normal Truncated Skewed-Laplace Model In Stochastic Frontier Analysis, Junyi Wang

Masters Theses & Specialist Projects

Stochastic frontier analysis is an exciting method of economic production modeling that is relevant to hospitals, stock markets, manufacturing factories, and services. In this paper, we create a new model using the normal distribution and truncated skew-Laplace distribution, namely the normal-truncated skew-Laplace model. This is a generalized model of the normal-exponential case. Furthermore, we compute the true technical efficiency and estimated technical efficiency of the normal-truncated skewed-Laplace model. Also, we compare the technical efficiencies of normal-truncated skewed-Laplace model and normal-exponential model.


Spatial And Temporal Correlations Of Freeway Link Speeds: An Empirical Study, Piotr J. Rachtan Jan 2012

Spatial And Temporal Correlations Of Freeway Link Speeds: An Empirical Study, Piotr J. Rachtan

Masters Theses 1911 - February 2014

Congestion on roadways and high level of uncertainty of traffic conditions are major considerations for trip planning. The purpose of this research is to investigate the characteristics and patterns of spatial and temporal correlations and also to detect other variables that affect correlation in a freeway setting. 5-minute speed aggregates from the Performance Measurement System (PeMS) database are obtained for two directions of an urban freeway – I-10 between Santa Monica and Los Angeles, California. Observations are for all non-holiday weekdays between January 1st and June 30th, 2010. Other variables include traffic flow, ramp locations, number of lanes and the …


Flexible Distributed Lag Models Using Random Functions With Application To Estimating Mortality Displacement From Heat-Related Deaths, Roger D. Peng Dec 2011

Flexible Distributed Lag Models Using Random Functions With Application To Estimating Mortality Displacement From Heat-Related Deaths, Roger D. Peng

Johns Hopkins University, Dept. of Biostatistics Working Papers

No abstract provided.


Generalized Bathtub Hazard Models For Binary-Transformed Climate Data, James Polcer May 2011

Generalized Bathtub Hazard Models For Binary-Transformed Climate Data, James Polcer

Masters Theses & Specialist Projects

In this study, we use a hazard-based modeling as an alternative statistical framework to time series methods as applied to climate data. Data collected from the Kentucky Mesonet will be used to study the distributional properties of the duration of high and low-energy wind events relative to an arbitrary threshold. Our objectiveswere to fit bathtub models proposed in literature, propose a generalized bathtub model, apply these models to Kentucky Mesonet data, and make recommendations as to feasibility of wind power generation. Using two different thresholds (1.8 and 10 mph respectively), results show that the Hjorth bathtub model consistently performed better …


Probability Models For Blackjack Poker, Charlie H. Cooke Jan 2010

Probability Models For Blackjack Poker, Charlie H. Cooke

Mathematics & Statistics Faculty Publications

For simplicity in calculation, previous analyses of blackjack poker have employed models which employ sampling with replacement. in order to assess what degree of error this may induce, the purpose here is to calculate results for a typical hand where sampling without replacement is employed. It is seen that significant error can result when long runs are required to complete the hand. The hand examined is itself of particular interest, as regards both its outstanding expectations of high yield and certain implications for pair splitting of two nines against the dealer's seven. Theoretical and experimental methods are used in order …


Shrinkage Estimation Of Expression Fold Change As An Alternative To Testing Hypotheses Of Equivalent Expression, Zahra Montazeri, Corey M. Yanofsky, David R. Bickel Aug 2009

Shrinkage Estimation Of Expression Fold Change As An Alternative To Testing Hypotheses Of Equivalent Expression, Zahra Montazeri, Corey M. Yanofsky, David R. Bickel

COBRA Preprint Series

Research on analyzing microarray data has focused on the problem of identifying differentially expressed genes to the neglect of the problem of how to integrate evidence that a gene is differentially expressed with information on the extent of its differential expression. Consequently, researchers currently prioritize genes for further study either on the basis of volcano plots or, more commonly, according to simple estimates of the fold change after filtering the genes with an arbitrary statistical significance threshold. While the subjective and informal nature of the former practice precludes quantification of its reliability, the latter practice is equivalent to using a …


A Monte Carlo Power Analysis Of Traditional Repeated Measures And Hierarchical Multivariate Linear Models In Longitudinal Data Analysis, Hua Fang, Gordon P. Brooks, Maria L. Rizzo, Kimberly A. Espy, Robert S. Barcikowski May 2008

A Monte Carlo Power Analysis Of Traditional Repeated Measures And Hierarchical Multivariate Linear Models In Longitudinal Data Analysis, Hua Fang, Gordon P. Brooks, Maria L. Rizzo, Kimberly A. Espy, Robert S. Barcikowski

Developmental Cognitive Neuroscience Laboratory: Faculty and Staff Publications

The power properties of traditional repeated measures and hierarchical linear models have not been clearly determined in the balanced design for longitudinal studies in the current literature. A Monte Carlo power analysis of traditional repeated measures and hierarchical multivariate linear models are presented under three variance-covariance structures. Results suggest that traditional repeated measures have higher power than hierarchical linear models for main effects, but lower power for interaction effects. Significant power differences are also exhibited when power is compared across different covariance structures. Results also supplement more comprehensive empirical indexes for estimating model precision via bootstrap estimates and the approximate …


A Method For Visualizing Multivariate Time Series Data, Roger D. Peng Feb 2008

A Method For Visualizing Multivariate Time Series Data, Roger D. Peng

Johns Hopkins University, Dept. of Biostatistics Working Papers

Visualization and exploratory analysis is an important part of any data analysis and is made more challenging when the data are voluminous and high-dimensional. One such example is environmental monitoring data, which are often collected over time and at multiple locations, resulting in a geographically indexed multivariate time series. Financial data, although not necessarily containing a geographic component, present another source of high-volume multivariate time series data. We present the mvtsplot function which provides a method for visualizing multivariate time series data. We outline the basic design concepts and provide some examples of its usage by applying it to a …


Bayesian Analysis For Penalized Spline Regression Using Win Bugs, Ciprian M. Crainiceanu, David Ruppert, M.P. Wand Dec 2007

Bayesian Analysis For Penalized Spline Regression Using Win Bugs, Ciprian M. Crainiceanu, David Ruppert, M.P. Wand

Johns Hopkins University, Dept. of Biostatistics Working Papers

Penalized splines can be viewed as BLUPs in a mixed model framework, which allows the use of mixed model software for smoothing. Thus, software originally developed for Bayesian analysis of mixed models can be used for penalized spline regression. Bayesian inference for nonparametric models enjoys the flexibility of nonparametric models and the exact inference provided by the Bayesian inferential machinery. This paper provides a simple, yet comprehensive, set of programs for the implementation of nonparametric Bayesian analysis in WinBUGS. MCMC mixing is substantially improved from the previous versions by using low{rank thin{plate splines instead of truncated polynomial basis. Simulation time …


The Time Invariance Principle, Ecological (Non)Chaos, And A Fundamental Pitfall Of Discrete Modeling, Bo Deng Mar 2007

The Time Invariance Principle, Ecological (Non)Chaos, And A Fundamental Pitfall Of Discrete Modeling, Bo Deng

Department of Mathematics: Faculty Publications

This paper is to show that most discrete models used for population dynamics in ecology are inherently pathological that their predications cannot be independently verified by experiments because they violate a fundamental principle of physics. The result is used to tackle an on-going controversy regarding ecological chaos. Another implication of the result is that all continuous dynamical systems must be modeled by differential equations. As a result it suggests that researches based on discrete modeling must be closely scrutinized and the teaching of calculus and differential equations must be emphasized for students of biology.


Diffusion And Fractional Diffusion Based Models For Multiple Light Scattering And Image Analysis, Jonathan Blackledge Jan 2007

Diffusion And Fractional Diffusion Based Models For Multiple Light Scattering And Image Analysis, Jonathan Blackledge

Articles

This paper considers a fractional light diffusion model as an approach to characterizing the case when intermediate scattering processes are present, i.e. the scattering regime is neither strong nor weak. In order to introduce the basis for this approach, we revisit the elements of formal scattering theory and the classical diffusion problem in terms of solutions to the inhomogeneous wave and diffusion equations respectively. We then address the significance of these equations in terms of a random walk model for multiple scattering. This leads to the proposition of a fractional diffusion equation for modelling intermediate strength scattering that is based …