Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability Commons

Open Access. Powered by Scholars. Published by Universities.®

10,908 Full-Text Articles 13,744 Authors 1,487,536 Downloads 155 Institutions

All Articles in Statistics and Probability

Faceted Search

10,908 full-text articles. Page 1 of 296.

Heterogeneous Responses To Viral Infection: Insights From Mathematical Modeling Of Yellow Fever Vaccine, James R. Moore 2016 Emory University

Heterogeneous Responses To Viral Infection: Insights From Mathematical Modeling Of Yellow Fever Vaccine, James R. Moore

Biology and Medicine Through Mathematics Conference

No abstract provided.


Homeolog Specific Expression Bias, Ronald D. Smith 2016 College of William and Mary

Homeolog Specific Expression Bias, Ronald D. Smith

Biology and Medicine Through Mathematics Conference

No abstract provided.


Imports, Unionizationandracial Age Discrimination In The Us, Jacqueline Agesa, Richard U. Agesa 2016 Marshall University

Imports, Unionizationandracial Age Discrimination In The Us, Jacqueline Agesa, Richard U. Agesa

Jacqueline Agesa

Past studies of the relationship between competition and racial wages find that domestic competition reduces racial wage discrimination of nonunion workers. This article examines the effects of foreign competition on racial wages of union and nonunion workers utilizing an empirical model which allows for cluster-adjusted SEs by industry. Such a procedure allows independence of observations across industries but not within industries, thereby not overstating the significance of industry invariant controls. In this analysis, clustered SEs prevent the overstatement of the significance of imports as a means to reduce earnings discrimination. We find evidence of a wage premium for nonunion white ...


Imports, Unionizationandracial Age Discrimination In The Us, Jacqueline Agesa, Richard U. Agesa 2016 Marshall University

Imports, Unionizationandracial Age Discrimination In The Us, Jacqueline Agesa, Richard U. Agesa

Richard Agesa

Past studies of the relationship between competition and racial wages find that domestic competition reduces racial wage discrimination of nonunion workers. This article examines the effects of foreign competition on racial wages of union and nonunion workers utilizing an empirical model which allows for cluster-adjusted SEs by industry. Such a procedure allows independence of observations across industries but not within industries, thereby not overstating the significance of industry invariant controls. In this analysis, clustered SEs prevent the overstatement of the significance of imports as a means to reduce earnings discrimination. We find evidence of a wage premium for nonunion white ...


Takens Theorem With Singular Spectrum Analysis Applied To Noisy Time Series, Thomas K. Torku 2016 East Tennessee State University

Takens Theorem With Singular Spectrum Analysis Applied To Noisy Time Series, Thomas K. Torku

Electronic Theses and Dissertations

The evolution of big data has led to financial time series becoming increasingly complex, noisy, non-stationary and nonlinear. Takens theorem can be used to analyze and forecast nonlinear time series, but even small amounts of noise can hopelessly corrupt a Takens approach. In contrast, Singular Spectrum Analysis is an excellent tool for both forecasting and noise reduction. Fortunately, it is possible to combine the Takens approach with Singular Spectrum analysis (SSA), and in fact, estimation of key parameters in Takens theorem is performed with Singular Spectrum Analysis. In this thesis, we combine the denoising abilities of SSA with the Takens ...


Examining Cost Functionality And Optimization: A Case Study On Testing The Reasonableness Of New Aircraft Using Historical Aircraft Data, Katherine Jozefiak 2016 Washington University in St. Louis

Examining Cost Functionality And Optimization: A Case Study On Testing The Reasonableness Of New Aircraft Using Historical Aircraft Data, Katherine Jozefiak

Arts & Sciences Electronic Theses and Dissertations

When pursuing business by competing for government contracts, proving the submitted price is reasonable is often required. This proof is called a test of reasonableness. This study analyzes data from historical aircraft programs in relation of a new aircraft program in order to demonstrate the estimated cost of the new program is reasonable. The purpose of this study is to investigate three questions. Is the new program cost reasonable using current industry and government parameters? Is it better to look at programs from a total cost perspective or break the total cost into subcategory levels? Finally, this study applies a ...


Classification Trees And Rule-Based Modeling Using The C5.0 Algorithm For Self-Image Across Sex And Race In St. Louis, Rohan Shirali 2016 Washington University in St. Louis

Classification Trees And Rule-Based Modeling Using The C5.0 Algorithm For Self-Image Across Sex And Race In St. Louis, Rohan Shirali

Arts & Sciences Electronic Theses and Dissertations

The study population comprised children, adolescents, and adults who were residents of the city of St. Louis at the time of data collection in 2015. The data collected includes sex, age, race, measured height and weight, self-reported height and weight, zip code, educational background, exercise and diet habits, and descriptions and strategies of participants' weight (i.e. overweight and trying to lose weight, respectively). I use the C5.0 algorithm to create classification trees and rule-based models to analyze this population. Specifically, I model a binary self-image variable as a function of sex, age, race, zip code, and a ratio ...


Methods For Dealing With Death And Missing Data, And For Standardizing Different Health Variables In Longitudinal Datasets: The Cardiovascular Health Study, Paula Diehr 2016 University of Washington

Methods For Dealing With Death And Missing Data, And For Standardizing Different Health Variables In Longitudinal Datasets: The Cardiovascular Health Study, Paula Diehr

UW Biostatistics Working Paper Series

Longitudinal studies of older adults usually need to account for deaths and missing data. The study databases often include multiple health-related variables, whose trends over time are hard to compare because they were measured on different scales. Here we present a unified approach to these three problems that was developed and used in the Cardiovascular Health Study. Data were first transformed to a new scale that had integer/ratio properties, and on which “dead” logically takes the value zero. Missing data were then imputed on this new scale, using each person’s own data over time. Imputation could thus be ...


Stochastic Optimization Of Adaptive Enrichment Designs For Two Subpopulations, Aaron Fisher, Michael Rosenblum 2016 Johns Hopkins University Bloomberg School of Public Health

Stochastic Optimization Of Adaptive Enrichment Designs For Two Subpopulations, Aaron Fisher, Michael Rosenblum

Johns Hopkins University, Dept. of Biostatistics Working Papers

An adaptive enrichment design is a randomized trial that allows enrollment criteria to be modified at interim analyses, based on preset decision rules. When there is prior uncertainty regarding treatment effect heterogeneity, these trials can provide improved power for detecting treatment effects in subpopulations. An obstacle to using these designs is that there is no general approach to determine what decision rules and other design parameters will lead to good performance for a given research problem. To address this, we present a simulated annealing approach for optimizing the parameters of an adaptive enrichment design for a given scientific application. Optimization ...


Published On Dc - For Download Test, Sidney Twentythree 2016 Selected Works

Published On Dc - For Download Test, Sidney Twentythree

Sid B. Twentythree

This is a TEST.


Statistical Interpretation Including The Appropriate Statistical Tests, Olga A. Vsevolozhskaya 2016 University of Kentucky

Statistical Interpretation Including The Appropriate Statistical Tests, Olga A. Vsevolozhskaya

Olga A. Vsevolozhskaya

Outline:

  1. Evaluation of treatment’s therapeutic potential after experimental stroke.
  2. Post-stroke behavioral testing and functional recovery.


Data Smoothing Techniques: Historical And Modern, Lori L. Murray 2016 The University of Western Ontario

Data Smoothing Techniques: Historical And Modern, Lori L. Murray

Electronic Thesis and Dissertation Repository

Some elementary data smoothing techniques emerged during the eighteenth century. At that time, smoothing techniques consisted of simple interpolation of the data and eventually evolved into more complex modern methods. Some of the significant milestones of smoothing or graduation of population data will be described including the smoothing methods of W.F. Sheppard in the early twentieth century. Sheppard's statistical interests focused on data smoothing, the construction of mathematical tables and education. Throughout his career, Sheppard consulted Karl Pearson for advice pertaining to his statistical research. An examination of his correspondence to Pearson will be presented and his smoothing ...


Effects Of Tillage And Poultry Manure Application Rates On Salmonella And Fecal Indicator Bacteria Concentrations In Tiles Draining Des Moines Lobe Soils, C. E. Hruby, M. L. Soupir, T. B. Moorman, M. Shelley, R. S. Kanwar 2016 Iowa State University

Effects Of Tillage And Poultry Manure Application Rates On Salmonella And Fecal Indicator Bacteria Concentrations In Tiles Draining Des Moines Lobe Soils, C. E. Hruby, M. L. Soupir, T. B. Moorman, M. Shelley, R. S. Kanwar

Agricultural and Biosystems Engineering Publications

Application of poultry manure (PM) to cropland as fertilizer is a common practice in artificially drained regions of the Upper Midwest United States. Tile-waters have the potential to contribute pathogenic bacteria to downstream waters. This 3-year study (2010–2012) was designed to evaluate the impacts of manure management and tillage practices on bacteria losses to drainage tiles under a wide range of field conditions. PM was applied annually in spring, prior to planting corn, at application rates ranging from 5 to 40 kg/ha to achieve target rates of 112 and 224 kg/ha nitrogen (PM1 and PM2). Control plots ...


Does Research On Evaluation Matter? Findings From A Survey Of American Evaluation Association Members And Prominent Evaluation Theorists And Scholars, Satoshi Ozeki 2016 Western Michigan University

Does Research On Evaluation Matter? Findings From A Survey Of American Evaluation Association Members And Prominent Evaluation Theorists And Scholars, Satoshi Ozeki

Research and Creative Activities Poster Day

  • Evaluation is a relatively new, practice-based field
  • Evaluation scholars lead the field by presenting their theories
  • Evaluation theories are not based on empirical evidence
  • Empirical investigation is required to establish the field of evaluation
  • There were calls for more research on evaluation (RoE)
  • The number of studies on RoE has increased in the past decade
  • It is unknown whether RoE is important in the evaluation community


Data-Adaptive Inference Of The Optimal Treatment Rule And Its Mean Reward. The Masked Bandit, Antoine Chambaz, Wenjing Zheng, Mark J. van der Laan 2016 Université Paris Ouest Nanterre

Data-Adaptive Inference Of The Optimal Treatment Rule And Its Mean Reward. The Masked Bandit, Antoine Chambaz, Wenjing Zheng, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

This article studies the data-adaptive inference of an optimal treatment rule. A treatment rule is an individualized treatment strategy in which treatment assignment for a patient is based on her measured baseline covariates. Eventually, a reward is measured on the patient. We also infer the mean reward under the optimal treatment rule. We do so in the so called non-exceptional case, i.e., assuming that there is no stratum of the baseline covariates where treatment is neither beneficial nor harmful, and under a companion margin assumption.

Our pivotal estimator, whose definition hinges on the targeted minimum loss estimation (TMLE) principle ...


Psychological Aspects Of Slaughter: Reactions Of College Students To Killing And Butchering Cattle And Hogs, Harold A. Herzog Jr., Sandy McGee 2016 Mars Hill College

Psychological Aspects Of Slaughter: Reactions Of College Students To Killing And Butchering Cattle And Hogs, Harold A. Herzog Jr., Sandy Mcgee

Harold Herzog

This study examined the reactions of college students involved in slaughtering cattle and hogs as part of their jobs on a college work crew. The 27 students were surveyed on attitudes containing items toward slaughtering animals and toward different uses of animals. Nineteen were later interviewed. Some aspects of slaughtering were reported to be more bothersome than others. There was a relationship between the amount of experience of the subjects in slaughtering and also their general attitudes toward various uses of animals and their responses to several of the items on the questionnaire. The perceived benefits of the slaughtering experience ...


Flesch-Kincaid Reading Grade Level Re-Examined: Creating A Uniform Method For Calculating Readability On A Certification Exam, Emily Neuhoff, Kristiana M. Feeser, Kayla Sutherland, Thomas Hovatter 2016 Southern Illinois University Carbondale

Flesch-Kincaid Reading Grade Level Re-Examined: Creating A Uniform Method For Calculating Readability On A Certification Exam, Emily Neuhoff, Kristiana M. Feeser, Kayla Sutherland, Thomas Hovatter

Online Journal for Workforce Education and Development

Abstract

Objective: This study attempted to establish a consistent measurement technique of the readability of a state-wide Certified Nursing Assistant’s (CNA) certification exam. Background: Monitoring the readability level of an exam helps ensure all test versions do not exceed the maximum reading level of the exam, and that knowledge of the subject matter, rather than reading ability, is being assessed. Method: A two part approach was used to specify and evaluate readability. First, two methods (Microsoft Word® (MSW) software and published readability formulae) were used to calculate Flesch Reading Ease (FRE) and Flesch-Kincaid Reading Grade Level (FKRGL) for multiple ...


Recommendation To Use Exact P-Values In Biomarker Discovery Research, Margaret Sullivan Pepe, Matthew F. Buas, Christopher I. Li, Garnet L. Anderson 2016 Fred Hutchinson Cancer Rsrch Center

Recommendation To Use Exact P-Values In Biomarker Discovery Research, Margaret Sullivan Pepe, Matthew F. Buas, Christopher I. Li, Garnet L. Anderson

UW Biostatistics Working Paper Series

Background: In biomarker discovery studies, markers are ranked for validation using P-values. Standard P-value calculations use normal approximations that may not be valid for small P-values and small sample sizes common in discovery research.

Methods: We compared exact P-values, valid by definition, with normal and logit-normal approximations in a simulated study of 40 cases and 160 controls. The key measure of biomarker performance was sensitivity at 90% specificity. Data for 3000 uninformative markers and 30 true markers were generated randomly, with 10 replications of the simulation. We also analyzed real data on 2371 antibody array markers ...


The Reliability Of Crowdsourcing: Latent Trait Modeling With Mechanical Turk, Matt Baucum, Steven Rouse Dr., Cindy Miller-Perrin, Elizabeth Mancuso Dr. 2016 Pepperdine University

The Reliability Of Crowdsourcing: Latent Trait Modeling With Mechanical Turk, Matt Baucum, Steven Rouse Dr., Cindy Miller-Perrin, Elizabeth Mancuso Dr.

Seaver College Research And Scholarly Achievement Symposium

Mechanical Turk, an online crowdsourcing platform, has recently received increased attention in the social sciences as studies continue to suggest its viability as a source for reliable experimental data. Given the ease with which large samples can be quickly and inexpensively gathered, it is worth examining whether Mechanical Turk can provide accurate experimental data for methodologies requiring such large samples. One such methodology is Item Response Theory, a psychometric paradigm that defines test items by a mathematical relationship between a respondent’s ability and the probability of item endorsement. To test whether Mechanical Turk can serve as a reliable source ...


Modelling Latent Variables For Bayesian Networks, Charles Cain 2016 University of Minnesota, Morris

Modelling Latent Variables For Bayesian Networks, Charles Cain

Undergraduate Research Symposium 2016

Bayesian Networks are networks of interconnected variables used to explain causal relationships with conditional probability. Latent variables or hidden variables are variables that cannot be directly measured, like depression or physical activity. They can be used inside of a Bayesian Network. This research looks at latent variables as a weighted sum of observed variables. We use these modeled latent variables as continuous variables in a Bayesian Network. As an example, we look at a Bayesian Network of the causation of diabetes using data from the National Health and Nutrition Examination Survey (NHANES) that is publicly available from the CDC and ...


Digital Commons powered by bepress