Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability

2011

Institution
Keyword
Publication
Publication Type
File Type

Articles 61 - 90 of 525

Full-Text Articles in Physical Sciences and Mathematics

Identifying Outliers In Fuzzy Time Series, S. Suresh, K. Senthamarai Kannan Nov 2011

Identifying Outliers In Fuzzy Time Series, S. Suresh, K. Senthamarai Kannan

Journal of Modern Applied Statistical Methods

Time series analysis is often associated with the discovery of patterns and prediction of features. Forecasting accuracy can be improved by removing identified outliers in the data set using the Cook’s distance and Studentized residual test. In this paper a modified fuzzy time series method is proposed based on transition probability vector membership function. It is experimentally shown that the proposed method minimizes the average forecasting error compared with other known existing methods.


Jmasm31: Manova Procedure For Power Calculations (Spss), Alan Taylor Nov 2011

Jmasm31: Manova Procedure For Power Calculations (Spss), Alan Taylor

Journal of Modern Applied Statistical Methods

D’Amico, Neilands & Zambarano (2001) showed how the SPSS MANOVA procedure can be used to conduct power calculations for research designs. This article demonstrates a simple way of entering data required for power calculations into SPSS and provides examples that supplement those given by D’Amico, Neilands & Zambarano.


Higher Order Markov Structure-Based Logistic Model And Likelihood Inference For Ordinal Data, Soma Chowdhury Biswas, M. Ataharul Islam, Jamal Nazrul Islam Nov 2011

Higher Order Markov Structure-Based Logistic Model And Likelihood Inference For Ordinal Data, Soma Chowdhury Biswas, M. Ataharul Islam, Jamal Nazrul Islam

Journal of Modern Applied Statistical Methods

Azzalini (1994) proposed a first order Markov chain for binary data. Azzalini’s model is extended for ordinal data and introduces a second order model. Further, the test statistics are developed and the power of the test is determined. An application using real data is also presented.


Proxy Pattern-Mixture Analysis For A Binary Variable Subject To Nonresponse., Rebecca H. Andridge, Roderick J. Little Nov 2011

Proxy Pattern-Mixture Analysis For A Binary Variable Subject To Nonresponse., Rebecca H. Andridge, Roderick J. Little

The University of Michigan Department of Biostatistics Working Paper Series

We consider assessment of the impact of nonresponse for a binary survey

variable Y subject to nonresponse, when there is a set of covariates

observed for nonrespondents and respondents. To reduce dimensionality and

for simplicity we reduce the covariates to a continuous proxy variable X

that has the highest correlation with Y, estimated from a probit

regression analysis of respondent data. We extend our previously proposed

proxy-pattern mixture analysis (PPMA) for continuous outcomes to the binary

outcome using a latent variable approach. The method does not assume data

are missing at random, and creates a framework for sensitivity analyses.

Maximum …


Comparison Of Several Tests For Combining Several Independent Tests, Madhusudan Bhandary, Xuan Zhang Nov 2011

Comparison Of Several Tests For Combining Several Independent Tests, Madhusudan Bhandary, Xuan Zhang

Journal of Modern Applied Statistical Methods

Several tests for combining p-values from independent tests have been considered to address a particular common testing problem. A simulation study shows that Fisher’s (1932) Inverse Chi-square test is optimal based on a power comparison of several different tests.


A Permutation Test For Compound Symmetry With Application To Gene Expression Data, Tracy L. Morris, Mark E. Payton, Stephanie A. Santorico Nov 2011

A Permutation Test For Compound Symmetry With Application To Gene Expression Data, Tracy L. Morris, Mark E. Payton, Stephanie A. Santorico

Journal of Modern Applied Statistical Methods

The development and application of a permutation test for compound symmetry is described. In a simulation study the permutation test appears to be a level-α test and is robust to non-normality. However, it exhibits poor power, particularly for small samples.


Discriminant Analysis For Repeated Measures Data: Effects Of Mean And Covariance Misspecification On Bias And Error In Discriminant Function Coefficients, Tolulope T. Sajobi, Lisa M. Lix, Longhai Li, William Laverty Nov 2011

Discriminant Analysis For Repeated Measures Data: Effects Of Mean And Covariance Misspecification On Bias And Error In Discriminant Function Coefficients, Tolulope T. Sajobi, Lisa M. Lix, Longhai Li, William Laverty

Journal of Modern Applied Statistical Methods

Discriminant analysis (DA) procedures based on parsimonious mean and/or covariance structures have been proposed for repeated measures (RM) data. Bias and means square error of discriminant function coefficients (DFCs) for DA procedures are investigated when the mean and/or covariance structures are correctly specified and misspecified.


Control Balanced Designs Involving Sequences Of Treatments, Cini Varghese, Seema Jaggi Nov 2011

Control Balanced Designs Involving Sequences Of Treatments, Cini Varghese, Seema Jaggi

Journal of Modern Applied Statistical Methods

Designs involving sequences of treatments for test vs. control comparisons are suitable for research in which each experimental unit receives treatments over time in order to compare several test treatments to one (or more) control treatment(s). These designs can be advantageously used in screening experiments and bioequivalence trials. Three series of such designs are constructed in incomplete sequences wherein the first class of designs is variance balanced while the other two classes of designs are partially variance balanced for test versus test comparisons of both direct and residual effects of treatments.


Construction Of Control Charts Based On Six Sigma Initiatives For The Number Of Defects And Average Number Of Defects Per Unit, R. Radhakrishnan, P. Balamurugan Nov 2011

Construction Of Control Charts Based On Six Sigma Initiatives For The Number Of Defects And Average Number Of Defects Per Unit, R. Radhakrishnan, P. Balamurugan

Journal of Modern Applied Statistical Methods

A control chart is a statistical device used for the study and control of a repetitive process. In 1931, Shewart suggested control charts based on 3 sigma limits. Today manufacturing companies around the world apply Six Sigma initiatives, with a result offewer product defects. Companies practicing Six Sigma initiatives are expected to produce 3.4 or less number of defects per million opportunities, a concept suggested by Motorola in 1980. If companies practicing Six Sigma initiatives use control limits suggested by Shewhart, then no points will fall outside the control limits due to the improvement in the quality of the process. …


Higher Order C(T, P, S) Crossover Designs, James F. Reed Iii Nov 2011

Higher Order C(T, P, S) Crossover Designs, James F. Reed Iii

Journal of Modern Applied Statistical Methods

A crossover study is a repeated measures design in which each subject is randomly assigned to a sequence of treatments, including at least two treatments. The most damning characteristic of a crossover study is the potential of a carryover effect of one treatment to the next period. To solve the first-order crossover problem characteristic in the classic AB|BA design, the design must be extended. One alternative uses additional treatment sequences in two periods; a second option is to add a third period and repeat one of the treatments. Assuming a traditional model that specifies a first-order carryover effect, this study …


Height-Diameter Relationship In Tree Modeling Using Simultaneous Equation Techniques In Correlated Normal Deviates, S. O. Oyamakin Nov 2011

Height-Diameter Relationship In Tree Modeling Using Simultaneous Equation Techniques In Correlated Normal Deviates, S. O. Oyamakin

Journal of Modern Applied Statistical Methods

In other to study the complex simultaneous relationships existing in forest/tree growth modeling, six estimation methods of a simultaneous equation model are examined to determine how they cope with varying degrees of correlation between pairs of random deviates using average parameter estimates. A two-equation simultaneous system assumed covariance matrix was considered. The model was structured to have a mutual correlation between pairs of random deviates: a violation of the assumption of mutual independence between pairs of such random deviates. The correlation between the pairs of normal deviates were generated using three scenarios r = 0.0, 0.3 and 0.5. The performances …


Identification Of Optimal Autoregressive Integrated Moving Average Model On Temperature Data, Olusola Samuel Makinde, Olusoga Akin Fasoranbaku Nov 2011

Identification Of Optimal Autoregressive Integrated Moving Average Model On Temperature Data, Olusola Samuel Makinde, Olusoga Akin Fasoranbaku

Journal of Modern Applied Statistical Methods

Autoregressive Integrated Moving Average (ARIMA) processes of various orders are presented to identify an optimal model from a class of models. Parameters of the models are estimated using an Ordinary Least Square (OLS) approach. ARIMA (p, d, q) is formulated for maximum daily temperature data in Ondo and Zaira from January 1995 to November 2005. The choice of ARIMA models of orders p and q is intended to retain persistence in a natural process. To determine the performance of models, Normalized Bayesian Information Criterion is adopted. The ARIMA (1, 1, 1) is adequate for modeling maximum daily temperature in Ondo …


Tests For Correlation On Bivariate Non-Normal Data, L. Beversdorf, Ping Sa Nov 2011

Tests For Correlation On Bivariate Non-Normal Data, L. Beversdorf, Ping Sa

Journal of Modern Applied Statistical Methods

Two statistics are considered to test the population correlation for non-normally distributed bivariate data. A simulation study shows that both statistics control type I error rates well for left-tailed tests and have reasonable power performance.


Explicit Equations For Acf In Autoregressive Processes In The Presence Of Heteroscedasticity Disturbances, Samir Safi Nov 2011

Explicit Equations For Acf In Autoregressive Processes In The Presence Of Heteroscedasticity Disturbances, Samir Safi

Journal of Modern Applied Statistical Methods

The autocorrelation function, ACF, is an important guide to the properties of a time series. Explicit equations are derived for ACF in the presence of heteroscedasticity disturbances in pth order autoregressive, AR(p), processes. Two cases are presented: (1) when the disturbance term follows the general covariance matrix, Σ , and (2) when the diagonal elements of Σ are not all identical but σi,j = 0 ∀i ≠ j.


Lq-Moments For Regional Flood Frequency Analysis: A Case Study For The North-Bank Region Of The Brahmaputra River, India, Abhijit Bhuyan, Munindra Borah Nov 2011

Lq-Moments For Regional Flood Frequency Analysis: A Case Study For The North-Bank Region Of The Brahmaputra River, India, Abhijit Bhuyan, Munindra Borah

Journal of Modern Applied Statistical Methods

The LQ-moment proposed by Mudholkar, et al. (1998) is used for regional flood frequency analysis of the North-Bank region of the river Brahmaputra, India. Five probability distributions are used for the LQmoment: generalized extreme value (GEV), generalized logistic (GLO) and generalized Pareto (GPA), lognormal (LN3) and Pearson Type III (PE3). The same regional frequency analysis procedure proposed by Hosking (1990) for the L-moment is used for the LQ-moment. Based on the LQ-moment ratio diagram and |Zidist| -statistic criteria, the PE3 distribution is identified as the robust distribution for the study area. For estimation of floods of various …


Assessing The Impact Of Non-Differential Genotyping Errors On Rare Variant Tests Of Association, Scott Powers, Shyam Gopalakrishnan, Nathan L. Tintle Nov 2011

Assessing The Impact Of Non-Differential Genotyping Errors On Rare Variant Tests Of Association, Scott Powers, Shyam Gopalakrishnan, Nathan L. Tintle

Faculty Work Comprehensive List

Background/Aims: We aim to quantify the effect of non-differential genotyping errors on the power of rare variant tests and identify those situations when genotyping errors are most harmful. Methods: We simulated genotype and phenotype data for a range of sample sizes, minor allele frequencies, disease relative risks and numbers of rare variants. Genotype errors were then simulated using five different error models covering a wide range of error rates. Results: Even at very low error rates, misclassifying a common homozygote as a heterozygote translates into a substantial loss of power, a result that is exacerbated even further as the minor …


A Bayesian Model For Gene Family Evolution, Liang Liu, Lili Yu, Venugopal Kalavacharla, Zhanji Liu Nov 2011

A Bayesian Model For Gene Family Evolution, Liang Liu, Lili Yu, Venugopal Kalavacharla, Zhanji Liu

Biostatistics Faculty Publications

Background

A birth and death process is frequently used for modeling the size of a gene family that may vary along the branches of a phylogenetic tree. Under the birth and death model, maximum likelihood methods have been developed to estimate the birth and death rate and the sizes of ancient gene families (numbers of gene copies at the internodes of the phylogenetic tree). This paper aims to provide a Bayesian approach for estimating parameters in the birth and death model.

Results

We develop a Bayesian approach for estimating the birth and death rate and other parameters in the birth …


Cagan Type Rational Expectations Model On Time Scales With Their Applications To Economics, Funda Ekiz Nov 2011

Cagan Type Rational Expectations Model On Time Scales With Their Applications To Economics, Funda Ekiz

Masters Theses & Specialist Projects

Rational expectations provide people or economic agents making future decision with available information and past experiences. The first approach to the idea of rational expectations was given approximately fifty years ago by John F. Muth. Many models in economics have been studied using the rational expectations idea. The most familiar one among them is the rational expectations version of the Cagans hyperination model where the expectation for tomorrow is formed using all the information available today. This model was reinterpreted by Thomas J. Sargent and Neil Wallace in 1973. After that time, many solution techniques were suggested to solve the …


Depicting Estimates Using The Intercept In Meta-Regression Models: The Moving Constant Technique, Blair T. Johnson Dr., Tania B. Huedo-Medina Dr. Oct 2011

Depicting Estimates Using The Intercept In Meta-Regression Models: The Moving Constant Technique, Blair T. Johnson Dr., Tania B. Huedo-Medina Dr.

CHIP Documents

In any scientific discipline, the ability to portray research patterns graphically often aids greatly in interpreting a phenomenon. In part to depict phenomena, the statistics and capabilities of meta-analytic models have grown increasingly sophisticated. Accordingly, this article details how to move the constant in weighted meta-analysis regression models (viz. “meta-regression”) to illuminate the patterns in such models across a range of complexities. Although it is commonly ignored in practice, the constant (or intercept) in such models can be indispensible when it is not relegated to its usual static role. The moving constant technique makes possible estimates and confidence intervals at …


Estimation Of A Non-Parametric Variable Importance Measure Of A Continuous Exposure, Chambaz Antoine, Pierre Neuvial, Mark J. Van Der Laan Oct 2011

Estimation Of A Non-Parametric Variable Importance Measure Of A Continuous Exposure, Chambaz Antoine, Pierre Neuvial, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

We define a new measure of variable importance of an exposure on a continuous outcome, accounting for potential confounders. The exposure features a reference level x0 with positive mass and a continuum of other levels. For the purpose of estimating it, we fully develop the semi-parametric estimation methodology called targeted minimum loss estimation methodology (TMLE) [van der Laan & Rubin, 2006; van der Laan & Rose, 2011]. We cover the whole spectrum of its theoretical study (convergence of the iterative procedure which is at the core of the TMLE methodology; consistency and asymptotic normality of the estimator), practical implementation, simulation …


Interactive Real-Time Embedded Systems Education Infused With Applied Internet Telephony, Kyle Persohn, Dennis Brylow Oct 2011

Interactive Real-Time Embedded Systems Education Infused With Applied Internet Telephony, Kyle Persohn, Dennis Brylow

Mathematics, Statistics and Computer Science Faculty Research and Publications

The transition from traditional circuit-switched phone systems to modern packet-based Internet telephony networks demands tools to support Voice over Internet Protocol (VoIP) development. In this paper, we introduce the XinuPhone, an integrated hardware/software approach for educating users about VoIP technology on a real-time embedded platform. We propose modular course topics for design-oriented, hands-on laboratory exercises: filter design, timing, serial communications, interrupts and resource budgeting, network transmission, and system benchmarking. Our open-source software platform encourages development and testing of new CODECs alongside existing standards, unlike similar commercial solutions. Furthermore, the supporting hardware features inexpensive, readily available components designed specifically for educational …


Student Fact Book, Fall 2011, Thirty-Fifth Annual Edition, Wright State University, Office Of Student Information Systems, Wright State University Oct 2011

Student Fact Book, Fall 2011, Thirty-Fifth Annual Edition, Wright State University, Office Of Student Information Systems, Wright State University

Wright State University Student Fact Books

The student fact book has general demographic information on all students enrolled at Wright State University for Fall Quarter, 2011.


An Experimental Nexos Laboratory Using Virtual Xinu, Paul Ruth, Dennis Brylow Oct 2011

An Experimental Nexos Laboratory Using Virtual Xinu, Paul Ruth, Dennis Brylow

Mathematics, Statistics and Computer Science Faculty Research and Publications

The Nexos Project is a joint effort between Marquette University, the University of Buffalo, and the University of Mississippi to build curriculum materials and a supporting experimental laboratory for hands-on projects in computer systems courses. The approach focuses on inexpensive, flexible, commodity embedded hardware, freely available development and debugging tools, and a fresh implementation of a classic operating system, Embedded Xinu, that is ideal for student exploration. This paper describes an extension to the Nexos laboratory that includes a new target platform composed of Qemu virtual machines. Virtual Xinu addresses two challenges that limit the effectiveness of Nexos. First, potential …


Beyond Multiple Regression: Using Commonality Analysis To Better Understand R2 Results, Russell Warne Sep 2011

Beyond Multiple Regression: Using Commonality Analysis To Better Understand R2 Results, Russell Warne

Russell T Warne

Multiple regression is one of the most common statistical methods used in quantitative educational research. Despite the versatility and easy interpretability of multiple regression, it has some shortcomings in the detection of suppressor variables and for somewhat arbitrarily assigning values to the structure coefficients of correlated independent variables. Commonality analysis—heretofore rarely used in gifted education research—is a statistical method that partitions the explained variance of a dependent variable into nonoverlapping parts according to the independent variable(s) that are related to each portion. This Methodological Brief includes an example of commonality analysis and equations for researchers who wish to conduct their …


Heterogeneity And Data Analysis, Peter J. Taylor Sep 2011

Heterogeneity And Data Analysis, Peter J. Taylor

Working Papers on Science in a Changing World

This working paper is a discussion paper for a September 2011 meeting of the research group of Prof. Di Cook on data visualization and exploratory data analysis at Iowa State University. A taxonomy of eleven kinds of heterogeneity is presented, followed by a set of vignettes that illustrate some of the meanings and sketch some implications, then a series of images that illustrate the heterogeneities. Several of the vignettes speak to a broad contention about heterogeneity and control: In relation to modern understandings of heredity and development over the life course, research and application of resulting knowledge are untroubled by …


Bland-Altman Plots For Evaluating Agreement Between Solid Tumor Measurements, Chaya S. Moskowitz, Mithat Gonen Sep 2011

Bland-Altman Plots For Evaluating Agreement Between Solid Tumor Measurements, Chaya S. Moskowitz, Mithat Gonen

Memorial Sloan-Kettering Cancer Center, Dept. of Epidemiology & Biostatistics Working Paper Series

Rationale and Objectives. Solid tumor measurements are regularly used in clinical trials of anticancer therapeutic agents and in clinical practice managing patients' care. Consequently studies evaluating the reproducibility of solid tumor measurements are important as lack of reproducibility may directly affect patient management. The authors propose utilizing a modified Bland-Altman plot with a difference metric that lends itself naturally to this situation and facilitates interpretation. Materials and Methods. The modification to the Bland-Altman plot involves replacing the difference plotted on the vertical axis with the relative percent change (RC) between the two measurements. This quantity is the same one used …


A Regularization Corrected Score Method For Nonlinear Regression Models With Covariate Error, David M. Zucker, Malka Gorfine, Yi Li, Donna Spiegelman Sep 2011

A Regularization Corrected Score Method For Nonlinear Regression Models With Covariate Error, David M. Zucker, Malka Gorfine, Yi Li, Donna Spiegelman

Harvard University Biostatistics Working Paper Series

No abstract provided.


Negative Binomial Regression Extensions, Joseph Hilbe Sep 2011

Negative Binomial Regression Extensions, Joseph Hilbe

Joseph M Hilbe

Negative Binomial Regression Extensions is an e-book extension of Negative Binomial Regression, 2nd edition, with added R and Stata code, and SAS macros all related to count models.


Suppliment To Logistic Regression Models, Joseph Hilbe Sep 2011

Suppliment To Logistic Regression Models, Joseph Hilbe

Joseph M Hilbe

No abstract provided.


Social Networks Enabled Coordination Model For Cost Management Of Patient Hospital Admissions, Shahadat Uddin, Liaquat Hossain Sep 2011

Social Networks Enabled Coordination Model For Cost Management Of Patient Hospital Admissions, Shahadat Uddin, Liaquat Hossain

Shahadat Uddin

In this study, we introduce a social networks enabled coordination model for exploring the effect of network position of “patient,” “physician,” and “hospital” actors in a patient-centered care network that evolves during patient hospitalization period on the total cost of coordination. An actor is a node, which represents an entity such as individual and organization in a social network. In our analysis of actor networks and coordination in the healthcare literature, we identified that there is significant gap where a number of promising hospital coordination model have been developed (e.g., Guided Care Model, Chronic Care Model) for the current healthcare …