Performance Modeling And Optimization Techniques For Heterogeneous Computing, 2014 Louisiana Tech University
Performance Modeling And Optimization Techniques For Heterogeneous Computing, Supada Laosooksathit
Doctoral Dissertations
Since Graphics Processing Units (CPUs) have increasingly gained popularity amoung non-graphic and computational applications, known as General-Purpose computation on GPU (GPGPU), CPUs have been deployed in many clusters, including the world's fastest supercomputer. However, to make the most efficiency from a GPU system, one should consider both performance and reliability of the system.
This dissertation makes four major contributions. First, the two-level checkpoint/restart protocol that aims to reduce the checkpoint and recovery costs with a latency hiding strategy in a system between a CPU (Central Processing Unit) and a GPU is proposed. The experimental results and analysis reveals some benefits, …
An Exploration Of “Non-Economic” Damages In Civil Jury Awards, 2014 Duke Law School
An Exploration Of “Non-Economic” Damages In Civil Jury Awards, Herbert M. Kritzer, Guangya Liu, Neil Vidmar
Faculty Scholarship
Using three primary data sources plus three supplemental sources discussed in an appendix, this paper examines how well non-economic damages could be predicted by economic damages and at how the ratio of non-economic damages to economic damages changed as the magnitude of the economic damages awarded by juries increased. We found a mixture of consistent and inconsistent patterns across our various datasets. One fairly consistent pattern was the tendency for the ratio of non-economic to economic damages to decline as the amount of economic damages increased. Moreover, the variability of the ratio also tended to decline as the amount of …
Estimating Population Treatment Effects From A Survey Sub-Sample, 2014 Johns Hopkins Bloomberg School of Public Health
Estimating Population Treatment Effects From A Survey Sub-Sample, Kara E. Rudolph, Ivan Diaz, Michael Rosenblum, Elizabeth A. Stuart
Johns Hopkins University, Dept. of Biostatistics Working Papers
We consider the problem of estimating an average treatment effect for a target population from a survey sub-sample. Our motivating example is generalizing a treatment effect estimated in a sub-sample of the National Comorbidity Survey Replication Adolescent Supplement to the population of U.S. adolescents. To address this problem, we evaluate easy-to-implement methods that account for both non-random treatment assignment and a non-random two-stage selection mechanism. We compare the performance of a Horvitz-Thompson estimator using inverse probability weighting (IPW) and two double robust estimators in a variety of scenarios. We demonstrate that the two double robust estimators generally outperform IPW in …
Meta-Analysis Of Studies Investigation Of The Effect Of Smoking Cessation On Impatience, 2014 University of Vermont
Meta-Analysis Of Studies Investigation Of The Effect Of Smoking Cessation On Impatience, Miriam Claire Dash
Graduate College Dissertations and Theses
(DSM-IV-TR/) nor in withdrawal scales. However, a related term, "impatience" is listed in some nicotine withdrawal scales. (Hughes J. R., Measurements of the Effects of Abstinence from Tobacco: A Qualitative Review, 2007). Although impatience is not a synonym of impulsivity, both share the synonym "impetuous". Therefore, impatience can be considered a measure of impulsivity. Although some reviews of the effect of smoking cessation on impatience have occurred, we know of no quantitative review of prospective studies of whether smoking cessation increases impatience.
Purpose: To evaluate the effect of smoking cessation on impatience as measured by the Minnesota Nicotine Withdrawal Scale-Revised …
Statistical Analysis Of Unreplicated Factorial Designs Using Contrasts, 2014 Georgia Southern University
Statistical Analysis Of Unreplicated Factorial Designs Using Contrasts, Meixi Yang
Electronic Theses and Dissertations
Factorial designs can have a large number of treatments due to the number of factors and the number of levels of each factor. The number of experimental units required for a researcher to conduct a $k$ factorial experiment is at least the number of treatments. For such an experiment, the total number of experimental units will also depend on the number of replicates for each treatment. The more experimental units used in a study the more the cost to the researcher. The minimum cost is associated with the case in which there is one experimental unit per treatment. That is, …
Statistical Analysis Of Sleep Patterns In Drosophila Melanogaster, 2014 Missouri University of Science and Technology
Statistical Analysis Of Sleep Patterns In Drosophila Melanogaster, Luyang Wang
Masters Theses
“Sleep is one of the most preserved and restorative behaviors of animals and is important in human health. Lack of sleep may cause numerous diseases. So the study of sleep rhythms is very essential and complicated. In order to simplify the process of studying sleep, the fruit fly Drosophila melanogaster, is utilized as a model organism for several reasons. Some of the molecular mechanisms that contribute to the circadian clock in the fruit fly were also found to generate similar cycles in mammals. In order to study sleep patterns in the fruit fly, experiments were designed and performed to …
Methods For Integrative Analysis Of Genomic Data, 2014 Virginia Commonwealth University
Methods For Integrative Analysis Of Genomic Data, Paul Manser
Theses and Dissertations
In recent years, the development of new genomic technologies has allowed for the investigation of many regulatory epigenetic marks besides expression levels, on a genome-wide scale. As the price for these technologies continues to decrease, study sizes will not only increase, but several different assays are beginning to be used for the same samples. It is therefore desirable to develop statistical methods to integrate multiple data types that can handle the increased computational burden of incorporating large data sets. Furthermore, it is important to develop sound quality control and normalization methods as technical errors can compound when integrating multiple genomic …
Methods For Clustering Mixed Data, 2014 University of South Carolina - Columbia
Methods For Clustering Mixed Data, Jeanmarie L. Hendrickson
Theses and Dissertations
We give a brief introduction to cluster analysis and then propose and discuss a few methods for clustering mixed data. In particular, a model-based clustering method for mixed data based on Everitt's (1988) work is described, and we use a simulated annealing method to estimate the parameters for Everitt's model. A penalized log likelihood with the simulated annealing method is proposed as a remedy for the parameter estimates being drawn to extremes. Everitt's approach and the proposed method are compared based on their performance in clustering simulated data. We then use the penalized log likelihood method on a heart disease …
Oldtimers & Newcomers In Collective Action, 2014 University of South Carolina - Columbia
Oldtimers & Newcomers In Collective Action, Stefanie R. Chamberlain
Theses and Dissertations
Most work on groups facing collective action assumes that group membership is static, or fixed. Yet static membership is rare, with members joining and leaving groups. In this thesis, I propose to explore how the presence of newcomers to groups affects group coordination. Past research has shown an overall negative effect of newcomers on group contributions. The proposed thesis attempts to further establish the effect by determining whether newcomers, oldtimers, or both are responsible for the declining cooperation in groups. While the empirical component is focused solely on establishing who is responsible for driving down cooperation rates in dynamic groups, …
Bayesian Analysis Of Continuous Curve Functions, 2014 University of South Carolina - Columbia
Bayesian Analysis Of Continuous Curve Functions, Wen Cheng
Theses and Dissertations
We consider Bayesian analysis of continuous curve functions in 1D, 2D and 3D spaces. A fundamental feature of the analysis is that it is invariant under a simultaneous warping/re-parameterization of all target curves, as well as translation, rotation and scale of each individual if necessary. We introduce Bayesian models based on a special curve representation named Square Root Velocity Function (SRVF) introduced by Srivastava et al. (2011, IEEE PAMI). A Gaussian process model for the SRVFs of curves is proposed, and suitable prior models such as the Dirichlet distribution are employed for modeling the warping function as a cumulative distribution …
Applications Of Bayesian Nonparametrics To Reliability And Survival Data, 2014 University of South Carolina - Columbia
Applications Of Bayesian Nonparametrics To Reliability And Survival Data, Li Li
Theses and Dissertations
Reliability and survival data are widely encountered across many common settings. Subjects under investigation often include machines, bioassays, patients, etc.; their reliability or survival distribution, and its association with covariate processes, are commonly of interest. Within this dissertation, the first two chapters focus on reliability data where repairable systems fail and get interventions, e.g. repairs in the event process. It begins with a nonparametric test for the commonly assumed ''good as old'' assumption for minimal repair models and then a semi-parametric regression model is introduced for reliability data using Kijima's effective age. The third chapter focuses on survival data observed …
Repeat Sales House Price Index Methodology, 2013 Fordham University
Repeat Sales House Price Index Methodology, Chaitra Nagaraja, Lawrence Brown, Susan Wachter
Chaitra H Nagaraja
No abstract provided.
Business Statistics In Practice, 2013 Miami University
Business Statistics In Practice, Bruce Bowerman, Julie Schermer, Andrew Johnson, Richard O'Connell, Emily Murphree
Andrew M. Johnson
No abstract provided.
A Model Averaging Approach For High-Dimensional Regression, 2013 Melbourne Business School
A Model Averaging Approach For High-Dimensional Regression, Tomohiro Ando, Ker-Chau Li
Tomohiro Ando
No abstract provided.
Approximate Bayesian Computation In State Space Models, 2013 Monash University
Approximate Bayesian Computation In State Space Models, Gael Martin, Brendan Mccabe, Christian Robert, Worapree Ole Maneesoonthorn
Worapree Ole Maneesoonthorn
A new approach to inference in state space models is proposed, based on approximate Bayesian computation (ABC). ABC avoids evaluation of the likelihood function by matching observed summary statistics with statistics computed from data simulated from the true process; exact inference being feasible only if the statistics are sufficient. With finite sample sufficiency unattainable in the state space setting, we seek asymptotic sufficiency via the maximum likelihood estimator (MLE) of the parameters of an auxiliary model. We prove that this auxiliary model-based approach achieves Bayesian consistency, and that - in a precise limiting sense - the proximity to (asymptotic) sufficiency …
Measures For The Degree Of Overlap Of Gene Signatures And Applications To Tcga, 2013 Yale University
Measures For The Degree Of Overlap Of Gene Signatures And Applications To Tcga, Shuangge Ma
Shuangge Ma
For cancer and many other complex diseases, a large number of gene signatures have been generated. In this study, we use cancer as an example and note that other diseases can be analyzed in a similar manner. For signatures generated in multiple studies on the same cancer type/outcome, and for signatures on different cancer types, it is of interest to evaluate their degree of overlap. Many of the existing studies simply count the number (or percentage) of overlapped genes shared by two signatures. Such an approach has serious limitations. In this study, as a demonstrating example, we consider cancer prognosis …
Comparison Of Methods For Estimating The Effect Of Salvage Therapy In Prostate Cancer When Treatment Is Given By Indication., 2013 University of Pennsylvania
Comparison Of Methods For Estimating The Effect Of Salvage Therapy In Prostate Cancer When Treatment Is Given By Indication., Jeremy Taylor, Jincheng Shen, Edward Kennedy, Lu Wang, Douglas Schaubel
Edward H. Kennedy
For patients who were previously treated for prostate cancer, salvage hormone therapy is frequently given when the longitudinal marker prostate-specific antigen begins to rise during follow-up. Because the treatment is given by indication, estimating the effect of the hormone therapy is challenging. In a previous paper we described two methods for estimating the treatment effect, called two-stage and sequential stratification. The two-stage method involved modeling the longitudinal and survival data. The sequential stratification method involves contrasts within matched sets of people, where each matched set includes people who did and did not receive hormone therapy. In this paper, we evaluate …
Bias In Estimating The Causal Hazard Ratio Using Two-Stage Instrumental Variable Methods, 2013 University of Pennsylvania
Bias In Estimating The Causal Hazard Ratio Using Two-Stage Instrumental Variable Methods, Fei Wan, Dylan S. Small, Justin E. Bekelman, Nandita Mitra
fei wan
Two stage instrumental variable methods are commonly used to determine the causal effects of treatments on survival in the presence of measured and unmeasured confounding. Two stage residual inclusion (2SRI) has been the method of choice over two stage predictor substitution (2SPS) in clinical studies. We directly compare the bias in the causal hazard ratio estimated by these two methods. Under a principal stratification framework, we derive a closed form solution for asymptotic bias of the causal hazard ratio among compliers for both the 2SPS and 2SRI methods when survival time follows the Weibull distribution with random censoring. When there …
Importance Accelerated Robbins-Monro Recursion With Applications To Parametric Confidence Limits, 2013 Melbourne Business School
Importance Accelerated Robbins-Monro Recursion With Applications To Parametric Confidence Limits, Zdravjko I. Botev, Chris Lloyd
Chris J. Lloyd
Monro (1951) to calculating confidence limits leads to poor efficiency and difficulties in estimating the appropriate governing constants as well as the standard error. We suggest sampling instead from an alternative importance distribu- tion and modifying the Robbins-Monro recursion accordingly. This can reduce the asymptotic variance by the usual importance sampling factor. It also allows the standard error and optimal step length to be estimated from the simulation. The methodology is applied to computing almost exact confidence limits in a generalised linear model.
Double Propensity-Score Adjustment: A Solution To Design Bias Or Bias Due To Incomplete Matching, 2013 Institute for Clinical Evaluative Sciences
Double Propensity-Score Adjustment: A Solution To Design Bias Or Bias Due To Incomplete Matching, Peter Austin
Peter Austin
Propensity-score matching is frequently used to reduce the effects of confounding when using observational data to estimate the effects of treatments. Matching allows one to estimate the average effect of treatment in the treated. Rosenbaum and Rubin coined the term "bias due to incomplete matching" to describe the bias that can occur when some treated subjects are excluded from the matched sample because no appropriate control subject was available. The presence of incomplete matching raises important questions around the generalizability of estimated treatment effects to the entire population of treated subjects. We describe an analytic solution to address the bias …