Predicting Future Years Of Life, Health, And Functional Ability: A Healthy Life Calculator For Older Adults, 2015 University of Washington
Predicting Future Years Of Life, Health, And Functional Ability: A Healthy Life Calculator For Older Adults, Paula Diehr, Michael Diehr, Alice M. Arnold, Laura Yee, Michelle C. Odden, Calvin H. Hirsch, Stephen Thielke, Bruce Psaty, W Craig Johnson, Jorge Kizer, Anne B. Newman
UW Biostatistics Working Paper Series
Planning for the future would be easier if we knew how long we will live and, more importantly, how many years we will be healthy and able to enjoy it. There are few well-documented aids for predicting our future health. We attempted to meet this need for persons 65 years of age and older.
Data came from the Cardiovascular Health Study, a large longitudinal study of older adults that began in 1990. Years of life (YOL) were defined by measuring time to death. Years of healthy life (YHL) were defined by an annual question about self-rated health ...
Some Models And Methods For The Analysis Of Observational Data, 2015 Department of Statistics, Informatics and Modelling
Some Models And Methods For The Analysis Of Observational Data, José A. Ferreira
COBRA Preprint Series
This article provides a short, concise and essentially self-contained exposition of some of the most important models and methods for the analysis of observational data, and a substantial number of illustrations of their application. Although for the most part our presentation follows P. Rosenbaum’s book, “Observational Studies”, and naturally draws on related literature, it contains original elements and simplifies and generalizes some basic results. The illustrations, based on simulated data, show the methods at work in some detail, highlighting pitfalls and emphasizing certain subjective aspects of the statistical analyses.
Image Segmentation Using Fuzzy-Spatial Taxon Cut, 2015 U.C. Berkeley
Image Segmentation Using Fuzzy-Spatial Taxon Cut, Lauren Barghout
Images convey multiple meanings that depend on the context in which the viewer perceptually organizes the scene. This presents a problem for automated image segmentation, because it adds uncertainty to the process of selecting which objects to include or not include within a segment. I’ll discuss the implementation of a fuzzy-logic-natural-vision-processing engine that solves this problem by assuming the scene architecture prior to processing. The scene architecture, a standardized natural-scene-perception-taxonomy comprised of a hierarchy of nested spatial-taxons. Spatial-taxons are regions (pixel-sets) that are figure-like, in that they are perceived as having a contour, are either `thing-like', or a `group ...
Video Event Understanding With Pattern Theory, 2015 University of South Florida
Video Event Understanding With Pattern Theory, Fillipe Souza, Sudeep Sarkar, Anuj Srivastava, Jingyong Su
We propose a combinatorial approach built on Grenander’s pattern theory to generate semantic interpretations of video events of human activities. The basic units of representations, termed generators, are linked with each other using pairwise connections, termed bonds, that satisfy predefined relations. Different generators are specified for different levels, from (image) features at the bottom level to (human) actions at the highest, providing a rich representation of items in a scene. The resulting configurations of connected generators provide scene interpretations; the inference goal is to parse given video data and generate high-probability configurations. The probabilistic structures are imposed using energies ...
Metacognition: Using Confidence Ratings For Type 2 And Type 1 Roc Curves, 2015 UC Berkeley
Metacognition: Using Confidence Ratings For Type 2 And Type 1 Roc Curves, S A. Klein
In the past five years there has been a surge of renewed interest in metacognition ("thinking about thinking"). The typical experiment involves a binary judgment followed by a multilevel confidence rating. It is a confusing topic because the rating could be made either on one's confidence in the binary response (standard rating Type 1 ROC) or on one's confidence sorted by whether the response was correct (Type 2 ROC). Both are metacognition. After a few remarks on challenging aspects of the Type 2 approach, I will present some interesting results for Type 1 ROC for both memory and ...
Binocular 3d Motion Perception As Bayesian Inference, 2015 University of Glasgow
Binocular 3d Motion Perception As Bayesian Inference, Martin Lages, Suzanne Heron
The human visual system encodes monocular motion and binocular disparity input before it is integrated into a single 3D percept. Here we propose a geometric-statistical model of human 3D motion perception that solves the aperture problem in 3D by assuming that (i) velocity constraints arise from inverse projection of local 2D velocity constraints in a binocular viewing geometry, (ii) noise from monocular motion and binocular disparity processing is independent, and (iii) slower motions are more likely to occur than faster ones. In two experiments we found that instantiation of this Bayesian model can explain perceived 3D line motion direction under ...
Adaptive Pre-Specification In Randomized Trials With And Without Pair-Matching, 2015 Division of Biostatistics, University of California, Berkeley - the SEARCH Consortium
Adaptive Pre-Specification In Randomized Trials With And Without Pair-Matching, Laura B. Balzer, Mark J. Van Der Laan, Maya L. Petersen
U.C. Berkeley Division of Biostatistics Working Paper Series
In randomized trials, adjustment for measured covariates during the analysis can reduce variance and increase power. To avoid misleading inference, the analysis plan must be pre-specified. However, it is unclear a priori which baseline covariates (if any) should be included in the analysis. Consider, for example, the Sustainable East Africa Research in Community Health (SEARCH) trial for HIV prevention and treatment. There are 16 matched pairs of communities and many potential adjustment variables, including region, HIV prevalence, male circumcision coverage and measures of community-level viral load. In this paper, we propose a rigorous procedure to data-adaptively select the adjustment set ...
Double Robust Estimation Of Encouragement-Design Intervention Effects Transported Across Sites, 2015 Johns Hopkins Bloomberg School of Public Health
Double Robust Estimation Of Encouragement-Design Intervention Effects Transported Across Sites, Kara E. Rudolph, Mark J. Van Der Laan
U.C. Berkeley Division of Biostatistics Working Paper Series
We develop double robust targeted maximum likelihood estimators (TMLE) for transporting intervention effects from one population to another. Specifically, we develop TMLE estimators for three transported estimands: intent-to-treat average treatment effect (ATE) and complier ATE, which are relevant for encouragement-design interventions and instrumental variable analyses, and the ATE of the exposure on the outcome, which is applicable to any randomized or observational study. We demonstrate finite sample performance of these TMLE estimators using simulation, including in the presence of practical violations of the positivity assumption. We then apply these methods to the Moving to Opportunity trial, a multi-site, encouragement-design intervention ...
Spatially Random Processes In One-Dimensional Maps: The Logistic Map And The Arnold Circle Map, 2015 University of Colorado Boulder
Spatially Random Processes In One-Dimensional Maps: The Logistic Map And The Arnold Circle Map, An T. Le
Applied Mathematics Graduate Theses & Dissertations
One way to model in-situ remediation of contaminated groundwater is to consider spatially random processes in nonlinear systems. Groundwater remediation often requires injecting an aquifer with treatment solution, where degradation reactions break down the toxins. As the treatment solution and contaminated water flow through the aquifer, their movement is limited by the types of sediment found in the aquifer, which act as spatial barriers to mixing. The onset of chaos in this system implies the two solutions are well mixed, and thus the contaminants are rendered inert. The spatially random processes explored in this thesis are meant to mimic the ...
Stochastic Optimization Via Forward Slice, 2015 University of Washington - Seattle Campus
Stochastic Optimization Via Forward Slice, Bob A. Salim, Lurdes Y. T. Inoue
UW Biostatistics Working Paper Series
Optimization consists of maximizing or minimizing a real-valued objective function. In many problems, the objective function may not yield closed-form solutions. Over many decades, optimization methods, both deterministic and stochastic, have been developed to provide solutions to these problems. However, some common limitations of these methods are the sensitivity to the initial value and that often current methods only find a local (non-global) extremum. In this article, we propose an alternative stochastic optimization method, which we call "Forward Slice", and assess its performance relative to available optimization methods.
A Comparison Of Population-Averaged And Cluster-Specific Approaches In The Context Of Unequal Probabilities Of Selection, 2015 University of Nebraska-Lincoln
A Comparison Of Population-Averaged And Cluster-Specific Approaches In The Context Of Unequal Probabilities Of Selection, Natalie A. Koziol
Public Access Theses and Dissertations from the College of Education and Human Sciences
Sampling designs of large-scale, federally funded studies are typically complex, involving multiple design features (e.g., clustering, unequal probabilities of selection). Researchers must account for these features in order to obtain unbiased point estimators and make valid inferences about population parameters. Single-level (i.e., population-averaged) and multilevel (i.e., cluster-specific) methods provide two alternatives for modeling clustered data. Single-level methods rely on the use of adjusted variance estimators to account for dependency due to clustering, whereas multilevel methods incorporate the dependency into the specification of the model.
Although the literature comparing single-level and multilevel approaches is vast, comparisons have been ...
Boundary Problems For One And Two Dimensional Random Walks, 2015 Western Kentucky University
Boundary Problems For One And Two Dimensional Random Walks, Miky Wright
Masters Theses & Specialist Projects
This thesis provides a study of various boundary problems for one and two dimensional random walks. We first consider a one-dimensional random walk that starts at integer-valued height k > 0, with a lower boundary being the x-axis, and on each step moving downward with probability q being greater than or equal to the probability of going upward p. We derive the variance and the standard deviation of the number of steps T needed for the height to reach 0 from k, by first deriving the moment generating function of T. We then study two types of two-dimensional random walks with ...
Mindsets, Attitudes, And Achievement In Undergraduate Statistics Courses, 2015 Dordt College
Mindsets, Attitudes, And Achievement In Undergraduate Statistics Courses, Valorie L. Zonnefeld
Faculty Work: Comprehensive List
The purpose of this study was to determine the effects of theories of intelligence and an intervention of incremental mindset training on students’ attitudes toward statistics and their mastery of content in an introductory statistics college course. The sample was 547 undergraduate students at a small, faith-based, liberal arts college in the Midwest.
A pretest-posttest design was used for the three instruments implemented. The Comprehensive Assessment of Outcomes in a first Statistics course (CAOS) assessed students’ statistical literacy. The Student Attitudes Towards Statistics – 36© (SATS©) assessed six components of students’ attitudes toward statistics including affect, cognitive competence, difficulty, effort, interest ...
High Dimensional Model Selection And Validation: A Comparison Study, 2015 St. Cloud State University
High Dimensional Model Selection And Validation: A Comparison Study, Zhengyi Li
Culminating Projects in Applied Statistics
Model selection is a challenging issue in high dimensional statistical analysis, and many approaches have been proposed in recent years. In this thesis, we compare the performance of three penalized logistic regression approaches (Ridge, Lasso, and Elastic Net) and three information criteria (AIC, BIC, and EBIC) on binary response variable in high dimensional situation through extensive simulation study. The models are built and selected on the training datasets, and their performance are evaluated through AUC on the validation datasets. We also display the comparison results on two real datasets (Arcene Data and University Retention Data). The performance differences among those ...
Extreme Value Theory And Backtest Overfitting In Finance, 2015 Bowdoin College
Extreme Value Theory And Backtest Overfitting In Finance, Daniel C. Byrnes
In order to identify potentially profitable investment strategies, hedge funds and asset managers can use historical market data to simulate a strategy's performance, a process known as backtesting. While the abundance of historical stock price data and powerful computing technologies has made it feasible to run millions of simulations in a short period of time, this process may produce statistically insignificant results in the form of false positives. As the number of configurations of a strategy increases, it becomes more likely that some of the configurations will perform well by chance alone. The phenomenon of backtest overfitting occurs when ...
Scientific Awareness At Ursinus College, 2015 Ursinus College
Scientific Awareness At Ursinus College, Frank G. Devone
Mathematics Honors Papers
Ursinus College prides itself on creating well-rounded students, and recent initiatives, such as the Fellowships in the Ursinus Transition to the Undergraduate Research Experience Program and the Center for Science and the Common Good suggest that science is a vital part of the Ursinus liberal arts mission. A scientific awareness pilot survey was administered to a sample of Ursinus students drawn from the Class of 2014 and students residing at Ursinus during summer 2014. Experience and data collected from this pilot were used to create a final survey which was made available to all students at Ursinus College. The survey ...
Modeling Traffic At An Intersection, 2015 Kennesaw State University
Modeling Traffic At An Intersection, Kaleigh L. Mulkey, Saniita K. Fasenntao
Symposium of Student Scholars
The main purpose of this project is to build a mathematical model for traffic at a busy intersection. We use elements of Queueing Theory to build our model: the vehicles driving into the intersection are the “arrival process” and the stop light in the intersection is the “server.”
We collected traffic data on the number of vehicles arriving to the intersection, the duration of green and red lights, and the number of vehicles going through the intersection during a green light. We built a SAS macro code to simulate traffic based on parameters derived from the data.
In our program ...
Adaptive Enrichment Designs For Randomized Trials With Delayed Endpoints, Using Locally Efficient Estimators To Improve Precision, 2015 Johns Hopkins University, Johns Hopkins Bloomberg School of Public Health, Department of Biostatistics
Adaptive Enrichment Designs For Randomized Trials With Delayed Endpoints, Using Locally Efficient Estimators To Improve Precision, Michael Rosenblum, Tianchen Qian, Yu Du, Huitong Qiu
Johns Hopkins University, Dept. of Biostatistics Working Papers
Adaptive enrichment designs involve preplanned rules for modifying enrollment criteria based on accrued data in an ongoing trial. For example, enrollment of a subpopulation where there is sufficient evidence of treatment efficacy, futility, or harm could be stopped, while enrollment for the remaining subpopulations is continued. Most existing methods for constructing adaptive enrichment designs are limited to situations where patient outcomes are observed soon after enrollment. This is a major barrier to the use of such designs in practice, since for many diseases the outcome of most clinical importance does not occur shortly after enrollment. We propose a new class ...
Model Of Cost-Effectiveness Of Mri For Women Of Average Lifetime Risk Of Breast Cancer, 2015 Dominican University of California
Model Of Cost-Effectiveness Of Mri For Women Of Average Lifetime Risk Of Breast Cancer, Mckenna L. Kimball
Scholarly and Creative Works Conference
Background: Mammography is the current standard for breast cancer detection however magnetic resonance imaging (MRI) is a more sensitive method of breast imaging. Despite MRI’s increased sensitivity, MRI has more false positives and higher costs. The purpose of this study was to determine if MRI or MRI in conjunction with mammography was a cost-effective solution for breast cancer detection in women with average lifetime risk of breast cancer.
Methods: A mathematical model was used to compare annual mammography, annual MRI, and mammography and MRI on alternate years. The model included the natural history of breast cancer, screening by mammography ...
Nested Partially-Latent Class Models For Dependent Binary Data; Estimating Disease Etiology, 2015 Department of Biostatistics, Johns Hopkins Bloomberg School of Public Health
Nested Partially-Latent Class Models For Dependent Binary Data; Estimating Disease Etiology, Zhenke Wu, Maria Deloria-Knoll, Scott L. Zeger
Johns Hopkins University, Dept. of Biostatistics Working Papers
The Pneumonia Etiology Research for Child Health (PERCH) study seeks to use modern measurement technology to infer the causes of pneumonia. The paper describes a latent variable model designed to infer from case-control data the etiology distribution for the population of cases and for an individual case given his or her measurements taking account of dependence among pathogen measurements due to sources other than class membership. We assume each observation is drawn from a mixture model for which each component represents one pathogen. Conditional dependence among multivariate binary measurements on a single subject is induced by nesting latent subclasses within ...