Image Segmentation Using Fuzzy-Spatial Taxon Cut, 2015 U.C. Berkeley

#### Image Segmentation Using Fuzzy-Spatial Taxon Cut, Lauren Barghout

*MODVIS Workshop*

Images convey multiple meanings that depend on the context in which the viewer perceptually organizes the scene. This presents a problem for automated image segmentation, because it adds uncertainty to the process of selecting which objects to include or not include within a segment. I’ll discuss the implementation of a fuzzy-logic-natural-vision-processing engine that solves this problem by assuming the scene architecture prior to processing. The scene architecture, a standardized natural-scene-perception-taxonomy comprised of a hierarchy of nested spatial-taxons. Spatial-taxons are regions (pixel-sets) that are figure-like, in that they are perceived as having a contour, are either `thing-like', or a `group ...

Video Event Understanding With Pattern Theory, 2015 University of South Florida

#### Video Event Understanding With Pattern Theory, Fillipe Souza, Sudeep Sarkar, Anuj Srivastava, Jingyong Su

*MODVIS Workshop*

We propose a combinatorial approach built on Grenander’s pattern theory to generate semantic interpretations of video events of human activities. The basic units of representations, termed generators, are linked with each other using pairwise connections, termed bonds, that satisfy predefined relations. Different generators are specified for different levels, from (image) features at the bottom level to (human) actions at the highest, providing a rich representation of items in a scene. The resulting configurations of connected generators provide scene interpretations; the inference goal is to parse given video data and generate high-probability configurations. The probabilistic structures are imposed using energies ...

The Psychophysics Of Metacognition And Meta D', 2015 UC Berkeley

#### The Psychophysics Of Metacognition And Meta D', S A. Klein

*MODVIS Workshop*

In the past five years there has been a surge of renewed interest in metacognition and meta d'. It is a very interesting and highly controversial area of research. It is interesting because thinking about subjective experiences provides new insight into decision making. The new book on the topic edited by Fleming and Frith, and the Matiscalco article in that book provide an excellent summary of the issues. My view is that double judgment signal detection theory, plus new approaches for multinomial modeling can provide important insights into the recent meta d' findings. I will show how improved rating scale ...

Binocular 3d Motion Perception As Bayesian Inference, 2015 University of Glasgow

#### Binocular 3d Motion Perception As Bayesian Inference, Martin Lages, Suzanne Heron

*MODVIS Workshop*

The human visual system encodes monocular motion and binocular disparity input before it is integrated into a single 3D percept. Here we propose a geometric-statistical model of human 3D motion perception that solves the aperture problem in 3D by assuming that (i) velocity constraints arise from inverse projection of local 2D velocity constraints in a binocular viewing geometry, (ii) noise from monocular motion and binocular disparity processing is independent, and (iii) slower motions are more likely to occur than faster ones. In two experiments we found that instantiation of this Bayesian model can explain perceived 3D line motion direction under ...

Model Of Cost-Effectiveness Of Mri For Women Of Average Lifetime Risk Of Breast Cancer, 2015 Dominican University of California

#### Model Of Cost-Effectiveness Of Mri For Women Of Average Lifetime Risk Of Breast Cancer, Mckenna L. Kimball

*Scholarly and Creative Works Conference*

**Background: **Mammography is the current standard for breast cancer detection however magnetic resonance imaging (MRI) is a more sensitive method of breast imaging. Despite MRI’s increased sensitivity, MRI has more false positives and higher costs. The purpose of this study was to determine if MRI or MRI in conjunction with mammography was a cost-effective solution for breast cancer detection in women with average lifetime risk of breast cancer.

**Methods**: A mathematical model was used to compare annual mammography, annual MRI, and mammography and MRI on alternate years. The model included the natural history of breast cancer, screening by mammography ...

Introduction To Targeted Learning, 2015 University of California, Berkeley

Targeted Estimation And Inference For The Sample Average Treatment Effect, 2015 Division of Biostatistics, University of California, Berkeley - the SEARCH Consortium

#### Targeted Estimation And Inference For The Sample Average Treatment Effect, Laura B. Balzer, Maya L. Petersen, Mark J. Van Der Laan

*U.C. Berkeley Division of Biostatistics Working Paper Series*

While the population average treatment effect has been the subject of extensive methods and applied research, less consideration has been given to the sample average treatment effect: the mean difference in the counterfactual outcomes for the study units. The sample parameter is easily interpretable and is arguably the most relevant when the study units are not representative of a greater population or when the exposure's impact is heterogeneous. Formally, the sample effect is not identifiable from the observed data distribution. Nonetheless, targeted maximum likelihood estimation (TMLE) can provide an asymptotically unbiased and efficient estimate of both the population and ...

Inequality In Treatment Benefits: Can We Determine If A New Treatment Benefits The Many Or The Few?, 2015 Johns Hopkins University, Bloomberg School of Public Health, Department of Biostatistics

#### Inequality In Treatment Benefits: Can We Determine If A New Treatment Benefits The Many Or The Few?, Emily Huang, Michael Rosenblum

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

The primary analysis in many randomized controlled trials focuses on the average treatment effect and does not address whether treatment benefits are widespread or limited to a select few. This problem affects many disease areas, since it stems from how randomized trials, often the gold standard for evaluating treatments, are designed and analyzed. Our goal is to estimate the fraction who benefit from a treatment, based on randomized trial data. We consider cases where the primary outcome is continuous, discrete, or ordinal. In general, the fraction who benefit is a non-identifiable parameter, and the best that can be obtained are ...

Validated Automatic Brain Extraction Of Head Ct Images, 2015 Johns Hopkins University

#### Validated Automatic Brain Extraction Of Head Ct Images, John Muschelli Iii, Natalie L. Ullman, Daniel F. Hanley, Paul Vespa, Ciprian M. Crainiceanu

*John Muschelli III*

Background

X-ray Computed Tomography (CT) imaging of the brain is commonly used in diagnostic settings. Al- though CT scans are primarily used in clinical practice, they are increasingly used in research. A fundamental processing step in brain imaging research is brain extraction – the process of separating the brain tissue from all other tissues. Methods for brain extraction have either been validated but not fully automated, or have been fully automated and informally proposed, but never formally validated.

Aim

To systematically analyze and validate the performance of FSL’s brain extraction tool (BET) on head CT images of patients with intracranial ...

Session B-2: The “Roll” Of Statistics In Modeling - It All Adds Up, 2015 Illinois Mathematics and Science Academy

#### Session B-2: The “Roll” Of Statistics In Modeling - It All Adds Up, Richard Stalmack, Janice Krouse

*Professional Learning Day*

The common core practice standards ask us to teach students to propose mathematical models and test their viability. Participants will do an experiment, collect data and use technological tools to combine modeling, analysis and basic statistics. Participants should bring a laptop, if possible; otherwise, bring a graphing calculator.

Leveraging Prognostic Baseline Variables To Gain Precision In Randomized Trials, 2015 Johns Hopkins Bloomberg School of Public Health

#### Leveraging Prognostic Baseline Variables To Gain Precision In Randomized Trials, Elizabeth Colantuoni, Michael Rosenblum

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

We focus on estimating the average treatment effect in a randomized trial. If baseline variables are correlated with the outcome, then appropriately adjusting for these variables can improve precision. An example is the analysis of covariance (ANCOVA) estimator, which applies when the outcome is continuous, the quantity of interest is the difference in mean outcomes comparing treatment versus control, and a linear model with only main effects is used. ANCOVA is guaranteed to be at least as precise as the standard unadjusted estimator, asymptotically, under no parametric model assumptions, and also is locally, semiparametric efficient. Recently, several estimators have been ...

Statistical Estimation Of T1 Relaxation Time Using Conventional Magnetic Resonance Imaging, 2015 Department of Biostatistics, Bloomberg School of Public Health, Johns Hopkins University

#### Statistical Estimation Of T1 Relaxation Time Using Conventional Magnetic Resonance Imaging, Amanda Mejia, Elizabeth M. Sweeney, Blake Dewey, Govind Nair, Pascal Sati, Colin Shea, Daniel S. Reich, Russell T. Shinohara

*UPenn Biostatistics Working Papers*

Quantitative *T*_{1} maps (*qT*_{1}) are often used to study diffuse tissue abnormalities that may be difficult to assess on standard clinical sequences. While *qT*_{1} maps can provide valuable information for studying the progression and treatment of diseases like multiple sclerosis, the additional scan time required and multi-site implementation issues have limited their inclusion in many standard clinical and research protocols. Hence, the availability of *qT*_{1} maps has historically been limited.

In this paper, we propose a new method of estimating *T*_{1} maps *retroactively* that only requires the acquisition or availability of four conventional MRI sequences ...

Simulation Of Semicompeting Risk Survival Data And Estimation Based On Multistate Frailty Model, 2015 Harvard University

#### Simulation Of Semicompeting Risk Survival Data And Estimation Based On Multistate Frailty Model, Fei Jiang, Sebastien Haneuse

*Harvard University Biostatistics Working Paper Series*

We develop a simulation procedure to simulate the semicompeting risk survival data. In addition, we introduce an EM algorithm and a B–spline based estimation procedure to evaluate and implement Xu et al. (2010)’s nonparametric likelihood es- timation approach. The simulation procedure provides a route to simulate samples from the likelihood introduced in Xu et al. (2010)’s. Further, the EM algorithm and the B–spline methods stabilize the estimation and gives accurate estimation results. We illustrate the simulation and the estimation procedure with simluation examples and real data analysis.

Optimal Dynamic Treatments In Resource-Limited Settings, 2015 University of California, Berkeley, Division of Biostatistics

#### Optimal Dynamic Treatments In Resource-Limited Settings, Alexander R. Luedtke, Mark J. Van Der Laan

*U.C. Berkeley Division of Biostatistics Working Paper Series*

A dynamic treatment rule (DTR) is a treatment rule which assigns treatments to individuals based on (a subset of) their measured covariates. An optimal DTR is the DTR which maximizes the population mean outcome. Previous works in this area have assumed that treatment is an unlimited resource so that the entire population can be treated if this strategy maximizes the population mean outcome. We consider optimal DTRs in settings where the treatment resource is limited so that there is a maximum proportion of the population which can be treated. We give a general closed-form expression for an optimal stochastic DTR ...

The Game Of Thrones: A Study Of Power Networks And How They Change, 2015 Utah State University

#### The Game Of Thrones: A Study Of Power Networks And How They Change, Trevor Williams

*Research on Capitol Hill*

No abstract provided.

Negative Binomial Regerssion, 2nd Ed, 2nd Print, Errata And Comments, 2015 Arizona State University

#### Negative Binomial Regerssion, 2nd Ed, 2nd Print, Errata And Comments, Joseph Hilbe

*Joseph M Hilbe*

Errata and Comments for 2nd printing of NBR2, 2nd edition. Previous errata from first printing all corrected. Some added and new text as well.

Modeling Count Data; Errata And Comments, 2015 Arizona State University

#### Modeling Count Data; Errata And Comments, Joseph M. Hilbe

*Joseph M Hilbe*

Modeling Count Data: Errata and Comments PDF. Will be updated on a continuing basis.

Applying Multiple Imputation For External Calibration To Propensty Score Analysis, 2015 Johns Hopkins University, Bloomberg School of Public Health, Department of Biostatitics

#### Applying Multiple Imputation For External Calibration To Propensty Score Analysis, Yenny Webb-Vargas, Kara E. Rudolph, D. Lenis, Peter Murakami, Elizabeth A. Stuart

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

Although covariate measurement error is likely the norm rather than the exception, methods for handling covariate measurement error in propensity score methods have not been widely investigated. We consider a multiple imputation-based approach that uses an external calibration sample with information on the true and mismeasured covariates, Multiple Imputation for External Calibration (MI-EC), to correct for the measurement error, and investigate its performance using simulation studies. As expected, using the covariate measured with error leads to bias in the treatment effect estimate. In contrast, the MI-EC method can eliminate almost all the bias. We confirm that the outcome must be ...

Adaptive, Group Sequential Designs That Balance The Benefits And Risks Of Wider Inclusion Criteria, 2015 Johns Hopkins Bloomberg School of Public Health, Department of Biostatistics

#### Adaptive, Group Sequential Designs That Balance The Benefits And Risks Of Wider Inclusion Criteria, Michael Rosenblum, Brandon S. Luber, Richard E. Thompson, Daniel F. Hanley

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

We propose a new class of adaptive randomized trial designs aimed at gaining the advantages of wider generalizability and faster recruitment, while mitigating the risks of including a population for which there is greater a priori uncertainty. Our designs use adaptive enrichment, i.e., they have preplanned decision rules for modifying enrollment criteria based on data accrued at interim analyses. For example, enrollment can be restricted if the participants from predefined subpopulations are not benefiting from the new treatment. To the best of our knowledge, our designs are the first adaptive enrichment designs to have all of the following features ...

Review Of Naked Statistics: Stripping The Dread From Data By Charles Wheelan, 2015 Dakota Wesleyan University

#### Review Of Naked Statistics: Stripping The Dread From Data By Charles Wheelan, Michael T. Catalano

*Numeracy*

Wheelan, Charles. *Naked Statistics: Stripping the Dread from Data* (New York, NY, W. W. Norton & Company, 2014). 282 pp. ISBN 978-0-393-07195-5

In his review of *What Numbers Say* and *The Numbers Game*, Rob Root (*Numeracy* 3(1): 9) writes “Popular books on quantitative literacy need to be easy to read, reasonably comprehensive in scope, and include examples that are thought-provoking and memorable.” Wheelan’s book certainly meets this description, and should be of interest to both the general public and those with a professional interest in numeracy. A moderately diligent learner can get a decent understanding of basic statistics from ...