Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability Commons

Open Access. Powered by Scholars. Published by Universities.®

5,322 Full-Text Articles 7,359 Authors 1,113,892 Downloads 131 Institutions

All Articles in Statistics and Probability

Faceted Search

5,322 full-text articles. Page 1 of 113.

Image Segmentation Using Fuzzy-Spatial Taxon Cut, Lauren Barghout 2015 U.C. Berkeley

Image Segmentation Using Fuzzy-Spatial Taxon Cut, Lauren Barghout

MODVIS Workshop

Images convey multiple meanings that depend on the context in which the viewer perceptually organizes the scene. This presents a problem for automated image segmentation, because it adds uncertainty to the process of selecting which objects to include or not include within a segment. I’ll discuss the implementation of a fuzzy-logic-natural-vision-processing engine that solves this problem by assuming the scene architecture prior to processing. The scene architecture, a standardized natural-scene-perception-taxonomy comprised of a hierarchy of nested spatial-taxons. Spatial-taxons are regions (pixel-sets) that are figure-like, in that they are perceived as having a contour, are either `thing-like', or a `group ...


Video Event Understanding With Pattern Theory, Fillipe Souza, Sudeep Sarkar, Anuj Srivastava, Jingyong Su 2015 University of South Florida

Video Event Understanding With Pattern Theory, Fillipe Souza, Sudeep Sarkar, Anuj Srivastava, Jingyong Su

MODVIS Workshop

We propose a combinatorial approach built on Grenander’s pattern theory to generate semantic interpretations of video events of human activities. The basic units of representations, termed generators, are linked with each other using pairwise connections, termed bonds, that satisfy predefined relations. Different generators are specified for different levels, from (image) features at the bottom level to (human) actions at the highest, providing a rich representation of items in a scene. The resulting configurations of connected generators provide scene interpretations; the inference goal is to parse given video data and generate high-probability configurations. The probabilistic structures are imposed using energies ...


The Psychophysics Of Metacognition And Meta D', S A. Klein 2015 UC Berkeley

The Psychophysics Of Metacognition And Meta D', S A. Klein

MODVIS Workshop

In the past five years there has been a surge of renewed interest in metacognition and meta d'. It is a very interesting and highly controversial area of research. It is interesting because thinking about subjective experiences provides new insight into decision making. The new book on the topic edited by Fleming and Frith, and the Matiscalco article in that book provide an excellent summary of the issues. My view is that double judgment signal detection theory, plus new approaches for multinomial modeling can provide important insights into the recent meta d' findings. I will show how improved rating scale ...


Binocular 3d Motion Perception As Bayesian Inference, Martin Lages, Suzanne Heron 2015 University of Glasgow

Binocular 3d Motion Perception As Bayesian Inference, Martin Lages, Suzanne Heron

MODVIS Workshop

The human visual system encodes monocular motion and binocular disparity input before it is integrated into a single 3D percept. Here we propose a geometric-statistical model of human 3D motion perception that solves the aperture problem in 3D by assuming that (i) velocity constraints arise from inverse projection of local 2D velocity constraints in a binocular viewing geometry, (ii) noise from monocular motion and binocular disparity processing is independent, and (iii) slower motions are more likely to occur than faster ones. In two experiments we found that instantiation of this Bayesian model can explain perceived 3D line motion direction under ...


Model Of Cost-Effectiveness Of Mri For Women Of Average Lifetime Risk Of Breast Cancer, Mckenna L. Kimball 2015 Dominican University of California

Model Of Cost-Effectiveness Of Mri For Women Of Average Lifetime Risk Of Breast Cancer, Mckenna L. Kimball

Scholarly and Creative Works Conference

Background: Mammography is the current standard for breast cancer detection however magnetic resonance imaging (MRI) is a more sensitive method of breast imaging. Despite MRI’s increased sensitivity, MRI has more false positives and higher costs. The purpose of this study was to determine if MRI or MRI in conjunction with mammography was a cost-effective solution for breast cancer detection in women with average lifetime risk of breast cancer.

Methods: A mathematical model was used to compare annual mammography, annual MRI, and mammography and MRI on alternate years. The model included the natural history of breast cancer, screening by mammography ...


Recent Periods Of Financial Turbulence On The Russian Stock Market And Their Effect On Price Correlation And Value At Risk, Alexander Logoveev, Gregory Cherinko 2015 Financial University under the Government of the RF, Moscow

Recent Periods Of Financial Turbulence On The Russian Stock Market And Their Effect On Price Correlation And Value At Risk, Alexander Logoveev, Gregory Cherinko

Undergraduate Economic Review

The aim of this article is to observe and analyze the recent periods of financial turbulence on the Russian stock market and determine their influence on the correlation coefficients between asset prices and the Value at Risk measure for a portfolio. Our task was to describe the previously observed phenomenon of correlation enlargement during times of financial crises deemed in our research as separate Black Swans. Based on up-to-date financial data analysis we determined correlation trends that can be useful in risk management and applied the Value at Risk method.


Simulation Of Semicompeting Risk Survival Data And Estimation Based On Multistate Frailty Model, Fei Jiang, Sebastien Haneuse 2015 Harvard University

Simulation Of Semicompeting Risk Survival Data And Estimation Based On Multistate Frailty Model, Fei Jiang, Sebastien Haneuse

Harvard University Biostatistics Working Paper Series

We develop a simulation procedure to simulate the semicompeting risk survival data. In addition, we introduce an EM algorithm and a B–spline based estimation procedure to evaluate and implement Xu et al. (2010)’s nonparametric likelihood es- timation approach. The simulation procedure provides a route to simulate samples from the likelihood introduced in Xu et al. (2010)’s. Further, the EM algorithm and the B–spline methods stabilize the estimation and gives accurate estimation results. We illustrate the simulation and the estimation procedure with simluation examples and real data analysis.


Introduction To Targeted Learning, Laura Balzer 2015 University of California, Berkeley

Introduction To Targeted Learning, Laura Balzer

Laura B. Balzer

No abstract provided.


Examining The Literature On “Networks In Space And In Time.” An Introduction, Luca De Benedictis, Prosperina Vitale, Stanley Wasserman 2015 DED - University of Macerata - Italy

Examining The Literature On “Networks In Space And In Time.” An Introduction, Luca De Benedictis, Prosperina Vitale, Stanley Wasserman

Luca De Benedictis

The Network science special issue of “Networks in space and in time: methods and applications” contributes to the debate on contextual analysis in network science. It includes seven research papers that shed light on the analysis of network phenomena studied within geographic space and across temporal dimensions. In these papers, methodological issues as well as specific applications are described from different fields. We take the seven papers, study their citations and texts, and relate them to the broader literature. By exploiting the bibliographic information and the textual data of these seven documents, citation analysis and lexical correspondence analysis allow us ...


Relationship Between High School Math Course Selection And Retention Rates At Otterbein University, Lauren A. Fisher 2015 Otterbein University

Relationship Between High School Math Course Selection And Retention Rates At Otterbein University, Lauren A. Fisher

Honor's Papers

Binary logistic regression was used to study the relationship between high school math course selection and retention rates at Otterbein University. Graduation rates from postsecondary institutions are low in the United States and, more specifically, at Otterbein. This study is important in helping to determine what can raise retention rates, and ultimately, graduation rates. It directs focus toward high school math course selection and what should be changed before entering a post-secondary institution. Otterbein will have a better idea of what type of students to recruit and which students may be good candidates with some extra help. Recruiting is expensive ...


Targeted Estimation And Inference For The Sample Average Treatment Effect, Laura B. Balzer, Maya L. Petersen, Mark J. van der Laan 2015 Division of Biostatistics, University of California, Berkeley - the SEARCH Consortium

Targeted Estimation And Inference For The Sample Average Treatment Effect, Laura B. Balzer, Maya L. Petersen, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

While the population average treatment effect has been the subject of extensive methods and applied research, less consideration has been given to the sample average treatment effect: the mean difference in the counterfactual outcomes for the study units. The sample parameter is easily interpretable and is arguably the most relevant when the study units are not representative of a greater population or when the exposure's impact is heterogeneous. Formally, the sample effect is not identifiable from the observed data distribution. Nonetheless, targeted maximum likelihood estimation (TMLE) can provide an asymptotically unbiased and efficient estimate of both the population and ...


Inequality In Treatment Benefits: Can We Determine If A New Treatment Benefits The Many Or The Few?, Emily Huang, Michael Rosenblum 2015 Johns Hopkins University, Bloomberg School of Public Health, Department of Biostatistics

Inequality In Treatment Benefits: Can We Determine If A New Treatment Benefits The Many Or The Few?, Emily Huang, Michael Rosenblum

Johns Hopkins University, Dept. of Biostatistics Working Papers

The primary analysis in many randomized controlled trials focuses on the average treatment effect and does not address whether treatment benefits are widespread or limited to a select few. This problem affects many disease areas, since it stems from how randomized trials, often the gold standard for evaluating treatments, are designed and analyzed. Our goal is to estimate the fraction who benefit from a treatment, based on randomized trial data. We consider cases where the primary outcome is continuous, discrete, or ordinal. In general, the fraction who benefit is a non-identifiable parameter, and the best that can be obtained are ...


Validated Automatic Brain Extraction Of Head Ct Images, John Muschelli III, Natalie L. Ullman, Daniel F. Hanley, Paul Vespa, Ciprian M. Crainiceanu 2015 Johns Hopkins University

Validated Automatic Brain Extraction Of Head Ct Images, John Muschelli Iii, Natalie L. Ullman, Daniel F. Hanley, Paul Vespa, Ciprian M. Crainiceanu

John Muschelli III

Background

X-ray Computed Tomography (CT) imaging of the brain is commonly used in diagnostic settings. Al- though CT scans are primarily used in clinical practice, they are increasingly used in research. A fundamental processing step in brain imaging research is brain extraction – the process of separating the brain tissue from all other tissues. Methods for brain extraction have either been validated but not fully automated, or have been fully automated and informally proposed, but never formally validated.

Aim

To systematically analyze and validate the performance of FSL’s brain extraction tool (BET) on head CT images of patients with intracranial ...


Session B-2: The “Roll” Of Statistics In Modeling - It All Adds Up, Richard Stalmack, Janice Krouse 2015 Illinois Mathematics and Science Academy

Session B-2: The “Roll” Of Statistics In Modeling - It All Adds Up, Richard Stalmack, Janice Krouse

Professional Learning Day

The common core practice standards ask us to teach students to propose mathematical models and test their viability. Participants will do an experiment, collect data and use technological tools to combine modeling, analysis and basic statistics. Participants should bring a laptop, if possible; otherwise, bring a graphing calculator.


Statistical Estimation Of T1 Relaxation Time Using Conventional Magnetic Resonance Imaging, Amanda Mejia, Elizabeth M. Sweeney, Blake Dewey, Govind Nair, Pascal Sati, Colin Shea, Daniel S. Reich, Russell T. Shinohara 2015 Department of Biostatistics, Bloomberg School of Public Health, Johns Hopkins University

Statistical Estimation Of T1 Relaxation Time Using Conventional Magnetic Resonance Imaging, Amanda Mejia, Elizabeth M. Sweeney, Blake Dewey, Govind Nair, Pascal Sati, Colin Shea, Daniel S. Reich, Russell T. Shinohara

UPenn Biostatistics Working Papers

Quantitative T1 maps (qT1) are often used to study diffuse tissue abnormalities that may be difficult to assess on standard clinical sequences. While qT1 maps can provide valuable information for studying the progression and treatment of diseases like multiple sclerosis, the additional scan time required and multi-site implementation issues have limited their inclusion in many standard clinical and research protocols. Hence, the availability of qT1 maps has historically been limited.

In this paper, we propose a new method of estimating T1 maps retroactively that only requires the acquisition or availability of four conventional MRI sequences ...


Optimal Dynamic Treatments In Resource-Limited Settings, Alexander R. Luedtke, Mark J. van der Laan 2015 University of California, Berkeley, Division of Biostatistics

Optimal Dynamic Treatments In Resource-Limited Settings, Alexander R. Luedtke, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

A dynamic treatment rule (DTR) is a treatment rule which assigns treatments to individuals based on (a subset of) their measured covariates. An optimal DTR is the DTR which maximizes the population mean outcome. Previous works in this area have assumed that treatment is an unlimited resource so that the entire population can be treated if this strategy maximizes the population mean outcome. We consider optimal DTRs in settings where the treatment resource is limited so that there is a maximum proportion of the population which can be treated. We give a general closed-form expression for an optimal stochastic DTR ...


The Game Of Thrones: A Study Of Power Networks And How They Change, Trevor Williams 2015 Utah State University

The Game Of Thrones: A Study Of Power Networks And How They Change, Trevor Williams

Research on Capitol Hill

No abstract provided.


Negative Binomial Regerssion, 2nd Ed, 2nd Print, Errata And Comments, Joseph Hilbe 2015 Arizona State University

Negative Binomial Regerssion, 2nd Ed, 2nd Print, Errata And Comments, Joseph Hilbe

Joseph M Hilbe

Errata and Comments for 2nd printing of NBR2, 2nd edition. Previous errata from first printing all corrected. Some added and new text as well.


Modeling Count Data; Errata And Comments, Joseph M. Hilbe 2015 Arizona State University

Modeling Count Data; Errata And Comments, Joseph M. Hilbe

Joseph M Hilbe

Modeling Count Data: Errata and Comments PDF. Will be updated on a continuing basis.


Applying Multiple Imputation For External Calibration To Propensty Score Analysis, Yenny Webb-Vargas, Kara E. Rudolph, D. Lenis, Peter Murakami, Elizabeth A. Stuart 2015 Johns Hopkins University, Bloomberg School of Public Health, Department of Biostatitics

Applying Multiple Imputation For External Calibration To Propensty Score Analysis, Yenny Webb-Vargas, Kara E. Rudolph, D. Lenis, Peter Murakami, Elizabeth A. Stuart

Johns Hopkins University, Dept. of Biostatistics Working Papers

Although covariate measurement error is likely the norm rather than the exception, methods for handling covariate measurement error in propensity score methods have not been widely investigated. We consider a multiple imputation-based approach that uses an external calibration sample with information on the true and mismeasured covariates, Multiple Imputation for External Calibration (MI-EC), to correct for the measurement error, and investigate its performance using simulation studies. As expected, using the covariate measured with error leads to bias in the treatment effect estimate. In contrast, the MI-EC method can eliminate almost all the bias. We confirm that the outcome must be ...


Digital Commons powered by bepress