Predicting Intraday Financial Market Dynamics Using Takens' Vectors; Incorporating Causality Testing And Machine Learning Techniques, 2015 East Tennessee State University

#### Predicting Intraday Financial Market Dynamics Using Takens' Vectors; Incorporating Causality Testing And Machine Learning Techniques, Abubakar-Sadiq Bouda Abdulai

*Electronic Theses and Dissertations*

Traditional approaches to predicting financial market dynamics tend to be linear and stationary, whereas financial time series data is increasingly nonlinear and non-stationary. Lately, advances in dynamical systems theory have enabled the extraction of complex dynamics from time series data. These developments include theory of time delay embedding and phase space reconstruction of dynamical systems from a scalar time series. In this thesis, a time delay embedding approach for predicting intraday stock or stock index movement is developed. The approach combines methods of nonlinear time series analysis with those of causality testing, theory of dynamical systems and machine learning (artificial ...

Addition To Pglr Chap 6, 2015 Arizona State University

#### Addition To Pglr Chap 6, Joseph M. Hilbe

*Joseph M Hilbe*

Addition to Chapter 6 in Practical Guide to Logistic Regression. Added section on Bayesian logistic regression using Stata.

Hilbe-Pglr-Errata-And-Comments, 2015 Arizona State University

#### Hilbe-Pglr-Errata-And-Comments, Joseph M. Hilbe

*Joseph M Hilbe*

Errata and Comments for Practical Guide to Logistic Regression

An Analysis Of The Characteristics And Practices Of Selected Alabama Small Livestock Producers: A Focus On Economics And Marketing, 2015 Tuskegee University

#### An Analysis Of The Characteristics And Practices Of Selected Alabama Small Livestock Producers: A Focus On Economics And Marketing, Janette R. Bartlett, Nii O. Tackie, Mst Nusrat Jahan, Akua Adu-Gyamfi

*Professional Agricultural Workers Journal*

**Abstract**

The study examined the characteristics and practices of small livestock producers, focusing on economics and marketing. Data were obtained from a convenience sample of 121 small producers from several South Central Alabama counties, and were analyzed using descriptive statistics, including chi-square tests. The socioeconomic characteristics reflected a higher proportion of part-time farmers; a higher proportion with at most a two-year/technical degree or some college education; and a higher proportion with $40,000 or less annual household income. A majority had been farming more than thirty years, and most had small herds. Also, very few made profits; many sold ...

A Nonlinear Filter For Markov Chains And Its Effect On Diffusion Maps, 2015 Yale University

#### A Nonlinear Filter For Markov Chains And Its Effect On Diffusion Maps, Stefan Steinerberger

*Yale Day of Data*

Diffusion maps are a modern mathematical tool that helps to find structure in large data sets - we present a new filtering technique that is based on the assumption that errors in the data are intrinsically random to isolate and filter errors and thus boost the efficiency of diffusion maps. Applications include data sets from medicine (the Cleveland Heart Disease Data set and the Wisconsin Breast Cancer Data set) and engineering (the Ionosphere data set).

A Machine Learning Approach To Post-Market Surveillance Of Medical Devices, 2015 Yale University

#### A Machine Learning Approach To Post-Market Surveillance Of Medical Devices, Jonathan Bates, Shu-Xia Li, Craig Parzynski, Ronald Coifman, Harlan Krumholz, Joseph Ross

*Yale Day of Data*

Post-market surveillance is a collection of processes and activities used by product manufacturers and regulators, such as the U.S. Food and Drug Administration (FDA) to monitor the safety and effectiveness of medical devices once they are available for use “on the market”. These activities are designed to generate information to identify poorly performing devices and other safety problems, accurately characterize real-world device performance and clinical outcomes, and facilitate the development of new devices, or new uses for existing devices. Typically, a device is monitored by comparing adverse events in the exposed population to a matched unexposed population. This research ...

K-Mer Analysis On Developmental And Housekeeping Enhancer Peaks, 2015 Yale University

#### K-Mer Analysis On Developmental And Housekeeping Enhancer Peaks, Yunsi Yang, Anurag Sethi, Mark Gerstein

*Yale Day of Data*

The regulation of gene expression involves interaction between transcriptional enhancers and core promoters. However, the separation between developmental and housekeeping gene regulation remains unknown. Here, we present a method to detect if different core promoters exhibit specificity to certain enhancers within massively parallel assays for enhancer detection. We use k-mers of various length (3-8bp) as sequence features and compare k-mer frequencies between developmental and housekeeping enhancers. This method shows promoter specificity of enhancers in D. melanogaster.

Control-Group Feature Normalization For Multivariate Pattern Analysis Using The Support Vector Machine, 2015 Department of Biostatistics and Epidemiology, Perelman School of Medicine, University of Pennsylvania

#### Control-Group Feature Normalization For Multivariate Pattern Analysis Using The Support Vector Machine, Kristin A. Linn, Bilwaj Gaonkar, Jimit Doshi, Christos Davatzikos, Russell T. Shinohara

*UPenn Biostatistics Working Papers*

Normalization of feature vector values is a common practice in machine learning. Generally, each feature value is standardized to the unit hypercube or by normalizing to zero mean and unit variance. Classification decisions based on support vector machines (SVMs) or by other methods are sensitive to the specific normalization used on the features. In the context of multivariate pattern analysis using neuroimaging data, standardization effectively up- and down-weights features based on their individual variability. Since the standard approach uses the entire data set to guide the normalization it utilizes the total variability of these features. This total variation is inevitably ...

Addressing Confounding In Predictive Models With An Application To Neuroimaging, 2015 Department of Biostatistics and Epidemiology, Perelman School of Medicine, University of Pennsylvania

#### Addressing Confounding In Predictive Models With An Application To Neuroimaging, Kristin A. Linn, Bilwaj Gaonkar, Jimit Doshi, Christos Davatzikos, Russell T. Shinohara

*UPenn Biostatistics Working Papers*

Understanding structural changes in the brain that are caused by a particular disease is a major goal of neuroimaging research. Multivariate pattern analysis (MVPA) comprises a collection of tools that can be used to understand complex disease effects across the brain. We discuss several important issues that must be considered when analyzing data from neuroimaging studies using MVPA. In particular, we focus on the consequences of confounding by non-imaging variables such as age and sex on the results of MVPA. After reviewing current practice to address confounding in neuroimaging studies, we propose an alternative approach based on inverse probability weighting ...

Factors Affecting Dimensional Precision Of Consumer 3d Printing, 2015 Embry-Riddle Aeronautical University

#### Factors Affecting Dimensional Precision Of Consumer 3d Printing, David D. Hernandez

*International Journal of Aviation, Aeronautics, and Aerospace*

This paper investigates the factors affecting dimensional precision of consumer-grade 3D printing, attempting to isolate and mitigate sources of error. The focus is on creating engineering prototypes of, tooling for, or finalized instances of mechanical devices. A specific fused deposition modeling printer – the Ultimaker 2 – is analyzed in terms of meeting precise physical dimensions, consistent shapes, and predictable surface finish. Extensive trial and error resulted in removal of several sources of bias, with square test articles exhibiting a lower-than-anticipated mean percentage error of -0.387% (*SD* = 0.559), a value comparable to other modern manufacturing techniques. A full factorial design ...

On Varieties Of Doubly Robust Estimators Under Missing Not At Random With An Ancillary Variable, 2015 Beijing University

#### On Varieties Of Doubly Robust Estimators Under Missing Not At Random With An Ancillary Variable, Wang Miao, Eric Tchetgen Tchetgen

*Harvard University Biostatistics Working Paper Series*

No abstract provided.

Generalizing Evidence From Randomized Trials Using Inverse Probability Of Sampling Weights, 2015 Harvard University

#### Generalizing Evidence From Randomized Trials Using Inverse Probability Of Sampling Weights, Ashley L. Buchanan, Michael G. Hudgens, Stephen R. Cole, Katie Mollan, Paul E. Sax, Eric Daar, Adaora A. Adimora, Joseph Eron, Michael Mugavero

*The University of North Carolina at Chapel Hill Department of Biostatistics Technical Report Series*

Results obtained in randomized trials may not generalize to specific target populations. In a randomized trial, the treatment assignment mechanism is known, but assuming participants are a random sample from the target population is often dubious. Lack of generalizability can occur when the distribution of treatment effect modifiers in trial participants differs from the distribution in the target population. We consider an inverse probability of sampling weighted (IPSW) estimator for generalizing trial results to a user-specified target population that differs in important clinical or demographic characteristics from the randomized trial. The IPSW estimator is shown to be consistent and asymptotically ...

Embアルゴリズムの新たな応用による多重比率補定(高橋将宜), 2015 National Statistics Center of Japan

#### Embアルゴリズムの新たな応用による多重比率補定(高橋将宜), Masayoshi Takahashi

*Masayoshi Takahashi*

No abstract provided.

On Partial Identification Of The Pure Direct Effect, 2015 University of California - Berkeley

#### On Partial Identification Of The Pure Direct Effect, Caleb Miles, Phyllis Kanki, Seema Meloni, Eric Tchetgen Tchetgen

*Harvard University Biostatistics Working Paper Series*

No abstract provided.

On The Estimation Of Intracluster Correlation For Time-To-Event Outcomes In Cluster Randomized Trials, 2015 The University of Western Ontario

#### On The Estimation Of Intracluster Correlation For Time-To-Event Outcomes In Cluster Randomized Trials, Sumeet Kalia

*Electronic Thesis and Dissertation Repository*

Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs where the intracluster correlation coefficient (ICC) serves as a descriptive parameter to assess the similarity among outcomes in a cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge due to the presence of censored observations. The ICC is estimated for two CRTs using the censoring indicators and observed outcomes.

A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that the ICC estimators derived from ...

Methods For Dealing With Death And Missing Data, And For Standardizing Different Health Variables In Longitudinal Datasets: The Cardiovascular Health Study, 2015 University of Washington

#### Methods For Dealing With Death And Missing Data, And For Standardizing Different Health Variables In Longitudinal Datasets: The Cardiovascular Health Study, Paula Diehr

*UW Biostatistics Working Paper Series*

Longitudinal studies of older adults usually need to account for deaths and missing data. The databases often include multiple health-related variables, whose trends over time are hard to compare because they were measured on different scales. Here we present the unified approach to these three problems that was developed and used in the Cardiovascular Health Study. Data were first transformed to a new scale that had integer/ratio properties, and on which “dead” logically takes the value zero. Missing data were then imputed on this new scale, using each person’s own data over time. Imputation could thus be informed ...

Completely Monotone And Bernstein Functions With Convexity Properties On Their Measures, 2015 The University of Western Ontario

#### Completely Monotone And Bernstein Functions With Convexity Properties On Their Measures, Shen Shan

*Electronic Thesis and Dissertation Repository*

The concepts of completely monotone and Bernstein functions have been introduced near one hundred years ago. They find wide applications in areas ranging from stochastic L\'{e}vy processes and complex analysis to monotone operator theory. They have well-known Bernstein and L\'{e}vy-Khintchine integral representations through which there are one-to-one correspondences between them and Radon measures on $[0,\infty)$ or $(0,\infty)$, respectively. In this thesis, we investigate subclasses of completely monotone and Bernstein functions with various convexity properties on their measures. These subclasses have intriguing applications in probability theories and convex analysis.

The convexity properties we investigate include ...

Tropical Cyclone Wind Hazard Assessment For Southeast Part Of Coastal Region Of China, 2015 The University of Western Ontario

#### Tropical Cyclone Wind Hazard Assessment For Southeast Part Of Coastal Region Of China, Sihan Li

*Electronic Thesis and Dissertation Repository*

Tropical cyclone (TC) or typhoon wind hazard and risk are significant for China. The return period value of the maximum typhoon wind speed is used to characterize the typhoon wind hazard and assign wind load in building design code. Since the historical surface observations of typhoon wind speed are often scarce and of short period, the typhoon wind hazard assessment is often carried out using the wind field model and TC track model. For a few major cities in the coastal region of mainland China, simple or approximated wind field models and a circular subregion method (CSM) have been used ...

On The Dual Risk Models, 2015 The University of Western Ontario

#### On The Dual Risk Models, Chen Yang

*Electronic Thesis and Dissertation Repository*

Abstract This thesis focuses on developing and computing ruin-related quantities that are potentially measurements for the dual risk models which was proposed to describe the annuity-type businesses from the perspective of the collective risk theory in 1950’s. In recent years, the dual risk models are revisited by many researchers to quantify the risk of the similar businesses as the annuity-type businesses. The major extensions included in this thesis consist of two aspects: the ﬁrst is to search for new ruin-related quantities that are potentially indices of the risk for well-established dual models; the other aspect is to generalize the ...

Statistical Estimation Of T1 Relaxation Times Using Conventioanl Magnetic Resonance Imaging, 2015 Department of Biostatistics, Bloomberg School of Public Health, Johns Hopkins University

#### Statistical Estimation Of T1 Relaxation Times Using Conventioanl Magnetic Resonance Imaging, Amanda Mejia, Elizabeth M. Sweeney, Blake Dewey, Govind Nair, Pascal Sati, Colin Shea, Daniel S. Reich, Russell T. Shinohara

*UPenn Biostatistics Working Papers*

Quantitative *T*_{1} maps estimate *T*_{1} relaxation times and can be used to assess diuse tissue abnormalities within normal-appearing tissue. *T*_{1} maps are popular for studying the progression and treatment of multiple sclerosis (MS). However, their inclusion in standard imaging protocols remains limited due to the additional scanning time and expert calibration required and susceptibility to bias and noise. Here, we propose a new method of estimating *T*_{1} maps using four conventional MR images, which are intensity- normalized using cerebellar gray matter as a reference tissue and related to *T*_{1} using a smooth regression model. Using ...