Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 20 of 20

Full-Text Articles in Physical Sciences and Mathematics

Statistical Inferences For The Youden Index, Haochuan Zhou Dec 2011

Statistical Inferences For The Youden Index, Haochuan Zhou

Mathematics Dissertations

In diagnostic test studies, one crucial task is to evaluate the diagnostic accuracy of a test. Currently, most studies focus on the Receiver Operating Characteristics Curve and the Area Under the Curve. On the other hand, the Youden index, widely applied in practice, is another comprehensive measurement for the performance of a diagnostic test. For a continuous-scale test classifying diseased and non-diseased groups, finding the Youden index of the test is equivalent to maximize the sum of sensitivity and specificity for all the possible values of the cut-point. This dissertation concentrates on statistical inferences for the Youden index. First, an …


Assessment Of The Sustained Financial Impact Of Risk Engineering Service On Insurance Claims Costs, Bobby I. Parker Mr. Dec 2011

Assessment Of The Sustained Financial Impact Of Risk Engineering Service On Insurance Claims Costs, Bobby I. Parker Mr.

Mathematics Theses

This research paper creates a comprehensive statistical model, relating financial impact of risk engineering activity, and insurance claims costs. Specifically, the model shows important statistical relationships among six variables including: types of risk engineering activity, risk engineering dollar cost, duration of risk engineering service, and type of customer by industry classification, dollar premium amounts, and dollar claims costs.

We accomplish this by using a large data sample of approximately 15,000 customer-years of insurance coverage, and risk engineering activity. Data sample is from an international casualty/property insurance company and covers four years of operations, 2006-2009. The choice of statistical model is …


On The 4 By 4 Irreducible Sign Pattern Matrices That Require Four Distinct Eigenvalues, Paul J. Kim Aug 2011

On The 4 By 4 Irreducible Sign Pattern Matrices That Require Four Distinct Eigenvalues, Paul J. Kim

Mathematics Theses

A sign pattern matrix is a matrix whose entries are from the set {+,-,0}. For a real matrix B, sgn(B) is the sign pattern matrix obtained by replacing each positive(respectively, negative, zero) entry of B by + (respectively, -, 0). For a sign pattern matrix A, the sign pattern class of A, denoted Q(A), is defined as {B: sgn(B) = A}.

An n by n sign pattern matrix A requires all distinct eigenvalues if every real matrix whose sign pattern is represented by A has n distinct eigenvalues. In this thesis, a number of sufficient and/or necessary conditions for a …


Analysis Of Faculty Evaluation By Students As A Reliable Measure Of Faculty Teaching Performance, Etienne Twagirumukiza Aug 2011

Analysis Of Faculty Evaluation By Students As A Reliable Measure Of Faculty Teaching Performance, Etienne Twagirumukiza

Mathematics Theses

Most American universities and colleges require students to provide faculty evaluation at end of each academic term, as a way of measuring faculty teaching performance. Although some analysts think that this kind of evaluation does not necessarily provide a good measurement of teaching effectiveness, there is a growing agreement in the academic world about its reliability. This study attempts to find any strong statistical evidence supporting faculty evaluation by students as a measure of faculty teaching effectiveness. Emphasis will be on analyzing relationships between instructor ratings by students and corresponding students’ grades. Various statistical methods are applied to analyze a …


Discrimination Of High Risk And Low Risk Populations For The Treatment Of Stds, Hui Zhao Aug 2011

Discrimination Of High Risk And Low Risk Populations For The Treatment Of Stds, Hui Zhao

Mathematics Theses

It is an important step in clinical practice to discriminate real diseased patients from healthy persons. It would be great to get such discrimination from some common information like personal information, life style, and the contact with diseased patient. In this study, a score is calculated for each patient based on a survey through generalized linear model, and then the diseased status is decided according to previous sexually transmitted diseases (STDs) records. This study will facilitate clinics in grouping patients into real diseased or healthy, which in turn will affect the method clinics take to screen patients: complete screening for …


The Path From Foster Care To Permanence: Does Proximity Outweigh Stability?, Michael Fost Aug 2011

The Path From Foster Care To Permanence: Does Proximity Outweigh Stability?, Michael Fost

Mathematics Theses

This thesis investigates the relationship between foster care placement settings and discharges. Placement settings are where foster children live: foster homes, group homes, etc. There may be one or several placements for any individual child. In the interest of stability, federal funding to states depends in part on low numbers of placement moves. Federal reviews, however, do not consider whether the placement settings resemble permanent family life (foster homes compared to congregate care) or the direction of placement moves. Competing risks regression was used to analyze time to discharge data of foster children in Georgia. Discharges (competing risks) were compared …


Revisiting The Dimensions Of Residential Segregation, Harry Sharp Aug 2011

Revisiting The Dimensions Of Residential Segregation, Harry Sharp

Mathematics Theses

The first major work to analyze the dimensions of segregation, done in the late 1980s by Massey and Denton, found five dimensions which explained the phenomenon of segregation. Since the original work was done in 1988 it seems relevant to revisit the issue with new data. Massey and Denton used the technique of factor analysis to identify the latent structure underlying the phenomenon. In this research their methodology is applied to a more complete data set from the 1980 Census to confirm their results and extend the methodology. Due to problems identified during the analysis confirmation was not possible. However, …


Testing An Assumption Of Non-Differential Misclassification In Case-Control Studies, Qin Hui Aug 2011

Testing An Assumption Of Non-Differential Misclassification In Case-Control Studies, Qin Hui

Mathematics Theses

One of the issues regarding the misclassification in case-control studies is whether the misclassification error rates are the same for both cases and controls. Currently, a common practice is to assume that the rates are the same (“non-differential” assumption). However, it is suspicious that this assumption is valid in many case-control studies. Unfortunately, no test is available so far to test the validity of the assumption of non-differential misclassification when the validation data are not available. We propose the first such method to test the validity of non-differential assumption in a case-control study with 2 × 2 contingency table. First, …


Analysis Of Dependently Truncated Sample Using Inverse Probability Weighted Estimator, Yang Liu Aug 2011

Analysis Of Dependently Truncated Sample Using Inverse Probability Weighted Estimator, Yang Liu

Mathematics Theses

Many statistical methods for truncated data rely on the assumption that the failure and truncation time are independent, which can be unrealistic in applications. The study cohorts obtained from bone marrow transplant (BMT) registry data are commonly recognized as truncated samples, the time-to-failure is truncated by the transplant time. There are clinical evidences that a longer transplant waiting time is a worse prognosis of survivorship. Therefore, it is reasonable to assume the dependence between transplant and failure time. To better analyze BMT registry data, we utilize a Cox analysis in which the transplant time is both a truncation variable and …


Minimum Degree Conditions For Tilings In Graphs And Hypergraphs, Andrew Lightcap Aug 2011

Minimum Degree Conditions For Tilings In Graphs And Hypergraphs, Andrew Lightcap

Mathematics Theses

We consider tiling problems for graphs and hypergraphs. For two graphs and , an -tiling of is a subgraph of consisting of only vertex disjoint copies of . By using the absorbing method, we give a short proof that in a balanced tripartite graph , if every vertex is adjacent to of the vertices in each of the other vertex partitions, then has a -tiling. Previously, Magyar and Martin [11] proved the same result (without ) by using the Regularity Lemma.

In a 3-uniform hypergraph , let denote the minimum number of edges that contain for all pairs of vertices. …


Jackknife Empirical Likelihood For The Accelerated Failure Time Model With Censored Data, Maxime K. Bouadoumou Jul 2011

Jackknife Empirical Likelihood For The Accelerated Failure Time Model With Censored Data, Maxime K. Bouadoumou

Mathematics Theses

Kendall and Gehan estimating functions are used to estimate the regression parameter in accelerated failure time (AFT) model with censored observations. The accelerated failure time model is the preferred survival analysis method because it maintains a consistent association between the covariate and the survival time. The jackknife empirical likelihood method is used because it overcomes computation difficulty by circumventing the construction of the nonlinear constraint. Jackknife empirical likelihood turns the statistic of interest into a sample mean based on jackknife pseudo-values. U-statistic approach is used to construct the confidence intervals for the regression parameter. We conduct a simulation study …


Prevalence Of Chronic Diseases And Risk Factors For Death Among Elderly Americans, Guangming Han Jul 2011

Prevalence Of Chronic Diseases And Risk Factors For Death Among Elderly Americans, Guangming Han

Mathematics Theses

The main aim of this study is to explore the effects of risk factors contributing to death in the elderly American population. To achieve this purpose, we constructed Cox proportional hazard regression models and logistic regression models with the complex survey dataset from the national Second Longitudinal Study of Aging (LSOA II) to calculate the hazard ratios (HR)/odds ratios (OR) and confidence interval (CI) of risk factors. Our results show that in addition to chronic disease conditions, many risk factors, such as demographic factors (gender and age), social factors (interaction with friends or relatives), personal health behaviors (smoking and exercise), …


Estimation Algorithm For Mixture Of Experts Recurrent Event Model, Timesha U. Brooks Jun 2011

Estimation Algorithm For Mixture Of Experts Recurrent Event Model, Timesha U. Brooks

Mathematics Theses

This paper proposes a mixture of experts recurrent events model. This general model accommodates an unobservable frailty variable, intervention effect, influence of accumulating event occurrences, and covariate effects. A latent class variable is utilized to deal with a heterogeneous population and associated covariates. A homogeneous nonparametric baseline hazard and heterogeneous parametric covariate effects are assumed. Maximum likelihood principle is employed to obtain parameter estimates. Since the frailty variable and latent classes are unobserved, an estimation procedure is derived through the EM algorithm. A simulated data set is generated to illustrate the data structure of recurrent events for a heterogeneous population.


Three Topics In Analysis: (I) The Fundamental Theorem Of Calculus Implies That Of Algebra, (Ii) Mini Sums For The Riesz Representing Measure, And (Iii) Holomorphic Domination And Complex Banach Manifolds Similar To Stein Manifolds, Panakkal J. Mathew May 2011

Three Topics In Analysis: (I) The Fundamental Theorem Of Calculus Implies That Of Algebra, (Ii) Mini Sums For The Riesz Representing Measure, And (Iii) Holomorphic Domination And Complex Banach Manifolds Similar To Stein Manifolds, Panakkal J. Mathew

Mathematics Dissertations

We look at three distinct topics in analysis. In the first we give a direct and easy proof that the usual Newton-Leibniz rule implies the fundamental theorem of algebra that any nonconstant complex polynomial of one complex variable has a complex root. Next, we look at the Riesz representation theorem and show that the Riesz representing measure often can be given in the form of mini sums just like in the case of the usual Lebesgue measure on a cube. Lastly, we look at the idea of holomorphic domination and use it to define a class of complex Banach manifolds …


Some Topics In Roc Curves Analysis, Xin Huang May 2011

Some Topics In Roc Curves Analysis, Xin Huang

Mathematics Dissertations

The receiver operating characteristic (ROC) curves is a popular tool for evaluating continuous diagnostic tests. The traditional definition of ROC curves incorporates implicitly the idea of "hard" thresholding, which also results in the empirical curves being step functions. The first topic is to introduce a novel definition of soft ROC curves, which incorporates the idea of "soft" thresholding. The softness of a soft ROC curve is controlled by a regularization parameter that can be selected suitably by a cross-validation procedure. A byproduct of the soft ROC curves is that the corresponding empirical curves are smooth.

The second topic is on …


Estimation Of Hazard Function For Right Truncated Data, Yong Jiang Apr 2011

Estimation Of Hazard Function For Right Truncated Data, Yong Jiang

Mathematics Theses

This thesis centers on nonparametric inferences of the cumulative hazard function of a right truncated variable. We present three variance estimators for the Nelson-Aalen estimator of the cumulative hazard function and conduct a simulation study to investigate their performances. A close match between the sampling standard deviation and the estimated standard error is observed when an estimated survival probability is not close to 1. However, the problem of poor tail performance exists due to the limitation of the proposed variance estimators. We further analyze an AIDS blood transfusion sample for which the disease latent time is right truncated. We compute …


A Review Of Cross Validation And Adaptive Model Selection, Ali R. Syed Apr 2011

A Review Of Cross Validation And Adaptive Model Selection, Ali R. Syed

Mathematics Theses

We perform a review of model selection procedures, in particular various cross validation procedures and adaptive model selection. We cover important results for these procedures and explore the connections between different procedures and information criteria.


A New Jackknife Empirical Likelihood Method For U-Statistics, Zhengbo Ma Apr 2011

A New Jackknife Empirical Likelihood Method For U-Statistics, Zhengbo Ma

Mathematics Theses

U-statistics generalizes the concept of mean of independent identically distributed (i.i.d.) random variables and is widely utilized in many estimating and testing problems. The standard empirical likelihood (EL) for U-statistics is computationally expensive because of its onlinear constraint. The jackknife empirical likelihood method largely relieves computation burden by circumventing the construction of the nonlinear constraint. In this thesis, we adopt a new jackknife empirical likelihood method to make inference for the general volume under the ROC surface (VUS), which is one typical kind of U-statistics. Monte Carlo simulations are conducted to show that the EL confidence intervals perform well in …


Stability Selection Of The Number Of Clusters, Gabriella V. Reizer Apr 2011

Stability Selection Of The Number Of Clusters, Gabriella V. Reizer

Mathematics Theses

Selecting the number of clusters is one of the greatest challenges in clustering analysis. In this thesis, we propose a variety of stability selection criteria based on cross validation for determining the number of clusters. Clustering stability measures the agreement of clusterings obtained by applying the same clustering algorithm on multiple independent and identically distributed samples. We propose to measure the clustering stability by the correlation between two clustering functions. These criteria are motivated by the concept of clustering instability proposed by Wang (2010), which is based on a form of clustering distance. In addition, the effectiveness and robustness of …


On The Lebesgue Integral, Jeremiah D. Kastine Mar 2011

On The Lebesgue Integral, Jeremiah D. Kastine

Mathematics Theses

We look from a new point of view at the definition and basic properties of the Lebesgue measure and integral on Euclidean spaces, on abstract spaces, and on locally compact Hausdorff spaces. We use mini sums to give all of them a unified treatment that is more efficient than the standard ones. We also give Fubini's theorem a proof that is nicer and uses much lighter technical baggage than the usual treatments.