Open Access. Powered by Scholars. Published by Universities.®

Survival Analysis Commons

Open Access. Powered by Scholars. Published by Universities.®

2007

Discipline
Institution
Keyword
Publication
Publication Type
File Type

Articles 1 - 8 of 8

Full-Text Articles in Survival Analysis

Immediate Implant Placement In Extraction Sites With Periapical Lesions: A Retrospective Study, Yuan-Lung Hung Dec 2007

Immediate Implant Placement In Extraction Sites With Periapical Lesions: A Retrospective Study, Yuan-Lung Hung

Loma Linda University Electronic Theses, Dissertations & Projects

Immediate implant placement into fresh extraction sites has become a relatively routine clinical procedure with a favorable prognosis. However, immediate placement into extraction sockets with lesions has not been extensively documented in humans. Therefore, the purpose of this study was to retrospectively determine the survival rate of implants placed into extraction sockets with visible periapical lesions.

Patient charts of 544 immediately placed implants from Loma Linda University School of Dentistry, Center for Prosthodontics and Implant Dentistry were examined. Eighty-six of the 544 implants had been placed immediately into extraction sockets with periapical lesions and they were included in this study. …


A Note On Targeted Maximum Likelihood And Right Censored Data, Mark J. Van Der Laan, Daniel Rubin Oct 2007

A Note On Targeted Maximum Likelihood And Right Censored Data, Mark J. Van Der Laan, Daniel Rubin

U.C. Berkeley Division of Biostatistics Working Paper Series

A popular way to estimate an unknown parameter is with substitution, or evaluating the parameter at a likelihood based fit of the data generating density. In many cases, such estimators have substantial bias and can fail to converge at the parametric rate. van der Laan and Rubin (2006) introduced targeted maximum likelihood learning, removing these shackles from substitution estimators, which were made in full agreement with the locally efficient estimating equation procedures as presented in Robins and Rotnitzsky (1992) and van der Laan and Robins (2003). This note illustrates how targeted maximum likelihood can be applied in right censored data …


Empirical Efficiency Maximization, Daniel B. Rubin, Mark J. Van Der Laan Jul 2007

Empirical Efficiency Maximization, Daniel B. Rubin, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

It has long been recognized that covariate adjustment can increase precision, even when it is not strictly necessary. The phenomenon is particularly emphasized in clinical trials, whether using continuous, categorical, or censored time-to-event outcomes. Adjustment is often straightforward when a discrete covariate partitions the sample into a handful of strata, but becomes more involved when modern studies collect copious amounts of baseline information on each subject.

The dilemma helped motivate locally efficient estimation for coarsened data structures, as surveyed in the books of van der Laan and Robins (2003) and Tsiatis (2006). Here one fits a relatively small working model …


Regression Analysis Of A Disease Onset Distribution Using Diagnosis Data, Jessica G. Young, Nicholas P. Jewell, Steven J. Samuels Jul 2007

Regression Analysis Of A Disease Onset Distribution Using Diagnosis Data, Jessica G. Young, Nicholas P. Jewell, Steven J. Samuels

U.C. Berkeley Division of Biostatistics Working Paper Series

We consider methods for estimating the effect of a covariate on a disease onset distribution when the observed data structure consists of right-censored data on diagnosis times and current status data on onset times amongst individuals who have not yet been diagnosed. Dunson and Baird (2001) approached this problem using maximum likelihood, under the assumption that the ratio of the diagnosis and onset distributions is monotonic non-decreasing. As an alternative, we propose a two-step estimator, an extension of the approach of van der Laan, Jewell and Petersen (1997) in the single sample setting, that is computationally much simpler and requires …


Survival Analysis With Large Dimensional Covariates: An Application In Microarray Studies, David A. Engler, Yi Li Jul 2007

Survival Analysis With Large Dimensional Covariates: An Application In Microarray Studies, David A. Engler, Yi Li

Harvard University Biostatistics Working Paper Series

Use of microarray technology often leads to high-dimensional and low- sample size data settings. Over the past several years, a variety of novel approaches have been proposed for variable selection in this context. However, only a small number of these have been adapted for time-to-event data where censoring is present. Among standard variable selection methods shown both to have good predictive accuracy and to be computationally efficient is the elastic net penalization approach. In this paper, adaptation of the elastic net approach is presented for variable selection both under the Cox proportional hazards model and under an accelerated failure time …


Semiparametric Bivariate Quantile-Quantile Regression For Analyzing Semi-Competing Risks Data, Daniel O. Scharfstein, James M. Robins, Mark Van Der Laan Mar 2007

Semiparametric Bivariate Quantile-Quantile Regression For Analyzing Semi-Competing Risks Data, Daniel O. Scharfstein, James M. Robins, Mark Van Der Laan

Johns Hopkins University, Dept. of Biostatistics Working Papers

In this paper, we consider estimation of the effect of a randomized treatment on time to disease progression and death, possibly adjusting for high-dimensional baseline prognostic factors. We assume that patients may or may not have a specific type of disease progression prior to death and those who have this endpoint are followed for their survival information. Progression and survival may also be censored due to loss to follow-up or study termination. We posit a semi-parametric bivariate quantile-quantile regression failure time model and show how to construct estimators of the regression parameters. The causal interpretation of the parameters depends on …


Pdf Submitted Jan Nineteen, Sid Twentythree Jan 2007

Pdf Submitted Jan Nineteen, Sid Twentythree

Sidney Twentythree Sr.

One more test.


Chess, Chance And Conspiracy, Mark Segal Dec 2006

Chess, Chance And Conspiracy, Mark Segal

Mark R Segal

Chess and chance are seemingly strange bedfellows. Luck and/or randomness have no apparent role in move selection when the game is played at the highest levels. However, when competition is at the ultimate level, that of the World Chess Championship (WCC), chess and conspiracy are not strange bedfellows, there being a long and colorful history of accusations levied between participants. One such accusation, frequently repeated, was that all the games in the 1985 WCC (Karpov vs Kasparov) were fixed and prearranged move by move. That this claim was advanced by a former World Champion, Bobby Fischer, argues that it ought …