Open Access. Powered by Scholars. Published by Universities.®

Longitudinal Data Analysis and Time Series Commons

Open Access. Powered by Scholars. Published by Universities.®

414 Full-Text Articles 680 Authors 212,302 Downloads 72 Institutions

All Articles in Longitudinal Data Analysis and Time Series

Faceted Search

414 full-text articles. Page 7 of 15.

A Comparison Of Techniques For Handling Missing Data In Longitudinal Studies, Alexander R. Bogdan 2016 University of Massachusetts Amherst

A Comparison Of Techniques For Handling Missing Data In Longitudinal Studies, Alexander R. Bogdan

Masters Theses

Missing data are a common problem in virtually all epidemiological research, especially when conducting longitudinal studies. In these settings, clinicians may collect biological samples to analyze changes in biomarkers, which often do not conform to parametric distributions and may be censored due to limits of detection. Using complete data from the BioCycle Study (2005-2007), which followed 259 premenopausal women over two menstrual cycles, we compared four techniques for handling missing biomarker data with non-Normal distributions. We imposed increasing degrees of missing data on two non-Normally distributed biomarkers under conditions of missing completely at random, missing at random, and missing not …


Development Of Anatomical And Functional Magnetic Resonance Imaging Measures Of Alzheimer Disease, Samaneh Kazemifar 2016 The University of Western Ontario

Development Of Anatomical And Functional Magnetic Resonance Imaging Measures Of Alzheimer Disease, Samaneh Kazemifar

Electronic Thesis and Dissertation Repository

Alzheimer disease is considered to be a progressive neurodegenerative condition, clinically characterized by cognitive dysfunction and memory impairments. Incorporating imaging biomarkers in the early diagnosis and monitoring of disease progression is increasingly important in the evaluation of novel treatments. The purpose of the work in this thesis was to develop and evaluate novel structural and functional biomarkers of disease to improve Alzheimer disease diagnosis and treatment monitoring. Our overarching hypothesis is that magnetic resonance imaging methods that sensitively measure brain structure and functional impairment have the potential to identify people with Alzheimer’s disease prior to the onset of cognitive decline. …


Nondestructive Testing And Structural Health Monitoring Based On Adams And Svm Techniques, Gang Jiang, Yi Ming Deng, Ji Tai Niu 2016 Southwest University of Science and Technology

Nondestructive Testing And Structural Health Monitoring Based On Adams And Svm Techniques, Gang Jiang, Yi Ming Deng, Ji Tai Niu

The 8th International Conference on Physical and Numerical Simulation of Materials Processing

No abstract provided.


Longitudinal Tidal Dispersion Coefficient Estimation And Total Suspended Solids Transport Characterization In The James River, Beatriz Eugenia Patino 2016 Old Dominion University

Longitudinal Tidal Dispersion Coefficient Estimation And Total Suspended Solids Transport Characterization In The James River, Beatriz Eugenia Patino

Civil & Environmental Engineering Theses & Dissertations

The longitudinal dispersion coefficient is a parameter used to evaluate the effect of cross-sectional variations on substance mixing mechanisms in estuaries influenced by tide, wind and internal density variations. Considering a two dimensional approach, this study aims at evaluating a tidal area of the lower James River at approximately 19 miles upstream from the mouth at the Chesapeake Bay, in the City of Newport News, and applies an experimental procedure based on in-situ salinity concentrations to estimate the dispersion coefficient in the area where receives a discharge from the HRSD James River Wastewater Treatment Plant, and further characterizes Total Suspended …


Advances In Portmanteau Diagnostic Tests, Jinkun Xiao 2016 The University of Western Ontario

Advances In Portmanteau Diagnostic Tests, Jinkun Xiao

Electronic Thesis and Dissertation Repository

Portmanteau test serves an important role in model diagnostics for Box-Jenkins Modelling procedures. A large number of Portmanteau test based on the autocorrelation function are proposed for a general purpose goodness-of-fit test. Since the asymptotic distributions for the statistics has a complicated form which makes it hard to obtain the p-value directly, the gamma approximation is introduced to obtain the p-value. But the approximation will inevitably introduce approximation errors and needs a large number of observations to yield a good approximation. To avoid some pitfalls in the approximation, the Lin-Mcleod Test is further proposed to obtain a numeric solution to …


Passive Visual Analytics Of Social Media Data For Detection Of Unusual Events, Kush Rustagi, Junghoon Chae 2016 Purdue University

Passive Visual Analytics Of Social Media Data For Detection Of Unusual Events, Kush Rustagi, Junghoon Chae

The Summer Undergraduate Research Fellowship (SURF) Symposium

Now that social media sites have gained substantial traction, huge amounts of un-analyzed valuable data are being generated. Posts containing images and text have spatiotemporal data attached as well, having immense value for increasing situational awareness of local events, providing insights for investigations and understanding the extent of incidents, their severity, and consequences, as well as their time-evolving nature. However, the large volume of unstructured social media data hinders exploration and examination. To analyze such social media data, the S.M.A.R.T system provides the analyst with an interactive visual spatiotemporal analysis and spatial decision support environment that assists in evacuation planning …


Tornado Density And Return Periods In The Southeastern United States: Communicating Risk And Vulnerability At The Regional And State Levels, Michelle Bradburn 2016 East Tennessee State University

Tornado Density And Return Periods In The Southeastern United States: Communicating Risk And Vulnerability At The Regional And State Levels, Michelle Bradburn

Electronic Theses and Dissertations

Tornado intensity and impacts vary drastically across space, thus spatial and statistical analyses were used to identify patterns of tornado severity in the Southeastern United States and to assess the vulnerability and estimated recurrence of tornadic activity. Records from the Storm Prediction Center's tornado database (1950-2014) were used to estimate kernel density to identify areas of high and low tornado frequency at both the regional- and state-scales. Return periods (2-year, 5-year, 10-year, 25-year, 50-year, and 100-year) were calculated at both scales as well using a composite score that included EF-scale magnitude, injury counts, and fatality counts. Results showed that the …


Spatio-Temporal Analysis Of Point Patterns, Abdul-Nasah Soale 2016 East Tennessee State University

Spatio-Temporal Analysis Of Point Patterns, Abdul-Nasah Soale

Electronic Theses and Dissertations

In this thesis, the basic tools of spatial statistics and time series analysis are applied to the case study of the earthquakes in a certain geographical region and time frame. Then some of the existing methods for joint analysis of time and space are described and applied. Finally, additional research questions about the spatial-temporal distribution of the earthquakes are posed and explored using statistical plots and models. The focus in the last section is in the relationship between number of events per year and maximum magnitude and its effect on how clustered the spatial distribution is and the relationship between …


Multilevel Models For Longitudinal Data, Aastha Khatiwada 2016 East Tennessee State University

Multilevel Models For Longitudinal Data, Aastha Khatiwada

Electronic Theses and Dissertations

Longitudinal data arise when individuals are measured several times during an ob- servation period and thus the data for each individual are not independent. There are several ways of analyzing longitudinal data when different treatments are com- pared. Multilevel models are used to analyze data that are clustered in some way. In this work, multilevel models are used to analyze longitudinal data from a case study. Results from other more commonly used methods are compared to multilevel models. Also, comparison in output between two software, SAS and R, is done. Finally a method consisting of fitting individual models for each …


Regional Dynamic Price Relationships Of Distillers Dried Grains In U.S. Feed Markets, Matthew Fulton Johnson 2016 University of Tennessee, Knoxville

Regional Dynamic Price Relationships Of Distillers Dried Grains In U.S. Feed Markets, Matthew Fulton Johnson

Masters Theses

Distillers dried grains with solubles (DDGS) is now a mainstream substitute in U.S. animal feed rations. DDGS is rich in fat and protein content and serves as a competitive feed source in livestock markets. The objective of this study is to identify dynamic price relationships among DDGS, corn, soybean meal, and livestock outputs in context of specific livestock sectors and their geographic location. Four locations associated with a predominant livestock sector are selected for analysis by measuring density and relative proportion of a livestock sector’s grain consumption at the county level. A vector error correction model is applied to post-mandate …


Novel Methods For Analyzing Longitudinal Data With Measurement Error In The Time Variable, Caroline Munindi Mulatya 2016 University of South Carolina

Novel Methods For Analyzing Longitudinal Data With Measurement Error In The Time Variable, Caroline Munindi Mulatya

Theses and Dissertations

In some longitudinal studies, the observed time points are often confounded with measurement error due to the sampling conditions, resulting into data with measurement error in the time variable. This type of data occurs mainly in observational studies when the onset of a longitudinal process is unknown or in clinical trials when individual visits do not take place as specified by the study protocol, but are often rounded to coincide with the study protocol. Methodological and inferential implications of error in time varying covariates for both linear and nonlinear models have been studied widely. In this dissertation, we shift attention …


Joint Modelling In Liver Transplantation, Elizabeth M. Renouf 2016 The University of Western Ontario

Joint Modelling In Liver Transplantation, Elizabeth M. Renouf

Electronic Thesis and Dissertation Repository

In the setting of liver transplantation, clinical trials and transplant registries regularly collect repeated measurements of clinical biomarkers which may be strongly associated with a time-to-event such as graft failure or disease recurrence. Multiple time-to-event outcomes are routinely collected. However, joint models are rarely used. This thesis will describe important considerations for joint modelling in the setting of liver transplantation. We will focus on transplant registry data from the United States. We develop a new tool for joint modelling in the context where a critical health event can be tracked in the longitudinal biomarker and often presents as a non-linear …


Population Projection And Habitat Preference Modeling Of The Endangered James Spinymussel (Pleurobema Collina), Marisa Draper 2016 James Madison University

Population Projection And Habitat Preference Modeling Of The Endangered James Spinymussel (Pleurobema Collina), Marisa Draper

Senior Honors Projects, 2010-2019

The James Spinymussel (Pleurobema collina) is an endangered mussel species at the top of Virginia’s conservation list. The James Spinymussel plays a critical role in the environment by filtering and cleaning stream water while providing shelter and food for macroinvertebrates; however, conservation efforts are complicated by the mussels’ burrowing behavior, camouflage, and complex life cycle. The goals of the research conducted were to estimate detection probabilities that could be used to predict species presence and facilitate field work, and to track individually marked mussels to test for habitat preferences. Using existing literature and mark-recapture field data, these goals were accomplished …


Takens Theorem With Singular Spectrum Analysis Applied To Noisy Time Series, Thomas K. Torku 2016 East Tennessee State University

Takens Theorem With Singular Spectrum Analysis Applied To Noisy Time Series, Thomas K. Torku

Electronic Theses and Dissertations

The evolution of big data has led to financial time series becoming increasingly complex, noisy, non-stationary and nonlinear. Takens theorem can be used to analyze and forecast nonlinear time series, but even small amounts of noise can hopelessly corrupt a Takens approach. In contrast, Singular Spectrum Analysis is an excellent tool for both forecasting and noise reduction. Fortunately, it is possible to combine the Takens approach with Singular Spectrum analysis (SSA), and in fact, estimation of key parameters in Takens theorem is performed with Singular Spectrum Analysis. In this thesis, we combine the denoising abilities of SSA with the Takens …


Effects Of Bullying And Victimization On Friendship Selection, Reciprocation, And Maintenance In Elementary School Children, Marisa Lynn Whitley 2016 University of Tennessee - Knoxville

Effects Of Bullying And Victimization On Friendship Selection, Reciprocation, And Maintenance In Elementary School Children, Marisa Lynn Whitley

Masters Theses

This study examined the effects of elementary school children’s bullying and victimization experiences on their friendships over time. The majority of children experience acts of aggression or bullying before the end of elementary school, and bullying and peer victimization is associated with academic, social, behavioral, and psychological difficulties. This study used social networks analysis (R SIENA 4.0) to examine whether peer reports of forms of bullying and victimization (i.e., overt and relational) affect the likelihood of friendship selection, reciprocation, and maintenance in 2nd-4th grade children. Children (N = 143) from the Midwestern region of the United …


Methods For Dealing With Death And Missing Data, And For Standardizing Different Health Variables In Longitudinal Datasets: The Cardiovascular Health Study, Paula Diehr 2016 University of Washington

Methods For Dealing With Death And Missing Data, And For Standardizing Different Health Variables In Longitudinal Datasets: The Cardiovascular Health Study, Paula Diehr

UW Biostatistics Working Paper Series

Longitudinal studies of older adults usually need to account for deaths and missing data. The study databases often include multiple health-related variables, whose trends over time are hard to compare because they were measured on different scales. Here we present a unified approach to these three problems that was developed and used in the Cardiovascular Health Study. Data were first transformed to a new scale that had integer/ratio properties, and on which “dead” logically takes the value zero. Missing data were then imputed on this new scale, using each person’s own data over time. Imputation could thus be informed by …


Macroconstants Of Development: A New Benchmark For The Strategic Development Of Advanced Countries And Firms, Andrey Bystrov, Vyacheslav Yusim, Tamilla Curtis 2016 Plekhanov Russian Academy of Economics

Macroconstants Of Development: A New Benchmark For The Strategic Development Of Advanced Countries And Firms, Andrey Bystrov, Vyacheslav Yusim, Tamilla Curtis

Dr. Tamilla Curtis

This research proposed a new indicator of countries’ development called “macroconstants of development”. The literature review indicates that the concept of "macroconstants of development" is not used at the moment in neither the theory nor the practice of industrial policy. Research of longitudinal data of total GDP, GDP per capita and their derivatives for most countries of the world was conducted. An analysis of statistical information has been done by employing econometric analyses.

Based on the analysis of the statistical data, which characterizes the development of large, technologically advanced countries in ordinary conditions, it was identified that the average acceleration …


Evaluating The Impact Of A Hiv Low-Risk Express Care Task-Shifting Program: A Case Study Of The Targeted Learning Roadmap, Linh Tran, Constantin T. Yiannoutsos, Beverly S. Musick, Kara K. Wools-Kaloustian, Abraham Siika, Sylvester Kimaiyo, Mark J. van der Laan, Maya L. Petersen 2016 University of California-Berkeley, School of Public Health, Division of Biostatistics

Evaluating The Impact Of A Hiv Low-Risk Express Care Task-Shifting Program: A Case Study Of The Targeted Learning Roadmap, Linh Tran, Constantin T. Yiannoutsos, Beverly S. Musick, Kara K. Wools-Kaloustian, Abraham Siika, Sylvester Kimaiyo, Mark J. Van Der Laan, Maya L. Petersen

U.C. Berkeley Division of Biostatistics Working Paper Series

In conducting studies on an exposure of interest, a systematic roadmap should be applied for translating causal questions into statistical analyses and interpreting the results. In this paper we describe an application of one such roadmap applied to estimating the joint effect of both time to availability of a nurse-based triage system (low risk express care (LREC)) and individual enrollment in the program among HIV patients in East Africa. Our study population is comprised of 16;513 subjects found eligible for this task-shifting program within 15 clinics in Kenya between 2006 and 2009, with each clinic starting the LREC program between …


Models For Hsv Shedding Must Account For Two Levels Of Overdispersion, Amalia Magaret 2016 University of Washington - Seattle Campus

Models For Hsv Shedding Must Account For Two Levels Of Overdispersion, Amalia Magaret

UW Biostatistics Working Paper Series

We have frequently implemented crossover studies to evaluate new therapeutic interventions for genital herpes simplex virus infection. The outcome measured to assess the efficacy of interventions on herpes disease severity is the viral shedding rate, defined as the frequency of detection of HSV on the genital skin and mucosa. We performed a simulation study to ascertain whether our standard model, which we have used previously, was appropriately considering all the necessary features of the shedding data to provide correct inference. We simulated shedding data under our standard, validated assumptions and assessed the ability of 5 different models to reproduce the …


Online Variational Bayes Inference For High-Dimensional Correlated Data, Sylvie T. Kabisa, Jeffrey S. Morris, David Dunson 2016 Duke University

Online Variational Bayes Inference For High-Dimensional Correlated Data, Sylvie T. Kabisa, Jeffrey S. Morris, David Dunson

Jeffrey S. Morris

High-dimensional data with hundreds of thousands of observations are becoming commonplace in many disciplines. The analysis of such data poses many computational challenges, especially when the observations are correlated over time and/or across space. In this paper we propose exible hierarchical regression models for analyzing such data that accommodate serial and/or spatial correlation. We address the computational challenges involved in fitting these models by adopting an approximate inference framework. We develop an online variational Bayes algorithm that works by incrementally reading the data into memory one portion at a time. The performance of the method is assessed through simulation studies. …


Digital Commons powered by bepress