Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 25 of 25

Full-Text Articles in Physical Sciences and Mathematics

Autologous Stem Cell Transplant: Factors Predicting The Yield Of Cd34+ Cells, Elizabeth Anne Lawson Dec 2005

Autologous Stem Cell Transplant: Factors Predicting The Yield Of Cd34+ Cells, Elizabeth Anne Lawson

Theses and Dissertations

Stem cell transplant is often considered the last hope for the survival for many cancer patients. The CD34+ cell content of a collection of stem cells has appeared as the most reliable indicator of the quantity of desired cells in a peripheral blood stem cell harvest and is used as a surrogate measure of the sample quality. Factors predicting the yield of CD34+ cells in a collection are not yet fully understood. Throughout the literature, there has been conflicting evidence with regards to age, gender, disease status, and prior radiation. In addition to the factors that have already been explored, …


A Reliability Case Study On Estimating Extremely Small Percentiles Of Strength Data For The Continuous Improvement Of Medium Density Fiberboard Product Quality, Weiwei Chen Dec 2005

A Reliability Case Study On Estimating Extremely Small Percentiles Of Strength Data For The Continuous Improvement Of Medium Density Fiberboard Product Quality, Weiwei Chen

Masters Theses

The objective of this thesis is to better estimate extremely small percentiles of strength distributions for measuring failure process in continuous improvement initiatives. These percentiles are of great interest for companies, oversight organizations, and consumers concerned with product safety and reliability. The thesis investigates the lower percentiles for the quality of medium density fiberboard (MDF). The international industrial standard for measuring quality for MDF is internal bond (IB, a tensile strength test). The results of the thesis indicated that the smaller percentiles are crucial, especially the first percentile and lower ones.

The thesis starts by introducing the background, study objectives, …


Accuracy Of The Newtom 3g™ In Measuring The Angle Of The Articular Eminence, Rehana Khan Dec 2005

Accuracy Of The Newtom 3g™ In Measuring The Angle Of The Articular Eminence, Rehana Khan

Loma Linda University Electronic Theses, Dissertations & Projects

The purpose of this study was to determine the accuracy of the Newtom 3G™ in determining the angulation of the articular eminence. The benefits of conducting this study were to provide additional uses for the standard records that are taken for the purposes of orthodontic treatment, as well as evaluate the Newtom 3G™ for accuracy in measuring the anatomy of the glenoid fossa. This study required 20 participants that volunteered to allow their records to be used. Records evaluated were the Newtom 3G™, impressions, and wax check bite registrations. The wax record was taken using the 'forced bite' technique to …


Autism And Parental Marital Satisfaction: The Role Of Adequacy Of Resources, Geneeta Kaliah Chambers Dec 2005

Autism And Parental Marital Satisfaction: The Role Of Adequacy Of Resources, Geneeta Kaliah Chambers

Loma Linda University Electronic Theses, Dissertations & Projects

The goal of the present study was to expand on the existing literature exploring families with children who have developmental disabilities, particularly autism. Previous studies have been constrained by univariate approaches that have failed to adequately capture the nuances of family functioning. Using an ecological/context approach, stemming from an ongoing research program conducted within a university-based treatment center, the present study attempted to improve on the conceptualization of interrelationships among family members and the role that contextual factors play within that dynamic. Specifically, the present study explored the influence of children’s level of autism on parents’ reports of their marital …


Testing Primitive Polynomials For Generalized Feedback Shift Register Random Number Generators, Guinan Lian Nov 2005

Testing Primitive Polynomials For Generalized Feedback Shift Register Random Number Generators, Guinan Lian

Theses and Dissertations

The class of generalized feedback shift register (GFSR) random number generators was a promising method for random number generation in the 1980's, but was abandoned because of some flaws such as poor performance on certain tests for randomness. The poor performance may be due to the choice of primitive polynomials used in the generators, rather than inherent flaws in the method. The original GFSR generators were all based on primitive trinomials. This project examines several alternative choices of primitive polynomials with more than one "interior" term to address this problem and hopefully provide access to good random number generators.


Development Of Commercial Applications For Recycled Plastics Using Finite Element Analysis, Nanjunda Narasimhamurthy Nov 2005

Development Of Commercial Applications For Recycled Plastics Using Finite Element Analysis, Nanjunda Narasimhamurthy

Theses and Dissertations

This thesis investigates the suitability of thermo-kinetically recycled plastics for use in commercial product applications using finite element analysis and statistics. Different recycled material blends were tested and evaluated for their use in commercial product applications. There are six different blends of thermo-kinetically recycled plastics used for testing and CATIA is used for finite element analysis. The different types of thermo-kinetically recycled plastics blends are: pop bottles made of PolyethyleneTeraphthalate (PET), milk jugs made of High-Density Polyethylene (HDPE), Vinyl seats made of Poly Vinyl Chloride (PVC) and small amount of Polypropylene (PP) and Urethane, electronic scrap made of engineering resins …


Modeling Distributions Of Test Scores With Mixtures Of Beta Distributions, Jingyu Feng Nov 2005

Modeling Distributions Of Test Scores With Mixtures Of Beta Distributions, Jingyu Feng

Theses and Dissertations

Test score distributions are used to make important instructional decisions about students. The test scores usually do not follow a normal distribution. In some cases, the scores appear to follow a bimodal distribution that can be modeled with a mixture of beta distributions. This bimodality may be due different levels of students' ability. The purpose of this study was to develop and apply statistical techniques for fitting beta mixtures and detecting bimodality in test score distributions. Maximum likelihood and Bayesian methods were used to estimate the five parameters of the beta mixture distribution for scores in four quizzes in a …


The Outcome Of Mta As A Root End Filling Material: A Long Term Evaluation, Christopher M. Sechrist Sep 2005

The Outcome Of Mta As A Root End Filling Material: A Long Term Evaluation, Christopher M. Sechrist

Loma Linda University Electronic Theses, Dissertations & Projects

Periradicular surgery is a viable option to save natural teeth when non-surgical treatment fails or when endodontic retreatment is not feasible or contraindicated. Laboratory and animal studies have demonstrated that MTA is biocompatible, provides an excellent seal against penetrating bacteria, and promotes hard tissue healing. The purpose of this study was to provide long term (>3 years) clinical evidence for its use as a root-end filling material in endodontics. The clinical records of 294 patients who had MTA used during endodontic treatment from 1996 to 2001 were reviewed. From these, 75 patients whose root end cavities had been filled …


Laser And Led Effects On The Proliferation Rate Of Periodontal Ligament Fibroblasts, Allen J. Job Sep 2005

Laser And Led Effects On The Proliferation Rate Of Periodontal Ligament Fibroblasts, Allen J. Job

Loma Linda University Electronic Theses, Dissertations & Projects

PURPOSE: To compare the effectiveness of a Gallium Aluminum Arsenide (GaAlAs) diode laser and a light emitting diode (LED) on periodontal ligament fibroblast cell proliferative rates.

METHODS and MATERIALS: PDLF obtained from freshly extracted permanent teeth were cultured under standard conditions until a subconfluent monolayer was present. The next section took 5 days to complete. On day 1, the initial cell concentration of 700 uL/cm2 was plated on 96-well assay plates and placed in a CO2 incubator at 37° C for 24 hours. On day 2, cell counts were first verified using hemocytometry then were irradiated using an …


The Interquartile Range: Theory And Estimation., Dewey Lonzo Whaley Aug 2005

The Interquartile Range: Theory And Estimation., Dewey Lonzo Whaley

Electronic Theses and Dissertations

The interquartile range (IQR) is used to describe the spread of a distribution. In an introductory statistics course, the IQR might be introduced as simply the “range within which the middle half of the data points lie.” In other words, it is the distance between the two quartiles, IQR = Q3 - Q1. We will compute the population IQR, the expected value, and the variance of the sample IQR for various continuous distributions. In addition, a bootstrap confidence interval for the population IQR will be evaluated.


Using Box-Scores To Determine A Position's Contribution To Winning Basketball Games, Garritt L. Page Aug 2005

Using Box-Scores To Determine A Position's Contribution To Winning Basketball Games, Garritt L. Page

Theses and Dissertations

Basketball is a sport that has become increasingly popular world-wide. At the professional level it is a game in which each of the five positions has a specific responsibility that requires unique skills. It seems likely that it would be valuable for coaches to know which skills for each position are most conducive to winning. Knowing which skills to develop for each position could help coaches optimize each player's ability by customizing practice to contain drills that develop the most important skills for each position that would in turn improve the team's overall ability. Through the use of Bayesian hierarchical …


Estimating The Discrepancy Between Computer Model Data And Field Data: Modeling Techniques For Deterministic And Stochastic Computer Simulators, Emily Joy Dastrup Aug 2005

Estimating The Discrepancy Between Computer Model Data And Field Data: Modeling Techniques For Deterministic And Stochastic Computer Simulators, Emily Joy Dastrup

Theses and Dissertations

Computer models have become useful research tools in many disciplines. In many cases a researcher has access to data from a computer simulator and from a physical system. This research discusses Bayesian models that allow for the estimation of the discrepancy between the two data sources. We fit two models to data in the field of electrical engineering. Using this data we illustrate ways of modeling both a deterministic and a stochastic simulator when specific parametric assumptions can be made about the discrepancy term.


Performance Of Aic-Selected Spatial Covariance Structures For Fmri Data, David A. Stromberg Jul 2005

Performance Of Aic-Selected Spatial Covariance Structures For Fmri Data, David A. Stromberg

Theses and Dissertations

FMRI datasets allow scientists to assess functionality of the brain by measuring the response of blood flow to a stimulus. Since the responses from neighboring locations within the brain are correlated, simple linear models that assume independence of measurements across locations are inadequate. Mixed models can be used to model the spatial correlation between observations, however selecting the correct covariance structure is difficult. Information criteria, such as AIC are often used to choose among covariance structures. Once the covariance structure is selected, significance tests can be used to determine if a region of interest within the brain is significantly active. …


Rank-Based Methods For Repeated Measures Data Under Exchangeable Errors, John Kloke Jun 2005

Rank-Based Methods For Repeated Measures Data Under Exchangeable Errors, John Kloke

Dissertations

Rank-based estimation methods provide alternatives to least squares. Estimators derived via least squares are generally not robust to aberrant observations.Rank-based methods for linear models generalize traditional Wilcoxon procedures in the simple location models and are robust.

In the usual linear model it is assumed that the errors are independent. In the case of repeated measures data several observations are taken on each experimental unit. In the case of longitudinal data the measures are taken on the same subject over time. As such an independence assumption does not seem valid. A common solution to this is to make an assumption on …


Fit-To-Fight: Waist Vs. Waist/Height Measurements To Determine An Individual's Fitness Level A Study In Statistical Regression And Analysis, Steven J. Swiderski Jun 2005

Fit-To-Fight: Waist Vs. Waist/Height Measurements To Determine An Individual's Fitness Level A Study In Statistical Regression And Analysis, Steven J. Swiderski

Theses and Dissertations

Air Force members are to be tested for fitness by measuring their abdominal circumference, counting the number of sit-ups and push-ups they can accomplish, and the time it takes them to run 1 and miles. The abdominal measurement is a "one-size-fits-all" fitness standard. This research determines that a person's waist-to-height ratio is a better measurement than the waist measurement to estimate an individual's fitness level. This research estimates that all of the variables used to proxy fitness (Gender, Age, Height, Waist Circumference, Waist-to-Height Ratio, Push-Ups, and Sit-Ups) are statistically significant and do represent good estimators of physical fitness. This research …


The Navigation Potential Of Signals Of Opportunity-Based Time Difference Of Arrival Measurements, Kenneth A. Fisher Jun 2005

The Navigation Potential Of Signals Of Opportunity-Based Time Difference Of Arrival Measurements, Kenneth A. Fisher

Theses and Dissertations

This research introduces the concept of navigation potential, NP, to quantify the intrinsic ability to navigate using a given signal. NP theory is a new, information theory-like concept that provides a theoretical performance limit on estimating navigation parameters from a received signal that is modeled through a stochastic mapping of the transmitted signal and measurement noise. NP theory is applied to SOP-based TDOA systems in general as well as for the Gaussian case. Furthermore, the NP is found for a received signal consisting of the transmitted signal, multiple delayed and attenuated replicas of the transmitted signal, and measurement noise. Multipath-based …


Structure And Dynamics Of Soluble Guanylyl Cyclase, Kentaro Sugino May 2005

Structure And Dynamics Of Soluble Guanylyl Cyclase, Kentaro Sugino

Theses

Soluble guanylyl cyclase (sGC) is one of the key enzymes involved in many fundamental biological processes including vasodilatation. It can be allosterically activated by synthetic compound such as YC-l. Recently, the 3D structure of adenylyl cyclase (AC), which is a homologue of sGC, was determined. Using AC as template and homology modeling, the 3D structure of sGC is predicted. Prior experimental work has suggested two binding modes of YC- 1. In the current investigation, molecular dynamics simulations (MD) were conducted to seek more detail of molecular mechanism of sGC activation.

From these MD simulations, a tentative mechanism of sGC activation …


Determining The Optimum Number Of Increments In Composite Sampling, John Ellis Hathaway May 2005

Determining The Optimum Number Of Increments In Composite Sampling, John Ellis Hathaway

Theses and Dissertations

Composite sampling can be more cost effective than simple random sampling. This paper considers how to determine the optimum number of increments to use in composite sampling. Composite sampling terminology and theory are outlined and a model is developed which accounts for different sources of variation in compositing and data analysis. This model is used to define and understand the process of determining the optimum number of increments that should be used in forming a composite. The blending variance is shown to have a smaller range of possible values than previously reported when estimating the number of increments in a …


Special Classification Models For Lichens In The Pacific Northwest, Janeen Ardito May 2005

Special Classification Models For Lichens In The Pacific Northwest, Janeen Ardito

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

A common problem in ecological studies is that of determining where to look for rare species. This paper shows how statistical models, such as classification trees, may be used to assist in the design of probability-based surveys for rare species using information on more abundant species that are associated with the rare species. This model assisted approach to survey design involves first building models for the more abundant species. The models are then used to determine stratifications for the rare species that are associated with the more abundant species. The goal of this approach is to increase the number of …


Novel Algorithms And Datamining For Clustering Massive Datasets, Aruna K. Buddana May 2005

Novel Algorithms And Datamining For Clustering Massive Datasets, Aruna K. Buddana

Masters Theses

Clustering proteomics data is a challenging problem for any traditional clustering algorithm. Usually, the number of samples is much smaller than the number of protein peaks. The use of a clustering algorithm which does not take into consideration the number of feature of variables (here the number of peaks) is needed. An innovative hierarchical clustering algorithm may be a good approach. This work proposes a new dissimilarity measure for the hierarchical clustering combined with a functional data analysis. This work presents a specific application of functional data analysis (FDA) to a highthrouput proteomics study. The high performance of the proposed …


Statistical Analysis Of Longitudinal And Multivariate Discrete Data, Deepak Mav Apr 2005

Statistical Analysis Of Longitudinal And Multivariate Discrete Data, Deepak Mav

Mathematics & Statistics Theses & Dissertations

Correlated multivariate Poisson and binary variables occur naturally in medical, biological and epidemiological longitudinal studies. Modeling and simulating such variables is difficult because the correlations are restricted by the marginal means via Fréchet bounds in a complicated way. In this dissertation we will first discuss partially specified models and methods for estimating the regression and correlation parameters. We derive the asymptotic distributions of these parameter estimates. Using simulations based on extensions of the algorithm due to Sim (1993, Journal of Statistical Computation and Simulation, 47, pp. 1–10), we study the performance of these estimates using infeasibility, coverage probabilities of the …


Customization Of Discriminant Function Analysis For Prediction Of Solar Flares, Evelyn A. Schumer Mar 2005

Customization Of Discriminant Function Analysis For Prediction Of Solar Flares, Evelyn A. Schumer

Theses and Dissertations

This research is an extension to the research conducted by K. Leka and G. Barnes of the Colorado Research Associates Division, Northwest Research Associates, Inc. in Boulder, Colorado (CORA) in which they found no single photospheric solar parameter they considered could sufficiently identify a flare-producing active region (AR). Their research then explored the possibility a linear combination of parameters used in a multivariable discriminant function (DF) could adequately predict solar activity. The purpose of this research is to extend the DF research conducted by Leka and Barnes by refining the method of statistical discriminant analysis (DA) with the goal of …


2d Quantitative Structure Activity Relationship Modeling Of Methylphenidate Analogues Using Algorithm And Partial Least Square Regression, Noureen Wadhwaniya Jan 2005

2d Quantitative Structure Activity Relationship Modeling Of Methylphenidate Analogues Using Algorithm And Partial Least Square Regression, Noureen Wadhwaniya

Theses

Quantitative Structure-Activity Relationship (QSAR) analysis attempts to develop a predictive model of biological activity based on molecular descriptors. 2D QSAR uses descriptors, such as topological indices, that are independent of molecular conformation. A genetic algorithm - partial least squares (GA-PLS) approach was used to identify the molecular descriptors that correlate to the biological activity (binding affinity) of a set of 80 methylphenidate analogues and to construct a predictive model. The GA code was implemented using the fitness function (1-(n-1)(1-q2)/ (n - c)), where n is the number of compounds, c is the optimal number of components, and q …


Session-Based Intrusion Detection System To Map Anomalous Network Traffic, Bruce Caulkins Jan 2005

Session-Based Intrusion Detection System To Map Anomalous Network Traffic, Bruce Caulkins

Electronic Theses and Dissertations

Computer crime is a large problem (CSI, 2004; Kabay, 2001a; Kabay, 2001b). Security managers have a variety of tools at their disposal -- firewalls, Intrusion Detection Systems (IDSs), encryption, authentication, and other hardware and software solutions to combat computer crime. Many IDS variants exist which allow security managers and engineers to identify attack network packets primarily through the use of signature detection; i.e., the IDS recognizes attack packets due to their well-known "fingerprints" or signatures as those packets cross the network's gateway threshold. On the other hand, anomaly-based ID systems determine what is normal traffic within a network and reports …


Application Of Survival Analysis Methods To Pulsed Exposures: Exposure Duration, Latent Mortality, Recovery Time, And The Underlying Theory Of Survival Distribution Models, Yuan Zhao Jan 2005

Application Of Survival Analysis Methods To Pulsed Exposures: Exposure Duration, Latent Mortality, Recovery Time, And The Underlying Theory Of Survival Distribution Models, Yuan Zhao

Dissertations, Theses, and Masters Projects

Ecotoxicologists adopted median lethal concentration (LC50) methods from mammalian toxicology. This conventional LC50 approach has shortcomings. Fixing the exposure duration and selecting the 50% mortality level result in loss of ecologically relevant information generated at all other times. It also ignores latent mortality that can manifest after exposure ends. as a result, it cannot adequately predict pulsed exposure effects in which concentration, duration, and frequency of pulses change through time. The underlying theory of the dose-response models used to calculate LC50 values, stochastic versus individual effective dose (IED) theory, has not been tested rigorously either. In this study, the effects …