Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability Commons

Open Access. Powered by Scholars. Published by Universities.®

University of Nevada, Las Vegas

Discipline
Keyword
Publication Year
Publication
Publication Type

Articles 181 - 208 of 208

Full-Text Articles in Statistics and Probability

Arima Models For Bank Failures: Prediction And Comparison, Fangjin Cui May 2011

Arima Models For Bank Failures: Prediction And Comparison, Fangjin Cui

UNLV Theses, Dissertations, Professional Papers, and Capstones

The number of bank failures has increased dramatically over the last twenty-two years. A common notion in economics is that some banks can become "too big to fail." Is this still a true statement? What is the relationship, if any, between bank sizes and bank failures? In this thesis, the proposed modeling techniques are applied to real bank failure data from the FDIC. In particular, quarterly data from 1989:Q1 to 2010:Q4 are used in the data analysis, which includes three major parts: 1) pairwise bank failure rate comparisons using the conditional test (Przyborowski and Wilenski, 1940); 2) development of the …


Statistical Inference Of A Measure For Two Binomial Variates, Serena Petersen May 2011

Statistical Inference Of A Measure For Two Binomial Variates, Serena Petersen

UNLV Theses, Dissertations, Professional Papers, and Capstones

We study measures of a comparison for two independent binomial variates which frequently occur in real situations. An estimator for measure of reduction (MOR) is considered for two sample proportions based on a modified maximum likelihood estimation. We study the desirable properties of the estimator: the asymptotic behavior of its unbiasedness and the variance of the estimator. Since the measure ρ is approximately normally distributed when sample sizes are sufficiently large, one may establish approximate confidence intervals for the true value of the estimators. For numerical study, the Monte Carlo experiment is carried out for the various scenarios of two …


Modeling Mortality Rates For Leukemia Between Men And Women In The United States, Blessed Quansah May 2011

Modeling Mortality Rates For Leukemia Between Men And Women In The United States, Blessed Quansah

UNLV Theses, Dissertations, Professional Papers, and Capstones

Leukemia related deaths increased dramatically over the last forty years. Leukemia is a malignant disease or cancer of the bone marrow and blood. It is characterized by the uncontrolled accumulation of blood cells. Leukemia is divided into two categories: myelogenous or lymphocytic, each of which can be acute or chronic. The terms, myelogenous or lymphocytic denote the cell type involved.

In this thesis, the proposed modeling techniques are applied to leukemia deaths data from the Surveillance Epidemiology and End Results (SEER). In particular, annual deaths data from 1969 to 2007 are used in the data analysis, which includes three major …


Risk Auto Theft: Predicting Spatial Distributions Of Crime Events, Tana J. Gurule, Tamara D. Madensen Apr 2011

Risk Auto Theft: Predicting Spatial Distributions Of Crime Events, Tana J. Gurule, Tamara D. Madensen

Graduate Research Symposium (GCUA) (2010 - 2017)

Police typically rely on retrospective hotspot maps to informe prevention strategies aimed at reducing future crime. The current study reviews environmental crime theories that help to identify casual factors associated with rish of auto theft. Map layers are created from data that operationalize these risk factors. These layers are combined using spatial analysis techniques to produce a "risk density" map. Analysis of crime data are used to determing wheter our "risk density" map better predicts subsequetnt theft events than a traditional retrospective hotspot map.


Relationship Between Perceived And Actual Quality Of Data Checking, Hunter Speich, Sophia Karas, Dan Erosa, Kelly Grob, Kimberly A. Barchard Apr 2011

Relationship Between Perceived And Actual Quality Of Data Checking, Hunter Speich, Sophia Karas, Dan Erosa, Kelly Grob, Kimberly A. Barchard

Festival of Communities: UG Symposium (Posters)

Data quality is critical to reaching correct research conclusions. Researchers attempt to ensure that they have accurate data by checking the data after it has been entered. Previous research has demonstrated that some methods of data checking are better than others, but not all researchers use the best methods. Perhaps researchers continue to use less optimal data checking methods because they mistakenly believe that they are highly accurate. The purpose of this study was to examine the relationship between perceived data quality and actual data quality. A total of 29 participants completed this study. Participants checked that letters and numbers …


Analysis Of Morris Water Maze Data With Bayesian Statistical Methods, Maxym V. Myroshnychenko, Anton Westveld, Jefferson Kinney Apr 2011

Analysis Of Morris Water Maze Data With Bayesian Statistical Methods, Maxym V. Myroshnychenko, Anton Westveld, Jefferson Kinney

Festival of Communities: UG Symposium (Posters)

Neuroscientists commonly use a Morris Water Maze to assess learning in rodents. In his kind of a maze, the subjects learn to swim toward a platform hidden in opaque water as they orient themselves according to the cues on the walls. This protocol presents a challenge to statistical analysis, because an artificial cut-off must be set for those experimental subjects that do not reach the platform so as they do not drown from exhaustion. This fact leads to the data being right censored. In our experimental data, which compares learning in rodents that have chemically induced symptoms of schizophrenia to …


Unlv Enrollment Forecasting, Sabrina Beckman, Stefan Cline, Monika Neda Apr 2011

Unlv Enrollment Forecasting, Sabrina Beckman, Stefan Cline, Monika Neda

Festival of Communities: UG Symposium (Posters)

Our project investigates the future enrollment of undergraduates at UNLV in the entire university, the College of Science, and the Department of Mathematical Sciences. The method used for the forecast, is the well-known least-squares method, for which a mathematical description will be presented. Studies for the numerical error are pursued too. The study will include graphs that describe the past and future behavior for different parameter settings. Mathematical results obtained show that the university will continue to grow given the current trends of enrollment.


Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown Apr 2011

Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown

Library Faculty Publications

Three different ways of documenting library value were presented to fourth year landscape architecture students in the UNLV School of Architecture: a contingent valuation survey, a library calculator, and a survey to rate importance and impact of library services and features. Students used the three approaches, then discussed their experiences with the author. Their input suggested improvements in the instruments and provided feedback on possible positive and negative consequences of inviting this kind of valuing. Working with a focused collection and population provided a relatively safe environment to explore concerns about negative consequences.


Demonstrating Library Value: Examples And Applications For Arts Libraries, Jeanne M. Brown Apr 2011

Demonstrating Library Value: Examples And Applications For Arts Libraries, Jeanne M. Brown

Library Faculty Publications

Demonstrating library value is of critical importance to all libraries, both to protect services and to serve patrons effectively. This paper presents suggestions for art and architecture libraries as they engage in determining what patrons value and documenting that value for library and campus administrators. Methods for calculating worth and for presenting a case are provided, as are ways of using strategic thinking and the assessment process to ensure the continuance of valuable services should budget reductions be unavoidable.


The Determinants Of Colorectal Cancer Survival Disparities In Nevada, Lucas N. Wassira Dec 2010

The Determinants Of Colorectal Cancer Survival Disparities In Nevada, Lucas N. Wassira

UNLV Theses, Dissertations, Professional Papers, and Capstones

Different population groups across Nevada and throughout the United States suffer disproportionately from colorectal cancer and its after-effects. Overcoming cancer health disparities is important for lessening the burden of cancer. There has been an overall decline in the incidence of and mortality from colorectal cancer (CRC). This is likely due, in part, to the increasing use of screening procedures such as Fecal Occult Blood Test (FOBT) and/or endoscopy, which can reduce the risk of CRC mortality by fifty percent. Nevertheless, screening procedures are routinely used by only fifty percent of Americans aged fifty years and older. Despite overall mortality decreasing …


Poisson Process Monitoring, Test And Comparison, Qing Chen Dec 2010

Poisson Process Monitoring, Test And Comparison, Qing Chen

UNLV Theses, Dissertations, Professional Papers, and Capstones

The task of determining whether a sudden change occurred in the generative parameters of a time series generates application in many areas. In this thesis, we aim at monitoring the change-point of a Poisson process by method, which is characterized by a forward-backward testing algorithm and several overall error control mechanisms. With the application of this proposed method, we declare that Mount Etna is not a simple Poissonian volcano, because two different regimes divided by the change point, January 30th 1974, are identified. The validation procedures, used in a complementary fashion, by the formal hypothesis tests and graphical method will …


Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown Oct 2010

Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown

Library Faculty Presentations

Population
- Landscape architecture studio of ten 5th year students
- Use of physical library ranges from 1- 30 times/month
- Use of virtual library ranges from 2-30x/month
- Compared to others in School of Architecture use is moderate
- They self-rate as average or above average on library skills, compared to their peers


Arima Model For Forecasting Poisson Data: Application To Long-Term Earthquake Predictions, Wangdong Fu Aug 2010

Arima Model For Forecasting Poisson Data: Application To Long-Term Earthquake Predictions, Wangdong Fu

UNLV Theses, Dissertations, Professional Papers, and Capstones

Earthquakes that occurred worldwide during the period of 1896 to 2009 with magnitude greater than or equal to 8.0 on the Richter scale are assumed to follow a Poisson process. Autoregressive Integrated Moving Average models are presented to fit the empirical recurrence rates, and to predict future large earthquakes. We show valuable modeling and computational techniques for the point processes and time series data. Specifically, for the proposed methodology, we address the following areas: data management and graphic presentation, model fitting and selection, model validation, model and data sensitivity analysis, and forecasting.


General Coupon Collecting Models And Multinomial Games, James Y. Lee May 2010

General Coupon Collecting Models And Multinomial Games, James Y. Lee

UNLV Theses, Dissertations, Professional Papers, and Capstones

The coupon collection problem is one of the most studied problems in statistics. It is the problem of collecting r (r<∞) distinct coupons one by one from k different kinds (k<∞) of coupons. We note that this is equivalent to the classical occupancy problem which involves the random allocation of r distinct balls into k distinct cells. Although the problem was first introduced centuries ago, it is still actively investigated today. Perhaps its greatest feature is its versatility, numerous approaches, and countless variations. For this reason, we are particularly interested in creating a classification system for the many generalizations of the coupon collection problem. In this thesis, we will introduce models that will be able to categorize these generalizations. In addition, we calculate the waiting time for the models under consideration. Our approach is to use the Dirichlet Type II integral. We compare our calculations to the ones obtained through Monte Carlo simulation. Our results will show that our models and the method used to find the waiting times are ideal for solving problems of this type.


Insights Into The Commons On Flickr, Jason Vaughan Apr 2010

Insights Into The Commons On Flickr, Jason Vaughan

Library Faculty Publications

The Commons on Flickr, comprised of an international community of select libraries, museums, and archives, was a project initially launched in 2008 by the Library of Congress and Flickr. Primary goals of The Commons are to broaden exposure to rich cultural heritage photographs and to observe and participate in the communities of engagement and dialog enabled through The Commons. A survey was administered to all The Commons institutions during summer 2009, focusing on assessment of the overall satisfaction of current members and seeking additional details on participation goals, social interactions, staff time involvement, and general statistics. Members report a very …


Research Poster: Hydrological Impacts Of Climate Change On Colorado Basin, Peng Jiang, Zhongbo Yu Feb 2010

Research Poster: Hydrological Impacts Of Climate Change On Colorado Basin, Peng Jiang, Zhongbo Yu

2010 Annual Nevada NSF EPSCoR Climate Change Conference

Research poster


Research Poster: An Overview Of Progress In Nsf Epscor Project Entitled, “Reducing Cloud Uncertainties In Climate Models”, Subhashree Mishra, David L. Mitchell, W. Patrick Arnott Feb 2010

Research Poster: An Overview Of Progress In Nsf Epscor Project Entitled, “Reducing Cloud Uncertainties In Climate Models”, Subhashree Mishra, David L. Mitchell, W. Patrick Arnott

2010 Annual Nevada NSF EPSCoR Climate Change Conference

Research poster


Research Poster: Climate Prediction Downscaling Of Temperature And Precipitation In The Great Basin Region, Ramesh Vellore, Benjamin J. Hatchett, Darko Koracin Feb 2010

Research Poster: Climate Prediction Downscaling Of Temperature And Precipitation In The Great Basin Region, Ramesh Vellore, Benjamin J. Hatchett, Darko Koracin

2010 Annual Nevada NSF EPSCoR Climate Change Conference

Research poster


In Step With Hiv Vaccines? A Content Analysis Of Local Recruitment Campaigns For An International Hiv Vaccine Study, Paula M. Frew, Wendy Macias, Kayshin Chan, Ashley Harding Jan 2009

In Step With Hiv Vaccines? A Content Analysis Of Local Recruitment Campaigns For An International Hiv Vaccine Study, Paula M. Frew, Wendy Macias, Kayshin Chan, Ashley Harding

Environmental & Occupational Health Faculty Publications

During the past two decades of the HIV/AIDS pandemic, several recruitment campaigns were designed to generate community involvement in preventive HIV vaccine clinical trials. These efforts utilized a blend of advertising and marketing strategies mixed with public relations and community education approaches to attract potential study participants to clinical trials (integrated marketing communications). Although more than 30,000 persons worldwide have participated in preventive HIV vaccine studies, no systematic analysis of recruitment campaigns exists. This content analysis study was conducted to examine several United States and Canadian recruitment campaigns for one of the largest-scale HIV vaccine trials to date (the “Step …


Statistical Inferences For Functions Of Parameters Of Several Pareto And Exponential Populations With Application In Data Traffic, Sumith Gunasekera Jan 2009

Statistical Inferences For Functions Of Parameters Of Several Pareto And Exponential Populations With Application In Data Traffic, Sumith Gunasekera

UNLV Theses, Dissertations, Professional Papers, and Capstones

In this dissertation, we discuss the usability and applicability of three statistical inferential frameworks--namely, the Classical Method, which is sometimes referred to as the Conventional or the Frequentist Method, based on the approximate large sample approach, the Generalized Variable Method based on the exact generalized p -value approach, and the Bayesian Method based on prior densities--for solving existing problems in the area of parametric estimation. These inference procedures are discussed through Pareto and exponential distributions that are widely used to model positive random variables relevant to social, scientific, actuarial, insurance, finance, investments, banking, and many other types of observable phenomena. …


Chronic Disease, Homeland Security, And Sailing To Where There Be Dragons, David M. Hassenzahl Oct 2008

Chronic Disease, Homeland Security, And Sailing To Where There Be Dragons, David M. Hassenzahl

Public Policy and Leadership Faculty Publications

The five papers in this special issue share the perspective that attitudes toward risk are strongly shaped by social context, and that understanding context can help us understand how risk decisions are made, and thereby how to make them better.


Implementation Of Uncertainty Propagation In Triton/Keno, Charlotta Sanders, Denis Beller Jan 2008

Implementation Of Uncertainty Propagation In Triton/Keno, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

Monte Carlo methods are beginning to be used for three dimensional fuel depletion analyses to compute various quantities of interest, including isotopic compositions of used nuclear fuel. The TRITON control module, available in the SCALE 5.1 code system, can perform three-dimensional (3-D) depletion calculations using either the KENO V.a or KENO-VI Monte Carlo transport codes, as well as the two-dimensional (2-D) NEWT discrete ordinates code. To overcome problems such as spatially nonuniform neutron flux and non-uniform statistical uncertainties in computed reaction rates and to improve the fidelity of calculations using Monte Carlo methods, uncertainty propagation is needed for depletion calculations.


Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis, Charlotta Sanders, Denis Beller Jan 2008

Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

The dimensions and the large amount of shielding required for Global Nuclear Energy Partnership (GNEP) facilities, advanced radiation shielding, and dose computation techniques are beyond today’s capabilities and will certainly be required. With the Generation IV Nuclear Energy System Initiative, it will become increasingly important to be able to accurately model advanced Boiling Water Reactor and Pressurized Water Reactor facilities, and to calculate dose rates at all locations within a containment (e.g., resulting from radiations from the reactor as well as the from the primary coolant loop) and adjoining structures (e.g., from the spent fuel pool).

The MAVRIC sequence is …


Implementation Of Uncertainty Propagation In Triton/Keno: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller Oct 2007

Implementation Of Uncertainty Propagation In Triton/Keno: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

Monte Carlo methods are beginning to be used for three-dimensional fuel depletion analyses to compute various quantities of interest, including isotopic compositions of used fuel.1 The TRITON control module, available in the SCALE 5.1 code system, can perform three dimensional (3-D) depletion calculations using either the KENO V.a or KENO-VI Monte Carlo transport codes, as well as the two-dimensional (2- D) NEWT discrete ordinates code. For typical reactor systems, the neutron flux is not spatially uniform. For Monte Carlo simulations, this results in non-uniform statistical uncertainties in the computed reaction rates. For spatial regions where the flux is low, e.g., …


Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller Oct 2007

Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

Monte Carlo methods are used to compute fluxes or dose rates over large areas using mesh tallies. For problems that demand that the uncertainty in each mesh cell be less than some set maximum, computation time is controlled by the cell with the largest uncertainty. This issue becomes quite troublesome in deep-penetration problems, and advanced variance reduction techniques are required to obtain reasonable uncertainties over large areas.

In this project the MAVRIC sequence will be evaluated along with the Monte Carlo engine Monaco to investigate its effectiveness and usefulness in facility shielding and dose rate analyses. A previously MCNP-evaluated cask …


Procedure Models, C. F. Bartley, W. W. Watson Oct 2006

Procedure Models, C. F. Bartley, W. W. Watson

Publications (YM)

This procedure establishes the responsibilities and process for documenting activities that constitute scientific investigation modeling. Planning requirements for conducting modeling are contained in LP-2.29Q-BSC, Planning for Science Activities.


Ramping Up Assessment At The Unlv Libraries, Jeanne M. Brown Jan 2005

Ramping Up Assessment At The Unlv Libraries, Jeanne M. Brown

Library Faculty Publications

Purpose – Sets out to describe the development of an assessment program at UNLV Libraries and current assessment activities.

Design/methodology/approach – Assessment activities are first placed in organizational context, distinguishing between assessment initiated by departments, and assessment done library-wide. Common expressions of resistance to assessment are noted, followed by the library and campus context relating to assessment. The impact of technology and of the LibQual+ survey is discussed.

Findings – Assessment activities at UNLV Libraries have strengthened and diversified over the last several years, thanks to several factors including the guidance of its dean, the development of technology and human …


A Monte Carlo Analysis Of Hedonic Models Using Traditional And Spatial Approaches, Helen R. Neill, David M. Hassenzahl, Djeto D. Assane Jun 2003

A Monte Carlo Analysis Of Hedonic Models Using Traditional And Spatial Approaches, Helen R. Neill, David M. Hassenzahl, Djeto D. Assane

Public Policy and Leadership Faculty Publications

Hedonic regression analysis of single family homes typically includes structural variables, locational variables and neighborhood quality characteristics. When nearby properties are related, Dubin (1988) reports that error terms are spatially autocorrelated. Estimation methods for these spatially autocorrelated error terms or hereafter, spatial approaches, include maximum likelihood estimation (MLE) and kriging techniques such as kriged maximum likelihood estimation (KMLE). Unfortunately these spatial methods require massive computer resources and are limited to significantly fewer observations than traditional ordinary least squares (OLS). This paper investigates the combination of spatial approaches and Monte Carlo analysis, a method that approximates large data sets. A question …