Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability

University of Nevada, Las Vegas

Keyword
Publication Year
Publication
Publication Type

Articles 181 - 208 of 208

Full-Text Articles in Physical Sciences and Mathematics

Effects Of Inlet Conditions On Diffuser Outlet Performance, Zaccary A. Poots May 2011

Effects Of Inlet Conditions On Diffuser Outlet Performance, Zaccary A. Poots

UNLV Theses, Dissertations, Professional Papers, and Capstones

Building air distribution terminal system designers and system installers require accurate quantitative information on the performance of the installed system to achieve optimum efficiency and levels of human comfort. This requires field installation adjustment values from published ideal pressure loss, air distribution and sound generation installation performance. This study documents the air output performance of different installation configurations of six types of ceiling diffusers and compares the results to performance when installed according to ANSI/ASHRAE Standard 70-2006. A diffuser inlet supply plenum was designed for optimum flow and used to acquire a baseline set of data covering the six types …


Calcium Supplemental Usage And Potential Health Issues Sssociated With The Rate Of Usage In Las Vegas, Nv, Tanesha Nicole Moss May 2011

Calcium Supplemental Usage And Potential Health Issues Sssociated With The Rate Of Usage In Las Vegas, Nv, Tanesha Nicole Moss

UNLV Theses, Dissertations, Professional Papers, and Capstones

Calcium is a very important nutrient and as such, it is very important that all humans consume sufficient amounts. However, some calcium supplements have been known to contain small quantities of lead. This research project used a retrospective approach to explore the trade-off between the benefits of calcium vs. the potential lead exposure amongst people who are taking these supplements. A survey consisting of 10 questions was used to try to assess the rate of consumption of specific types of calcium supplements. This research project obtained lead levels in calcium supplements from previous research and applied that data into this …


Implementation Of Numerically Stable Hidden Markov Model, Usha Ramya Tatavarty May 2011

Implementation Of Numerically Stable Hidden Markov Model, Usha Ramya Tatavarty

UNLV Theses, Dissertations, Professional Papers, and Capstones

A Hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (hidden) states. HMM is an extremely flexible tool and has been successfully applied to a wide variety of stochastic modeling tasks. One of the first applications of HMM is speech recognition. Later they came to be known for their applicability in handwriting recognition, part-of-speech tagging and bio-informatics.

In this thesis, we will explain the mathematics involved in HMMs and how to efficiently perform HMM computations using dynamic programming (DP) which makes it easy to implement …


Risk Auto Theft: Predicting Spatial Distributions Of Crime Events, Tana J. Gurule, Tamara D. Madensen Apr 2011

Risk Auto Theft: Predicting Spatial Distributions Of Crime Events, Tana J. Gurule, Tamara D. Madensen

Graduate Research Symposium (GCUA) (2010 - 2017)

Police typically rely on retrospective hotspot maps to informe prevention strategies aimed at reducing future crime. The current study reviews environmental crime theories that help to identify casual factors associated with rish of auto theft. Map layers are created from data that operationalize these risk factors. These layers are combined using spatial analysis techniques to produce a "risk density" map. Analysis of crime data are used to determing wheter our "risk density" map better predicts subsequetnt theft events than a traditional retrospective hotspot map.


Relationship Between Perceived And Actual Quality Of Data Checking, Hunter Speich, Sophia Karas, Dan Erosa, Kelly Grob, Kimberly A. Barchard Apr 2011

Relationship Between Perceived And Actual Quality Of Data Checking, Hunter Speich, Sophia Karas, Dan Erosa, Kelly Grob, Kimberly A. Barchard

Festival of Communities: UG Symposium (Posters)

Data quality is critical to reaching correct research conclusions. Researchers attempt to ensure that they have accurate data by checking the data after it has been entered. Previous research has demonstrated that some methods of data checking are better than others, but not all researchers use the best methods. Perhaps researchers continue to use less optimal data checking methods because they mistakenly believe that they are highly accurate. The purpose of this study was to examine the relationship between perceived data quality and actual data quality. A total of 29 participants completed this study. Participants checked that letters and numbers …


Analysis Of Morris Water Maze Data With Bayesian Statistical Methods, Maxym V. Myroshnychenko, Anton Westveld, Jefferson Kinney Apr 2011

Analysis Of Morris Water Maze Data With Bayesian Statistical Methods, Maxym V. Myroshnychenko, Anton Westveld, Jefferson Kinney

Festival of Communities: UG Symposium (Posters)

Neuroscientists commonly use a Morris Water Maze to assess learning in rodents. In his kind of a maze, the subjects learn to swim toward a platform hidden in opaque water as they orient themselves according to the cues on the walls. This protocol presents a challenge to statistical analysis, because an artificial cut-off must be set for those experimental subjects that do not reach the platform so as they do not drown from exhaustion. This fact leads to the data being right censored. In our experimental data, which compares learning in rodents that have chemically induced symptoms of schizophrenia to …


Unlv Enrollment Forecasting, Sabrina Beckman, Stefan Cline, Monika Neda Apr 2011

Unlv Enrollment Forecasting, Sabrina Beckman, Stefan Cline, Monika Neda

Festival of Communities: UG Symposium (Posters)

Our project investigates the future enrollment of undergraduates at UNLV in the entire university, the College of Science, and the Department of Mathematical Sciences. The method used for the forecast, is the well-known least-squares method, for which a mathematical description will be presented. Studies for the numerical error are pursued too. The study will include graphs that describe the past and future behavior for different parameter settings. Mathematical results obtained show that the university will continue to grow given the current trends of enrollment.


Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown Apr 2011

Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown

Library Faculty Publications

Three different ways of documenting library value were presented to fourth year landscape architecture students in the UNLV School of Architecture: a contingent valuation survey, a library calculator, and a survey to rate importance and impact of library services and features. Students used the three approaches, then discussed their experiences with the author. Their input suggested improvements in the instruments and provided feedback on possible positive and negative consequences of inviting this kind of valuing. Working with a focused collection and population provided a relatively safe environment to explore concerns about negative consequences.


Demonstrating Library Value: Examples And Applications For Arts Libraries, Jeanne M. Brown Apr 2011

Demonstrating Library Value: Examples And Applications For Arts Libraries, Jeanne M. Brown

Library Faculty Publications

Demonstrating library value is of critical importance to all libraries, both to protect services and to serve patrons effectively. This paper presents suggestions for art and architecture libraries as they engage in determining what patrons value and documenting that value for library and campus administrators. Methods for calculating worth and for presenting a case are provided, as are ways of using strategic thinking and the assessment process to ensure the continuance of valuable services should budget reductions be unavoidable.


Poisson Process Monitoring, Test And Comparison, Qing Chen Dec 2010

Poisson Process Monitoring, Test And Comparison, Qing Chen

UNLV Theses, Dissertations, Professional Papers, and Capstones

The task of determining whether a sudden change occurred in the generative parameters of a time series generates application in many areas. In this thesis, we aim at monitoring the change-point of a Poisson process by method, which is characterized by a forward-backward testing algorithm and several overall error control mechanisms. With the application of this proposed method, we declare that Mount Etna is not a simple Poissonian volcano, because two different regimes divided by the change point, January 30th 1974, are identified. The validation procedures, used in a complementary fashion, by the formal hypothesis tests and graphical method will …


The Determinants Of Colorectal Cancer Survival Disparities In Nevada, Lucas N. Wassira Dec 2010

The Determinants Of Colorectal Cancer Survival Disparities In Nevada, Lucas N. Wassira

UNLV Theses, Dissertations, Professional Papers, and Capstones

Different population groups across Nevada and throughout the United States suffer disproportionately from colorectal cancer and its after-effects. Overcoming cancer health disparities is important for lessening the burden of cancer. There has been an overall decline in the incidence of and mortality from colorectal cancer (CRC). This is likely due, in part, to the increasing use of screening procedures such as Fecal Occult Blood Test (FOBT) and/or endoscopy, which can reduce the risk of CRC mortality by fifty percent. Nevertheless, screening procedures are routinely used by only fifty percent of Americans aged fifty years and older. Despite overall mortality decreasing …


Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown Oct 2010

Developing A Library Value Indicator For A Disciplinary Population, Jeanne M. Brown

Library Faculty Presentations

Population
- Landscape architecture studio of ten 5th year students
- Use of physical library ranges from 1- 30 times/month
- Use of virtual library ranges from 2-30x/month
- Compared to others in School of Architecture use is moderate
- They self-rate as average or above average on library skills, compared to their peers


Arima Model For Forecasting Poisson Data: Application To Long-Term Earthquake Predictions, Wangdong Fu Aug 2010

Arima Model For Forecasting Poisson Data: Application To Long-Term Earthquake Predictions, Wangdong Fu

UNLV Theses, Dissertations, Professional Papers, and Capstones

Earthquakes that occurred worldwide during the period of 1896 to 2009 with magnitude greater than or equal to 8.0 on the Richter scale are assumed to follow a Poisson process. Autoregressive Integrated Moving Average models are presented to fit the empirical recurrence rates, and to predict future large earthquakes. We show valuable modeling and computational techniques for the point processes and time series data. Specifically, for the proposed methodology, we address the following areas: data management and graphic presentation, model fitting and selection, model validation, model and data sensitivity analysis, and forecasting.


General Coupon Collecting Models And Multinomial Games, James Y. Lee May 2010

General Coupon Collecting Models And Multinomial Games, James Y. Lee

UNLV Theses, Dissertations, Professional Papers, and Capstones

The coupon collection problem is one of the most studied problems in statistics. It is the problem of collecting r (r<∞) distinct coupons one by one from k different kinds (k<∞) of coupons. We note that this is equivalent to the classical occupancy problem which involves the random allocation of r distinct balls into k distinct cells. Although the problem was first introduced centuries ago, it is still actively investigated today. Perhaps its greatest feature is its versatility, numerous approaches, and countless variations. For this reason, we are particularly interested in creating a classification system for the many generalizations of the coupon collection problem. In this thesis, we will introduce models that will be able to categorize these generalizations. In addition, we calculate the waiting time for the models under consideration. Our approach is to use the Dirichlet Type II integral. We compare our calculations to the ones obtained through Monte Carlo simulation. Our results will show that our models and the method used to find the waiting times are ideal for solving problems of this type.


Insights Into The Commons On Flickr, Jason Vaughan Apr 2010

Insights Into The Commons On Flickr, Jason Vaughan

Library Faculty Publications

The Commons on Flickr, comprised of an international community of select libraries, museums, and archives, was a project initially launched in 2008 by the Library of Congress and Flickr. Primary goals of The Commons are to broaden exposure to rich cultural heritage photographs and to observe and participate in the communities of engagement and dialog enabled through The Commons. A survey was administered to all The Commons institutions during summer 2009, focusing on assessment of the overall satisfaction of current members and seeking additional details on participation goals, social interactions, staff time involvement, and general statistics. Members report a very …


Research Poster: Hydrological Impacts Of Climate Change On Colorado Basin, Peng Jiang, Zhongbo Yu Feb 2010

Research Poster: Hydrological Impacts Of Climate Change On Colorado Basin, Peng Jiang, Zhongbo Yu

2010 Annual Nevada NSF EPSCoR Climate Change Conference

Research poster


Research Poster: An Overview Of Progress In Nsf Epscor Project Entitled, “Reducing Cloud Uncertainties In Climate Models”, Subhashree Mishra, David L. Mitchell, W. Patrick Arnott Feb 2010

Research Poster: An Overview Of Progress In Nsf Epscor Project Entitled, “Reducing Cloud Uncertainties In Climate Models”, Subhashree Mishra, David L. Mitchell, W. Patrick Arnott

2010 Annual Nevada NSF EPSCoR Climate Change Conference

Research poster


Research Poster: Climate Prediction Downscaling Of Temperature And Precipitation In The Great Basin Region, Ramesh Vellore, Benjamin J. Hatchett, Darko Koracin Feb 2010

Research Poster: Climate Prediction Downscaling Of Temperature And Precipitation In The Great Basin Region, Ramesh Vellore, Benjamin J. Hatchett, Darko Koracin

2010 Annual Nevada NSF EPSCoR Climate Change Conference

Research poster


In Step With Hiv Vaccines? A Content Analysis Of Local Recruitment Campaigns For An International Hiv Vaccine Study, Paula M. Frew, Wendy Macias, Kayshin Chan, Ashley Harding Jan 2009

In Step With Hiv Vaccines? A Content Analysis Of Local Recruitment Campaigns For An International Hiv Vaccine Study, Paula M. Frew, Wendy Macias, Kayshin Chan, Ashley Harding

Environmental & Occupational Health Faculty Publications

During the past two decades of the HIV/AIDS pandemic, several recruitment campaigns were designed to generate community involvement in preventive HIV vaccine clinical trials. These efforts utilized a blend of advertising and marketing strategies mixed with public relations and community education approaches to attract potential study participants to clinical trials (integrated marketing communications). Although more than 30,000 persons worldwide have participated in preventive HIV vaccine studies, no systematic analysis of recruitment campaigns exists. This content analysis study was conducted to examine several United States and Canadian recruitment campaigns for one of the largest-scale HIV vaccine trials to date (the “Step …


Statistical Inferences For Functions Of Parameters Of Several Pareto And Exponential Populations With Application In Data Traffic, Sumith Gunasekera Jan 2009

Statistical Inferences For Functions Of Parameters Of Several Pareto And Exponential Populations With Application In Data Traffic, Sumith Gunasekera

UNLV Theses, Dissertations, Professional Papers, and Capstones

In this dissertation, we discuss the usability and applicability of three statistical inferential frameworks--namely, the Classical Method, which is sometimes referred to as the Conventional or the Frequentist Method, based on the approximate large sample approach, the Generalized Variable Method based on the exact generalized p -value approach, and the Bayesian Method based on prior densities--for solving existing problems in the area of parametric estimation. These inference procedures are discussed through Pareto and exponential distributions that are widely used to model positive random variables relevant to social, scientific, actuarial, insurance, finance, investments, banking, and many other types of observable phenomena. …


Chronic Disease, Homeland Security, And Sailing To Where There Be Dragons, David M. Hassenzahl Oct 2008

Chronic Disease, Homeland Security, And Sailing To Where There Be Dragons, David M. Hassenzahl

Public Policy and Leadership Faculty Publications

The five papers in this special issue share the perspective that attitudes toward risk are strongly shaped by social context, and that understanding context can help us understand how risk decisions are made, and thereby how to make them better.


Implementation Of Uncertainty Propagation In Triton/Keno, Charlotta Sanders, Denis Beller Jan 2008

Implementation Of Uncertainty Propagation In Triton/Keno, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

Monte Carlo methods are beginning to be used for three dimensional fuel depletion analyses to compute various quantities of interest, including isotopic compositions of used nuclear fuel. The TRITON control module, available in the SCALE 5.1 code system, can perform three-dimensional (3-D) depletion calculations using either the KENO V.a or KENO-VI Monte Carlo transport codes, as well as the two-dimensional (2-D) NEWT discrete ordinates code. To overcome problems such as spatially nonuniform neutron flux and non-uniform statistical uncertainties in computed reaction rates and to improve the fidelity of calculations using Monte Carlo methods, uncertainty propagation is needed for depletion calculations.


Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis, Charlotta Sanders, Denis Beller Jan 2008

Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

The dimensions and the large amount of shielding required for Global Nuclear Energy Partnership (GNEP) facilities, advanced radiation shielding, and dose computation techniques are beyond today’s capabilities and will certainly be required. With the Generation IV Nuclear Energy System Initiative, it will become increasingly important to be able to accurately model advanced Boiling Water Reactor and Pressurized Water Reactor facilities, and to calculate dose rates at all locations within a containment (e.g., resulting from radiations from the reactor as well as the from the primary coolant loop) and adjoining structures (e.g., from the spent fuel pool).

The MAVRIC sequence is …


Implementation Of Uncertainty Propagation In Triton/Keno: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller Oct 2007

Implementation Of Uncertainty Propagation In Triton/Keno: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

Monte Carlo methods are beginning to be used for three-dimensional fuel depletion analyses to compute various quantities of interest, including isotopic compositions of used fuel.1 The TRITON control module, available in the SCALE 5.1 code system, can perform three dimensional (3-D) depletion calculations using either the KENO V.a or KENO-VI Monte Carlo transport codes, as well as the two-dimensional (2- D) NEWT discrete ordinates code. For typical reactor systems, the neutron flux is not spatially uniform. For Monte Carlo simulations, this results in non-uniform statistical uncertainties in the computed reaction rates. For spatial regions where the flux is low, e.g., …


Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller Oct 2007

Monaco/Mavric Evaluation For Facility Shielding And Dose Rate Analysis: To Support The Global Nuclear Energy Partnership, Charlotta Sanders, Denis Beller

Reactor Campaign (TRP)

Monte Carlo methods are used to compute fluxes or dose rates over large areas using mesh tallies. For problems that demand that the uncertainty in each mesh cell be less than some set maximum, computation time is controlled by the cell with the largest uncertainty. This issue becomes quite troublesome in deep-penetration problems, and advanced variance reduction techniques are required to obtain reasonable uncertainties over large areas.

In this project the MAVRIC sequence will be evaluated along with the Monte Carlo engine Monaco to investigate its effectiveness and usefulness in facility shielding and dose rate analyses. A previously MCNP-evaluated cask …


Procedure Models, C. F. Bartley, W. W. Watson Oct 2006

Procedure Models, C. F. Bartley, W. W. Watson

Publications (YM)

This procedure establishes the responsibilities and process for documenting activities that constitute scientific investigation modeling. Planning requirements for conducting modeling are contained in LP-2.29Q-BSC, Planning for Science Activities.


Ramping Up Assessment At The Unlv Libraries, Jeanne M. Brown Jan 2005

Ramping Up Assessment At The Unlv Libraries, Jeanne M. Brown

Library Faculty Publications

Purpose – Sets out to describe the development of an assessment program at UNLV Libraries and current assessment activities.

Design/methodology/approach – Assessment activities are first placed in organizational context, distinguishing between assessment initiated by departments, and assessment done library-wide. Common expressions of resistance to assessment are noted, followed by the library and campus context relating to assessment. The impact of technology and of the LibQual+ survey is discussed.

Findings – Assessment activities at UNLV Libraries have strengthened and diversified over the last several years, thanks to several factors including the guidance of its dean, the development of technology and human …


A Monte Carlo Analysis Of Hedonic Models Using Traditional And Spatial Approaches, Helen R. Neill, David M. Hassenzahl, Djeto D. Assane Jun 2003

A Monte Carlo Analysis Of Hedonic Models Using Traditional And Spatial Approaches, Helen R. Neill, David M. Hassenzahl, Djeto D. Assane

Public Policy and Leadership Faculty Publications

Hedonic regression analysis of single family homes typically includes structural variables, locational variables and neighborhood quality characteristics. When nearby properties are related, Dubin (1988) reports that error terms are spatially autocorrelated. Estimation methods for these spatially autocorrelated error terms or hereafter, spatial approaches, include maximum likelihood estimation (MLE) and kriging techniques such as kriged maximum likelihood estimation (KMLE). Unfortunately these spatial methods require massive computer resources and are limited to significantly fewer observations than traditional ordinary least squares (OLS). This paper investigates the combination of spatial approaches and Monte Carlo analysis, a method that approximates large data sets. A question …