Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability

2011

Institution
Keyword
Publication
Publication Type

Articles 1 - 9 of 9

Full-Text Articles in Numerical Analysis and Computation

Flexible Distributed Lag Models Using Random Functions With Application To Estimating Mortality Displacement From Heat-Related Deaths, Roger D. Peng Dec 2011

Flexible Distributed Lag Models Using Random Functions With Application To Estimating Mortality Displacement From Heat-Related Deaths, Roger D. Peng

Johns Hopkins University, Dept. of Biostatistics Working Papers

No abstract provided.


Energy Functional For Nuclear Masses, Michael Giovanni Bertolli Dec 2011

Energy Functional For Nuclear Masses, Michael Giovanni Bertolli

Doctoral Dissertations

An energy functional is formulated for mass calculations of nuclei across the nuclear chart with major-shell occupations as the relevant degrees of freedom. The functional is based on Hohenberg-Kohn theory. Motivation for its form comes from both phenomenology and relevant microscopic systems, such as the three-level Lipkin Model. A global fit of the 17-parameter functional to nuclear masses yields a root- mean-square deviation of χ[chi] = 1.31 MeV, on the order of other mass models. The construction of the energy functional includes the development of a systematic method for selecting and testing possible functional terms. Nuclear radii are computed within …


Applying Gmdh-Type Neural Network And Genetic Algorithm For Stock Price Prediction Of Iranian Cement Sector, Saeed Fallahi, Meysam Shaverdi, Vahab Bashiri Dec 2011

Applying Gmdh-Type Neural Network And Genetic Algorithm For Stock Price Prediction Of Iranian Cement Sector, Saeed Fallahi, Meysam Shaverdi, Vahab Bashiri

Applications and Applied Mathematics: An International Journal (AAM)

The cement industry is one of the most important and profitable industries in Iran and great content of financial resources are investing in this sector yearly. In this paper a GMDH-type neural network and genetic algorithm is developed for stock price prediction of cement sector. For stocks price prediction by GMDH type-neural network, we are using earnings per share (EPS), Prediction Earnings Per Share (PEPS), Dividend per share (DPS), Price-earnings ratio (P/E), Earnings-price ratio (E/P) as input data and stock price as output data. For this work, data of ten cement companies is gathering from Tehran stock exchange (TSE) in …


Real Options Models In Real Estate, Jin Won Choi Nov 2011

Real Options Models In Real Estate, Jin Won Choi

Electronic Thesis and Dissertation Repository

Our aim in this thesis is to investigate the usefulness of real options analysis, taking case studies of problems in real estate. In the realm of real estate, we consider the following three problems. First, we consider the valuation and usefulness of presale contracts of condominiums, which can be viewed as similar to call options on condominiums. Secondly, we consider the valuation of farm land from the perspective of land developers, who may think of farm land as being similar to call options on subdivision lots. Third, we consider the valuation of opportunities to install solar panels on properties, in …


A Comparison Of Spatio-Temporal Prediction Methods Of Cancer Incidence In The U.S, Michelle Hamlyn Aug 2011

A Comparison Of Spatio-Temporal Prediction Methods Of Cancer Incidence In The U.S, Michelle Hamlyn

UNLV Theses, Dissertations, Professional Papers, and Capstones

Cancer is the cause of one out of four deaths in the United States, and in 2009, researchers expected over 1.5 million new patients to be diagnosed with some form of cancer. People diagnosed with cancer, whether a common or rare type, need to undergo treatments, the amount and kind of which will depend on the severity of the cancer. So how do healthcare providers know how much funding is needed for treatment? What would better enable a pharmaceutical company to determine how much to allocate for research and development of drugs, the amount of each drug to manufacture, or …


Variable Importance Analysis With The Multipim R Package, Stephan J. Ritter, Nicholas P. Jewell, Alan E. Hubbard Jul 2011

Variable Importance Analysis With The Multipim R Package, Stephan J. Ritter, Nicholas P. Jewell, Alan E. Hubbard

U.C. Berkeley Division of Biostatistics Working Paper Series

We describe the R package multiPIM, including statistical background, functionality and user options. The package is for variable importance analysis, and is meant primarily for analyzing data from exploratory epidemiological studies, though it could certainly be applied in other areas as well. The approach taken to variable importance comes from the causal inference field, and is different from approaches taken in other R packages. By default, multiPIM uses a double robust targeted maximum likelihood estimator (TMLE) of a parameter akin to the attributable risk. Several regression methods/machine learning algorithms are available for estimating the nuisance parameters of the models, including …


Analysis Of Roms Estimated Posterior Error Utilizing 4dvar Data Assimilation, Joseph Patrick Horton Jun 2011

Analysis Of Roms Estimated Posterior Error Utilizing 4dvar Data Assimilation, Joseph Patrick Horton

Mathematics

The appropriateness of the approximate error calculated by the Regional Ocean Modeling System (ROMS) is analyzed using Four-Dimensional Data Assimilation (4DVAR) performed on a numerical model of the San Luis Obispo Bay. An effective method of sampling data to minimize the actual error associated with the assimilated numerical model is explored by using different data sampling methods. An idealized state of the SLO bay region ("Real Run") is created to be used as the real ocean, then a numerical model of this region is created approximating this Real Run; this is known as the "Simulated State". By taking samples from …


Generalized Bathtub Hazard Models For Binary-Transformed Climate Data, James Polcer May 2011

Generalized Bathtub Hazard Models For Binary-Transformed Climate Data, James Polcer

Masters Theses & Specialist Projects

In this study, we use a hazard-based modeling as an alternative statistical framework to time series methods as applied to climate data. Data collected from the Kentucky Mesonet will be used to study the distributional properties of the duration of high and low-energy wind events relative to an arbitrary threshold. Our objectiveswere to fit bathtub models proposed in literature, propose a generalized bathtub model, apply these models to Kentucky Mesonet data, and make recommendations as to feasibility of wind power generation. Using two different thresholds (1.8 and 10 mph respectively), results show that the Hjorth bathtub model consistently performed better …


Unlv Enrollment Forecasting, Sabrina Beckman, Stefan Cline, Monika Neda Apr 2011

Unlv Enrollment Forecasting, Sabrina Beckman, Stefan Cline, Monika Neda

Festival of Communities: UG Symposium (Posters)

Our project investigates the future enrollment of undergraduates at UNLV in the entire university, the College of Science, and the Department of Mathematical Sciences. The method used for the forecast, is the well-known least-squares method, for which a mathematical description will be presented. Studies for the numerical error are pursued too. The study will include graphs that describe the past and future behavior for different parameter settings. Mathematical results obtained show that the university will continue to grow given the current trends of enrollment.