Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

160087 Full-Text Articles 188499 Authors 28691411 Downloads 340 Institutions

All Articles in Physical Sciences and Mathematics

Faceted Search

160087 full-text articles. Page 3628 of 4019.

Construction And Improvement Of A Scheffler Reflector And Thermal Storage Device, Jason Rapp 2010 California Polytechnic State University - San Luis Obispo

Construction And Improvement Of A Scheffler Reflector And Thermal Storage Device, Jason Rapp

Physics

We constructed and successfully tested a 2 m2 parabolic dish solar concentrator (Scheffler Concentrator) to focus sunlight onto a stationary target. Present efforts are to decrease the construction complexity and cost of the concentrator. In order to store solar heat, we also constructed and are testing a thermal storage device made of sand (for thermal mass), and pumice (for insulation). Preliminary tests indicate thermal retention times of many hours. Present efforts are to increase accessible power, and structural integrity.


Book Review: Stars Above, Earth Below: A Guide To Astronomy In The National Parks, T. D. Oswalt 2010 Florida Institute of Technology

Book Review: Stars Above, Earth Below: A Guide To Astronomy In The National Parks, T. D. Oswalt

Publications

This document is Dr. Oswalt’s review of Stars Above, Earth Below : a Guide to Astronomy in the National Parks by Tyler Nordgren. Springer/Praxis, 2010 444p, 9781441916488 $29.95.


An Analytic Characterization Of Model Minimization In Factored Markov Decision Processes, Guo W., Tze-Yun LEONG 2010 Singapore Management University

An Analytic Characterization Of Model Minimization In Factored Markov Decision Processes, Guo W., Tze-Yun Leong

Research Collection School Of Information Systems

Model minimization in Factored Markov Decision Processes (FMDPs) is concerned with finding the most compact partition of the state space such that all states in the same block are action-equivalent. This is an important problem because it can potentially transform a large FMDP into an equivalent but much smaller one, whose solution can be readily used to solve the original model. Previous model minimization algorithms are iterative in nature, making opaque the relationship between the input model and the output partition. We demonstrate that given a set of well-defined concepts and operations on partitions, we can express the model minimization ...


Low Temperature Photo-Oxidation Of Chloroperoxidase Compound Ii, Xinting Yuan, Xin Sheng, John A. Horner, Brian Bennett, Leslie W. -M. Fung, Martin Newcomb 2010 University of Illinois at Chicago

Low Temperature Photo-Oxidation Of Chloroperoxidase Compound Ii, Xinting Yuan, Xin Sheng, John A. Horner, Brian Bennett, Leslie W. -M. Fung, Martin Newcomb

Physics Faculty Research and Publications

Oxidation of the heme-thiolate enzyme chloroperoxidase (CPO) from Caldariomyces fumago with peroxynitrite (PN) gave the Compound II intermediate, which was photo-oxidized with 365 nm light to give a reactive oxidizing species. Cryo-solvents at pH ≈ 6 were employed, and reactions were conducted at temperatures as low as − 50 °C. The activity of CPO as evaluated by the chlorodimedone assay was unaltered by treatment with PN or by production of the oxidizing transient and subsequent reaction with styrene. EPR spectra at 77 K gave the amount of ferric protein at each stage in the reaction sequence. The PN oxidation step gave a ...


Cmmi For Development, Version 1.3, CMMI Product Team 2010 Carnegie Mellon University

Cmmi For Development, Version 1.3, Cmmi Product Team

Software Engineering Institute

CMMI® (Capability Maturity Model® Integration) models are collections of best practices that help organizations to improve their processes. These models are developed by product teams with members from industry, government, and the Carnegie Mellon® Software Engineering Institute (SEI). This model, called CMMI for Development (CMMI-DEV), provides a comprehensive integrated set of guidelines for developing products and services.


Rapidity And Centrality Dependence Of Azimuthal Correlations In Deuteron-Gold Collisions At Rhic, Kirill Tuchin 2010 Iowa State University

Rapidity And Centrality Dependence Of Azimuthal Correlations In Deuteron-Gold Collisions At Rhic, Kirill Tuchin

Physics and Astronomy Publications

We calculate azimuthal correlations in dAu collisions at different rapidities and centralities and argue that experimentally observed depletion of the back-to-back pick can be quantitatively explained by gluon saturation in the Color Glass Condensate of the Gold nucleus.


The Not-So-Quiet Revolution: Cautionary Comments On The Rejection Of Hypothesis Testing In Favor Of A “Causal” Modeling Alternative, Daniel H. Robinson, Joel R. Levin 2010 University of Texas

The Not-So-Quiet Revolution: Cautionary Comments On The Rejection Of Hypothesis Testing In Favor Of A “Causal” Modeling Alternative, Daniel H. Robinson, Joel R. Levin

Journal of Modern Applied Statistical Methods

Rodgers (2010) recently applauded a revolution involving the increased use of statistical modeling techniques. It is argued that such use may have a downside, citing empirical evidence in educational psychology that modeling techniques are often applied in cross-sectional, correlational studies to produce unjustified causal conclusions and prescriptive statements.


Notes On Hypothesis Testing Under A Single-Stage Design In Phase Ii Trial, Kung-Jong Lui 2010 San Diego State University

Notes On Hypothesis Testing Under A Single-Stage Design In Phase Ii Trial, Kung-Jong Lui

Journal of Modern Applied Statistical Methods

A primary objective of a phase II trial is to determine future development is warranted for a new treatment based on whether it has sufficient activity against a specified type of tumor. Limitations exist in the commonly-used hypothesis setting and the standard test procedure for a phase II trial. This study reformats the hypothesis setting to mirror the clinical decision process in practice. Under the proposed hypothesis setting, the critical points and the minimum required sample size for a desired power of finding a superior treatment at a given α -level are presented. An example is provided to illustrate how ...


Generalized Variances Ratio Test For Comparing K Covariance Matrices From Dependent Normal Populations, Marcelo Angelo Cirillo, Daniel Furtado Ferreira, Thelma Sáfadi, Eric Batista Ferreira 2010 Federal University of Lavras, Brazil

Generalized Variances Ratio Test For Comparing K Covariance Matrices From Dependent Normal Populations, Marcelo Angelo Cirillo, Daniel Furtado Ferreira, Thelma Sáfadi, Eric Batista Ferreira

Journal of Modern Applied Statistical Methods

New tests based on the ratio of generalized variances are presented to compare covariance matrices from dependent normal populations. Monte Carlo simulation concluded that the tests considered controlled the Type I error, providing empirical probabilities that were consistent with the nominal level stipulated.


Reducing Selection Bias In Analyzing Longitudinal Health Data With High Mortality Rates, Xian Liu, Charles C. Engel, Han Kang, Kristie L. Gore 2010 Uniformed Services University of the Health Sciences, Bethesda MD and Walter Reed National Military Medical Center, Bethesda MD

Reducing Selection Bias In Analyzing Longitudinal Health Data With High Mortality Rates, Xian Liu, Charles C. Engel, Han Kang, Kristie L. Gore

Journal of Modern Applied Statistical Methods

Two longitudinal regression models, one parametric and one nonparametric, are developed to reduce selection bias when analyzing longitudinal health data with high mortality rates. The parametric mixed model is a two-step linear regression approach, whereas the nonparametric mixed-effects regression model uses a retransformation method to handle random errors across time.


Recommended Sample Size For Conducting Exploratory Factor Analysis On Dichotomous Data, Robert H. Pearson, Daniel J. Mundform 2010 University of Northern Colorado

Recommended Sample Size For Conducting Exploratory Factor Analysis On Dichotomous Data, Robert H. Pearson, Daniel J. Mundform

Journal of Modern Applied Statistical Methods

Minimum sample sizes are recommended for conducting exploratory factor analysis on dichotomous data. A Monte Carlo simulation was conducted, varying the level of communalities, number of factors, variable-to-factor ratio and dichotomization threshold. Sample sizes were identified based on congruence between rotated population and sample factor loadings.


On Scientific Research: The Role Of Statistical Modeling And Hypothesis Testing, Lisa L. Harlow 2010 University of Rhode Island

On Scientific Research: The Role Of Statistical Modeling And Hypothesis Testing, Lisa L. Harlow

Journal of Modern Applied Statistical Methods

Comments on Rodgers (2010a, 2010b) and Robinson and Levin (2010) are presented. Rodgers (2010a) initially reported on a growing trend towards more mathematical and statistical modeling; and a move away from null hypothesis significance testing (NHST). He defended and clarified those views in his sequel. Robinson and Levin argued against the perspective espoused by Rodgers and called for more research using experimentally manipulated interventions and less emphasis on correlational research and ill-founded prescriptive statements. In this response, the goal of science and major scientific approaches are discussed as well as their strengths and shortcomings. Consideration is given to how their ...


Effect Of Measurement Errors On The Separate And Combined Ratio And Product Estimators In Stratified Random Sampling, Housila P. Singh, Namrata Karpe 2010 Vikram University, Ujjain, India

Effect Of Measurement Errors On The Separate And Combined Ratio And Product Estimators In Stratified Random Sampling, Housila P. Singh, Namrata Karpe

Journal of Modern Applied Statistical Methods

Separate and combined ratio, product and difference estimators are introduced for population mean μY of a study variable Y using auxiliary variable X in stratified sampling when the observations are contaminated with measurement errors. The bias and mean squared error of the proposed estimators have been derived under large sample approximation and their properties are analyzed. Generalized versions of these estimators are given along with their properties.


Maximum Downside Semi Deviation Stochastic Programming For Portfolio Optimization Problem, Anton Abdulbasah Kamil, Khlipah Ibrahim 2010 Universiti Sains Malaysia, Penang, Malaysia

Maximum Downside Semi Deviation Stochastic Programming For Portfolio Optimization Problem, Anton Abdulbasah Kamil, Khlipah Ibrahim

Journal of Modern Applied Statistical Methods

Portfolio optimization is an important research field in financial decision making. The chief character within optimization problems is the uncertainty of future returns. Probabilistic methods are used alongside optimization techniques. Markowitz (1952, 1959) introduced the concept of risk into the problem and used a mean-variance model to identify risk with the volatility (variance) of the random objective. The mean-risk optimization paradigm has since been expanded extensively both theoretically and computationally. A single stage and two stage stochastic programming model with recourse are presented for risk averse investors with the objective of minimizing the maximum downside semideviation. The models employ the ...


A Ga-Based Sales Forecasting Model Incorporating Promotion Factors, Li-Chih Wang, Chin-Lien Wang 2010 Tunghai University, Taichung, Taiwan ROC

A Ga-Based Sales Forecasting Model Incorporating Promotion Factors, Li-Chih Wang, Chin-Lien Wang

Journal of Modern Applied Statistical Methods

Because promotions are critical factors highly related to product sales of consumer packaged goods (CPG) companies, predictors concerning sales forecast of CPG products must take promotions into consideration. Decomposition regression incorporating contextual factors offers a method for exploiting both reliability of statistical forecasting and flexibility of judgmental forecasting employing domain knowledge. However, it suffers from collinearity causing poor performance in variable identification and parameter estimation with traditional ordinary least square (OLS). Empirical research evidence shows that - in the case of collinearity - in variable identification, parameter estimation, and out of sample forecasting, genetic algorithms (GA) as an estimator outperform OLS consistently ...


A General Class Of Chain-Type Estimators In The Presence Of Non-Response Under Double Sampling Scheme, Sunil Kumar, Housila P. Singh, Sandeep Bhougal 2010 University of Jammu, (J & K), India

A General Class Of Chain-Type Estimators In The Presence Of Non-Response Under Double Sampling Scheme, Sunil Kumar, Housila P. Singh, Sandeep Bhougal

Journal of Modern Applied Statistical Methods

General class chain ratio type estimators for estimating the population mean of a study variable are examined in the presence of non-response under a double sampling scheme using a factor-type estimator (FTE). Properties of the suggested estimators are studied and compared to those of existing estimators. An empirical study is carried out to demonstrate the performance of the suggested estimators; empirical results support the theoretical study.


Robust Estimators In Logistic Regression: A Comparative Simulation Study, Sanizah Ahmad, Norazan Mohamed Ramli, Habshah Midi 2010 saniz924@salam.uitm.edu.my

Robust Estimators In Logistic Regression: A Comparative Simulation Study, Sanizah Ahmad, Norazan Mohamed Ramli, Habshah Midi

Journal of Modern Applied Statistical Methods

The maximum likelihood estimator (MLE) is commonly used to estimate the parameters of logistic regression models due to its efficiency under a parametric model. However, evidence has shown the MLE has an unduly effect on the parameter estimates in the presence of outliers. Robust methods are put forward to rectify this problem. This article examines the performance of the MLE and four existing robust estimators under different outlier patterns, which are investigated by real data sets and Monte Carlo simulation.


Adjusted Confidence Interval For The Population Median Of The Exponential Distribution, Moustafa Omar Ahmed Abu-Shawiesh 2010 Hashemite University, Zarqa Jordan

Adjusted Confidence Interval For The Population Median Of The Exponential Distribution, Moustafa Omar Ahmed Abu-Shawiesh

Journal of Modern Applied Statistical Methods

The median confidence interval is useful for one parameter families, such as the exponential distribution, and it may not need to be adjusted if censored observations are present. In this article, two estimators for the median of the exponential distribution, MD, are considered and compared based on the sample median and the maximum likelihood method. The first estimator is the sample median, MD1, and the second estimator is the maximum likelihood estimator of the median, MDMLE. Both estimators are used to propose a modified confidence interval for the population median of the exponential distribution, MD. Monte Carlo simulations ...


Neighbor Balanced Block Designs For Two Factors, Seema Jaggi, Cini Varghese, N. R. Abeynayake 2010 Indian Agricultural Statistics Research Institute, New Delhi, India

Neighbor Balanced Block Designs For Two Factors, Seema Jaggi, Cini Varghese, N. R. Abeynayake

Journal of Modern Applied Statistical Methods

The concept of Neighbor Balanced Block (NBB) designs is defined for the experimental situation where the treatments are combinations of levels of two factors and only one of the factors exhibits a neighbor effect. Methods of constructing complete NBB designs for two factors in a plot that is strongly neighbor balanced for one factor are obtained. These designs are variance balanced for estimating the direct effects of contrasts pertaining to combinations of levels of both the factors. An incomplete NBB design for two factors is also presented and is found to be partially variance balanced with three associate classes.


Bayesian Analysis For Component Manufacturing Processes, L. V. Nandakishore 2010 Dr. M. G. R. University, Chennai

Bayesian Analysis For Component Manufacturing Processes, L. V. Nandakishore

Journal of Modern Applied Statistical Methods

In manufacturing processes various machines are used to produce the same product. Based on the age, make, etc., of the machines the output may not always follow the same distribution. An attempt is made to introduce Bayesian techniques for a two machine problem. Two cases are presented in this article.


Digital Commons powered by bepress