Open Access. Powered by Scholars. Published by Universities.®

Statistical Models Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

Institution
Keyword
Publication Year
Publication
Publication Type
File Type

Articles 91 - 112 of 112

Full-Text Articles in Statistical Models

Identification Of Informativeness In Text Using Natural Language Stylometry, Rushdi Shams Aug 2014

Identification Of Informativeness In Text Using Natural Language Stylometry, Rushdi Shams

Electronic Thesis and Dissertation Repository

In this age of information overload, one experiences a rapidly growing over-abundance of written text. To assist with handling this bounty, this plethora of texts is now widely used to develop and optimize statistical natural language processing (NLP) systems. Surprisingly, the use of more fragments of text to train these statistical NLP systems may not necessarily lead to improved performance. We hypothesize that those fragments that help the most with training are those that contain the desired information. Therefore, determining informativeness in text has become a central issue in our view of NLP. Recent developments in this field have spawned …


Spatiotemporal Crime Analysis, James Q. Tay, Abish Malik, Sherry Towers, David Ebert Aug 2014

Spatiotemporal Crime Analysis, James Q. Tay, Abish Malik, Sherry Towers, David Ebert

The Summer Undergraduate Research Fellowship (SURF) Symposium

There has been a rise in the use of visual analytic techniques to create interactive predictive environments in a range of different applications. These tools help the user sift through massive amounts of data, presenting most useful results in a visual context and enabling the person to rapidly form proactive strategies. In this paper, we present one such visual analytic environment that uses historical crime data to predict future occurrences of crimes, both geographically and temporally. Due to the complexity of this analysis, it is necessary to find an appropriate statistical method for correlative analysis of spatiotemporal data, as well …


Measuring Security: A Challenge For The Generation, Janusz Zalewski, Steven Drager, William Mckeever, Andrew J. Kornecki Jan 2014

Measuring Security: A Challenge For The Generation, Janusz Zalewski, Steven Drager, William Mckeever, Andrew J. Kornecki

Department of Electrical Engineering and Computer Science - Daytona Beach

This paper presents an approach to measuring computer security understood as a system property, in the category of similar properties, such as safety, reliability, dependability, resilience, etc. First, a historical discussion of measurements is presented, beginning with views of Hermann von Helmholtz in his 19th century work “Zählen und Messen”. Then, contemporary approaches related to the principles of measuring software properties are discussed, with emphasis on statistical, physical and software models. A distinction between metrics and measures is made to clarify the concepts. A brief overview of inadequacies of methods and techniques to evaluate computer security is presented, followed by …


Changing Minds To Changing The World: Mapping The Spectrum Of Intent In Data Visualization And Data Arts, Scott Murray Jan 2014

Changing Minds To Changing The World: Mapping The Spectrum Of Intent In Data Visualization And Data Arts, Scott Murray

Art + Architecture

No abstract provided.


Scalable Collaborative Filtering Recommendation Algorithms On Apache Spark, Walker Evan Casey Jan 2014

Scalable Collaborative Filtering Recommendation Algorithms On Apache Spark, Walker Evan Casey

CMC Senior Theses

Collaborative filtering based recommender systems use information about a user's preferences to make personalized predictions about content, such as topics, people, or products, that they might find relevant. As the volume of accessible information and active users on the Internet continues to grow, it becomes increasingly difficult to compute recommendations quickly and accurately over a large dataset. In this study, we will introduce an algorithmic framework built on top of Apache Spark for parallel computation of the neighborhood-based collaborative filtering problem, which allows the algorithm to scale linearly with a growing number of users. We also investigate several different variants …


Caimans - Semantic Platform For Advance Content Mining (Sketch Wp), Salvo Reina Jul 2013

Caimans - Semantic Platform For Advance Content Mining (Sketch Wp), Salvo Reina

Salvo Reina

A middleware SW platform was created for automatic classification of textual contents. The worksheet of requirements and the original flow-sketchs are published.


Iterative Statistical Verification Of Probabilistic Plans, Colin M. Potts May 2013

Iterative Statistical Verification Of Probabilistic Plans, Colin M. Potts

Lawrence University Honors Projects

Artificial intelligence seeks to create intelligent agents. An agent can be anything: an autopilot, a self-driving car, a robot, a person, or even an anti-virus system. While the current state-of-the-art may not achieve intelligence (a rather dubious thing to quantify) it certainly achieves a sense of autonomy. A key aspect of an autonomous system is its ability to maintain and guarantee safety—defined as avoiding some set of undesired outcomes. The piece of software responsible for this is called a planner, which is essentially an automated problem solver. An advantage computer planners have over humans is their ability to consider and …


Modeling A Sensor To Improve Its Efficacy, Nabin K. Malakar, Daniil Gladkov, Kevin H. Knuth May 2013

Modeling A Sensor To Improve Its Efficacy, Nabin K. Malakar, Daniil Gladkov, Kevin H. Knuth

Physics Faculty Scholarship

Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be extremely expensive and cost-prohibitive. Thus many robotic systems must make due with lower-quality sensors. Here we demonstrate via a case study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework. As a test bed we employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. The light sensor integrates the light arriving from a …


Automating Large-Scale Simulation Calibration To Real-World Sensor Data, Richard Everett Edwards May 2013

Automating Large-Scale Simulation Calibration To Real-World Sensor Data, Richard Everett Edwards

Doctoral Dissertations

Many key decisions and design policies are made using sophisticated computer simulations. However, these sophisticated computer simulations have several major problems. The two main issues are 1) gaps between the simulation model and the actual structure, and 2) limitations of the modeling engine's capabilities. This dissertation's goal is to address these simulation deficiencies by presenting a general automated process for tuning simulation inputs such that simulation output matches real world measured data. The automated process involves the following key components -- 1) Identify a model that accurately estimates the real world simulation calibration target from measured sensor data; 2) Identify …


Retrieval Of Sub-Pixel-Based Fire Intensity And Its Application For Characterizing Smoke Injection Heights And Fire Weather In North America, David Peterson Sep 2012

Retrieval Of Sub-Pixel-Based Fire Intensity And Its Application For Characterizing Smoke Injection Heights And Fire Weather In North America, David Peterson

Department of Earth and Atmospheric Sciences: Dissertations, Theses, and Student Research

For over two decades, satellite sensors have provided the locations of global fire activity with ever-increasing accuracy. However, the ability to measure fire intensity, know as fire radiative power (FRP), and its potential relationships to meteorology and smoke plume injection heights, are currently limited by the pixel resolution. This dissertation describes the development of a new, sub-pixel-based FRP calculation (FRPf) for fire pixels detected by the MODerate Resolution Imaging Spectroradiometer (MODIS) fire detection algorithm (Collection 5), which is subsequently applied to several large wildfire events in North America. The methodology inherits an earlier bi-spectral algorithm for retrieving sub-pixel …


The Interacting Multiple Models Algorithm With State-Dependent Value Assignment, Rastin Rastgoufard May 2012

The Interacting Multiple Models Algorithm With State-Dependent Value Assignment, Rastin Rastgoufard

University of New Orleans Theses and Dissertations

The value of a state is a measure of its worth, so that, for example, waypoints have high value and regions inside of obstacles have very small value. We propose two methods of incorporating world information as state-dependent modifications to the interacting multiple models (IMM) algorithm, and then we use a game's player-controlled trajectories as ground truths to compare the normal IMM algorithm to versions with our proposed modifications. The two methods involve modifying the model probabilities in the update step and modifying the transition probability matrix in the mixing step based on the assigned values of different target states. …


A Normal Truncated Skewed-Laplace Model In Stochastic Frontier Analysis, Junyi Wang May 2012

A Normal Truncated Skewed-Laplace Model In Stochastic Frontier Analysis, Junyi Wang

Masters Theses & Specialist Projects

Stochastic frontier analysis is an exciting method of economic production modeling that is relevant to hospitals, stock markets, manufacturing factories, and services. In this paper, we create a new model using the normal distribution and truncated skew-Laplace distribution, namely the normal-truncated skew-Laplace model. This is a generalized model of the normal-exponential case. Furthermore, we compute the true technical efficiency and estimated technical efficiency of the normal-truncated skewed-Laplace model. Also, we compare the technical efficiencies of normal-truncated skewed-Laplace model and normal-exponential model.


Basic R Matrix Operations, Joseph Hilbe Aug 2011

Basic R Matrix Operations, Joseph Hilbe

Joseph M Hilbe

No abstract provided.


A Geospatial Based Decision Framework For Extending Marssim Regulatory Principles Into The Subsurface, Robert Nathan Stewart Aug 2011

A Geospatial Based Decision Framework For Extending Marssim Regulatory Principles Into The Subsurface, Robert Nathan Stewart

Doctoral Dissertations

The Multi-Agency Radiological Site Survey Investigation Manual (MARSSIM) is a regulatory guidance document regarding compliance evaluation of radiologically contaminated soils and buildings (USNRC, 2000). Compliance is determined by comparing radiological measurements to established limits using a combination of hypothesis testing and scanning measurements. Scanning allows investigators to identify localized pockets of contamination missed during sampling and allows investigators to assess radiological exposure at different spatial scales. Scale is important in radiological dose assessment as regulatory limits can vary with the size of the contaminated area and sites are often evaluated at more than one scale (USNRC, 2000). Unfortunately, scanning is …


Flipping The Winner Of A Poset Game, Adam O. Kalinich '12 Jan 2011

Flipping The Winner Of A Poset Game, Adam O. Kalinich '12

Student Publications & Research

Partially-ordered set games, also called poset games, are a class of two-player combinatorial games. The playing field consists of a set of elements, some of which are greater than other elements. Two players take turns removing an element and all elements greater than it, and whoever takes the last element wins. Examples of poset games include Nim and Chomp. We investigate the complexity of computing which player of a poset game has a winning strategy. We give an inductive procedure that modifies poset games to change the nim-value which informally captures the winning strategies in the game. For a generic …


Why Is An Einstein Ring Blue?, Jonathan Blackledge Jan 2011

Why Is An Einstein Ring Blue?, Jonathan Blackledge

Articles

Albert Einstein predicted the existence of `Einstein rings' as a consequence of his general theory of relativity. The phenomenon is a direct result of the idea that if a mass warps space-time then light (and other electromagnetic waves) will be `lensed' by the strong gravitational field produced by a large cosmological body such as a galaxy. Since 1998, when the first complete Einstein ring was observed, many more complete or partially complete Einstein rings have been observed in the radio and infrared spectra, for example, and by the Hubble Space Telescope in the optical spectrum. However, in the latter case, …


Software Internationalization: A Framework Validated Against Industry Requirements For Computer Science And Software Engineering Programs, John Huân Vũ Mar 2010

Software Internationalization: A Framework Validated Against Industry Requirements For Computer Science And Software Engineering Programs, John Huân Vũ

Master's Theses

View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c.

In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested …


Encryption Using Deterministic Chaos, Jonathan Blackledge, Nikolai Ptitsyn Jan 2010

Encryption Using Deterministic Chaos, Jonathan Blackledge, Nikolai Ptitsyn

Articles

The concepts of randomness, unpredictability, complexity and entropy form the basis of modern cryptography and a cryptosystem can be interpreted as the design of a key-dependent bijective transformation that is unpredictable to an observer for a given computational resource. For any cryptosystem, including a Pseudo-Random Number Generator (PRNG), encryption algorithm or a key exchange scheme, for example, a cryptanalyst has access to the time series of a dynamic system and knows the PRNG function (the algorithm that is assumed to be based on some iterative process) which is taken to be in the public domain by virtue of the Kerchhoff-Shannon …


A New Soft Tissue Analysis : To Establish Facial Esthetic Norms In Young Adult Females, Anne Béress Aug 1996

A New Soft Tissue Analysis : To Establish Facial Esthetic Norms In Young Adult Females, Anne Béress

Loma Linda University Electronic Theses, Dissertations & Projects

Two hundred and fifty-five articles, books, and masters theses were reviewed for the most frequently applied soft tissue measurements in the literature in order to develop a new soft tissue analysis computer program that includes established soft tissue measurements and the newly developed globe analysis. A meta analysis of 20 normal occlusion studies, was performed to obtain mean values and standard deviations to form a large sample size. Inclusion criteria for articles in the meta analysis were normal occlusion, no orthodontic treatment, pleasing faces, statement on age, race, and lip position of the population. For the lateral view, angular and …


Comparing Traditional Statistical Models With Neural Network Models: The Case Of The Relation Of Human Performance Factors To The Outcomes Of Military Combat, William Oliver Hedgepeth Jan 1995

Comparing Traditional Statistical Models With Neural Network Models: The Case Of The Relation Of Human Performance Factors To The Outcomes Of Military Combat, William Oliver Hedgepeth

Engineering Management & Systems Engineering Theses & Dissertations

Statistics and neural networks are analytical methods used to learn about observed experience. Both the statistician and neural network researcher develop and analyze data sets, draw relevant conclusions, and validate the conclusions. They also share in the challenge of creating accurate predictions of future events with noisy data.

Both analytical methods are investigated. This is accomplished by examining the veridicality of both with real system data. The real system used in this project is a database of 400 years of historical military combat. The relationships among the variables represented in this database are recognized as being hypercomplex and nonlinear.

The …


Polygon Explorer For Massachusetts Data: Initial Report, Center For Economic Development Jan 1993

Polygon Explorer For Massachusetts Data: Initial Report, Center For Economic Development

Center for Economic Development Technical Reports

Polygon Explorer is a program written for Macintosh computers that links data stored in standard spreadsheet formats with a geographic database. It differs from similar programs such as MapInfo and ArcView in that it provides a statistical visualization capability in the form of bar charts, histograms, scatterplots, and other views, and further these views are linked to one another so that any action in one view results highlighting in other views.

Polygon Explorer was initially developed with the support of funding from the Massachusetts Agricultural Experiment Station, and parts of a donation form the Environmental Systems Research Institute and a …


Opset Program For Computerized Selection Of Watershed Parameter Values For The Stanford Watershed Model, Earnest Yuan-Shang Liou, L. Douglas James Jan 1970

Opset Program For Computerized Selection Of Watershed Parameter Values For The Stanford Watershed Model, Earnest Yuan-Shang Liou, L. Douglas James

KWRRI Research Reports

The advent of high-speed electronic computer made it possible to model complex hydrologic processes by mathematical expressions and thereby simulate streamflows from climatological data. The most widely used program is the Stanford Watershed Model, a digital parametric model of the land phase of the hydrologic cycle based on moisture accounting processes. It can be used to simulate annual or longer flow sequences at hourly time intervals. Due to its capability of simulating historical streamflows from recorded climatological data, it has a great potential in the planning and design of water resources systems. However, widespread use of the Stanford Watershed Model …