Open Access. Powered by Scholars. Published by Universities.®
- Institution
-
- Southern Methodist University (13)
- Western University (6)
- City University of New York (CUNY) (5)
- Kennesaw State University (5)
- Selected Works (5)
-
- Florida International University (4)
- Louisiana State University (4)
- Old Dominion University (4)
- Purdue University (4)
- University of Massachusetts Amherst (4)
- University of New Orleans (3)
- University of Tennessee, Knoxville (3)
- Claremont Colleges (2)
- Embry-Riddle Aeronautical University (2)
- Georgia Southern University (2)
- SelectedWorks (2)
- Technological University Dublin (2)
- The University of San Francisco (2)
- University of Nebraska - Lincoln (2)
- University of New Mexico (2)
- West Virginia University (2)
- Bowling Green State University (1)
- Brigham Young University (1)
- COBRA (1)
- California Polytechnic State University, San Luis Obispo (1)
- California State University, San Bernardino (1)
- Chinese Academy of Sciences (1)
- East Tennessee State University (1)
- Illinois Math and Science Academy (1)
- Illinois State University (1)
- Keyword
-
- Machine learning (12)
- Machine Learning (11)
- Deep Learning (8)
- Statistics (7)
- Deep learning (6)
-
- Modeling (4)
- Neural Networks (4)
- Poisson (4)
- R (4)
- Regression (4)
- Simulation (4)
- Artificial Intelligence (3)
- Bayesian (3)
- Computer vision (3)
- Data Science (3)
- Optimization (3)
- Protein (3)
- Artificial intelligence (2)
- Big Data (2)
- Big data (2)
- Computer Science (2)
- Ensemble (2)
- Gene expression (2)
- Genome-wide association studies (2)
- Long short-term memory (2)
- Monte Carlo Simulation (2)
- Natural Language Processing (2)
- Natural language processing (NLP) (2)
- Network (2)
- Population genetics (2)
- Publication Year
- Publication
-
- SMU Data Science Review (11)
- Electronic Thesis and Dissertation Repository (6)
- Doctoral Dissertations (5)
- Electronic Theses and Dissertations (4)
- FIU Electronic Theses and Dissertations (4)
-
- Doctor of Data Science and Analytics Dissertations (3)
- LSU Doctoral Dissertations (3)
- The Summer Undergraduate Research Fellowship (SURF) Symposium (3)
- University of New Orleans Theses and Dissertations (3)
- Articles (2)
- Dissertations, Theses, and Capstone Projects (2)
- Graduate Theses, Dissertations, and Problem Reports (2)
- Publications and Research (2)
- Statistical Science Theses and Dissertations (2)
- Theses and Dissertations (2)
- All NMU Master's Theses (1)
- Annual Symposium on Biomathematics and Ecology Education and Research (1)
- Art + Architecture (1)
- Basic Science Engineering (1)
- Beyond: Undergraduate Research Journal (1)
- Bioinformatics Faculty Publications (1)
- Bulletin of Chinese Academy of Sciences (Chinese Version) (1)
- CCE Theses and Dissertations (1)
- CGU Theses & Dissertations (1)
- CMC Senior Theses (1)
- COBRA Preprint Series (1)
- Center for Economic Development Technical Reports (1)
- Computational Modeling & Simulation Engineering Theses & Dissertations (1)
- Computer Science ETDs (1)
- Creative Activity and Research Day - CARD (1)
- Publication Type
- File Type
Articles 91 - 112 of 112
Full-Text Articles in Statistical Models
Identification Of Informativeness In Text Using Natural Language Stylometry, Rushdi Shams
Identification Of Informativeness In Text Using Natural Language Stylometry, Rushdi Shams
Electronic Thesis and Dissertation Repository
In this age of information overload, one experiences a rapidly growing over-abundance of written text. To assist with handling this bounty, this plethora of texts is now widely used to develop and optimize statistical natural language processing (NLP) systems. Surprisingly, the use of more fragments of text to train these statistical NLP systems may not necessarily lead to improved performance. We hypothesize that those fragments that help the most with training are those that contain the desired information. Therefore, determining informativeness in text has become a central issue in our view of NLP. Recent developments in this field have spawned …
Spatiotemporal Crime Analysis, James Q. Tay, Abish Malik, Sherry Towers, David Ebert
Spatiotemporal Crime Analysis, James Q. Tay, Abish Malik, Sherry Towers, David Ebert
The Summer Undergraduate Research Fellowship (SURF) Symposium
There has been a rise in the use of visual analytic techniques to create interactive predictive environments in a range of different applications. These tools help the user sift through massive amounts of data, presenting most useful results in a visual context and enabling the person to rapidly form proactive strategies. In this paper, we present one such visual analytic environment that uses historical crime data to predict future occurrences of crimes, both geographically and temporally. Due to the complexity of this analysis, it is necessary to find an appropriate statistical method for correlative analysis of spatiotemporal data, as well …
Measuring Security: A Challenge For The Generation, Janusz Zalewski, Steven Drager, William Mckeever, Andrew J. Kornecki
Measuring Security: A Challenge For The Generation, Janusz Zalewski, Steven Drager, William Mckeever, Andrew J. Kornecki
Department of Electrical Engineering and Computer Science - Daytona Beach
This paper presents an approach to measuring computer security understood as a system property, in the category of similar properties, such as safety, reliability, dependability, resilience, etc. First, a historical discussion of measurements is presented, beginning with views of Hermann von Helmholtz in his 19th century work “Zählen und Messen”. Then, contemporary approaches related to the principles of measuring software properties are discussed, with emphasis on statistical, physical and software models. A distinction between metrics and measures is made to clarify the concepts. A brief overview of inadequacies of methods and techniques to evaluate computer security is presented, followed by …
Changing Minds To Changing The World: Mapping The Spectrum Of Intent In Data Visualization And Data Arts, Scott Murray
Changing Minds To Changing The World: Mapping The Spectrum Of Intent In Data Visualization And Data Arts, Scott Murray
Art + Architecture
No abstract provided.
Scalable Collaborative Filtering Recommendation Algorithms On Apache Spark, Walker Evan Casey
Scalable Collaborative Filtering Recommendation Algorithms On Apache Spark, Walker Evan Casey
CMC Senior Theses
Collaborative filtering based recommender systems use information about a user's preferences to make personalized predictions about content, such as topics, people, or products, that they might find relevant. As the volume of accessible information and active users on the Internet continues to grow, it becomes increasingly difficult to compute recommendations quickly and accurately over a large dataset. In this study, we will introduce an algorithmic framework built on top of Apache Spark for parallel computation of the neighborhood-based collaborative filtering problem, which allows the algorithm to scale linearly with a growing number of users. We also investigate several different variants …
Caimans - Semantic Platform For Advance Content Mining (Sketch Wp), Salvo Reina
Caimans - Semantic Platform For Advance Content Mining (Sketch Wp), Salvo Reina
Salvo Reina
A middleware SW platform was created for automatic classification of textual contents. The worksheet of requirements and the original flow-sketchs are published.
Iterative Statistical Verification Of Probabilistic Plans, Colin M. Potts
Iterative Statistical Verification Of Probabilistic Plans, Colin M. Potts
Lawrence University Honors Projects
Artificial intelligence seeks to create intelligent agents. An agent can be anything: an autopilot, a self-driving car, a robot, a person, or even an anti-virus system. While the current state-of-the-art may not achieve intelligence (a rather dubious thing to quantify) it certainly achieves a sense of autonomy. A key aspect of an autonomous system is its ability to maintain and guarantee safety—defined as avoiding some set of undesired outcomes. The piece of software responsible for this is called a planner, which is essentially an automated problem solver. An advantage computer planners have over humans is their ability to consider and …
Modeling A Sensor To Improve Its Efficacy, Nabin K. Malakar, Daniil Gladkov, Kevin H. Knuth
Modeling A Sensor To Improve Its Efficacy, Nabin K. Malakar, Daniil Gladkov, Kevin H. Knuth
Physics Faculty Scholarship
Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be extremely expensive and cost-prohibitive. Thus many robotic systems must make due with lower-quality sensors. Here we demonstrate via a case study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework. As a test bed we employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. The light sensor integrates the light arriving from a …
Automating Large-Scale Simulation Calibration To Real-World Sensor Data, Richard Everett Edwards
Automating Large-Scale Simulation Calibration To Real-World Sensor Data, Richard Everett Edwards
Doctoral Dissertations
Many key decisions and design policies are made using sophisticated computer simulations. However, these sophisticated computer simulations have several major problems. The two main issues are 1) gaps between the simulation model and the actual structure, and 2) limitations of the modeling engine's capabilities. This dissertation's goal is to address these simulation deficiencies by presenting a general automated process for tuning simulation inputs such that simulation output matches real world measured data. The automated process involves the following key components -- 1) Identify a model that accurately estimates the real world simulation calibration target from measured sensor data; 2) Identify …
Retrieval Of Sub-Pixel-Based Fire Intensity And Its Application For Characterizing Smoke Injection Heights And Fire Weather In North America, David Peterson
Retrieval Of Sub-Pixel-Based Fire Intensity And Its Application For Characterizing Smoke Injection Heights And Fire Weather In North America, David Peterson
Department of Earth and Atmospheric Sciences: Dissertations, Theses, and Student Research
For over two decades, satellite sensors have provided the locations of global fire activity with ever-increasing accuracy. However, the ability to measure fire intensity, know as fire radiative power (FRP), and its potential relationships to meteorology and smoke plume injection heights, are currently limited by the pixel resolution. This dissertation describes the development of a new, sub-pixel-based FRP calculation (FRPf) for fire pixels detected by the MODerate Resolution Imaging Spectroradiometer (MODIS) fire detection algorithm (Collection 5), which is subsequently applied to several large wildfire events in North America. The methodology inherits an earlier bi-spectral algorithm for retrieving sub-pixel …
The Interacting Multiple Models Algorithm With State-Dependent Value Assignment, Rastin Rastgoufard
The Interacting Multiple Models Algorithm With State-Dependent Value Assignment, Rastin Rastgoufard
University of New Orleans Theses and Dissertations
The value of a state is a measure of its worth, so that, for example, waypoints have high value and regions inside of obstacles have very small value. We propose two methods of incorporating world information as state-dependent modifications to the interacting multiple models (IMM) algorithm, and then we use a game's player-controlled trajectories as ground truths to compare the normal IMM algorithm to versions with our proposed modifications. The two methods involve modifying the model probabilities in the update step and modifying the transition probability matrix in the mixing step based on the assigned values of different target states. …
A Normal Truncated Skewed-Laplace Model In Stochastic Frontier Analysis, Junyi Wang
A Normal Truncated Skewed-Laplace Model In Stochastic Frontier Analysis, Junyi Wang
Masters Theses & Specialist Projects
Stochastic frontier analysis is an exciting method of economic production modeling that is relevant to hospitals, stock markets, manufacturing factories, and services. In this paper, we create a new model using the normal distribution and truncated skew-Laplace distribution, namely the normal-truncated skew-Laplace model. This is a generalized model of the normal-exponential case. Furthermore, we compute the true technical efficiency and estimated technical efficiency of the normal-truncated skewed-Laplace model. Also, we compare the technical efficiencies of normal-truncated skewed-Laplace model and normal-exponential model.
Basic R Matrix Operations, Joseph Hilbe
A Geospatial Based Decision Framework For Extending Marssim Regulatory Principles Into The Subsurface, Robert Nathan Stewart
A Geospatial Based Decision Framework For Extending Marssim Regulatory Principles Into The Subsurface, Robert Nathan Stewart
Doctoral Dissertations
The Multi-Agency Radiological Site Survey Investigation Manual (MARSSIM) is a regulatory guidance document regarding compliance evaluation of radiologically contaminated soils and buildings (USNRC, 2000). Compliance is determined by comparing radiological measurements to established limits using a combination of hypothesis testing and scanning measurements. Scanning allows investigators to identify localized pockets of contamination missed during sampling and allows investigators to assess radiological exposure at different spatial scales. Scale is important in radiological dose assessment as regulatory limits can vary with the size of the contaminated area and sites are often evaluated at more than one scale (USNRC, 2000). Unfortunately, scanning is …
Flipping The Winner Of A Poset Game, Adam O. Kalinich '12
Flipping The Winner Of A Poset Game, Adam O. Kalinich '12
Student Publications & Research
Partially-ordered set games, also called poset games, are a class of two-player combinatorial games. The playing field consists of a set of elements, some of which are greater than other elements. Two players take turns removing an element and all elements greater than it, and whoever takes the last element wins. Examples of poset games include Nim and Chomp. We investigate the complexity of computing which player of a poset game has a winning strategy. We give an inductive procedure that modifies poset games to change the nim-value which informally captures the winning strategies in the game. For a generic …
Why Is An Einstein Ring Blue?, Jonathan Blackledge
Why Is An Einstein Ring Blue?, Jonathan Blackledge
Articles
Albert Einstein predicted the existence of `Einstein rings' as a consequence of his general theory of relativity. The phenomenon is a direct result of the idea that if a mass warps space-time then light (and other electromagnetic waves) will be `lensed' by the strong gravitational field produced by a large cosmological body such as a galaxy. Since 1998, when the first complete Einstein ring was observed, many more complete or partially complete Einstein rings have been observed in the radio and infrared spectra, for example, and by the Hubble Space Telescope in the optical spectrum. However, in the latter case, …
Software Internationalization: A Framework Validated Against Industry Requirements For Computer Science And Software Engineering Programs, John Huân Vũ
Master's Theses
View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c.
In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested …
Encryption Using Deterministic Chaos, Jonathan Blackledge, Nikolai Ptitsyn
Encryption Using Deterministic Chaos, Jonathan Blackledge, Nikolai Ptitsyn
Articles
The concepts of randomness, unpredictability, complexity and entropy form the basis of modern cryptography and a cryptosystem can be interpreted as the design of a key-dependent bijective transformation that is unpredictable to an observer for a given computational resource. For any cryptosystem, including a Pseudo-Random Number Generator (PRNG), encryption algorithm or a key exchange scheme, for example, a cryptanalyst has access to the time series of a dynamic system and knows the PRNG function (the algorithm that is assumed to be based on some iterative process) which is taken to be in the public domain by virtue of the Kerchhoff-Shannon …
A New Soft Tissue Analysis : To Establish Facial Esthetic Norms In Young Adult Females, Anne Béress
A New Soft Tissue Analysis : To Establish Facial Esthetic Norms In Young Adult Females, Anne Béress
Loma Linda University Electronic Theses, Dissertations & Projects
Two hundred and fifty-five articles, books, and masters theses were reviewed for the most frequently applied soft tissue measurements in the literature in order to develop a new soft tissue analysis computer program that includes established soft tissue measurements and the newly developed globe analysis. A meta analysis of 20 normal occlusion studies, was performed to obtain mean values and standard deviations to form a large sample size. Inclusion criteria for articles in the meta analysis were normal occlusion, no orthodontic treatment, pleasing faces, statement on age, race, and lip position of the population. For the lateral view, angular and …
Comparing Traditional Statistical Models With Neural Network Models: The Case Of The Relation Of Human Performance Factors To The Outcomes Of Military Combat, William Oliver Hedgepeth
Comparing Traditional Statistical Models With Neural Network Models: The Case Of The Relation Of Human Performance Factors To The Outcomes Of Military Combat, William Oliver Hedgepeth
Engineering Management & Systems Engineering Theses & Dissertations
Statistics and neural networks are analytical methods used to learn about observed experience. Both the statistician and neural network researcher develop and analyze data sets, draw relevant conclusions, and validate the conclusions. They also share in the challenge of creating accurate predictions of future events with noisy data.
Both analytical methods are investigated. This is accomplished by examining the veridicality of both with real system data. The real system used in this project is a database of 400 years of historical military combat. The relationships among the variables represented in this database are recognized as being hypercomplex and nonlinear.
The …
Polygon Explorer For Massachusetts Data: Initial Report, Center For Economic Development
Polygon Explorer For Massachusetts Data: Initial Report, Center For Economic Development
Center for Economic Development Technical Reports
Polygon Explorer is a program written for Macintosh computers that links data stored in standard spreadsheet formats with a geographic database. It differs from similar programs such as MapInfo and ArcView in that it provides a statistical visualization capability in the form of bar charts, histograms, scatterplots, and other views, and further these views are linked to one another so that any action in one view results highlighting in other views.
Polygon Explorer was initially developed with the support of funding from the Massachusetts Agricultural Experiment Station, and parts of a donation form the Environmental Systems Research Institute and a …
Opset Program For Computerized Selection Of Watershed Parameter Values For The Stanford Watershed Model, Earnest Yuan-Shang Liou, L. Douglas James
Opset Program For Computerized Selection Of Watershed Parameter Values For The Stanford Watershed Model, Earnest Yuan-Shang Liou, L. Douglas James
KWRRI Research Reports
The advent of high-speed electronic computer made it possible to model complex hydrologic processes by mathematical expressions and thereby simulate streamflows from climatological data. The most widely used program is the Stanford Watershed Model, a digital parametric model of the land phase of the hydrologic cycle based on moisture accounting processes. It can be used to simulate annual or longer flow sequences at hourly time intervals. Due to its capability of simulating historical streamflows from recorded climatological data, it has a great potential in the planning and design of water resources systems. However, widespread use of the Stanford Watershed Model …