Open Access. Powered by Scholars. Published by Universities.®
- Discipline
-
- Physical Sciences and Mathematics (77)
- Computer Sciences (74)
- Electrical and Computer Engineering (21)
- Computational Engineering (10)
- Electrical and Electronics (8)
-
- Robotics (8)
- Artificial Intelligence and Robotics (7)
- Business (6)
- Computer and Systems Architecture (5)
- Digital Communications and Networking (5)
- Education (5)
- Applied Mathematics (4)
- Mechanical Engineering (4)
- Other Computer Engineering (4)
- Biomedical Engineering and Bioengineering (3)
- Business Administration, Management, and Operations (3)
- Chemical Engineering (3)
- Civil and Environmental Engineering (3)
- Life Sciences (3)
- Physics (3)
- Social and Behavioral Sciences (3)
- Biology (2)
- Data Storage Systems (2)
- Databases and Information Systems (2)
- Educational Assessment, Evaluation, and Research (2)
- Hardware Systems (2)
- Human Resources Management (2)
- Management Information Systems (2)
- Institution
- Keyword
-
- Machine learning (19)
- Knowledge Management (15)
- Knowledge management (14)
- Deep learning (13)
- Machine Learning (12)
-
- Classification (9)
- Regression (8)
- Support Vector Machine (8)
- Deep Learning (7)
- Security (7)
- Twitter (7)
- Data mining (6)
- Knowledge (6)
- Mental Workload (6)
- Natural Language Processing (6)
- Sentiment analysis (6)
- Wiki (6)
- BERT (5)
- Cloud computing (5)
- Natural language processing (5)
- Supervised Machine Learning (5)
- CNN (4)
- Computer vision (4)
- E-learning (4)
- Education (4)
- LSTM (4)
- Neural Networks (4)
- Robotics (4)
- Support Vector Machines (4)
- Transfer Learning (4)
- Publication Year
- Publication Type
Articles 61 - 90 of 318
Full-Text Articles in Computer Engineering
Can Generative Adversarial Networks Help Us Fight Financial Fraud?, Sean Mciver
Can Generative Adversarial Networks Help Us Fight Financial Fraud?, Sean Mciver
Dissertations
Transactional fraud datasets exhibit extreme class imbalance. Learners cannot make accurate generalizations without sufficient data. Researchers can account for imbalance at the data level, algorithmic level or both. This paper focuses on techniques at the data level. We evaluate the evidence of the optimal technique and potential enhancements. Global fraud losses totalled more than 80 % of the UK’s GDP in 2019. The improvement of preprocessing is inherently valuable in fighting these losses. Synthetic minority oversampling technique (SMOTE) and extensions of SMOTE are currently the most common preprocessing strategies. SMOTE oversamples the minority classes by randomly generating a point between …
A Comparison Of Instructional Efficiency Models In Third Level Education, Murali Rajendran
A Comparison Of Instructional Efficiency Models In Third Level Education, Murali Rajendran
Dissertations
This study investigates the validity and sensitivity of a novel model of instructional efficiency: the parabolic model. The novel model is compared against state-of-the-art models present in instructional design today; Likelihood model, Deviational model and Multidimensional model. This models is based on the assumption that optimal mental workload and high performance leads to high efficiency, while other models assume that low mental workload and high performance leads to high efficiency. The investigation makes use of two instructional design conditions: a direct instructions approach to learning and its extension with a collaborative activity. A control group received the former instructional design …
Improving A Network Intrusion Detection System’S Efficiency Using Model-Based Data Augmentation, Vinicius Waterkemper Lodetti
Improving A Network Intrusion Detection System’S Efficiency Using Model-Based Data Augmentation, Vinicius Waterkemper Lodetti
Dissertations
A network intrusion detection system (NIDS) is one important element to mitigate cybersecurity risks, the NIDS allow for detecting anomalies in a network which may be a cyberattack to a corporate network environment. A NIDS can be seen as a classification problem where the ultimate goal is to distinguish between malicious traffic among a majority of benign traffic. Researches on NIDS are often performed using outdated datasets that don’t represent the actual cyberspace. Datasets such as the CICIDS2018 address this gap by being generated from attacks and an infrastructure that reflects an up-to-date scenario.
A problem may arise when machine …
Exploiting Bert And Roberta To Improve Performance For Aspect Based Sentiment Analysis, Gagan Reddy Narayanaswamy
Exploiting Bert And Roberta To Improve Performance For Aspect Based Sentiment Analysis, Gagan Reddy Narayanaswamy
Dissertations
Sentiment Analysis also known as opinion mining is a type of text research that analyses people’s opinions expressed in written language. Sentiment analysis brings together various research areas such as Natural Language Processing (NLP), Data Mining, and Text Mining, and is fast becoming of major importance to companies and organizations as it is started to incorporate online commerce data for analysis. Often the data on which sentiment analysis is performed will be reviews. The data can range from reviews of a small product to a big multinational corporation. The goal of performing sentiment analysis is to extract information from those …
An Evaluation On The Performance Of Code Generated With Webassembly Compilers, Raymond Phelan
An Evaluation On The Performance Of Code Generated With Webassembly Compilers, Raymond Phelan
Dissertations
WebAssembly is a new technology that is revolutionizing the web. Essentially it is a low-level binary instruction set that can be run on browsers, servers or stand-alone environments. Many programming languages either currently have, or are working on, compilers that will compile the language into WebAssembly. This means that applications written in languages like C++ or Rust can now be run on the web, directly in a browser or other environment. However, as we will highlight in this research, the quality of code generated by the different WebAssembly compilers varies and causes performance issues. This research paper aims to evaluate …
Stellar Classification Of Folded Spectra Using The Mk Classification Scheme And Convolutional Neural Networks, John Magee
Dissertations
The year 1943 saw the introduction of the Morgan-Keenan (MK) classification scheme and this replaced the existing Harvard Classification scheme. Both stellar classification scheme are fundamentally grounded in the field of spectroscopy. The Harvard Classification scheme classified stars based on stellar surface temperature. The MK Classification scheme introduced the concept of a luminosity class that is intrinsically linked to the surface gravity of a star. Temperature and luminosity class values are estimated directly from the stellar spectrum.
Machine learning is a well-established technique in astronomy. Traditionally, a spectrum is treated as a one-dimensional sequence of data. Techniques such as artificial …
Event-Driven Servers Using Asynchronous, Non-Blocking Network I/O: Performance Evaluation Of Kqueue And Epoll, Lorcan Leonard
Event-Driven Servers Using Asynchronous, Non-Blocking Network I/O: Performance Evaluation Of Kqueue And Epoll, Lorcan Leonard
Dissertations
This research project evaluates the performance of kqueue and epoll in the context of event-driven servers. The evaluation is done through benchmarking and tracing which are used to measure throughput and execution time respectively. The experiment is repeated for both a virtualised and native server environment. The results from the experiment are statistically analysed and compared. These results show significant differences between kqueue and epoll, and a profound impact of virtualisation as a variable.
Identifying Roles Of Software Developers From Their Answers On Stack Overflow, Dean Power
Identifying Roles Of Software Developers From Their Answers On Stack Overflow, Dean Power
Dissertations
Stack Overflow is the world’s largest community of software developers. Users ask and answer questions on various tagged topics of software development. The set of questions a site user answers is representative of their knowledge base, or “wheelhouse”. It is proposed that clustering users by their wheelhouse yields communities of similar software developers by skill-set. These communities represent the different roles within software development and could be used as the basis to define roles at any point in time in an ever-evolving landscape of software development. A network graph of site users, linked if they answered questions on the same …
Adequately Generating Captions For An Image Using Adaptive And Global Attention Mechanisms., Shravan Kumar Talanki Venkatarathanaiahsetty
Adequately Generating Captions For An Image Using Adaptive And Global Attention Mechanisms., Shravan Kumar Talanki Venkatarathanaiahsetty
Dissertations
Generating description to images is a recent surge and with latest developments in the field of Artificial Intelligence, it can be one of the prominent applications to bridge the gap between Computer vision and Natural language processing fields. In terms of the learning curve, Deep learning has become the main backbone in driving many new applications. Image Captioning is one such application where the usage of Deep learning methods enhanced the performance of the captioning accuracy. The introduction of the Encoder-Decoder framework was a breakthrough in Image captioning. But as the sequences got longer the performance of captions was affected. …
Feature Augmentation For Improved Topic Modeling Of Youtube Lecture Videos Using Latent Dirichlet Allocation, Nakul Srikumar
Feature Augmentation For Improved Topic Modeling Of Youtube Lecture Videos Using Latent Dirichlet Allocation, Nakul Srikumar
Dissertations
Application of Topic Models in text mining of educational data and more specifically, the text data obtained from lecture videos, is an area of research which is largely unexplored yet holds great potential. This work seeks to find empirical evidence for an improvement in Topic Modeling by pre- extracting bigram tokens and adding them as additional features in the Latent Dirichlet Allocation (LDA) algorithm, a widely-recognized topic modeling technique. The dataset considered for analysis is a collection of transcripts of video lectures on Machine Learning scraped from YouTube. Using the cosine similarity distance measure as a metric, the experiment showed …
Performance Comparison Between A Distributed Particle Swarm Algorithm And A Centralised Algorithm, Ciarán O’Loughlin
Performance Comparison Between A Distributed Particle Swarm Algorithm And A Centralised Algorithm, Ciarán O’Loughlin
Dissertations
Particle Swarm optimisation (PSO) is a particular form of swarm intelligence, which itself is an innovative intelligent paradigm for solving optimization problems. PSO is generally used to find a global optimum in a single optimisation function. This typically occurs on one node(machine) but there has been a significant body of research into creating distributed implementations of the PSO algorithm. Such research has often focused on the creation and performance of the distributed implementation in an isolated manner or compared to different distributed algorithms.
This research piece aims to bridge a gap in the existing literature, by testing a distributed implementation …
A Hybrid Neural Network For Stock Price Direction Forecasting, Daniel Devine
A Hybrid Neural Network For Stock Price Direction Forecasting, Daniel Devine
Dissertations
The volatility of stock markets makes them notoriously difficult to predict and is the reason that many investors sell out at the wrong time. Contrary to the efficient market hypothesis (EMH) and the random walk theory, contribution to the study of machine learning models for stock price forecasting has shown evidence of stock markets predictability with varying degrees of success. Contemporary approaches have sought to use a hybrid of convolutional neural network (CNN) for its feature extraction capabilities and long short-term memory (LSTM) neural network for its time series prediction. This comparative study aims to determine the predictability of stock …
Human Age And Gender Classification Using Convolutional Neural Networks, Eamon Kelliher
Human Age And Gender Classification Using Convolutional Neural Networks, Eamon Kelliher
Dissertations
In a world relying ever more on human classification, this papers aims to improve on age and gender image classification through the use of Convolutional Neural Networks (CNN). Age and gender classification has become a popular area of study in the past number of years however there are still improvements to be made, particularly in the area of age classification. This research paper aims to test the currently accepted fact that CNN models are the superior model type for image classification by comparing CNN performance against Support Vector Machine performance on the same dataset. Using the Adience image classification dataset, …
Identifying Significant Features For Player Evaluation In Nfl Comparing Anns And Traditional Models, Ronan Walsh
Identifying Significant Features For Player Evaluation In Nfl Comparing Anns And Traditional Models, Ronan Walsh
Dissertations
The evaluation of player performance in sports is popular and important in modern sports, enabling teams to use real data in the construction of their rosters. This dissertation proposes to apply machine learning algorithms to predicting the player evaluations from a leading NFL analytics company who use a combination of statistics and expert evaluation. In addition, it will investigate what features are significant in the evaluation of a position. Data for the dissertation is obtained from multiple online sources - Pro Football Reference and Pro Football Focus (the the NFL analytics company). These data sets are combined and analysed before …
Evaluating The Performance Of Transformer Architecture Over Attention Architecture On Image Captioning, Deepti Balasubramaniam
Evaluating The Performance Of Transformer Architecture Over Attention Architecture On Image Captioning, Deepti Balasubramaniam
Dissertations
Over the last few decades computer vision and Natural Language processing has shown tremendous improvement in different tasks such as image captioning, video captioning, machine translation etc using deep learning models. However, there were not much researches related to image captioning based on transformers and how it outperforms other models that were implemented for image captioning. In this study will be designing a simple encoder-decoder model, attention model and transformer model for image captioning using Flickr8K dataset where will be discussing about the hyperparameters of the model, type of pre-trained model used and how long the model has been trained. …
Finetuning Bert And Xlnet For Sentiment Analysis Of Stock Market Tweets Using Mixout And Dropout Regularization, Shubham Jangir
Finetuning Bert And Xlnet For Sentiment Analysis Of Stock Market Tweets Using Mixout And Dropout Regularization, Shubham Jangir
Dissertations
Sentiment analysis is also known as Opinion mining or emotional mining which aims to identify the way in which sentiments are expressed in text and written data. Sentiment analysis combines different study areas such as Natural Language Processing (NLP), Data Mining, and Text Mining, and is quickly becoming a key concern for businesses and organizations, especially as online commerce data is being used for analysis. Twitter is also becoming a popular microblogging and social networking platform today for information among people as they contribute their opinions, thoughts, and attitudes on social media platforms over the years. Because of the large …
Human-Robot Interaction For Assistive Robotics, Jiawei Li
Human-Robot Interaction For Assistive Robotics, Jiawei Li
Dissertations
This dissertation presents an in-depth study of human-robot interaction (HRI) withapplication to assistive robotics. In various studies, dexterous in-hand manipulation is included, assistive robots for Sit-To-stand (STS) assistance along with the human intention estimation. In Chapter 1, the background and issues of HRI are explicitly discussed. In Chapter 2, the literature review introduces the recent state-of-the-art research on HRI, such as physical Human-Robot Interaction (HRI), robot STS assistance, dexterous in hand manipulation and human intention estimation. In Chapter 3, various models and control algorithms are described in detail. Chapter 4 introduces the research equipment. Chapter 5 presents innovative theories and …
Performance Optimization Of Big Data Computing Workflows For Batch And Stream Data Processing In Multi-Clouds, Huiyan Cao
Dissertations
Workflow techniques have been widely used as a major computing solution in many science domains. With the rapid deployment of cloud infrastructures around the globe and the economic benefits of cloud-based computing and storage services, an increasing number of scientific workflows have migrated or are in active transition to clouds. As the scale of scientific applications continues to grow, it is now common to deploy various data- and network-intensive computing workflows such as serial computing workflows, MapReduce/Spark-based workflows, and Storm-based stream data processing workflows in multi-cloud environments, where inter-cloud data transfer oftentimes plays a significant role in both workflow performance …
Semantic, Integrated Keyword Search Over Structured And Loosely Structured Databases, Xinge Lu
Semantic, Integrated Keyword Search Over Structured And Loosely Structured Databases, Xinge Lu
Dissertations
Keyword search has been seen in recent years as an attractive way for querying data with some form of structure. Indeed, it allows simple users to extract information from databases without mastering a complex structured query language and without having knowledge of the schema of the data. It also allows for integrated search of heterogeneous data sources. However, as keyword queries are ambiguous and not expressive enough, keyword search cannot scale satisfactorily on big datasets and the answers are, in general, of low accuracy. Therefore, flat keyword search alone cannot efficiently return high quality results on large data with structure. …
Discover Influential Mental Workload Attributes Impacting Learners Performance In Third-Level Education, Amisha Mehta
Discover Influential Mental Workload Attributes Impacting Learners Performance In Third-Level Education, Amisha Mehta
Dissertations
Human Mental Workload is an intervening variable and a fundamental concept in the discipline of Ergonomics. It is deduced from variations in performance. High or low mental workload leads to hampering of performance. Mental workload in an educational setting has been extensively researched. It is applied in instructional design but it is obscure as to which factors are majorly driving mental workload in learners. This dissertation investigates the importance of the features used in the the NASA-Task Load Index mental workload assessment instrument and their impact on the performance of learners as assessed by multiple-choice tests conducted in classrooms of …
Stacked Convolutional Recurrent Auto-Encoder For Noise Reduction In Eeg, Eoghan Keegan
Stacked Convolutional Recurrent Auto-Encoder For Noise Reduction In Eeg, Eoghan Keegan
Dissertations
Electroencephalogram (EEG) can be used to record electrical potentials in the brain by attaching electrodes to the scalp. However, these low amplitude recordings are susceptible to noise which originates from several sources including ocular, pulse and muscle artefacts. Their presence has a severe impact on analysis and diagnoses of brain abnormalities. This research assessed the effectiveness of a stacked convolutional-recurrent auto-encoder (CR-AE) for noise reduction of EEG signal. Performance was evaluated using the signal-to-noise ratio (SNR) and peak signal-to-noise ratio (PSNR) in comparison to principal component analysis (PCA), independent component analysis (ICA) and a simple auto-encoder (AE). The Harrell-Davis quantile …
Hybrid Deep Neural Networks For Mining Heterogeneous Data, Xiurui Hou
Hybrid Deep Neural Networks For Mining Heterogeneous Data, Xiurui Hou
Dissertations
In the era of big data, the rapidly growing flood of data represents an immense opportunity. New computational methods are desired to fully leverage the potential that exists within massive structured and unstructured data. However, decision-makers are often confronted with multiple diverse heterogeneous data sources. The heterogeneity includes different data types, different granularities, and different dimensions, posing a fundamental challenge in many applications. This dissertation focuses on designing hybrid deep neural networks for modeling various kinds of data heterogeneity.
The first part of this dissertation concerns modeling diverse data types, the first kind of data heterogeneity. Specifically, image data and …
Live Media Production: Multicast Optimization And Visibility For Clos Fabric In Media Data Centers, Ammar Latif
Live Media Production: Multicast Optimization And Visibility For Clos Fabric In Media Data Centers, Ammar Latif
Dissertations
Media production data centers are undergoing a major architectural shift to introduce digitization concepts to media creation and media processing workflows. Content companies such as NBC Universal, CBS/Viacom and Disney are modernizing their workflows to take advantage of the flexibility of IP and virtualization.
In these new environments, multicast is utilized to provide point-to-multi-point communications. In order to build point-to-multi-point trees, Multicast has an established set of control protocols such as IGMP and PIM. The existing multicast protocols do not optimize multicast tree formation for maximizing network throughput which lead to decreased fabric utilization and decreased total number of admitted …
Energy And Performance-Optimized Scheduling Of Tasks In Distributed Cloud And Edge Computing Systems, Haitao Yuan
Energy And Performance-Optimized Scheduling Of Tasks In Distributed Cloud And Edge Computing Systems, Haitao Yuan
Dissertations
Infrastructure resources in distributed cloud data centers (CDCs) are shared by heterogeneous applications in a high-performance and cost-effective way. Edge computing has emerged as a new paradigm to provide access to computing capacities in end devices. Yet it suffers from such problems as load imbalance, long scheduling time, and limited power of its edge nodes. Therefore, intelligent task scheduling in CDCs and edge nodes is critically important to construct energy-efficient cloud and edge computing systems. Current approaches cannot smartly minimize the total cost of CDCs, maximize their profit and improve quality of service (QoS) of tasks because of aperiodic arrival …
Changing The Focus: Worker-Centric Optimization In Human-In-The-Loop Computations, Mohammadreza Esfandiari
Changing The Focus: Worker-Centric Optimization In Human-In-The-Loop Computations, Mohammadreza Esfandiari
Dissertations
A myriad of emerging applications from simple to complex ones involve human cognizance in the computation loop. Using the wisdom of human workers, researchers have solved a variety of problems, termed as “micro-tasks” such as, captcha recognition, sentiment analysis, image categorization, query processing, as well as “complex tasks” that are often collaborative, such as, classifying craters on planetary surfaces, discovering new galaxies (Galaxyzoo), performing text translation. The current view of “humans-in-the-loop” tends to see humans as machines, robots, or low-level agents used or exploited in the service of broader computation goals. This dissertation is developed to shift the focus back …
Towards Practical Homomorphic Encryption And Efficient Implementation, Gyana R. Sahu
Towards Practical Homomorphic Encryption And Efficient Implementation, Gyana R. Sahu
Dissertations
Cloud computing has gained significant traction over the past few years and its application continues to soar as evident from its rapid adoption in various industries. One of the major challenges involved in cloud computing services is the security of sensitive information as cloud servers have been often found to be vulnerable to snooping by malicious adversaries. Such data privacy concerns can be addressed to a greater extent by enforcing cryptographic measures. Fully homomorphic encryption (FHE), a special form of public key encryption has emerged as a primary tool in deploying such cryptographic security assurances without sacrificing many of the …
Software Quality Control Through Formal Method, Jialiang Chang
Software Quality Control Through Formal Method, Jialiang Chang
Dissertations
With the improvement of theories in the software industry, software quality is becoming the most significant part of the procedure of software development. Due to the implicit and explicit vulnerabilities inside the software, software quality control has caught more researchers and engineers’ attention and interest.
Current research on software quality control and verification are involving various manual and automated testing methods, which can be categorized into static analysis and dynamic analysis. However, both of them have their own disadvantages. With static analysis methods, inputs will not be taken into consideration because the software system isn’t executed so we do not …
Communications With Spectrum Sharing In 5g Networks Via Drone-Mounted Base Stations, Liang Zhang
Communications With Spectrum Sharing In 5g Networks Via Drone-Mounted Base Stations, Liang Zhang
Dissertations
The fifth generation wireless network is designed to accommodate enormous traffic demands for the next decade and to satisfy varying quality of service for different users. Drone-mounted base stations (DBSs) characterized by high mobility and low cost intrinsic attributes can be deployed to enhance the network capacity. In-band full-duplex (IBFD) is a promising technology for future wireless communications that can potentially enhance the spectrum efficiency and the throughput capacity. Therefore, the following issues have been identified and investigated in this dissertation in order to achieve high spectrum efficiency and high user quality of service.
First, the problem of deploying DBSs …
Efficient Hardware/Software Partitioning Techniques For A Cloud-Scale Cpu-Fpga Platform, Samah Ziyad Rahamneh
Efficient Hardware/Software Partitioning Techniques For A Cloud-Scale Cpu-Fpga Platform, Samah Ziyad Rahamneh
Dissertations
The diversity of workload characteristics has stimulated the deployment of heterogeneous architectures to accommodate workloads’ requirements disparity in cloud data centers. In heterogeneous computing, co-processors are utilized to support Central Processing Units (CPUs) in fulfilling workload demands. Field Programmable Gate Arrays (FPGAs) have advantages over other accelerators because of their power, performance and re-configurability benefits. In order to achieve the most benefit of a heterogeneous platform, efficient partitioning of workload between the CPU and the FPGA is a crucial demand.
This dissertation first presents a design and implementation of cooperative CPU-FPGA execution techniques, which include code and data partitioning, of …
Brain Disease Detection From Eegs: Comparing Spiking And Recurrent Neural Networks For Non-Stationary Time Series Classification, Hristo Stoev
Dissertations
Modeling non-stationary time series data is a difficult problem area in AI, due to the fact that the statistical properties of the data change as the time series progresses. This complicates the classification of non-stationary time series, which is a method used in the detection of brain diseases from EEGs. Various techniques have been developed in the field of deep learning for tackling this problem, with recurrent neural networks (RNN) approaches utilising Long short-term memory (LSTM) architectures achieving a high degree of success. This study implements a new, spiking neural network-based approach to time series classification for the purpose of …