Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 27 of 27

Full-Text Articles in Physical Sciences and Mathematics

The Role Of Student Motivation In Integrating Ai Into Web Design Education: A Longitudinal Study, Jason Lively, James Hutson May 2024

The Role Of Student Motivation In Integrating Ai Into Web Design Education: A Longitudinal Study, Jason Lively, James Hutson

Faculty Scholarship

Amidst the current wave studies of artificial intelligence (AI) in education, this longitudinal case study, spanning Spring 2023 to Spring 2024, delves into the integration of AI in the UI/UX web design classroom. By introducing both text-based and image-based AI tools to students with varying levels of skill in introductory web design and user experience (UX) courses, the study observed a significant enhancement in student creative capabilities and project outcomes. The utilization of text-based generators markedly improved writing efficiency and coding, while image-based tools facilitated better ideation and color selection. These findings underscore the potential to augment traditional educational methods, …


Machine-Learning-Based Vulnerability Detection And Classification In Internet Of Things Device Security, Sarah Bin Hulayyil, Shancang Li, Li Da Xu Jan 2023

Machine-Learning-Based Vulnerability Detection And Classification In Internet Of Things Device Security, Sarah Bin Hulayyil, Shancang Li, Li Da Xu

Information Technology & Decision Sciences Faculty Publications

Detecting cyber security vulnerabilities in the Internet of Things (IoT) devices before they are exploited is increasingly challenging and is one of the key technologies to protect IoT devices from cyber attacks. This work conducts a comprehensive survey to investigate the methods and tools used in vulnerability detection in IoT environments utilizing machine learning techniques on various datasets, i.e., IoT23. During this study, the common potential vulnerabilities of IoT architectures are analyzed on each layer and the machine learning workflow is described for detecting IoT vulnerabilities. A vulnerability detection and mitigation framework was proposed for machine learning-based vulnerability detection in …


Designing A Patient-Centered Clinical Workflow To Assess Cyberbully Experiences Of Youths In The U.S. Healthcare System, Fayika Farhat Nova Oct 2022

Designing A Patient-Centered Clinical Workflow To Assess Cyberbully Experiences Of Youths In The U.S. Healthcare System, Fayika Farhat Nova

Dissertations (1934 -)

Cyberbullying or online harassment is often defined as when someone repeatedly and intentionally harasses, mistreats, or makes fun of others aiming to scare, anger or shame them using electronic devices [296]. Youths experiencing cyberbullying report higher levels of anxiety and depression, mental distress, suicide thoughts, and substance abuse than their non-bullied peers [360, 605, 261, 354]. Even though bullying is associated with significant health problems, to date, very little youth anti-bullying efforts are initiated and directed in clinical settings. There is presently no standardized procedure or workflow across health systems for systematically assessing cyberbullying or other equally dangerous online activities …


Investigating Collaboration In Software Reverse Engineering, Allison M. Wong Mar 2022

Investigating Collaboration In Software Reverse Engineering, Allison M. Wong

Theses and Dissertations

Reverse engineering (RE) is a rigorous process of exploration and analysis to support software design recovery and exploit development. The process is often conducted in teams to divide the workload and take full advantage of engineers' individual expertise and strengths. Collaboration in RE requires versatile and reliable tools that can match the environment's unpredictable and fluid nature. While studies on collaborative software development have indicated common best practices and implementations, similar standards have not been explored in reverse engineering. This research conducts semi-structured interviews with reverse engineering experts to understand their needs and solutions while working in a team. The …


Advancing Ubiquitous Collaboration For Telehealth - A Framework To Evaluate Technology-Mediated Collaborative Workflow For Telehealth, Hypertension Exam Workflow Study, Christopher Bondy Ph.D., Linlin Chen Ph.D, Pamela Grover Md, Pengcheng Shi Ph.D Feb 2022

Advancing Ubiquitous Collaboration For Telehealth - A Framework To Evaluate Technology-Mediated Collaborative Workflow For Telehealth, Hypertension Exam Workflow Study, Christopher Bondy Ph.D., Linlin Chen Ph.D, Pamela Grover Md, Pengcheng Shi Ph.D

Articles

Healthcare systems are under siege globally regarding technology adoption; the recent pandemic has only magnified the issues. Providers and patients alike look to new enabling technologies to establish real-time connectivity and capability for a growing range of remote telehealth solutions. The migration to new technology is not as seamless as clinicians and patients would like since the new workflows pose new responsibilities and barriers to adoption across the telehealth ecosystem. Technology-mediated workflows (integrated software and personal medical devices) are increasingly important in patient-centered healthcare; software-intense systems will become integral in prescribed treatment plans [1]. My research explored the path to …


Evaluating Technology-Mediated Collaborative Workflows For Telehealth, Christopher Bondy Ph.D., Pengcheng Shi, Pamela Grover Md, Vicki Hanson, Linlin Chen, Rui Li Dec 2021

Evaluating Technology-Mediated Collaborative Workflows For Telehealth, Christopher Bondy Ph.D., Pengcheng Shi, Pamela Grover Md, Vicki Hanson, Linlin Chen, Rui Li

Articles

Goals: This paper discusses the need for a predictable method to evaluate gains and gaps of collaborative technology-mediated workflows and introduces an evaluation framework to address this need. Methods: The Collaborative Space Analysis Framework (CS-AF), introduced in this research, is a cross-disciplinary evaluation method designed to evaluate technology-mediated collaborative workflows. The 5-step CS-AF approach includes: (1) current-state workflow definition, (2) current-state (baseline) workflow assessment, (3) technology-mediated workflow development and deployment, (4) technology-mediated workflow assessment, (5) analysis, and conclusions. For this research, a comprehensive, empirical study of hypertension exam workflow for telehealth was conducted using the CS-AF approach. Results: The CS-AF …


Experience Report On The Use Of Technology To Manage Capstone Course Projects, Benjamin Gan, Eng Lieh Ouh Oct 2020

Experience Report On The Use Of Technology To Manage Capstone Course Projects, Benjamin Gan, Eng Lieh Ouh

Research Collection School Of Computing and Information Systems

This full paper presents an experience report describing lessons learnt from innovative practice use of technologies in an undergraduate computing capstone course. At our school, around fifty-five teams comprising of around 300 students take this course every year. With fifty-five teams, we needed a system to schedule presentations; improve communications; collaborate between stakeholders; share knowledge; monitor progress; team up students; match students to projects; improve grading process; showcase posters; and track improvements using analytics. The Learning Management Systems (LMS) is great to manage course content and grade submission. On the other hand, students are required to conduct agile sprint reviews …


Qos-Aware Scheduling For Data Intensive Workflow, Wan Cong, Cuirong Wang, Wang Cong Jul 2020

Qos-Aware Scheduling For Data Intensive Workflow, Wan Cong, Cuirong Wang, Wang Cong

Journal of System Simulation

Abstract: The development of technology enables people to access resources from different data centers. Resource management and scheduling of applications, such as workflow, that are deployed on the cloud computing environment have already become a hot spot. A QoS-aware scheduling algorithm for data intensive workflow on multiple data center environment was proposed. Scheduling data intensive workflow on multiple data center environment has two characteristics: A large amount of data is distributed in different geographical locations, the process of data migration will consume a large amount of time and bandwidth; secondly, the data centers have different price and resources. Data migration …


On Privacy-Aware Escience Workflows, Khalid Belhajjame, Noura Faci, Zakaria Maamar, Vanilson Burégio, Edvan Soares, Mahmoud Barhamgi May 2020

On Privacy-Aware Escience Workflows, Khalid Belhajjame, Noura Faci, Zakaria Maamar, Vanilson Burégio, Edvan Soares, Mahmoud Barhamgi

All Works

© 2020, Springer-Verlag GmbH Austria, part of Springer Nature. Computing-intensive experiments in modern sciences have become increasingly data-driven illustrating perfectly the Big-Data era. These experiments are usually specified and enacted in the form of workflows that would need to manage (i.e., read, write, store, and retrieve) highly-sensitive data like persons’ medical records. We assume for this work that the operations that constitute a workflow are 1-to-1 operations, in the sense that for each input data record they produce a single data record. While there is an active research body on how to protect sensitive data by, for instance, anonymizing datasets, …


Migrating From Partial Least Squares Discriminant Analysis To Artificial Neural Networks: A Comparison Of Functionally Equivalent Visualisation And Feature Contribution Tools Using Jupyter Notebooks, Kevin M. Mendez, David I. Broadhurst, Stacey N. Reinke Jan 2020

Migrating From Partial Least Squares Discriminant Analysis To Artificial Neural Networks: A Comparison Of Functionally Equivalent Visualisation And Feature Contribution Tools Using Jupyter Notebooks, Kevin M. Mendez, David I. Broadhurst, Stacey N. Reinke

Research outputs 2014 to 2021

Introduction:

Metabolomics data is commonly modelled multivariately using partial least squares discriminant analysis (PLS-DA). Its success is primarily due to ease of interpretation, through projection to latent structures, and transparent assessment of feature importance using regression coefficients and Variable Importance in Projection scores. In recent years several non-linear machine learning (ML) methods have grown in popularity but with limited uptake essentially due to convoluted optimisation and interpretation. Artificial neural networks (ANNs) are a non-linear projection-based ML method that share a structural equivalence with PLS, and as such should be amenable to equivalent optimisation and interpretation methods.

Objectives:

We hypothesise that …


Vt-Revolution: Interactive Programming Tutorials Made Possible, Lingfeng Bao, Zhenchang Xing, Xin Xia, David Lo, Shanping Li Nov 2018

Vt-Revolution: Interactive Programming Tutorials Made Possible, Lingfeng Bao, Zhenchang Xing, Xin Xia, David Lo, Shanping Li

Research Collection School Of Computing and Information Systems

Programming video tutorials showcase programming tasks and associated workflows. Although video tutorials are easy to create, it isoften difficult to explore the captured workflows and interact withthe programs in the videos. In this work, we propose a tool named VTRevolution – an interactive programming video tutorial authoring system. VTRevolution has two components: 1) a tutorial authoring system leverages operating system level instrumentation to log workflow history while tutorial authors are creating programming video tutorials; 2) a tutorial watching system enhances the learning experience of video tutorials by providing operation history and timeline-based browsing interactions. Our tutorial authoring system does not …


Vt-Revolution: Interactive Programming Video Tutorial Authoring And Watching System, Lingfeng Bao, Zhenchang Xing, Xin Xia, David Lo Feb 2018

Vt-Revolution: Interactive Programming Video Tutorial Authoring And Watching System, Lingfeng Bao, Zhenchang Xing, Xin Xia, David Lo

Research Collection School Of Computing and Information Systems

Procedural knowledge describes actions and manipulations that are carried out to complete programming tasks. An effective way to document procedural knowledge is programming video tutorials. Existing solutions to adding interactive workflow and elements to programming videos have a dilemma between the level of desired interaction and the efforts required for authoring tutorials. In this work, we tackle this dilemma by designing and building a programming video tutorial authoring system that leverages operating system level instrumentation to log workflow history while tutorial authors are creating programming videos, and the corresponding tutorial watching system that enhances the learning experience of video tutorials …


Automated Cluster Provisioning And Workflow Management For Parallel Scientific Applications In The Cloud, Brandon Posey, Christopher Gropp, Alexander Herzog, Amy Apon Nov 2017

Automated Cluster Provisioning And Workflow Management For Parallel Scientific Applications In The Cloud, Brandon Posey, Christopher Gropp, Alexander Herzog, Amy Apon

Publications

Many commercial cloud providers and tools are available that researchers could utilize to advance computational science research. However, adoption by the research community has been slow. In this paper we describe the automated Pro-visioning And Workflow (PAW) management tool for parallel scientific applications in the cloud. PAW is a comprehensive resource provisioning and workflow tool that automates the steps of dynamically provisioning a large scale cluster environment in the cloud, executing a set of jobs or a custom workflow and, after the jobs have completed, de-provisioning the cluster environment in a single operation. A key characteristic of PAW is that …


Towards A Better Understanding Of On And Off Target Effects Of The Lymphocyte-Specific Kinase Lck For The Development Of Novel And Safer Pharmaceuticals, Xiaofei Zhang, Amir Kucharski, Wibe A. De Jong, Sally R. Ellingson Jun 2017

Towards A Better Understanding Of On And Off Target Effects Of The Lymphocyte-Specific Kinase Lck For The Development Of Novel And Safer Pharmaceuticals, Xiaofei Zhang, Amir Kucharski, Wibe A. De Jong, Sally R. Ellingson

Markey Cancer Center Faculty Publications

In this work we have developed a multi-tiered computational platform to study protein-drug interactions. At the beginning of the workflow more efficient and less accurate methods are used to enable large libraries of proteins in many conformations and massive chemical libraries to be screened. At each subsequent step in the workflow a subset of input data is investigated with increased accuracy and more computationally expensive methods. We demonstrate the developed workflow with the investigation of the lymphocyte-specific kinase LCK, which is implicated as a drug target in many cancers and also known to have toxic effects when unintentionally targeted. Several …


Defining Usability Heuristics For Adoption And Efficiency Of An Electronic Workflow Document Management System, Steven Fuentes Jan 2017

Defining Usability Heuristics For Adoption And Efficiency Of An Electronic Workflow Document Management System, Steven Fuentes

CCE Theses and Dissertations

Usability heuristics have been established for different uses and applications as general guidelines for user interfaces. These can affect the implementation of industry solutions and play a significant role regarding cost reduction and process efficiency. The area of electronic workflow document management (EWDM) solutions, also known as workflow, lacks a formal definition of usability heuristics. With the advent of new technologies such as mobile devices, defining a set of usability heuristics contributes to the adoption and efficiency of an EWDM system. Workflow usability has been evaluated for various industries. Most significantly research has been done for electronic healthcare records (EHR). …


Educating Nurses On Workflow Changes From Electronic Health Record Adoption, Rhoda Lynn Atienza San Jose Jan 2017

Educating Nurses On Workflow Changes From Electronic Health Record Adoption, Rhoda Lynn Atienza San Jose

Walden Dissertations and Doctoral Studies

Workflow issues related to adoption of the electronic health record (EHR) has led to unsafe workarounds, decreased productivity, inefficient clinical documentation and slow rates of EHR adoption. The problem addressed in this quality improvement project was nurses' lack of knowledge about workflow changes due to EHR adoption. The purpose of this project was to identify changes in workflow and to develop an educational module to communicate the changes. This project was guided by both the ADDIE model (analysis, design, development, implementation, and evaluation) and the diffusion of innovations theory. Five stages were involved: process mapping, cognitive walkthrough, eLearning module development, …


Partitioning Uncertain Workloads, Freddy Chua, Bernardo A. Huberman Nov 2016

Partitioning Uncertain Workloads, Freddy Chua, Bernardo A. Huberman

Research Collection School Of Computing and Information Systems

We present a method for determining the ratio of the tasks when breaking any complex workload in such a way that once the outputs from all tasks are joined, their full completion takes less time and exhibit smaller variance than when running on the undivided workload. To do that, we have to infer the capabilities of the processing unit executing the divided workloads or tasks. We propose a Bayesian Inference algorithm to infer the amount of time each task takes in a way that does not require prior knowledge on the processing unit capability. We demonstrate the effectiveness of this …


Data-Intensive Computing For Bioinformatics Using Virtualization Technologies And Hpc Infrastructures, Pengfei Xuan Dec 2011

Data-Intensive Computing For Bioinformatics Using Virtualization Technologies And Hpc Infrastructures, Pengfei Xuan

All Theses

The bioinformatics applications often involve many computational components and massive data sets, which are very difficult to be deployed on a single computing machine. In this thesis, we designed a data-intensive computing platform for bioinformatics applications using virtualization technologies and high performance computing (HPC) infrastructures with the concept of multi-tier architecture, which can seamlessly integrate the web user interface (presentation tier), scientific workflow (logic tier) and computing infrastructure (data/computing tier). We demonstrated our platform on two bioinformatics projects. First, we redesigned and deployed the cotton marker database (CMD) (http://www.cottonmarker.org), a centralized web portal in the cotton research community, using the …


A Machine-Learning Approach For Workflow Identification From Low-Level Monitoring Information, Thorsten Stein Jan 2011

A Machine-Learning Approach For Workflow Identification From Low-Level Monitoring Information, Thorsten Stein

Theses

Knowing which workflows are executed within Service Oriented Architectures (SOA) is essential for successful IT management. In many cases, SOAs grew out of previous existing IT architectures; existing components are used as single services and therefore as parts of newly created workflows. Since such workflows consist of newly developed and legacy services, traditional workflow management systems often cannot be applied. This thesis presents a method for gathering information about the executed workflows within such heterogeneous environments. An implementation of a framework is presented. This framework allows the training of machine-learning algorithms with workflow models and the mapping of low-level monitoring …


Cloud Based Scientific Workflow For Nmr Data Analysis, Ashwin Manjunatha, Paul E. Anderson, Satya S. Sahoo, Ajith Harshana Ranabahu, Michael L. Raymer, Amit P. Sheth Jul 2010

Cloud Based Scientific Workflow For Nmr Data Analysis, Ashwin Manjunatha, Paul E. Anderson, Satya S. Sahoo, Ajith Harshana Ranabahu, Michael L. Raymer, Amit P. Sheth

Kno.e.sis Publications

This work presents a service oriented scientific workflow approach to NMR-based metabolomics data analysis. We demonstrate the effectiveness of this approach by implementing several common spectral processing techniques in the cloud using a parallel map-reduce framework, Hadoop.


Dataset Threshold For The Performance Estimators In Supervised Machine Learning Experiments, Zanifa Omary, Fredrick Mtenzi Nov 2009

Dataset Threshold For The Performance Estimators In Supervised Machine Learning Experiments, Zanifa Omary, Fredrick Mtenzi

Conference papers

The establishment of dataset threshold is one among the first steps when comparing the performance of machine learning algorithms. It involves the use of different datasets with different sample sizes in relation to the number of attributes and the number of instances available in the dataset. Currently, there is no limit which has been set for those who are unfamiliar with machine learning experiments on the categorisation of these datasets, as either small or large, based on the two factors. In this paper we perform experiments in order to establish dataset threshold. The established dataset threshold will help unfamiliar supervised …


Cad Tools For Dna Micro-Array Design, Manufacture And Application, Nisar Hundewale Dec 2006

Cad Tools For Dna Micro-Array Design, Manufacture And Application, Nisar Hundewale

Computer Science Dissertations

Motivation: As the human genome project progresses and some microbial and eukaryotic genomes are recognized, numerous biotechnological processes have attracted increasing number of biologists, bioengineers and computer scientists recently. Biotechnological processes profoundly involve production and analysis of highthroughput experimental data. Numerous sequence libraries of DNA and protein structures of a large number of micro-organisms and a variety of other databases related to biology and chemistry are available. For example, microarray technology, a novel biotechnology, promises to monitor the whole genome at once, so that researchers can study the whole genome on the global level and have a better picture of …


Openws-Transaction: Enabling Reliable Web Service Transactions, Ivan Vasquez, John A. Miller, Kunal Verma, Amit P. Sheth Dec 2005

Openws-Transaction: Enabling Reliable Web Service Transactions, Ivan Vasquez, John A. Miller, Kunal Verma, Amit P. Sheth

Kno.e.sis Publications

OpenWS-Transaction is an open source middleware that enables Web services to participate in a distributed transaction as prescribed by the WS-Coordination and WS-Transaction set of specifications. Central to the framework are the Coordinator and Participant entities, which can be integrated into existing services by introducing minimal changes to application code. OpenWS-Transaction allows transaction members to recover their original state in case of operational failure by leveraging techniques in logical logging and recovery at the application level. Depending on transaction style, system recovery may involve restoring key application variables and replaying uncommitted database activity. Transactions are assumed to be defined in …


The Carnot Heterogeneous Database Project: Implemented Applications, Munindar Singh, Phil Cannata, Michael N. Huhns, Nigel Jacobs, Tomasz Ksiezyk, Kayliang Ong, Amit P. Sheth, Christine Tomlinson, Darrell Woelk Apr 1997

The Carnot Heterogeneous Database Project: Implemented Applications, Munindar Singh, Phil Cannata, Michael N. Huhns, Nigel Jacobs, Tomasz Ksiezyk, Kayliang Ong, Amit P. Sheth, Christine Tomlinson, Darrell Woelk

Kno.e.sis Publications

The Carnot project was an ambitious research project in heterogeneous databases. It integrated a variety of techniques to address a wide range of problems in achieving interoperation in heterogeneous environments. Here we describe some of the major implemented applications of this project. These applications concern(a) accessing a legacy scientific database, (b) automating a workflow involving legacy systems, (c) cleaning data, and (d) retrieving semantically appropriate information from structured databases in response to text queries. These applications support scientific decision support, business process management, data integrity enhancement, and analytical decision support, respectively. They demonstrate Carnot‘s capabilities for (a) heterogeneous query processing, …


Specifying And Enforcing Intertask Dependencies, Paul Attie, Munindar Singh, Amit P. Sheth, Marek Rusinkiewicz Aug 1993

Specifying And Enforcing Intertask Dependencies, Paul Attie, Munindar Singh, Amit P. Sheth, Marek Rusinkiewicz

Kno.e.sis Publications

Extensions of the traditional atomic transaction model are needed to support the development of multi-system applications or workflows that access heterogeneous databases and legacy application systems. Most extended transaction models use conditions involving events or dependencies between transactions. Intertask dependencies can serve as a uniform framework for defining extended transaction models. In this paper we introduce event attributes needed to determine whether a dependency is enforceable and to properly schedule events in extended transaction models. Using these attributes and a formalization of a dependency into the temporal logic CTL, we can automatically synthesize an automaton that captures the computations that …


On Transactional Workflows, Amit P. Sheth, Marek Rusinkiewicz Jan 1993

On Transactional Workflows, Amit P. Sheth, Marek Rusinkiewicz

Kno.e.sis Publications

The basic transaction model has evolved over time to incorporate more complex transactions structures and to take the advantage of semantics of higher-level operations that cannot be seen at the level of page reads and writes. Well known examples of such extended transaction models include nested and multi-level transactions. A number of relaxed transaction models have been defined in the last several years that permit a controlled relaxation of the transaction isolation and atomicity to better match the requirements of various database applications. Correctness criteria other than global serializability have also been proposed. Several examples of extended/relaxed transaction models are …


Executing Multidatabase Transactions, Mansoor Ansari, Marek Rusinkiewicz, Linda Ness, Amit P. Sheth Jan 1992

Executing Multidatabase Transactions, Mansoor Ansari, Marek Rusinkiewicz, Linda Ness, Amit P. Sheth

Kno.e.sis Publications

In a multidatabase environment, the traditional transaction model has been found to be too restrictive. Therefore, several extended transaction models have been proposed in which some of the requirements of transaction, such as isolation or atomicity, are optional. The authors describe one of such extensions, the flexible transaction model and discuss the scheduling of transactions involving multiple autonomous database systems managed by heterogeneous DBMS.

The scheduling algorithm for flexible transactions is implemented using L.0, a logically parallel language which provides a framework for concisely specifying the multidatabase transactions and for scheduling them. The key aspects of a flexible transaction specification, …