Big Data Investment And Knowledge Integration In Academic Libraries,
2019
University of Jordan
Big Data Investment And Knowledge Integration In Academic Libraries, Saher Manaseer, Afnan R. Alawneh, Dua Asoudi
Copyright, Fair Use, Scholarly Communication, etc.
Recently, big data investment has become important for organizations, especially with the fast growth of data following the huge expansion in the usage of social media applications, and websites. Many organizations depend on extracting and reaching the needed reports and statistics. As the investments on big data and its storage have become major challenges for organizations, many technologies and methods have been developed to tackle those challenges.
One of such technologies is Hadoop, a framework that is used to divide big data into packages and distribute those packages through nodes to be processed, consuming less cost than the traditional storage …
Digital Libraries For Open Science: Using A Socio-Technical Interaction Network Approach,
2019
The Claremont Colleges Library
Digital Libraries For Open Science: Using A Socio-Technical Interaction Network Approach, Jennifer E. Beamer
Library Staff Publications and Research
This paper argues that using Socio-Technical Interaction Networks to build on extensively-used Digital Library infrastructures for supporting Open Science knowledge environments. Using a more social -technical approach could lead to an evolutionary reconceptualization of Digital Libraries. Digital Libraries being used as knowledge environments, built upon on the document repositories, will also emphasize the importance of user interaction and collaboration in carrying out those activities. That is to say, the primary goal of Digital Libraries is to help users convert information into knowledge; therefore, Digital Libraries examined in light of socio-technical interaction networks have the potential to shift Digital Libraries from …
A Dynamic Fault Tolerance Model For Microservices Architecture,
2019
South Dakota State University
A Dynamic Fault Tolerance Model For Microservices Architecture, Hajar Hameed Addeen
Electronic Theses and Dissertations
Microservices architecture is popular for its distributive system styles due to the independent character of each of the services in the architecture. Microservices are built to be single and each service has its running process and interconnecting with a lightweight mechanism that called application programming interface (API). The interaction through microservices needs to communicate internally. Microservices are a service that is likely to become unreachable to its consumers because, in any distributed setup, communication will fail on occasions due to the number of messages passing between services. Failures can occur when the networks are unreliable, and thus the connections can …
Cloud Resource Optimization For Processing Multiple Streams Of Visual Data,
2019
Purdue University
Cloud Resource Optimization For Processing Multiple Streams Of Visual Data, Zohar Kapach, Andrew Ulmer, Daniel Merrick, Arshad Alikhan, Yung-Hsiang Lu, Anup Mohan, Ahmed S. Kaseb, George K. Thiruvathukal
Computer Science: Faculty Publications and Other Works
Hundreds of millions of network cameras have been installed throughout the world. Each is capable of providing a vast amount of real-time data. Analyzing the massive data generated by these cameras requires significant computational resources and the demands may vary over time. Cloud computing shows the most promise to provide the needed resources on demand. In this article, we investigate how to allocate cloud resources when analyzing real-time data streams from network cameras. A resource manager considers many factors that affect its decisions, including the types of analysis, the number of data streams, and the locations of the cameras. The …
U.S. Census Explorer: A Gui And Visualization Tool For The U.S. Census Data Api,
2019
The University of Akron
U.S. Census Explorer: A Gui And Visualization Tool For The U.S. Census Data Api, Timothy Snyder
Williams Honors College, Honors Research Projects
U.S. Census Explorer is a software application that is designed to provide tools for intuitive exploration and analysis of United States census data for non-technical users. The application serves as an interface into the U.S. Census Bureau’s data API that enables a complete workflow from data acquisition to data visualization without the need for technical intervention from the user. The suite of tools provided include a graphical user interface for dynamically querying U.S. census data, geographic visualizations, and the ability to download your work to common spreadsheet and image formats for inclusion in external works.
A Method Of Evaluation Of High-Performance Computing Batch Schedulers,
2019
University of North Florida
A Method Of Evaluation Of High-Performance Computing Batch Schedulers, Jeremy Stephen Futral
UNF Graduate Theses and Dissertations
According to Sterling et al., a batch scheduler, also called workload management, is an application or set of services that provide a method to monitor and manage the flow of work through the system [Sterling01]. The purpose of this research was to develop a method to assess the execution speed of workloads that are submitted to a batch scheduler. While previous research exists, this research is different in that more complex jobs were devised that fully exercised the scheduler with established benchmarks. This research is important because the reduction of latency even if it is miniscule can lead to massive …
Computational Modeling Of Trust Factors Using Reinforcement Learning,
2019
Old Dominion University
Computational Modeling Of Trust Factors Using Reinforcement Learning, C. M. Kuzio, A. Dinh, C. Stone, L. Vidyaratne, K. M. Iftekharuddin
Electrical & Computer Engineering Faculty Publications
As machine-learning algorithms continue to expand their scope and approach more ambiguous goals, they may be required to make decisions based on data that is often incomplete, imprecise, and uncertain. The capabilities of these models must, in turn, evolve to meet the increasingly complex challenges associated with the deployment and integration of intelligent systems into modern society. Historical variability in the performance of traditional machine-learning models in dynamic environments leads to ambiguity of trust in decisions made by such algorithms. Consequently, the objective of this work is to develop a novel computational model that effectively quantifies the reliability of autonomous …
Russia Today, Cyberterrorists Tomorrow: U.S. Failure To Prepare Democracy For Cyberspace,
2018
Norwich University
Russia Today, Cyberterrorists Tomorrow: U.S. Failure To Prepare Democracy For Cyberspace, Jonathan F. Lancelot
Journal of Digital Forensics, Security and Law
This paper is designed to expose vulnerabilities within the US electoral system, the use of cyberspace to exploit weaknesses within the information assurance strategies of the democratic and republican party organizations, and deficiencies within the social media communications and voting machine exploits. A brief history of discriminatory practices in voting rights and voting access will be set as the foundation for the argument that the system is vulnerable in the cyber age, and the need for reform at the local, state and national levels will be emphasized. The possibility of a foreign nation-state influencing the outcome of an election by …
Paul Baran, Network Theory, And The Past, Present, And Future Of Internet,
2018
University of Pennsylvania Carey Law School
Paul Baran, Network Theory, And The Past, Present, And Future Of Internet, Christopher S. Yoo
All Faculty Scholarship
Paul Baran’s seminal 1964 article “On Distributed Communications Networks” that first proposed packet switching also advanced an underappreciated vision of network architecture: a lattice-like, distributed network, in which each node of the Internet would be homogeneous and equal in status to all other nodes. Scholars who have subsequently embraced the concept of a lattice-like network approach have largely overlooked the extent to which it is both inconsistent with network theory (associated with the work of Duncan Watts and Albert-László Barabási), which emphasizes the importance of short cuts and hubs in enabling networks to scale, and the actual way, the Internet …
Open Source Foundations For Spatial Decision Support Systems,
2018
CUNY Hunter College
Open Source Foundations For Spatial Decision Support Systems, Jochen Albrecht
Publications and Research
Spatial Decision Support Systems (SDSS) were a hot topic in the 1990s, when researchers tried to imbue GIS with additional decision support features. Successful practical developments such as HAZUS or CommunityViz have since been built, based on commercial desktop software and without much heed for theory other than what underlies their process models. Others, like UrbanSim, have been completely overhauled twice but without much external scrutiny. Both the practical and the theoretical foundations of decision support systems have developed considerably over the past 20 years. This article presents an overview of these developments and then looks at what corresponding tools …
Adaptive Parallelism For Coupled, Multithreaded Message-Passing Programs,
2018
University of New Mexico
Adaptive Parallelism For Coupled, Multithreaded Message-Passing Programs, Samuel K. Gutiérrez
Computer Science ETDs
Hybrid parallel programming models that combine message passing (MP) and shared- memory multithreading (MT) are becoming more popular, especially with applications requiring higher degrees of parallelism and scalability. Consequently, coupled parallel programs, those built via the integration of independently developed and optimized software libraries linked into a single application, increasingly comprise message-passing libraries with differing preferred degrees of threading, resulting in thread-level heterogeneity. Retroactively matching threading levels between independently developed and maintained libraries is difficult, and the challenge is exacerbated because contemporary middleware services provide only static scheduling policies over entire program executions, necessitating suboptimal, over-subscribed or under-subscribed, configurations. In …
Automatic Performance Optimization On Heterogeneous Computer Systems Using Manycore Coprocessors,
2018
University of Arkansas, Fayetteville
Automatic Performance Optimization On Heterogeneous Computer Systems Using Manycore Coprocessors, Chenggang Lai
Graduate Theses and Dissertations
Emerging computer architectures and advanced computing technologies, such as Intel’s Many Integrated Core (MIC) Architecture and graphics processing units (GPU), provide a promising solution to employ parallelism for achieving high performance, scalability and low power consumption. As a result, accelerators have become a crucial part in developing supercomputers. Accelerators usually equip with different types of cores and memory. It will compel application developers to reach challenging performance goals. The added complexity has led to the development of task-based runtime systems, which allow complex computations to be expressed as task graphs, and rely on scheduling algorithms to perform load balancing between …
Empathetic Computing For Inclusive Application Design,
2018
Singapore Management University
Empathetic Computing For Inclusive Application Design, Kenny Choo Tsu Wei
Dissertations and Theses Collection (Open Access)
The explosive growth of the ecosystem of personal and ambient computing de- vices coupled with the proliferation of high-speed connectivity has enabled ex- tremely powerful and varied mobile computing applications that are used every- where. While such applications have tremendous potential to improve the lives of impaired users, most mobile applications have impoverished designs to be inclusive– lacking support for users with specific disabilities. Mobile app designers today haveinadequate support to design existing classes of apps to support users with specific disabilities, and more so, lack the support to design apps that specifically target these users. One way to resolve …
Enhancing Value-Based Healthcare With Reconstructability Analysis: Predicting Cost Of Care In Total Hip Replacement,
2018
Portland State University
Enhancing Value-Based Healthcare With Reconstructability Analysis: Predicting Cost Of Care In Total Hip Replacement, Cecily Corrine Froemke, Martin Zwick
Systems Science Faculty Publications and Presentations
Legislative reforms aimed at slowing growth of US healthcare costs are focused on achieving greater value per dollar. To increase value healthcare providers must not only provide high quality care, but deliver this care at a sustainable cost. Predicting risks that may lead to poor outcomes and higher costs enable providers to augment decision making for optimizing patient care and inform the risk stratification necessary in emerging reimbursement models. Healthcare delivery systems are looking at their high volume service lines and identifying variation in cost and outcomes in order to determine the patient factors that are driving this variation and …
Criticality Assessments For Improving Algorithmic Robustness,
2018
University of New Mexico
Criticality Assessments For Improving Algorithmic Robustness, Thomas B. Jones
Computer Science ETDs
Though computational models typically assume all program steps execute flawlessly, that does not imply all steps are equally important if a failure should occur. In the "Constrained Reliability Allocation" problem, sufficient resources are guaranteed for operations that prompt eventual program termination on failure, but those operations that only cause output errors are given a limited budget of some vital resource, insufficient to ensure correct operation for each of them.
In this dissertation, I present a novel representation of failures based on a combination of their timing and location combined with criticality assessments---a method used to predict the behavior of systems …
A Method And Tool For Finding Concurrency Bugs Involving Multiple Variables With Application To Modern Distributed Systems,
2018
Florida International University
A Method And Tool For Finding Concurrency Bugs Involving Multiple Variables With Application To Modern Distributed Systems, Zhuo Sun
FIU Electronic Theses and Dissertations
Concurrency bugs are extremely hard to detect due to huge interleaving space. They are happening in the real world more often because of the prevalence of multi-threaded programs taking advantage of multi-core hardware, and microservice based distributed systems moving more and more applications to the cloud. As the most common non-deadlock concurrency bugs, atomicity violations are studied in many recent works, however, those methods are applicable only to single-variable atomicity violation, and don't consider the specific challenge in distributed systems that have both pessimistic and optimistic concurrency control. This dissertation presents a tool using model checking to predict atomicity violation …
Hybrid Black-Box Solar Analytics And Their Privacy Implications,
2018
University of Massachusetts Amherst
Hybrid Black-Box Solar Analytics And Their Privacy Implications, Dong Chen
Doctoral Dissertations
The aggregate solar capacity in the U.S. is rising rapidly due to continuing decreases in the cost of solar modules. For example, the installed cost per Watt (W) for residential photovoltaics (PVs) decreased by 6X from 2009 to 2018 (from $8/W to $1.2/W), resulting in the installed aggregate solar capacity increasing 128X from 2009 to 2018 (from 435 megawatts to 55.9 gigawatts). This increasing solar capacity is imposing operational challenges on utilities in balancing electricity's real-time supply and demand, as solar generation is more stochastic and less predictable than aggregate demand.
To address this problem, both academia and utilities have …
Cross-Referencing Social Media And Public Surveillance Camera Data For Disaster Response,
2018
Stanford University
Cross-Referencing Social Media And Public Surveillance Camera Data For Disaster Response, Chittayong Surakitbanharn,, Calvin Yau, Guizhen Wang, Aniesh Chawla, Yinuo Pan, Zhaoya Sun, Sam Yellin, David Ebert, Yung-Hsiang Lu, George K. Thiruvathukal
George K. Thiruvathukal
Physical media (like surveillance cameras) and social media (like Instagram and Twitter) may both be useful in attaining on-the-ground information during an emergency or disaster situation. However, the intersection and reliability of both surveillance cameras and social media during a natural disaster are not fully understood. To address this gap, we tested whether social media is of utility when physical surveillance cameras went off-line during Hurricane Irma in 2017. Specifically, we collected and compared geo-tagged Instagram and Twitter posts in the state of Florida during times and in areas where public surveillance cameras went off-line. We report social media content …
Cross-Referencing Social Media And Public Surveillance Camera Data For Disaster Response,
2018
Stanford University
Cross-Referencing Social Media And Public Surveillance Camera Data For Disaster Response, Chittayong Surakitbanharn,, Calvin Yau, Guizhen Wang, Aniesh Chawla, Yinuo Pan, Zhaoya Sun, Sam Yellin, David Ebert, Yung-Hsiang Lu, George K. Thiruvathukal
Computer Science: Faculty Publications and Other Works
Physical media (like surveillance cameras) and social media (like Instagram and Twitter) may both be useful in attaining on-the-ground information during an emergency or disaster situation. However, the intersection and reliability of both surveillance cameras and social media during a natural disaster are not fully understood. To address this gap, we tested whether social media is of utility when physical surveillance cameras went off-line during Hurricane Irma in 2017. Specifically, we collected and compared geo-tagged Instagram and Twitter posts in the state of Florida during times and in areas where public surveillance cameras went off-line. We report social media content …
A New Framework For Securing, Extracting And Analyzing Big Forensic Data,
2018
Georgia Southern University
A New Framework For Securing, Extracting And Analyzing Big Forensic Data, Hitesh Sachdev, Hayden Wimmer, Lei Chen, Carl Rebman
Journal of Digital Forensics, Security and Law
Finding new methods to investigate criminal activities, behaviors, and responsibilities has always been a challenge for forensic research. Advances in big data, technology, and increased capabilities of smartphones has contributed to the demand for modern techniques of examination. Smartphones are ubiquitous, transformative, and have become a goldmine for forensics research. Given the right tools and research methods investigating agencies can help crack almost any illegal activity using smartphones. This paper focuses on conducting forensic analysis in exposing a terrorist or criminal network and introduces a new Big Forensic Data Framework model where different technologies of Hadoop and EnCase software are …