Country

Language

Accurate Joint Detection from Depth Videos towards Pose Analysis (open access)

Accurate Joint Detection from Depth Videos towards Pose Analysis

Joint detection is vital for characterizing human pose and serves as a foundation for a wide range of computer vision applications such as physical training, health care, entertainment. This dissertation proposed two methods to detect joints in the human body for pose analysis. The first method detects joints by combining body model and automatic feature points detection together. The human body model maps the detected extreme points to the corresponding body parts of the model and detects the position of implicit joints. The dominant joints are detected after implicit joints and extreme points are located by a shortest path based methods. The main contribution of this work is a hybrid framework to detect joints on the human body to achieve robustness to different body shapes or proportions, pose variations and occlusions. Another contribution of this work is the idea of using geodesic features of the human body to build a model for guiding the human pose detection and estimation. The second proposed method detects joints by segmenting human body into parts first and then detect joints by making the detection algorithm focusing on each limb. The advantage of applying body part segmentation first is that the body segmentation method narrows …
Date: May 2018
Creator: Kong, Longbo
System: The UNT Digital Library
Application-Specific Things Architectures for IoT-Based Smart Healthcare Solutions (open access)

Application-Specific Things Architectures for IoT-Based Smart Healthcare Solutions

Human body is a complex system organized at different levels such as cells, tissues and organs, which contributes to 11 important organ systems. The functional efficiency of this complex system is evaluated as health. Traditional healthcare is unable to accommodate everyone's need due to the ever-increasing population and medical costs. With advancements in technology and medical research, traditional healthcare applications are shaping into smart healthcare solutions. Smart healthcare helps in continuously monitoring our body parameters, which helps in keeping people health-aware. It provides the ability for remote assistance, which helps in utilizing the available resources to maximum potential. The backbone of smart healthcare solutions is Internet of Things (IoT) which increases the computing capacity of the real-world components by using cloud-based solutions. The basic elements of these IoT based smart healthcare solutions are called "things." Things are simple sensors or actuators, which have the capacity to wirelessly connect with each other and to the internet. The research for this dissertation aims in developing architectures for these things, focusing on IoT-based smart healthcare solutions. The core for this dissertation is to contribute to the research in smart healthcare by identifying applications which can be monitored remotely. For this, application-specific thing architectures …
Date: May 2018
Creator: Sundaravadivel, Prabha
System: The UNT Digital Library
Computational Approaches for Analyzing Social Support in Online Health Communities (open access)

Computational Approaches for Analyzing Social Support in Online Health Communities

Online health communities (OHCs) have become a medium for patients to share their personal experiences and interact with peers on topics related to a disease, medication, side effects, and therapeutic processes. Many studies show that using OHCs regularly decreases mortality and improves patients mental health. As a result of their benefits, OHCs are a popular place for patients to refer to, especially patients with a severe disease, and to receive emotional and informational support. The main reasons for developing OHCs are to present valid and high-quality information and to understand the mechanism of social support in changing patients' mental health. Given the purpose of OHC moderators for developing OHCs applications and the purpose of patients for using OHCs, there is no facility, feature, or sub-application in OHCs to satisfy patient and moderator goals. OHCs are only equipped with a primary search engine that is a keyword-based search tool. In other words, if a patient wants to obtain information about a side-effect, he/she needs to browse many threads in the hope that he/she can find several related comments. In the same way, OHC moderators cannot browse all information which is exchanged among patients to validate their accuracy. Thus, it is critical …
Date: May 2018
Creator: Khan Pour, Hamed
System: The UNT Digital Library
Computational Methods to Optimize High-Consequence Variants of the Vehicle Routing Problem for Relief Networks in Humanitarian Logistics (open access)

Computational Methods to Optimize High-Consequence Variants of the Vehicle Routing Problem for Relief Networks in Humanitarian Logistics

Optimization of relief networks in humanitarian logistics often exemplifies the need for solutions that are feasible given a hard constraint on time. For instance, the distribution of medical countermeasures immediately following a biological disaster event must be completed within a short time-frame. When these supplies are not distributed within the maximum time allowed, the severity of the disaster is quickly exacerbated. Therefore emergency response plans that fail to facilitate the transportation of these supplies in the time allowed are simply not acceptable. As a result, all optimization solutions that fail to satisfy this criterion would be deemed infeasible. This creates a conflict with the priority optimization objective in most variants of the generic vehicle routing problem (VRP). Instead of efficiently maximizing usage of vehicle resources available to construct a feasible solution, these variants ordinarily prioritize the construction of a minimum cost set of vehicle routes. Research presented in this dissertation focuses on the design and analysis of efficient computational methods for optimizing high-consequence variants of the VRP for relief networks. The conflict between prioritizing the minimization of the number of vehicles required or the minimization of total travel time is demonstrated. The optimization of the time and capacity constraints in …
Date: August 2018
Creator: Urbanovsky, Joshua C.
System: The UNT Digital Library
A Control Theoretic Approach for Resilient Network Services (open access)

A Control Theoretic Approach for Resilient Network Services

Resilient networks have the ability to provide the desired level of service, despite challenges such as malicious attacks and misconfigurations. The primary goal of this dissertation is to be able to provide uninterrupted network services in the face of an attack or any failures. This dissertation attempts to apply control system theory techniques with a focus on system identification and closed-loop feedback control. It explores the benefits of system identification technique in designing and validating the model for the complex and dynamic networks. Further, this dissertation focuses on designing robust feedback control mechanisms that are both scalable and effective in real-time. It focuses on employing dynamic and predictive control approaches to reduce the impact of an attack on network services. The closed-loop feedback control mechanisms tackle this issue by degrading the network services gracefully to an acceptable level and then stabilizing the network in real-time (less than 50 seconds). Employing these feedback mechanisms also provide the ability to automatically configure the settings such that the QoS metrics of the network is consistent with those specified in the service level agreements.
Date: December 2018
Creator: Vempati, Jagannadh Ambareesh
System: The UNT Digital Library
Dataflow Processing in Memory Achieves Significant Energy Efficiency (open access)

Dataflow Processing in Memory Achieves Significant Energy Efficiency

The large difference between processor CPU cycle time and memory access time, often referred to as the memory wall, severely limits the performance of streaming applications. Some data centers have shown servers being idle three out of four clocks. High performance instruction sequenced systems are not energy efficient. The execute stage of even simple pipeline processors only use 9% of the pipeline's total energy. A hybrid dataflow system within a memory module is shown to have 7.2 times the performance with 368 times better energy efficiency than an Intel Xeon server processor on the analyzed benchmarks. The dataflow implementation exploits the inherent parallelism and pipelining of the application to improve performance without the overhead functions of caching, instruction fetch, instruction decode, instruction scheduling, reorder buffers, and speculative execution used by high performance out-of-order processors. Coarse grain reconfigurable logic in an energy efficient silicon process provides flexibility to implement multiple algorithms in a low energy solution. Integrating the logic within a 3D stacked memory module provides lower latency and higher bandwidth access to memory while operating independently from the host system processor.
Date: August 2018
Creator: Shelor, Charles F.
System: The UNT Digital Library
Detecting Component Failures and Critical Components in Safety Critical Embedded Systems using Fault Tree Analysis (open access)

Detecting Component Failures and Critical Components in Safety Critical Embedded Systems using Fault Tree Analysis

Component failures can result in catastrophic behaviors in safety critical embedded systems, sometimes resulting in loss of life. Component failures can be treated as off nominal behaviors (ONBs) with respect to the components and sub systems involved in an embedded system. A lot of research is being carried out to tackle the problem of ONBs. These approaches are mainly focused on the states (i.e., desired and undesired states of a system at a given point of time to detect ONBs). In this paper, an approach is discussed to detect component failures and critical components of an embedded system. The approach is based on fault tree analysis (FTA), applied to the requirements specification of embedded systems at design time to find out the relationship between individual component failures and overall system failure. FTA helps in determining both qualitative and quantitative relationship between component failures and system failure. Analyzing the system at design time helps in detecting component failures and critical components and helps in devising strategies to mitigate component failures at design time and improve overall safety and reliability of a system.
Date: May 2018
Creator: Bhandaram, Abhinav
System: The UNT Digital Library
Detection of Generalizable Clone Security Coding Bugs Using Graphs and Learning Algorithms (open access)

Detection of Generalizable Clone Security Coding Bugs Using Graphs and Learning Algorithms

This research methodology isolates coding properties and identifies the probability of security vulnerabilities using machine learning and historical data. Several approaches characterize the effectiveness of detecting security-related bugs that manifest as vulnerabilities, but none utilize vulnerability patch information. The main contribution of this research is a framework to analyze LLVM Intermediate Representation Code and merging core source code representations using source code properties. This research is beneficial because it allows source programs to be transformed into a graphical form and users can extract specific code properties related to vulnerable functions. The result is an improved approach to detect, identify, and track software system vulnerabilities based on a performance evaluation. The methodology uses historical function level vulnerability information, unique feature extraction techniques, a novel code property graph, and learning algorithms to minimize the amount of end user domain knowledge necessary to detect vulnerabilities in applications. The analysis shows approximately 99% precision and recall to detect known vulnerabilities in the National Institute of Standards and Technology (NIST) Software Assurance Metrics and Tool Evaluation (SAMATE) project. Furthermore, 72% percent of the historical vulnerabilities in the OpenSSL testing environment were detected using a linear support vector classifier (SVC) model.
Date: December 2018
Creator: Mayo, Quentin R
System: The UNT Digital Library
Hybrid Approaches in Test Suite Prioritization (open access)

Hybrid Approaches in Test Suite Prioritization

The rapid advancement of web and mobile application technologies has recently posed numerous challenges to the Software Engineering community, including how to cost-effectively test applications that have complex event spaces. Many software testing techniques attempt to cost-effectively improve the quality of such software. This dissertation primarily focuses on that of hybrid test suite prioritization. The techniques utilize two or more criteria to perform test suite prioritization as it is often insufficient to use only a single criterion. The dissertation consists of the following contributions: (1) a weighted test suite prioritization technique that employs the distance between criteria as a weighting factor, (2) a coarse-to-fine grained test suite prioritization technique that uses a multilevel approach to increase the granularity of the criteria at each subsequent iteration, (3) the Caret-HM tool for Android user session-based testing that allows testers to record, replay, and create heat maps from user interactions with Android applications via a web browser, and (4) Android user session-based test suite prioritization techniques that utilize heuristics developed from user sessions created by Caret-HM. Each of the chapters empirically evaluate the respective techniques. The proposed techniques generally show improved or equally good performance when compared to the baselines, depending on an …
Date: May 2018
Creator: Nurmuradov, Dmitriy
System: The UNT Digital Library
Improving Software Quality through Syntax and Semantics Verification of Requirements Models (open access)

Improving Software Quality through Syntax and Semantics Verification of Requirements Models

Software defects can frequently be traced to poorly-specified requirements. Many software teams manage their requirements using tools such as checklists and databases, which lack a formal semantic mapping to system behavior. Such a mapping can be especially helpful for safety-critical systems. Another limitation of many requirements analysis methods is that much of the analysis must still be done manually. We propose techniques that automate portions of the requirements analysis process, as well as clarify the syntax and semantics of requirements models using a variety of methods, including machine learning tools and our own tool, VeriCCM. The machine learning tools used help us identify potential model elements and verify their correctness. VeriCCM, a formalized extension of the causal component model (CCM), uses formal methods to ensure that requirements are well-formed, as well as providing the beginnings of a full formal semantics. We also explore the use of statecharts to identify potential abnormal behaviors from a given set of requirements. At each stage, we perform empirical studies to evaluate the effectiveness of our proposed approaches.
Date: December 2018
Creator: Gaither, Danielle
System: The UNT Digital Library
A Multi-Modal Insider Threat Detection and Prevention based on Users' Behaviors (open access)

A Multi-Modal Insider Threat Detection and Prevention based on Users' Behaviors

Insider threat is one of the greatest concerns for information security that could cause more significant financial losses and damages than any other attack. However, implementing an efficient detection system is a very challenging task. It has long been recognized that solutions to insider threats are mainly user-centric and several psychological and psychosocial models have been proposed. A user's psychophysiological behavior measures can provide an excellent source of information for detecting user's malicious behaviors and mitigating insider threats. In this dissertation, we propose a multi-modal framework based on the user's psychophysiological measures and computer-based behaviors to distinguish between a user's behaviors during regular activities versus malicious activities. We utilize several psychophysiological measures such as electroencephalogram (EEG), electrocardiogram (ECG), and eye movement and pupil behaviors along with the computer-based behaviors such as the mouse movement dynamics, and keystrokes dynamics to build our framework for detecting malicious insiders. We conduct human subject experiments to capture the psychophysiological measures and the computer-based behaviors for a group of participants while performing several computer-based activities in different scenarios. We analyze the behavioral measures, extract useful features, and evaluate their capability in detecting insider threats. We investigate each measure separately, then we use data fusion techniques …
Date: August 2018
Creator: Hashem, Yassir
System: The UNT Digital Library
On-Loom Fabric Defect Inspection Using Contact Image Sensors and Activation Layer Embedded Convolutional Neural Network (open access)

On-Loom Fabric Defect Inspection Using Contact Image Sensors and Activation Layer Embedded Convolutional Neural Network

Malfunctions on loom machines are the main causes of faulty fabric production. An on-loom fabric inspection system is a real-time monitoring device that enables immediate defect detection for human intervention. This dissertation presented a solution for the on-loom fabric defect inspection, including the new hardware design—the configurable contact image sensor (CIS) module—for on-loom fabric scanning and the defect detection algorithms. The main contributions of this work include (1) creating a configurable CIS module adaptable to a loom width, which brings CIS unique features, such as sub-millimeter resolution, compact size, short working distance and low cost, to the fabric defect inspection system, (2) designing a two-level hardware architecture that can be efficiently deployed in a weaving factory with hundreds of looms, (3) developing a two-level inspecting scheme, with which the initial defect screening is performed on the Raspberry Pi and the intensive defect verification is processed on the cloud server, (4) introducing the novel pairwise-potential activation layer to a convolutional neural network that leads to high accuracies of defect segmentation on fabrics with fine and imbalanced structures, (5) achieving a real-time defect detection that allows a possible defect to be examined multiple times, and (6) implementing a new color segmentation technique …
Date: December 2018
Creator: Ouyang, Wenbin
System: The UNT Digital Library
Ontology Based Security Threat Assessment and Mitigation for Cloud Systems (open access)

Ontology Based Security Threat Assessment and Mitigation for Cloud Systems

A malicious actor often relies on security vulnerabilities of IT systems to launch a cyber attack. Most cloud services are supported by an orchestration of large and complex systems which are prone to vulnerabilities, making threat assessment very challenging. In this research, I developed formal and practical ontology-based techniques that enable automated evaluation of a cloud system's security threats. I use an architecture for threat assessment of cloud systems that leverages a dynamically generated ontology knowledge base. I created an ontology model and represented the components of a cloud system. These ontologies are designed for a set of domains that covers some cloud's aspects and information technology products' cyber threat data. The inputs to our architecture are the configurations of cloud assets and components specification (which encompass the desired assessment procedures) and the outputs are actionable threat assessment results. The focus of this work is on ways of enumerating, assessing, and mitigating emerging cyber security threats. A research toolkit system has been developed to evaluate our architecture. We expect our techniques to be leveraged by any cloud provider or consumer in closing the gap of identifying and remediating known or impending security threats facing their cloud's assets.
Date: December 2018
Creator: Kamongi, Patrick
System: The UNT Digital Library
Radio Resource Control Approaches for LTE-Advanced Femtocell Networks (open access)

Radio Resource Control Approaches for LTE-Advanced Femtocell Networks

The architecture of mobile networks has dramatically evolved in order to fulfill the growing demands on wireless services and data. The radio resources, which are used by the current mobile networks, are limited while the users demands are substantially increasing. In the future, tremendous Internet applications are expected to be served by mobile networks. Therefore, increasing the capacity of mobile networks has become a vital issue. Heterogeneous networks (HetNets) have been considered as a promising paradigm for future mobile networks. Accordingly, the concept of small cell has been introduced in order to increase the capacity of the mobile networks. A femtocell network is a kind of small cell networks. Femtocells are deployed within macrocells coverage. Femtocells cover small areas and operate with low transmission power while providing high capacity. Also, UEs can be offloaded from macrocells to femtocells. Thus, the capacity can be increased. However, this will introduce different technical challenges. The interference has become one of the key challenges for deploying femtocells within a certain macrocells coverage. Undesirable impact of the interference can degrade the performance of the mobile networks. Therefore, radio resource management mechanisms are needed in order to address key challenges of deploying femtocells. The objective of …
Date: August 2018
Creator: Alotaibi, Sultan Radhi
System: The UNT Digital Library
Reading with Robots: A Platform to Promote Cognitive Exercise through Identification and Discussion of Creative Metaphor in Books (open access)

Reading with Robots: A Platform to Promote Cognitive Exercise through Identification and Discussion of Creative Metaphor in Books

Maintaining cognitive health is often a pressing concern for aging adults, and given the world's shifting age demographics, it is impractical to assume that older adults will be able to rely on individualized human support for doing so. Recently, interest has turned toward technology as an alternative. Companion robots offer an attractive vehicle for facilitating cognitive exercise, but the language technologies guiding their interactions are still nascent; in elder-focused human-robot systems proposed to date, interactions have been limited to motion or buttons and canned speech. The incapacity of these systems to autonomously participate in conversational discourse limits their ability to engage users at a cognitively meaningful level. I addressed this limitation by developing a platform for human-robot book discussions, designed to promote cognitive exercise by encouraging users to consider the authors' underlying intentions in employing creative metaphors. The choice of book discussions as the backdrop for these conversations has an empirical basis in neuro- and social science research that has found that reading often, even in late adulthood, has been correlated with a decreased likelihood to exhibit symptoms of cognitive decline. The more targeted focus on novel metaphors within those conversations stems from prior work showing that processing novel metaphors …
Date: August 2018
Creator: Parde, Natalie
System: The UNT Digital Library
Secure and Trusted Execution Framework for Virtualized Workloads (open access)

Secure and Trusted Execution Framework for Virtualized Workloads

In this dissertation, we have analyzed various security and trustworthy solutions for modern computing systems and proposed a framework that will provide holistic security and trust for the entire lifecycle of a virtualized workload. The framework consists of 3 novel techniques and a set of guidelines. These 3 techniques provide necessary elements for secure and trusted execution environment while the guidelines ensure that the virtualized workload remains in a secure and trusted state throughout its lifecycle. We have successfully implemented and demonstrated that the framework provides security and trust guarantees at the time of launch, any time during the execution, and during an update of the virtualized workload. Given the proliferation of virtualization from cloud servers to embedded systems, techniques presented in this dissertation can be implemented on most computing systems.
Date: August 2018
Creator: Kotikela, Srujan D
System: The UNT Digital Library
Simulation of Dengue Outbreak in Thailand (open access)

Simulation of Dengue Outbreak in Thailand

The dengue virus has become widespread worldwide in recent decades. It has no specific treatment and affects more than 40% of the entire population in the world. In Thailand, dengue has been a health concern for more than half a century. The highest number of cases in one year was 174,285 in 1987, leading to 1,007 deaths. In the present day, dengue is distributed throughout the entire country. Therefore, dengue has become a major challenge for public health in terms of both prevention and control of outbreaks. Different methodologies and ways of dealing with dengue outbreaks have been put forward by researchers. Computational models and simulations play an important role, as they have the ability to help researchers and officers in public health gain a greater understanding of the virus's epidemic activities. In this context, this dissertation presents a new framework, Modified Agent-Based Modeling (mABM), a hybrid platform between a mathematical model and a computational model, to simulate a dengue outbreak in human and mosquito populations. This framework improves on the realism of former models by utilizing the reported data from several Thai government organizations, such as the Thai Ministry of Public Health (MoPH), the National Statistical Office, and others. …
Date: August 2018
Creator: Meesumrarn, Thiraphat
System: The UNT Digital Library
Toward Supporting Fine-Grained, Structured, Meaningful and Engaging Feedback in Educational Applications (open access)

Toward Supporting Fine-Grained, Structured, Meaningful and Engaging Feedback in Educational Applications

Recent advancements in machine learning have started to put their mark on educational technology. Technology is evolving fast and, as people adopt it, schools and universities must also keep up (nearly 70% of primary and secondary schools in the UK are now using tablets for various purposes). As these numbers are likely going to follow the same increasing trend, it is imperative for schools to adapt and benefit from the advantages offered by technology: real-time processing of data, availability of different resources through connectivity, efficiency, and many others. To this end, this work contributes to the growth of educational technology by developing several algorithms and models that are meant to ease several tasks for the instructors, engage students in deep discussions and ultimately, increase their learning gains. First, a novel, fine-grained knowledge representation is introduced that splits phrases into their constituent propositions that are both meaningful and minimal. An automated extraction algorithm of the propositions is also introduced. Compared with other fine-grained representations, the extraction model does not require any human labor after it is trained, while the results show considerable improvement over two meaningful baselines. Second, a proposition alignment model is created that relies on even finer-grained units of …
Date: December 2018
Creator: Bulgarov, Florin Adrian
System: The UNT Digital Library
Towards a Unilateral Sensing System for Detecting Person-to-Person Contacts (open access)

Towards a Unilateral Sensing System for Detecting Person-to-Person Contacts

The contact patterns among individuals can significantly affect the progress of an infectious outbreak within a population. Gathering data about these interaction and mixing patterns is essential to assess computational modeling of infectious diseases. Various self-report approaches have been designed in different studies to collect data about contact rates and patterns. Recent advances in sensing technology provide researchers with a bilateral automated data collection devices to facilitate contact gathering overcoming the disadvantages of previous approaches. In this study, a novel unilateral wearable sensing architecture has been proposed that overcome the limitations of the bi-lateral sensing. Our unilateral wearable sensing system gather contact data using hybrid sensor arrays embedded in wearable shirt. A smartphone application has been used to transfer the collected sensors data to the cloud and apply deep learning model to estimate the number of human contacts and the results are stored in the cloud database. The deep learning model has been developed on the hand labelled data over multiple experiments. This model has been tested and evaluated, and these results were reported in the study. Sensitivity analysis has been performed to choose the most suitable image resolution and format for the model to estimate contacts and to analyze …
Date: December 2018
Creator: Amara, Pavan Kumar
System: The UNT Digital Library
Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation (open access)

Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation

Biological emergency response planning plays a critical role in protecting the public from possible devastating results of sudden disease outbreaks. These plans describe the distribution of medical countermeasures across a region using limited resources within a restricted time window. Thus, the ability to determine that such a plan will be feasible, i.e. successfully provide service to affected populations within the time limit, is crucial. Many of the current efforts to validate plans are in the form of live drills and training, but those may not test plan activation at the appropriate scale or with sufficient numbers of participants. Thus, this necessitates the use of computational resources to aid emergency managers and planners in developing and evaluating plans before they must be used. Current emergency response plan generation software packages such as RE-PLAN or RealOpt, provide rate-based validation analyses. However, these types of analysis may neglect details of real-world traffic dynamics. Therefore, this dissertation presents Validating Emergency Response Plan Execution Through Simulation (VERPETS), a novel, computational system for the agent-based simulation of biological emergency response plan activation. This system converts raw road network, population distribution, and emergency response plan data into a format suitable for simulation, and then performs these simulations …
Date: May 2018
Creator: Helsing, Joseph
System: The UNT Digital Library