15 Matching Results

Results open in a new window/tab.

A Data-Driven Computational Framework to Assess the Risk of Epidemics at Global Mass Gatherings (open access)

A Data-Driven Computational Framework to Assess the Risk of Epidemics at Global Mass Gatherings

This dissertation presents a data-driven computational epidemic framework to simulate disease epidemics at global mass gatherings. The annual Muslim pilgrimage to Makkah, Saudi Arabia is used to demonstrate the simulation and analysis of various disease transmission scenarios throughout the different stages of the event from the arrival to the departure of international participants. The proposed agent-based epidemic model efficiently captures the demographic, spatial, and temporal heterogeneity at each stage of the global event of Hajj. Experimental results indicate the substantial impact of the demographic and mobility patterns of the heterogeneous population of pilgrims on the progression of the disease spread in the different stages of Hajj. In addition, these simulations suggest that the differences in the spatial and temporal settings in each stage can significantly affect the dynamic of the disease. Finally, the epidemic simulations conducted at the different stages in this dissertation illustrate the impact of the differences between the duration of each stage in the event and the length of the infectious and latent periods. This research contributes to a better understanding of epidemic modeling in the context of global mass gatherings to predict the risk of disease pandemics caused by associated international travel. The computational modeling and …
Date: May 2019
Creator: Alshammari, Sultanah
System: The UNT Digital Library
An Efficient Approach for Dengue Mitigation: A Computational Framework (open access)

An Efficient Approach for Dengue Mitigation: A Computational Framework

Dengue mitigation is a major research area among scientist who are working towards an effective management of the dengue epidemic. An effective dengue mitigation requires several other important components. These components include an accurate epidemic modeling, an efficient epidemic prediction, and an efficient resource allocation for controlling of the spread of the dengue disease. Past studies assumed homogeneous response pattern of the dengue epidemic to climate conditions throughout the regions. The dengue epidemic is climate dependent and also it is geographically dependent. A global model is not sufficient to capture the local variations of the epidemic. We propose a novel method of epidemic modeling considering local variation and that uses micro ensemble of regressors for each region. There are three regressors that are used in the construction of the ensemble. These are support vector regression, ordinary least square regression, and a k-nearest neighbor regression. The best performing regressors get selected into the ensemble. The proposed ensemble determines the risk of dengue epidemic in each region in advance. The risk is then used in risk-based resource allocation. The proposing resource allocation is built based on the genetic algorithm. The algorithm exploits the genetic algorithm with major modifications to its main components, …
Date: May 2019
Creator: Dinayadura, Nirosha
System: The UNT Digital Library

Enhancing Storage Dependability and Computing Energy Efficiency for Large-Scale High Performance Computing Systems

Access: Use of this item is restricted to the UNT Community
With the advent of information explosion age, larger capacity disk drives are used to store data and powerful devices are used to process big data. As the scale and complexity of computer systems increase, we expect these systems to provide dependable and energy-efficient services and computation. Although hard drives are reliable in general, they are the most commonly replaced hardware components. Disk failures cause data corruption and even data loss, which can significantly affect system performance and financial losses. In this dissertation research, I analyze different manifestations of disk failures in production data centers and explore data mining techniques combined with statistical analysis methods to discover categories of disk failures and their distinctive properties. I use similarity measures to quantify the degradation process of each failure type and derive the degradation signature. The derived degradation signatures are further leveraged to forecast when future disk failures may happen. Meanwhile, this dissertation also studies energy efficiency of high performance computers. Specifically, I characterize the power and energy consumption of Haswell processors which are used in multiple supercomputers, and analyze the power and energy consumption of Legion, a data-centric programming model and runtime system, and Legion applications. We find that power and energy …
Date: May 2019
Creator: Huang, Song
System: The UNT Digital Library
Exploring Physical Unclonable Functions for Efficient Hardware Assisted Security in the IoT (open access)

Exploring Physical Unclonable Functions for Efficient Hardware Assisted Security in the IoT

Modern cities are undergoing rapid expansion. The number of connected devices in the networks in and around these cities is increasing every day and will exponentially increase in the next few years. At home, the number of connected devices is also increasing with the introduction of home automation appliances and applications. Many of these appliances are becoming smart devices which can track our daily routines. It is imperative that all these devices should be secure. When cryptographic keys used for encryption and decryption are stored on memory present on these devices, they can be retrieved by attackers or adversaries to gain control of the system. For this purpose, Physical Unclonable Functions (PUFs) were proposed to generate the keys required for encryption and decryption of the data or the communication channel, as required by the application. PUF modules take advantage of the manufacturing variations that are introduced in the Integrated Circuits (ICs) during the fabrication process. These are used to generate the cryptographic keys which reduces the use of a separate memory module to store the encryption and decryption keys. A PUF module can also be recon gurable such that the number of input output pairs or Challenge Response Pairs (CRPs) …
Date: May 2019
Creator: Yanambaka, Venkata Prasanth
System: The UNT Digital Library
Extracting Temporally-Anchored Spatial Knowledge (open access)

Extracting Temporally-Anchored Spatial Knowledge

In my dissertation, I elaborate on the work that I have done to extract temporally-anchored spatial knowledge from text, including both intra- and inter-sentential knowledge. I also detail multiple approaches to infer spatial timeline of a person from biographies and social media. I present and analyze two strategies to annotate information regarding whether a given entity is or is not located at some location, and for how long with respect to an event. Specifically, I leverage semantic roles or syntactic dependencies to generate potential spatial knowledge and then crowdsource annotations to validate the potential knowledge. The resulting annotations indicate how long entities are or are not located somewhere, and temporally anchor this spatial information. I present an in-depth corpus analysis and experiments comparing the spatial knowledge generated by manipulating roles or dependencies. In my work, I also explore research methodologies that go beyond single sentences and extract spatio-temporal information from text. Spatial timelines refer to a chronological order of locations where a target person is or is not located. I present corpus and experiments to extract spatial timelines from Wikipedia biographies. I present my work on determining locations and the order in which they are actually visited by a person …
Date: May 2019
Creator: Vempala, Alakananda
System: The UNT Digital Library
Methodical Evaluation of Processing-in-Memory Alternatives (open access)

Methodical Evaluation of Processing-in-Memory Alternatives

In this work, I characterized a series of potential application kernels using a set of architectural and non-architectural metrics, and performed a comparison of four different alternatives for processing-in-memory cores (PIMs): ARM cores, GPGPUs, coarse-grained reconfigurable dataflow (DF-PIM), and a domain specific architecture using SIMD PIM engine consisting of a series of multiply-accumulate circuits (MACs). For each PIM alternative I investigated how performance and energy efficiency changes with respect to a series of system parameters, such as memory bandwidth and latency, number of PIM cores, DVFS states, cache architecture, etc. In addition, I compared the PIM core choices for a subset of applications and discussed how the application characteristics correlate to the achieved performance and energy efficiency. Furthermore, I compared the PIM alternatives to a host-centric solution that uses a traditional server-class CPU core or PIM-like cores acting as host-side accelerators instead of being part of 3D-stacked memories. Such insights can expose the achievable performance limits and shortcomings of certain PIM designs and show sensitivity to a series of system parameters (available memory bandwidth, application latency and bandwidth sensitivity, etc.). In addition, identifying the common application characteristics for PIM kernels provides opportunity to identify similar types of computation patterns in …
Date: May 2019
Creator: Scrbak, Marko
System: The UNT Digital Library

Revealing the Positive Meaning of a Negation

Access: Use of this item is restricted to the UNT Community
Negation is a complex phenomenon present in all human languages, allowing for the uniquely human capacities of denial, contradiction, misrepresentation, lying, and irony. It is in the first place a phenomenon of semantical opposition. Sentences containing negation are generally (a) less informative than affirmative ones, (b) morphosyntactically more marked—all languages have negative markers while only a few have affirmative markers, and (c) psychologically more complex and harder to process. Negation often conveys positive meaning. This meaning ranges from implicatures to entailments. In this dissertation, I develop a system to reveal the underlying positive interpretation of negation. I first identify which words are intended to be negated (i.e, the focus of negation) and second, I rewrite those tokens to generate an actual positive interpretation. I identify the focus of negation by scoring probable foci along a continuous scale. One of the obstacles to exploring foci scoring is that no public datasets exist for this task. Thus, to study this problem I create new corpora. The corpora contain verbal, nominal and adjectival negations and their potential positive interpretations along with their scores ranging from 1 to 5. Then, I use supervised learning models for scoring the focus of negation. In order to …
Date: May 2019
Creator: Sarabi, Zahra
System: The UNT Digital Library
A Study on Flat-Address-Space Heterogeneous Memory Architectures (open access)

A Study on Flat-Address-Space Heterogeneous Memory Architectures

In this dissertation, we present a number of studies that primarily focus on data movement challenges among different types of memories (viz., 3D-DRAM, DDRx DRAM and NVM) employed together as a flat-address heterogeneous memory system. We introduce two different hardware-based techniques for prefetching data from slow off-chip phase change memory (PCM) to fast on-chip memories. The prefetching techniques efficiently fetch data from PCM and place that data into processor-resident or 3D-DRAM-resident buffers without putting high demand on bandwidth and provide significant performance improvements. Next, we explore different page migration techniques for flat-address memory systems which differ in when to migrate pages (i.e., periodically or instantaneously) and how to manage the migrations (i.e., OS-based or hardware-based approach). In the first page migration study, we present several epoch-based page migration policies for different organizations of flat-address memories consisting of two (2-level) and three (3-level) types of memory modules. These policies have resulted in significant energy savings. In the next page migration study, we devise an efficient "on-the-fly'" page migration technique which migrates a page from slow PCM to fast 3D-DRAM whenever it receives a certain number of memory accesses without waiting for any specific time interval. Furthermore, we present a light-weight hardware-assisted …
Date: May 2019
Creator: Islam, Mahzabeen
System: The UNT Digital Library
Application of Adaptive Techniques in Regression Testing for Modern Software Development (open access)

Application of Adaptive Techniques in Regression Testing for Modern Software Development

In this dissertation we investigate the applicability of different adaptive techniques to improve the effectiveness and efficiency of the regression testing. Initially, we introduce the concept of regression testing. We then perform a literature review of current practices and state-of-the-art regression testing techniques. Finally, we advance the regression testing techniques by performing four empirical studies in which we use different types of information (e.g. user session, source code, code commit, etc.) to investigate the effectiveness of each software metric on fault detection capability for different software environments. In our first empirical study, we show the effectiveness of applying user session information for test case prioritization. In our next study, we apply learning from the previous study, and implement a collaborative filtering recommender system for test case prioritization, which uses user sessions and change history information as input parameter, and return the risk score associated with each component. Results of this study show that our recommender system improves the effectiveness of test prioritization; the performance of our approach was particularly noteworthy when we were under time constraints. We then investigate the merits of multi-objective testing over single objective techniques with a graph-based testing framework. Results of this study indicate that the …
Date: August 2019
Creator: Azizi, Maral
System: The UNT Digital Library
Skin Detection in Image and Video Founded in Clustering and Region Growing (open access)

Skin Detection in Image and Video Founded in Clustering and Region Growing

Researchers have been involved for decades in search of an efficient skin detection method. Yet current methods have not overcome the major limitations. To overcome these limitations, in this dissertation, a clustering and region growing based skin detection method is proposed. These methods together with a significant insight result in a more effective algorithm. The insight concerns a capability to define dynamically the number of clusters in a collection of pixels organized as an image. In clustering for most problem domains, the number of clusters is fixed a priori and does not perform effectively over a wide variety of data contents. Therefore, in this dissertation, a skin detection method has been proposed using the above findings and validated. This method assigns the number of clusters based on image properties and ultimately allows freedom from manual thresholding or other manual operations. The dynamic determination of clustering outcomes allows for greater automation of skin detection when dealing with uncertain real-world conditions.
Date: August 2019
Creator: Islam, A B M Rezbaul
System: The UNT Digital Library

SurfKE: A Graph-Based Feature Learning Framework for Keyphrase Extraction

Access: Use of this item is restricted to the UNT Community
Current unsupervised approaches for keyphrase extraction compute a single importance score for each candidate word by considering the number and quality of its associated words in the graph and they are not flexible enough to incorporate multiple types of information. For instance, nodes in a network may exhibit diverse connectivity patterns which are not captured by the graph-based ranking methods. To address this, we present a new approach to keyphrase extraction that represents the document as a word graph and exploits its structure in order to reveal underlying explanatory factors hidden in the data that may distinguish keyphrases from non-keyphrases. Experimental results show that our model, which uses phrase graph representations in a supervised probabilistic framework, obtains remarkable improvements in performance over previous supervised and unsupervised keyphrase extraction systems.
Date: August 2019
Creator: Florescu, Corina Andreea
System: The UNT Digital Library
Event Sequence Identification and Deep Learning Classification for Anomaly Detection and Predication on High-Performance Computing Systems (open access)

Event Sequence Identification and Deep Learning Classification for Anomaly Detection and Predication on High-Performance Computing Systems

High-performance computing (HPC) systems continue growing in both scale and complexity. These large-scale, heterogeneous systems generate tens of millions of log messages every day. Effective log analysis for understanding system behaviors and identifying system anomalies and failures is highly challenging. Existing log analysis approaches use line-by-line message processing. They are not effective for discovering subtle behavior patterns and their transitions, and thus may overlook some critical anomalies. In this dissertation research, I propose a system log event block detection (SLEBD) method which can extract the log messages that belong to a component or system event into an event block (EB) accurately and automatically. At the event level, we can discover new event patterns, the evolution of system behavior, and the interaction among different system components. To find critical event sequences, existing sequence mining methods are mostly based on the a priori algorithm which is compute-intensive and runs for a long time. I develop a novel, topology-aware sequence mining (TSM) algorithm which is efficient to generate sequence patterns from the extracted event block lists. I also train a long short-term memory (LSTM) model to cluster sequences before specific events. With the generated sequence pattern and trained LSTM model, we can predict …
Date: December 2019
Creator: Li, Zongze
System: The UNT Digital Library

A Performance and Security Analysis of Elliptic Curve Cryptography Based Real-Time Media Encryption

Access: Use of this item is restricted to the UNT Community
This dissertation emphasizes the security aspects of real-time media. The problems of existing real-time media protections are identified in this research, and viable solutions are proposed. First, the security of real-time media depends on the Secure Real-time Transport Protocol (SRTP) mechanism. We identified drawbacks of the existing SRTP Systems, which use symmetric key encryption schemes, which can be exploited by attackers. Elliptic Curve Cryptography (ECC), an asymmetric key cryptography scheme, is proposed to resolve these problems. Second, the ECC encryption scheme is based on elliptic curves. This dissertation explores the weaknesses of a widely used elliptic curve in terms of security and describes a more secure elliptic curve suitable for real-time media protection. Eighteen elliptic curves had been tested in a real-time video transmission system, and fifteen elliptic curves had been tested in a real-time audio transmission system. Based on the performance, X9.62 standard 256-bit prime curve, NIST-recommended 256-bit prime curves, and Brainpool 256-bit prime curves were found to be suitable for real-time audio encryption. Again, X9.62 standard 256-bit prime and 272-bit binary curves, and NIST-recommended 256-bit prime curves were found to be suitable for real-time video encryption.The weaknesses of NIST-recommended elliptic curves are discussed and a more secure new …
Date: December 2019
Creator: Sen, Nilanjan
System: The UNT Digital Library
Shepherding Network Security Protocols as They Transition to New Atmospheres: A New Paradigm in Network Protocol Analysis (open access)

Shepherding Network Security Protocols as They Transition to New Atmospheres: A New Paradigm in Network Protocol Analysis

The solutions presented in this dissertation describe a new paradigm in which we shepherd these network security protocols through atmosphere transitions, offering new ways to analyze and monitor the state of the protocol. The approach involves identifying a protocols transitional weaknesses through adaption of formal models, measuring the weakness as it exists in the wild by statically analyzing applications, and show how to use network traffic analysis to monitor protocol implementations going into the future. Throughout the effort, we follow the popular Open Authorization protocol in its attempts to apply its web-based roots to a mobile atmosphere. To pinpoint protocol deficiencies, we first adapt a well regarded formal analysis and show it insufficient in the characterization of mobile applications, tying its transitional weaknesses to implementation issues and delivering a reanalysis of the proof. We then measure the prevalence of this weakness by statically analyzing over 11,000 Android applications. While looking through source code, we develop new methods to find sensitive protocol information, overcome hurdles like obfuscation, and provide interfaces for later modeling, all while achieving a false positive rate of below 10 percent. We then use network analysis to detect and verify application implementations. By collecting network traffic from Android …
Date: December 2019
Creator: Talkington, Gregory Joshua
System: The UNT Digital Library

Spatial Partitioning Algorithms for Solving Location-Allocation Problems

Access: Use of this item is restricted to the UNT Community
This dissertation presents spatial partitioning algorithms to solve location-allocation problems. Location-allocations problems pertain to both the selection of facilities to serve demand at demand points and the assignment of demand points to the selected or known facilities. In the first part of this dissertation, we focus on the well known and well-researched location-allocation problem, the "p-median problem", which is a distance-based location-allocation problem that involves selection and allocation of p facilities for n demand points. We evaluate the performance of existing p-median heuristic algorithms and investigate the impact of the scale of the problem, and the spatial distribution of demand points on the performance of these algorithms. Based on the results from this comparative study, we present guidelines for location analysts to aid them in selecting the best heuristic and corresponding parameters depending on the problem at hand. Additionally, we found that existing heuristic algorithms are not suitable for solving large-scale p-median problems in a reasonable amount of time. We present a density-based decomposition methodology to solve large-scale p-median problems efficiently. This algorithm identifies dense clusters in the region and uses a MapReduce procedure to select facilities in the clustered regions independently and combine the solutions from the subproblems. Lastly, …
Date: December 2019
Creator: Gwalani, Harsha
System: The UNT Digital Library