Location Estimation and Geo-Correlated Information Trends (open access)

Location Estimation and Geo-Correlated Information Trends

A tremendous amount of information is being shared every day on social media sites such as Facebook, Twitter or Google+. However, only a small portion of users provide their location information, which can be helpful in targeted advertising and many other services. Current methods in location estimation using social relationships consider social friendship as a simple binary relationship. However, social closeness between users and structure of friends have strong implications on geographic distances. In the first task, we introduce new measures to evaluate the social closeness between users and structure of friends. Then we propose models that use them for location estimation. Compared with the models which take the friend relation as a binary feature, social closeness can help identify which friend of a user is more important and friend structure can help to determine significance level of locations, thus improving the accuracy of the location estimation models. A confidence iteration method is further introduced to improve estimation accuracy and overcome the problem of scarce location information. We evaluate our methods on two different datasets, Twitter and Gowalla. The results show that our model can improve the estimation accuracy by 5% - 20% compared with state-of-the-art friend-based models. In the …
Date: December 2017
Creator: Liu, Zhi
System: The UNT Digital Library
Evaluation of Call Mobility on Network Productivity in Long Term Evolution Advanced (LTE-A) Femtocells (open access)

Evaluation of Call Mobility on Network Productivity in Long Term Evolution Advanced (LTE-A) Femtocells

The demand for higher data rates for indoor and cell-edge users led to evolution of small cells. LTE femtocells, one of the small cell categories, are low-power low-cost mobile base stations, which are deployed within the coverage area of the traditional macro base station. The cross-tier and co-tier interferences occur only when the macrocell and femtocell share the same frequency channels. Open access (OSG), closed access (CSG), and hybrid access are the three existing access-control methods that decide users' connectivity to the femtocell access point (FAP). We define a network performance function, network productivity, to measure the traffic that is carried successfully. In this dissertation, we evaluate call mobility in LTE integrated network and determine optimized network productivity with variable call arrival rate in given LTE deployment with femtocell access modes (OSG, CSG, HYBRID) for a given call blocking vector. The solution to the optimization is maximum network productivity and call arrival rates for all cells. In the second scenario, we evaluate call mobility in LTE integrated network with increasing femtocells and maximize network productivity with variable femtocells distribution per macrocell with constant call arrival rate in uniform LTE deployment with femtocell access modes (OSG, CSG, HYBRID) for a given …
Date: December 2017
Creator: Sawant, Uttara
System: The UNT Digital Library
The Influence of Social Network Graph Structure on Disease Dynamics in a Simulated Environment (open access)

The Influence of Social Network Graph Structure on Disease Dynamics in a Simulated Environment

The fight against epidemics/pandemics is one of man versus nature. Technological advances have not only improved existing methods for monitoring and controlling disease outbreaks, but have also provided new means for investigation, such as through modeling and simulation. This dissertation explores the relationship between social structure and disease dynamics. Social structures are modeled as graphs, and outbreaks are simulated based on a well-recognized standard, the susceptible-infectious-removed (SIR) paradigm. Two independent, but related, studies are presented. The first involves measuring the severity of outbreaks as social network parameters are altered. The second study investigates the efficacy of various vaccination policies based on social structure. Three disease-related centrality measures are introduced, contact, transmission, and spread centrality, which are related to previously established centrality measures degree, betweenness, and closeness, respectively. The results of experiments presented in this dissertation indicate that reducing the neighborhood size along with outside-of-neighborhood contacts diminishes the severity of disease outbreaks. Vaccination strategies can effectively reduce these parameters. Additionally, vaccination policies that target individuals with high centrality are generally shown to be slightly more effective than a random vaccination policy. These results combined with past and future studies will assist public health officials in their effort to minimize the effects …
Date: December 2010
Creator: Johnson, Tina V.
System: The UNT Digital Library
Statistical Strategies for Efficient Signal Detection and Parameter Estimation in Wireless Sensor Networks (open access)

Statistical Strategies for Efficient Signal Detection and Parameter Estimation in Wireless Sensor Networks

This dissertation investigates data reduction strategies from a signal processing perspective in centralized detection and estimation applications. First, it considers a deterministic source observed by a network of sensors and develops an analytical strategy for ranking sensor transmissions based on the magnitude of their test statistics. The benefit of the proposed strategy is that the decision to transmit or not to transmit observations to the fusion center can be made at the sensor level resulting in significant savings in transmission costs. A sensor network based on target tracking application is simulated to demonstrate the benefits of the proposed strategy over the unconstrained energy approach. Second, it considers the detection of random signals in noisy measurements and evaluates the performance of eigenvalue-based signal detectors. Due to their computational simplicity, robustness and performance, these detectors have recently received a lot of attention. When the observed random signal is correlated, several researchers claim that the performance of eigenvalue-based detectors exceeds that of the classical energy detector. However, such claims fail to consider the fact that when the signal is correlated, the optimal detector is the estimator-correlator and not the energy detector. In this dissertation, through theoretical analyses and Monte Carlo simulations, eigenvalue-based detectors …
Date: December 2013
Creator: Ayeh, Eric
System: The UNT Digital Library
Real Time Assessment of a Video Game Player's State of Mind Using Off-the-Shelf Electroencephalography (open access)

Real Time Assessment of a Video Game Player's State of Mind Using Off-the-Shelf Electroencephalography

The focus of this research is on the development of a real time application that uses a low cost EEG headset to measure a player's state of mind while they play a video game. Using data collected using the Emotiv EPOC headset, various EEG processing techniques are tested to find ways of measuring a person's engagement and arousal levels. The ability to measure a person's engagement and arousal levels provide an opportunity to develop a model that monitor a person's flow while playing video games. Identifying when certain events occur, like when the player dies, will make it easier to identify when a player has left a state of flow. The real time application Brainwave captures data from the wireless Emotiv EPOC headset. Brainwave converts the raw EEG data into more meaningful brainwave band frequencies. Utilizing the brainwave frequencies the program trains multiple machine learning algorithms with data designed to identify when the player dies. Brainwave runs while the player plays through a video gaming monitoring their engagement and arousal levels for changes that cause the player to leave a state of flow. Brainwave reports to researchers and developers when the player dies along with the identification of the players …
Date: December 2016
Creator: McMahan, Timothy
System: The UNT Digital Library
Infusing Automatic Question Generation with Natural Language Understanding (open access)

Infusing Automatic Question Generation with Natural Language Understanding

Automatically generating questions from text for educational purposes is an active research area in natural language processing. The automatic question generation system accompanying this dissertation is MARGE, which is a recursive acronym for: MARGE automatically reads generates and evaluates. MARGE generates questions from both individual sentences and the passage as a whole, and is the first question generation system to successfully generate meaningful questions from textual units larger than a sentence. Prior work in automatic question generation from text treats a sentence as a string of constituents to be rearranged into as many questions as allowed by English grammar rules. Consequently, such systems overgenerate and create mainly trivial questions. Further, none of these systems to date has been able to automatically determine which questions are meaningful and which are trivial. This is because the research focus has been placed on NLG at the expense of NLU. In contrast, the work presented here infuses the questions generation process with natural language understanding. From the input text, MARGE creates a meaning analysis representation for each sentence in a passage via the DeconStructure algorithm presented in this work. Questions are generated from sentence meaning analysis representations using templates. The generated questions are automatically …
Date: December 2016
Creator: Mazidi, Karen
System: The UNT Digital Library
Ontology Based Security Threat Assessment and Mitigation for Cloud Systems (open access)

Ontology Based Security Threat Assessment and Mitigation for Cloud Systems

A malicious actor often relies on security vulnerabilities of IT systems to launch a cyber attack. Most cloud services are supported by an orchestration of large and complex systems which are prone to vulnerabilities, making threat assessment very challenging. In this research, I developed formal and practical ontology-based techniques that enable automated evaluation of a cloud system's security threats. I use an architecture for threat assessment of cloud systems that leverages a dynamically generated ontology knowledge base. I created an ontology model and represented the components of a cloud system. These ontologies are designed for a set of domains that covers some cloud's aspects and information technology products' cyber threat data. The inputs to our architecture are the configurations of cloud assets and components specification (which encompass the desired assessment procedures) and the outputs are actionable threat assessment results. The focus of this work is on ways of enumerating, assessing, and mitigating emerging cyber security threats. A research toolkit system has been developed to evaluate our architecture. We expect our techniques to be leveraged by any cloud provider or consumer in closing the gap of identifying and remediating known or impending security threats facing their cloud's assets.
Date: December 2018
Creator: Kamongi, Patrick
System: The UNT Digital Library
Toward Supporting Fine-Grained, Structured, Meaningful and Engaging Feedback in Educational Applications (open access)

Toward Supporting Fine-Grained, Structured, Meaningful and Engaging Feedback in Educational Applications

Recent advancements in machine learning have started to put their mark on educational technology. Technology is evolving fast and, as people adopt it, schools and universities must also keep up (nearly 70% of primary and secondary schools in the UK are now using tablets for various purposes). As these numbers are likely going to follow the same increasing trend, it is imperative for schools to adapt and benefit from the advantages offered by technology: real-time processing of data, availability of different resources through connectivity, efficiency, and many others. To this end, this work contributes to the growth of educational technology by developing several algorithms and models that are meant to ease several tasks for the instructors, engage students in deep discussions and ultimately, increase their learning gains. First, a novel, fine-grained knowledge representation is introduced that splits phrases into their constituent propositions that are both meaningful and minimal. An automated extraction algorithm of the propositions is also introduced. Compared with other fine-grained representations, the extraction model does not require any human labor after it is trained, while the results show considerable improvement over two meaningful baselines. Second, a proposition alignment model is created that relies on even finer-grained units of …
Date: December 2018
Creator: Bulgarov, Florin Adrian
System: The UNT Digital Library
Improving Software Quality through Syntax and Semantics Verification of Requirements Models (open access)

Improving Software Quality through Syntax and Semantics Verification of Requirements Models

Software defects can frequently be traced to poorly-specified requirements. Many software teams manage their requirements using tools such as checklists and databases, which lack a formal semantic mapping to system behavior. Such a mapping can be especially helpful for safety-critical systems. Another limitation of many requirements analysis methods is that much of the analysis must still be done manually. We propose techniques that automate portions of the requirements analysis process, as well as clarify the syntax and semantics of requirements models using a variety of methods, including machine learning tools and our own tool, VeriCCM. The machine learning tools used help us identify potential model elements and verify their correctness. VeriCCM, a formalized extension of the causal component model (CCM), uses formal methods to ensure that requirements are well-formed, as well as providing the beginnings of a full formal semantics. We also explore the use of statecharts to identify potential abnormal behaviors from a given set of requirements. At each stage, we perform empirical studies to evaluate the effectiveness of our proposed approaches.
Date: December 2018
Creator: Gaither, Danielle
System: The UNT Digital Library

A Performance and Security Analysis of Elliptic Curve Cryptography Based Real-Time Media Encryption

Access: Use of this item is restricted to the UNT Community
This dissertation emphasizes the security aspects of real-time media. The problems of existing real-time media protections are identified in this research, and viable solutions are proposed. First, the security of real-time media depends on the Secure Real-time Transport Protocol (SRTP) mechanism. We identified drawbacks of the existing SRTP Systems, which use symmetric key encryption schemes, which can be exploited by attackers. Elliptic Curve Cryptography (ECC), an asymmetric key cryptography scheme, is proposed to resolve these problems. Second, the ECC encryption scheme is based on elliptic curves. This dissertation explores the weaknesses of a widely used elliptic curve in terms of security and describes a more secure elliptic curve suitable for real-time media protection. Eighteen elliptic curves had been tested in a real-time video transmission system, and fifteen elliptic curves had been tested in a real-time audio transmission system. Based on the performance, X9.62 standard 256-bit prime curve, NIST-recommended 256-bit prime curves, and Brainpool 256-bit prime curves were found to be suitable for real-time audio encryption. Again, X9.62 standard 256-bit prime and 272-bit binary curves, and NIST-recommended 256-bit prime curves were found to be suitable for real-time video encryption.The weaknesses of NIST-recommended elliptic curves are discussed and a more secure new …
Date: December 2019
Creator: Sen, Nilanjan
System: The UNT Digital Library
Online Construction of Android Application Test Suites (open access)

Online Construction of Android Application Test Suites

Mobile applications play an important role in the dissemination of computing and information resources. They are often used in domains such as mobile banking, e-commerce, and health monitoring. Cost-effective testing techniques in these domains are critical. This dissertation contributes novel techniques for automatic construction of mobile application test suites. In particular, this work provides solutions that focus on the prohibitively large number of possible event sequences that must be sampled in GUI-based mobile applications. This work makes three major contributions: (1) an automated GUI testing tool, Autodroid, that implements a novel online approach to automatic construction of Android application test suites (2) probabilistic and combinatorial-based algorithms that systematically sample the input space of Android applications to generate test suites with GUI/context events and (3) empirical studies to evaluate the cost-effectiveness of our techniques on real-world Android applications. Our experiments show that our techniques achieve better code coverage and event coverage compared to random test generation. We demonstrate that our techniques are useful for automatic construction of Android application test suites in the absence of source code and preexisting abstract models of an Application Under Test (AUT). The insights derived from our empirical studies provide guidance to researchers and practitioners involved …
Date: December 2017
Creator: Adamo, David T., Jr.
System: The UNT Digital Library
Detection and Classification of Heart Sounds Using a Heart-Mobile Interface (open access)

Detection and Classification of Heart Sounds Using a Heart-Mobile Interface

An early detection of heart disease can save lives, caution individuals and also help to determine the type of treatment to be given to the patients. The first test of diagnosing a heart disease is through auscultation - listening to the heart sounds. The interpretation of heart sounds is subjective and requires a professional skill to identify the abnormalities in these sounds. A medical practitioner uses a stethoscope to perform an initial screening by listening for irregular sounds from the patient's chest. Later, echocardiography and electrocardiography tests are taken for further diagnosis. However, these tests are expensive and require specialized technicians to operate. A simple and economical way is vital for monitoring in homecare or rural hospitals and urban clinics. This dissertation is focused on developing a patient-centered device for initial screening of the heart sounds that is both low cost and can be used by the users on themselves, and later share the readings with the healthcare providers. An innovative mobile health service platform is created for analyzing and classifying heart sounds. Certain properties of heart sounds have to be evaluated to identify the irregularities such as the number of heart beats and gallops, intensity, frequency, and duration. Since …
Date: December 2016
Creator: Thiyagaraja, Shanti
System: The UNT Digital Library
Indoor Localization Using Magnetic Fields (open access)

Indoor Localization Using Magnetic Fields

Indoor localization consists of locating oneself inside new buildings. GPS does not work indoors due to multipath reflection and signal blockage. WiFi based systems assume ubiquitous availability and infrastructure based systems require expensive installations, hence making indoor localization an open problem. This dissertation consists of solving the problem of indoor localization by thoroughly exploiting the indoor ambient magnetic fields comprising mainly of disturbances termed as anomalies in the Earth’s magnetic field caused by pillars, doors and elevators in hallways which are ferromagnetic in nature. By observing uniqueness in magnetic signatures collected from different campus buildings, the work presents the identification of landmarks and guideposts from these signatures and further develops magnetic maps of buildings - all of which can be used to locate and navigate people indoors. To understand the reason behind these anomalies, first a comparison between the measured and model generated Earth’s magnetic field is made, verifying the presence of a constant field without any disturbances. Then by modeling the magnetic field behavior of different pillars such as steel reinforced concrete, solid steel, and other structures like doors and elevators, the interaction of the Earth’s field with the ferromagnetic fields is described thereby explaining the causes of the …
Date: December 2011
Creator: Pathapati Subbu, Kalyan Sasidhar
System: The UNT Digital Library
Modeling Synergistic Relationships Between Words and Images (open access)

Modeling Synergistic Relationships Between Words and Images

Texts and images provide alternative, yet orthogonal views of the same underlying cognitive concept. By uncovering synergistic, semantic relationships that exist between words and images, I am working to develop novel techniques that can help improve tasks in natural language processing, as well as effective models for text-to-image synthesis, image retrieval, and automatic image annotation. Specifically, in my dissertation, I will explore the interoperability of features between language and vision tasks. In the first part, I will show how it is possible to apply features generated using evidence gathered from text corpora to solve the image annotation problem in computer vision, without the use of any visual information. In the second part, I will address research in the reverse direction, and show how visual cues can be used to improve tasks in natural language processing. Importantly, I propose a novel metric to estimate the similarity of words by comparing the visual similarity of concepts invoked by these words, and show that it can be used further to advance the state-of-the-art methods that employ corpus-based and knowledge-based semantic similarity measures. Finally, I attempt to construct a joint semantic space connecting words with images, and synthesize an evaluation framework to quantify cross-modal …
Date: December 2012
Creator: Leong, Chee Wee
System: The UNT Digital Library
Source and Channel Coding Strategies for Wireless Sensor Networks (open access)

Source and Channel Coding Strategies for Wireless Sensor Networks

In this dissertation, I focus on source coding techniques as well as channel coding techniques. I addressed the challenges in WSN by developing (1) a new source coding strategy for erasure channels that has better distortion performance compared to MDC; (2) a new cooperative channel coding strategy for multiple access channels that has better channel outage performances compared to MIMO; (3) a new source-channel cooperation strategy to accomplish source-to-fusion center communication that reduces system distortion and improves outage performance. First, I draw a parallel between the 2x2 MDC scheme and the Alamouti's space time block coding (STBC) scheme and observe the commonality in their mathematical models. This commonality allows us to observe the duality between the two diversity techniques. Making use of this duality, I develop an MDC scheme with pairwise complex correlating transform. Theoretically, I show that MDC scheme results in: 1) complete elimination of the estimation error when only one descriptor is received; 2) greater efficiency in recovering the stronger descriptor (with larger variance) from the weaker descriptor; and 3) improved performance in terms of minimized distortion as the quantization error gets reduced. Experiments are also performed on real images to demonstrate these benefits. Second, I present a …
Date: December 2012
Creator: Li, Li
System: The UNT Digital Library
Process-Voltage-Temperature Aware Nanoscale Circuit Optimization (open access)

Process-Voltage-Temperature Aware Nanoscale Circuit Optimization

Embedded systems which are targeted towards portable applications are required to have low power consumption because such portable devices are typically powered by batteries. During the memory accesses of such battery operated portable systems, including laptops, cell phones and other devices, a significant amount of power or energy is consumed which significantly affects the battery life. Therefore, efficient and leakage power saving cache designs are needed for longer operation of battery powered applications. Design engineers have limited control over many design parameters of the circuit and hence face many chal-lenges due to inherent process technology variations, particularly on static random access memory (SRAM) circuit design. As CMOS process technologies scale down deeper into the nanometer regime, the push for high performance and reliable systems becomes even more challenging. As a result, developing low-power designs while maintaining better performance of the circuit becomes a very difficult task. Furthermore, a major need for accurate analysis and optimization of various forms of total power dissipation and performance in nanoscale CMOS technologies, particularly in SRAMs, is another critical issue to be considered. This dissertation proposes power-leakage and static noise margin (SNM) analysis and methodologies to achieve optimized static random access memories (SRAMs). Alternate topologies …
Date: December 2010
Creator: Thakral, Garima
System: The UNT Digital Library
Modeling and Analysis of Intentional And Unintentional Security Vulnerabilities in a Mobile Platform (open access)

Modeling and Analysis of Intentional And Unintentional Security Vulnerabilities in a Mobile Platform

Mobile phones are one of the essential parts of modern life. Making a phone call is not the main purpose of a smart phone anymore, but merely one of many other features. Online social networking, chatting, short messaging, web browsing, navigating, and photography are some of the other features users enjoy in modern smartphones, most of which are provided by mobile apps. However, with this advancement, many security vulnerabilities have opened up in these devices. Malicious apps are a major threat for modern smartphones. According to Symantec Corp., by the middle of 2013, about 273,000 Android malware apps were identified. It is a complex issue to protect everyday users of mobile devices from the attacks of technologically competent hackers, illegitimate users, trolls, and eavesdroppers. This dissertation emphasizes the concept of intention identification. Then it looks into ways to utilize this intention identification concept to enforce security in a mobile phone platform. For instance, a battery monitoring app requiring SMS permissions indicates suspicious intention as battery monitoring usually does not need SMS permissions. Intention could be either the user's intention or the intention of an app. These intentions can be identified using their behavior or by using their source code. Regardless …
Date: December 2014
Creator: Fazeen, Mohamed & Issadeen, Mohamed
System: The UNT Digital Library
Uncertainty Evaluation in Large-scale Dynamical Systems: Theory and Applications (open access)

Uncertainty Evaluation in Large-scale Dynamical Systems: Theory and Applications

Significant research efforts have been devoted to large-scale dynamical systems, with the aim of understanding their complicated behaviors and managing their responses in real-time. One pivotal technological obstacle in this process is the existence of uncertainty. Although many of these large-scale dynamical systems function well in the design stage, they may easily fail when operating in realistic environment, where environmental uncertainties modulate system dynamics and complicate real-time predication and management tasks. This dissertation aims to develop systematic methodologies to evaluate the performance of large-scale dynamical systems under uncertainty, as a step toward real-time decision support. Two uncertainty evaluation approaches are pursued: the analytical approach and the effective simulation approach. The analytical approach abstracts the dynamics of original stochastic systems, and develops tractable analysis (e.g., jump-linear analysis) for the approximated systems. Despite the potential bias introduced in the approximation process, the analytical approach provides rich insights valuable for evaluating and managing the performance of large-scale dynamical systems under uncertainty. When a system’s complexity and scale are beyond tractable analysis, the effective simulation approach becomes very useful. The effective simulation approach aims to use a few smartly selected simulations to quickly evaluate a complex system’s statistical performance. This approach was originally developed …
Date: December 2014
Creator: Zhou, Yi (Software engineer)
System: The UNT Digital Library
A New Look at Retargetable Compilers (open access)

A New Look at Retargetable Compilers

Consumers demand new and innovative personal computing devices every 2 years when their cellular phone service contracts are renewed. Yet, a 2 year development cycle for the concurrent development of both hardware and software is nearly impossible. As more components and features are added to the devices, maintaining this 2 year cycle with current tools will become commensurately harder. This dissertation delves into the feasibility of simplifying the development of such systems by employing heterogeneous systems on a chip in conjunction with a retargetable compiler such as the hybrid computer retargetable compiler (Hy-C). An example of a simple architecture description of sufficient detail for use with a retargetable compiler like Hy-C is provided. As a software engineer with 30 years of experience, I have witnessed numerous system failures. A plethora of software development paradigms and tools have been employed to prevent software errors, but none have been completely successful. Much discussion centers on software development in the military contracting market, as that is my background. The dissertation reviews those tools, as well as some existing retargetable compilers, in an attempt to determine how those errors occurred and how a system like Hy-C could assist in reducing future software errors. In …
Date: December 2014
Creator: Burke, Patrick William
System: The UNT Digital Library
Exploration of Visual, Acoustic, and Physiological Modalities to Complement Linguistic Representations for Sentiment Analysis (open access)

Exploration of Visual, Acoustic, and Physiological Modalities to Complement Linguistic Representations for Sentiment Analysis

This research is concerned with the identification of sentiment in multimodal content. This is of particular interest given the increasing presence of subjective multimodal content on the web and other sources, which contains a rich and vast source of people's opinions, feelings, and experiences. Despite the need for tools that can identify opinions in the presence of diverse modalities, most of current methods for sentiment analysis are designed for textual data only, and few attempts have been made to address this problem. The dissertation investigates techniques for augmenting linguistic representations with acoustic, visual, and physiological features. The potential benefits of using these modalities include linguistic disambiguation, visual grounding, and the integration of information about people's internal states. The main goal of this work is to build computational resources and tools that allow sentiment analysis to be applied to multimodal data. This thesis makes three important contributions. First, it shows that modalities such as audio, video, and physiological data can be successfully used to improve existing linguistic representations for sentiment analysis. We present a method that integrates linguistic features with features extracted from these modalities. Features are derived from verbal statements, audiovisual recordings, thermal recordings, and physiological sensors signals. The resulting …
Date: December 2014
Creator: Pérez-Rosas, Verónica
System: The UNT Digital Library
Trajectories As a Unifying Cross Domain Feature for Surveillance Systems (open access)

Trajectories As a Unifying Cross Domain Feature for Surveillance Systems

Manual video analysis is apparently a tedious task. An efficient solution is of highly importance to automate the process and to assist operators. A major goal of video analysis is understanding and recognizing human activities captured by surveillance cameras, a very challenging problem; the activities can be either individual or interactional among multiple objects. It involves extraction of relevant spatial and temporal information from visual images. Most video analytics systems are constrained by specific environmental situations. Different domains may require different specific knowledge to express characteristics of interesting events. Spatial-temporal trajectories have been utilized to capture motion characteristics of activities. The focus of this dissertation is on how trajectories are utilized in assist in developing video analytic system in the context of surveillance. The research as reported in this dissertation begins real-time highway traffic monitoring and dynamic traffic pattern analysis and in the end generalize the knowledge to event and activity analysis in a broader context. The main contributions are: the use of the graph-theoretic dominant set approach to the classification of traffic trajectories; the ability to first partition the trajectory clusters using entry and exit point awareness to significantly improve the clustering effectiveness and to reduce the computational time …
Date: December 2014
Creator: Wan, Yiwen
System: The UNT Digital Library
Shepherding Network Security Protocols as They Transition to New Atmospheres: A New Paradigm in Network Protocol Analysis (open access)

Shepherding Network Security Protocols as They Transition to New Atmospheres: A New Paradigm in Network Protocol Analysis

The solutions presented in this dissertation describe a new paradigm in which we shepherd these network security protocols through atmosphere transitions, offering new ways to analyze and monitor the state of the protocol. The approach involves identifying a protocols transitional weaknesses through adaption of formal models, measuring the weakness as it exists in the wild by statically analyzing applications, and show how to use network traffic analysis to monitor protocol implementations going into the future. Throughout the effort, we follow the popular Open Authorization protocol in its attempts to apply its web-based roots to a mobile atmosphere. To pinpoint protocol deficiencies, we first adapt a well regarded formal analysis and show it insufficient in the characterization of mobile applications, tying its transitional weaknesses to implementation issues and delivering a reanalysis of the proof. We then measure the prevalence of this weakness by statically analyzing over 11,000 Android applications. While looking through source code, we develop new methods to find sensitive protocol information, overcome hurdles like obfuscation, and provide interfaces for later modeling, all while achieving a false positive rate of below 10 percent. We then use network analysis to detect and verify application implementations. By collecting network traffic from Android …
Date: December 2019
Creator: Talkington, Gregory Joshua
System: The UNT Digital Library
Event Sequence Identification and Deep Learning Classification for Anomaly Detection and Predication on High-Performance Computing Systems (open access)

Event Sequence Identification and Deep Learning Classification for Anomaly Detection and Predication on High-Performance Computing Systems

High-performance computing (HPC) systems continue growing in both scale and complexity. These large-scale, heterogeneous systems generate tens of millions of log messages every day. Effective log analysis for understanding system behaviors and identifying system anomalies and failures is highly challenging. Existing log analysis approaches use line-by-line message processing. They are not effective for discovering subtle behavior patterns and their transitions, and thus may overlook some critical anomalies. In this dissertation research, I propose a system log event block detection (SLEBD) method which can extract the log messages that belong to a component or system event into an event block (EB) accurately and automatically. At the event level, we can discover new event patterns, the evolution of system behavior, and the interaction among different system components. To find critical event sequences, existing sequence mining methods are mostly based on the a priori algorithm which is compute-intensive and runs for a long time. I develop a novel, topology-aware sequence mining (TSM) algorithm which is efficient to generate sequence patterns from the extracted event block lists. I also train a long short-term memory (LSTM) model to cluster sequences before specific events. With the generated sequence pattern and trained LSTM model, we can predict …
Date: December 2019
Creator: Li, Zongze
System: The UNT Digital Library

Spatial Partitioning Algorithms for Solving Location-Allocation Problems

Access: Use of this item is restricted to the UNT Community
This dissertation presents spatial partitioning algorithms to solve location-allocation problems. Location-allocations problems pertain to both the selection of facilities to serve demand at demand points and the assignment of demand points to the selected or known facilities. In the first part of this dissertation, we focus on the well known and well-researched location-allocation problem, the "p-median problem", which is a distance-based location-allocation problem that involves selection and allocation of p facilities for n demand points. We evaluate the performance of existing p-median heuristic algorithms and investigate the impact of the scale of the problem, and the spatial distribution of demand points on the performance of these algorithms. Based on the results from this comparative study, we present guidelines for location analysts to aid them in selecting the best heuristic and corresponding parameters depending on the problem at hand. Additionally, we found that existing heuristic algorithms are not suitable for solving large-scale p-median problems in a reasonable amount of time. We present a density-based decomposition methodology to solve large-scale p-median problems efficiently. This algorithm identifies dense clusters in the region and uses a MapReduce procedure to select facilities in the clustered regions independently and combine the solutions from the subproblems. Lastly, …
Date: December 2019
Creator: Gwalani, Harsha
System: The UNT Digital Library