Predictive Modeling for Persuasive Ambient Technology (open access)

Predictive Modeling for Persuasive Ambient Technology

Computer scientists are increasingly aware of the power of ubiquitous computing systems that can display information in and about the user's environment. One sub category of ubiquitous computing is persuasive ambient information systems that involve an informative display transitioning between the periphery and center of attention. The goal of this ambient technology is to produce a behavior change, implying that a display must be informative, unobtrusive, and persuasive. While a significant body of research exists on ambient technology, previous research has not fully explored the different measures to identify behavior change, evaluation techniques for linking design characteristics to visual effectiveness, nor the use of short-term goals to affect long-term behavior change. This study uses the unique context of noise-induced hearing loss (NIHL) among collegiate musicians to explore these issues through developing the MIHL Reduction Feedback System that collects real-time data, translates it into visuals for music classrooms, provides predictive outcomes for goalsetting persuasion, and provides statistical measures of behavior change.
Date: August 2015
Creator: Powell, Jason W.
System: The UNT Digital Library
Evaluation Techniques and Graph-Based Algorithms for Automatic Summarization and Keyphrase Extraction (open access)

Evaluation Techniques and Graph-Based Algorithms for Automatic Summarization and Keyphrase Extraction

Automatic text summarization and keyphrase extraction are two interesting areas of research which extend along natural language processing and information retrieval. They have recently become very popular because of their wide applicability. Devising generic techniques for these tasks is challenging due to several issues. Yet we have a good number of intelligent systems performing the tasks. As different systems are designed with different perspectives, evaluating their performances with a generic strategy is crucial. It has also become immensely important to evaluate the performances with minimal human effort. In our work, we focus on designing a relativized scale for evaluating different algorithms. This is our major contribution which challenges the traditional approach of working with an absolute scale. We consider the impact of some of the environment variables (length of the document, references, and system-generated outputs) on the performance. Instead of defining some rigid lengths, we show how to adjust to their variations. We prove a mathematically sound baseline that should work for all kinds of documents. We emphasize automatically determining the syntactic well-formedness of the structures (sentences). We also propose defining an equivalence class for each unit (e.g. word) instead of the exact string matching strategy. We show an evaluation …
Date: August 2016
Creator: Hamid, Fahmida
System: The UNT Digital Library
Resource Management in Wireless Networks (open access)

Resource Management in Wireless Networks

A local call admission control (CAC) algorithm for third generation wireless networks was designed and implemented, which allows for the simulation of network throughput for different spreading factors and various mobility scenarios. A global CAC algorithm is also implemented and used as a benchmark since it is inherently optimized; it yields the best possible performance but has an intensive computational complexity. Optimized local CAC algorithm achieves similar performance as global CAC algorithm at a fraction of the computational cost. Design of a dynamic channel assignment algorithm for IEEE 802.11 wireless systems is also presented. Channels are assigned dynamically depending on the minimal interference generated by the neighboring access points on a reference access point. Analysis of dynamic channel assignment algorithm shows an improvement by a factor of 4 over the default settings of having all access points use the same channel, resulting significantly higher network throughput.
Date: August 2006
Creator: Arepally, Anurag
System: The UNT Digital Library

Deep Learning Optimization and Acceleration

The novelty of this dissertation is the optimization and acceleration of deep neural networks aimed at real-time predictions with minimal energy consumption. It consists of cross-layer optimization, output directed dynamic quantization, and opportunistic near-data computation for deep neural network acceleration. On two datasets (CIFAR-10 and CIFAR-100), the proposed deep neural network optimization and acceleration frameworks are tested using a variety of Convolutional neural networks (e.g., LeNet-5, VGG-16, GoogLeNet, DenseNet, ResNet). Experimental results are promising when compared to other state-of-the-art deep neural network acceleration efforts in the literature.
Date: August 2022
Creator: Jiang, Beilei
System: The UNT Digital Library
Non-Uniform Grid-Based Coordinated Routing in Wireless Sensor Networks (open access)

Non-Uniform Grid-Based Coordinated Routing in Wireless Sensor Networks

Wireless sensor networks are ad hoc networks of tiny battery powered sensor nodes that can organize themselves to form self-organized networks and collect information regarding temperature, light, and pressure in an area. Though the applications of sensor networks are very promising, sensor nodes are limited in their capability due to many factors. The main limitation of these battery powered nodes is energy. Sensor networks are expected to work for long periods of time once deployed and it becomes important to conserve the battery life of the nodes to extend network lifetime. This work examines non-uniform grid-based routing protocol as an effort to minimize energy consumption in the network and extend network lifetime. The entire test area is divided into non-uniformly shaped grids. Fixed source and sink nodes with unlimited energy are placed in the network. Sensor nodes with full battery life are deployed uniformly and randomly in the field. The source node floods the network with only the coordinator node active in each grid and the other nodes sleeping. The sink node traces the same route back to the source node through the same coordinators. This process continues till a coordinator node runs out of energy, when new coordinator nodes …
Date: August 2008
Creator: Kadiyala, Priyanka
System: The UNT Digital Library
Computational Methods to Optimize High-Consequence Variants of the Vehicle Routing Problem for Relief Networks in Humanitarian Logistics (open access)

Computational Methods to Optimize High-Consequence Variants of the Vehicle Routing Problem for Relief Networks in Humanitarian Logistics

Optimization of relief networks in humanitarian logistics often exemplifies the need for solutions that are feasible given a hard constraint on time. For instance, the distribution of medical countermeasures immediately following a biological disaster event must be completed within a short time-frame. When these supplies are not distributed within the maximum time allowed, the severity of the disaster is quickly exacerbated. Therefore emergency response plans that fail to facilitate the transportation of these supplies in the time allowed are simply not acceptable. As a result, all optimization solutions that fail to satisfy this criterion would be deemed infeasible. This creates a conflict with the priority optimization objective in most variants of the generic vehicle routing problem (VRP). Instead of efficiently maximizing usage of vehicle resources available to construct a feasible solution, these variants ordinarily prioritize the construction of a minimum cost set of vehicle routes. Research presented in this dissertation focuses on the design and analysis of efficient computational methods for optimizing high-consequence variants of the VRP for relief networks. The conflict between prioritizing the minimization of the number of vehicles required or the minimization of total travel time is demonstrated. The optimization of the time and capacity constraints in …
Date: August 2018
Creator: Urbanovsky, Joshua C.
System: The UNT Digital Library
Dataflow Processing in Memory Achieves Significant Energy Efficiency (open access)

Dataflow Processing in Memory Achieves Significant Energy Efficiency

The large difference between processor CPU cycle time and memory access time, often referred to as the memory wall, severely limits the performance of streaming applications. Some data centers have shown servers being idle three out of four clocks. High performance instruction sequenced systems are not energy efficient. The execute stage of even simple pipeline processors only use 9% of the pipeline's total energy. A hybrid dataflow system within a memory module is shown to have 7.2 times the performance with 368 times better energy efficiency than an Intel Xeon server processor on the analyzed benchmarks. The dataflow implementation exploits the inherent parallelism and pipelining of the application to improve performance without the overhead functions of caching, instruction fetch, instruction decode, instruction scheduling, reorder buffers, and speculative execution used by high performance out-of-order processors. Coarse grain reconfigurable logic in an energy efficient silicon process provides flexibility to implement multiple algorithms in a low energy solution. Integrating the logic within a 3D stacked memory module provides lower latency and higher bandwidth access to memory while operating independently from the host system processor.
Date: August 2018
Creator: Shelor, Charles F.
System: The UNT Digital Library
Adaptive Power Management for Autonomic Resource Configuration in Large-scale Computer Systems (open access)

Adaptive Power Management for Autonomic Resource Configuration in Large-scale Computer Systems

In order to run and manage resource-intensive high-performance applications, large-scale computing and storage platforms have been evolving rapidly in various domains in both academia and industry. The energy expenditure consumed to operate and maintain these cloud computing infrastructures is a major factor to influence the overall profit and efficiency for most cloud service providers. Moreover, considering the mitigation of environmental damage from excessive carbon dioxide emission, the amount of power consumed by enterprise-scale data centers should be constrained for protection of the environment.Generally speaking, there exists a trade-off between power consumption and application performance in large-scale computing systems and how to balance these two factors has become an important topic for researchers and engineers in cloud and HPC communities. Therefore, minimizing the power usage while satisfying the Service Level Agreements have become one of the most desirable objectives in cloud computing research and implementation. Since the fundamental feature of the cloud computing platform is hosting workloads with a variety of characteristics in a consolidated and on-demand manner, it is demanding to explore the inherent relationship between power usage and machine configurations. Subsequently, with an understanding of these inherent relationships, researchers are able to develop effective power management policies to optimize …
Date: August 2015
Creator: Zhang, Ziming
System: The UNT Digital Library
Automated Real-time Objects Detection in Colonoscopy Videos for Quality Measurements (open access)

Automated Real-time Objects Detection in Colonoscopy Videos for Quality Measurements

The effectiveness of colonoscopy depends on the quality of the inspection of the colon. There was no automated measurement method to evaluate the quality of the inspection. This thesis addresses this issue by investigating an automated post-procedure quality measurement technique and proposing a novel approach automatically deciding a percentage of stool areas in images of digitized colonoscopy video files. It involves the classification of image pixels based on their color features using a new method of planes on RGB (red, green and blue) color space. The limitation of post-procedure quality measurement is that quality measurements are available long after the procedure was done and the patient was released. A better approach is to inform any sub-optimal inspection immediately so that the endoscopist can improve the quality in real-time during the procedure. This thesis also proposes an extension to post-procedure method to detect stool, bite-block, and blood regions in real-time using color features in HSV color space. These three objects play a major role in quality measurements in colonoscopy. The proposed method partitions very large positive examples of each of these objects into a number of groups. These groups are formed by taking intersection of positive examples with a hyper plane. …
Date: August 2013
Creator: Kumara, Muthukudage Jayantha
System: The UNT Digital Library
Detection of Temporal Events and Abnormal Images for Quality Analysis in Endoscopy Videos (open access)

Detection of Temporal Events and Abnormal Images for Quality Analysis in Endoscopy Videos

Recent reports suggest that measuring the objective quality is very essential towards the success of colonoscopy. Several quality indicators (i.e. metrics) proposed in recent studies are implemented in software systems that compute real-time quality scores for routine screening colonoscopy. Most quality metrics are derived based on various temporal events occurred during the colonoscopy procedure. The location of the phase boundary between the insertion and the withdrawal phases and the amount of circumferential inspection are two such important temporal events. These two temporal events can be determined by analyzing various camera motions of the colonoscope. This dissertation put forward a novel method to estimate X, Y and Z directional motions of the colonoscope using motion vector templates. Since abnormalities of a WCE or a colonoscopy video can be found in a small number of frames (around 5% out of total frames), it is very helpful if a computer system can decide whether a frame has any mucosal abnormalities. Also, the number of detected abnormal lesions during a procedure is used as a quality indicator. Majority of the existing abnormal detection methods focus on detecting only one type of abnormality or the overall accuracies are somewhat low if the method tries to …
Date: August 2013
Creator: Nawarathna, Ruwan D.
System: The UNT Digital Library
Simulation of Dengue Outbreak in Thailand (open access)

Simulation of Dengue Outbreak in Thailand

The dengue virus has become widespread worldwide in recent decades. It has no specific treatment and affects more than 40% of the entire population in the world. In Thailand, dengue has been a health concern for more than half a century. The highest number of cases in one year was 174,285 in 1987, leading to 1,007 deaths. In the present day, dengue is distributed throughout the entire country. Therefore, dengue has become a major challenge for public health in terms of both prevention and control of outbreaks. Different methodologies and ways of dealing with dengue outbreaks have been put forward by researchers. Computational models and simulations play an important role, as they have the ability to help researchers and officers in public health gain a greater understanding of the virus's epidemic activities. In this context, this dissertation presents a new framework, Modified Agent-Based Modeling (mABM), a hybrid platform between a mathematical model and a computational model, to simulate a dengue outbreak in human and mosquito populations. This framework improves on the realism of former models by utilizing the reported data from several Thai government organizations, such as the Thai Ministry of Public Health (MoPH), the National Statistical Office, and others. …
Date: August 2018
Creator: Meesumrarn, Thiraphat
System: The UNT Digital Library
Secure and Trusted Execution Framework for Virtualized Workloads (open access)

Secure and Trusted Execution Framework for Virtualized Workloads

In this dissertation, we have analyzed various security and trustworthy solutions for modern computing systems and proposed a framework that will provide holistic security and trust for the entire lifecycle of a virtualized workload. The framework consists of 3 novel techniques and a set of guidelines. These 3 techniques provide necessary elements for secure and trusted execution environment while the guidelines ensure that the virtualized workload remains in a secure and trusted state throughout its lifecycle. We have successfully implemented and demonstrated that the framework provides security and trust guarantees at the time of launch, any time during the execution, and during an update of the virtualized workload. Given the proliferation of virtualization from cloud servers to embedded systems, techniques presented in this dissertation can be implemented on most computing systems.
Date: August 2018
Creator: Kotikela, Srujan D
System: The UNT Digital Library
New Frameworks for Secure Image Communication in the Internet of Things (IoT) (open access)

New Frameworks for Secure Image Communication in the Internet of Things (IoT)

The continuous expansion of technology, broadband connectivity and the wide range of new devices in the IoT cause serious concerns regarding privacy and security. In addition, in the IoT a key challenge is the storage and management of massive data streams. For example, there is always the demand for acceptable size with the highest quality possible for images to meet the rapidly increasing number of multimedia applications. The effort in this dissertation contributes to the resolution of concerns related to the security and compression functions in image communications in the Internet of Thing (IoT), due to the fast of evolution of IoT. This dissertation proposes frameworks for a secure digital camera in the IoT. The objectives of this dissertation are twofold. On the one hand, the proposed framework architecture offers a double-layer of protection: encryption and watermarking that will address all issues related to security, privacy, and digital rights management (DRM) by applying a hardware architecture of the state-of-the-art image compression technique Better Portable Graphics (BPG), which achieves high compression ratio with small size. On the other hand, the proposed framework of SBPG is integrated with the Digital Camera. Thus, the proposed framework of SBPG integrated with SDC is suitable …
Date: August 2016
Creator: Albalawi, Umar Abdalah S
System: The UNT Digital Library
Sensing and Decoding Brain States for Predicting and Enhancing Human Behavior, Health, and Security (open access)

Sensing and Decoding Brain States for Predicting and Enhancing Human Behavior, Health, and Security

The human brain acts as an intelligent sensor by helping in effective signal communication and execution of logical functions and instructions, thus, coordinating all functions of the human body. More importantly, it shows the potential to combine prior knowledge with adaptive learning, thus ensuring constant improvement. These qualities help the brain to interact efficiently with both, the body (brain-body) as well as the environment (brain-environment). This dissertation attempts to apply the brain-body-environment interactions (BBEI) to elevate human existence and enhance our day-to-day experiences. For instance, when one stepped out of the house in the past, one had to carry keys (for unlocking), money (for purchasing), and a phone (for communication). With the advent of smartphones, this scenario changed completely and today, it is often enough to carry just one's smartphone because all the above activities can be performed with a single device. In the future, with advanced research and progress in BBEI interactions, one will be able to perform many activities by dictating it in one's mind without any physical involvement. This dissertation aims to shift the paradigm of existing brain-computer-interfaces from just ‘control' to ‘monitor, control, enhance, and restore' in three main areas - healthcare, transportation safety, and cryptography. …
Date: August 2016
Creator: Bajwa, Garima
System: The UNT Digital Library

Frameworks for Attribute-Based Access Control (ABAC) Policy Engineering

In this disseration we propose semi-automated top-down policy engineering approaches for attribute-based access control (ABAC) development. Further, we propose a hybrid ABAC policy engineering approach to combine the benefits and address the shortcomings of both top-down and bottom-up approaches. In particular, we propose three frameworks: (i) ABAC attributes extraction, (ii) ABAC constraints extraction, and (iii) hybrid ABAC policy engineering. Attributes extraction framework comprises of five modules that operate together to extract attributes values from natural language access control policies (NLACPs); map the extracted values to attribute keys; and assign each key-value pair to an appropriate entity. For ABAC constraints extraction framework, we design a two-phase process to extract ABAC constraints from NLACPs. The process begins with the identification phase which focuses on identifying the right boundary of constraint expressions. Next is the normalization phase, that aims at extracting the actual elements that pose a constraint. On the other hand, our hybrid ABAC policy engineering framework consists of 5 modules. This framework combines top-down and bottom-up policy engineering techniques to overcome the shortcomings of both approaches and to generate policies that are more intuitive and relevant to actual organization policies. With this, we believe that our work takes essential steps towards …
Date: August 2020
Creator: Alohaly, Manar
System: The UNT Digital Library
Kriging Methods to Exploit Spatial Correlations of EEG Signals for Fast and Accurate Seizure Detection in the IoMT (open access)

Kriging Methods to Exploit Spatial Correlations of EEG Signals for Fast and Accurate Seizure Detection in the IoMT

Epileptic seizure presents a formidable threat to the life of its sufferers, leaving them unconscious within seconds of its onset. Having a mortality rate that is at least twice that of the general population, it is a true cause for concern which has gained ample attention from various research communities. About 800 million people in the world will have at least one seizure experience in their lifespan. Injuries sustained during a seizure crisis are one of the leading causes of death in epilepsy. These can be prevented by an early detection of seizure accompanied by a timely intervention mechanism. The research presented in this dissertation explores Kriging methods to exploit spatial correlations of electroencephalogram (EEG) Signals from the brain, for fast and accurate seizure detection in the Internet of Medical Things (IoMT) using edge computing paradigms, by modeling the brain as a three-dimensional spatial object, similar to a geographical panorama. This dissertation proposes basic, hierarchical and distributed Kriging models, with a deep neural network (DNN) wrapper in some instances. Experimental results from the models are highly promising for real-time seizure detection, with excellent performance in seizure detection latency and training time, as well as accuracy, sensitivity and specificity which compare …
Date: August 2020
Creator: Olokodana, Ibrahim Latunde
System: The UNT Digital Library
Joint Schemes for Physical Layer Security and Error Correction (open access)

Joint Schemes for Physical Layer Security and Error Correction

The major challenges facing resource constraint wireless devices are error resilience, security and speed. Three joint schemes are presented in this research which could be broadly divided into error correction based and cipher based. The error correction based ciphers take advantage of the properties of LDPC codes and Nordstrom Robinson code. A cipher-based cryptosystem is also presented in this research. The complexity of this scheme is reduced compared to conventional schemes. The securities of the ciphers are analyzed against known-plaintext and chosen-plaintext attacks and are found to be secure. Randomization test was also conducted on these schemes and the results are presented. For the proof of concept, the schemes were implemented in software and hardware and these shows a reduction in hardware usage compared to conventional schemes. As a result, joint schemes for error correction and security provide security to the physical layer of wireless communication systems, a layer in the protocol stack where currently little or no security is implemented. In this physical layer security approach, the properties of powerful error correcting codes are exploited to deliver reliability to the intended parties, high security against eavesdroppers and efficiency in communication system. The notion of a highly secure and reliable …
Date: August 2011
Creator: Adamo, Oluwayomi Bamidele
System: The UNT Digital Library
Scene Analysis Using Scale Invariant Feature Extraction and Probabilistic Modeling (open access)

Scene Analysis Using Scale Invariant Feature Extraction and Probabilistic Modeling

Conventional pattern recognition systems have two components: feature analysis and pattern classification. For any object in an image, features could be considered as the major characteristic of the object either for object recognition or object tracking purpose. Features extracted from a training image, can be used to identify the object when attempting to locate the object in a test image containing many other objects. To perform reliable scene analysis, it is important that the features extracted from the training image are detectable even under changes in image scale, noise and illumination. Scale invariant feature has wide applications such as image classification, object recognition and object tracking in the image processing area. In this thesis, color feature and SIFT (scale invariant feature transform) are considered to be scale invariant feature. The classification, recognition and tracking result were evaluated with novel evaluation criterion and compared with some existing methods. I also studied different types of scale invariant feature for the purpose of solving scene analysis problems. I propose probabilistic models as the foundation of analysis scene scenario of images. In order to differential the content of image, I develop novel algorithms for the adaptive combination for multiple features extracted from images. I …
Date: August 2011
Creator: Shen, Yao
System: The UNT Digital Library
Social Network Simulation and Mining Social Media to Advance Epidemiology (open access)

Social Network Simulation and Mining Social Media to Advance Epidemiology

Traditional Public Health decision-support can benefit from the Web and social media revolution. This dissertation presents approaches to mining social media benefiting public health epidemiology. Through discovery and analysis of trends in Influenza related blogs, a correlation to Centers for Disease Control and Prevention (CDC) influenza-like-illness patient reporting at sentinel health-care providers is verified. A second approach considers personal beliefs of vaccination in social media. A vaccine for human papillomavirus (HPV) was approved by the Food and Drug Administration in May 2006. The virus is present in nearly all cervical cancers and implicated in many throat and oral cancers. Results from automatic sentiment classification of HPV vaccination beliefs are presented which will enable more accurate prediction of the vaccine's population-level impact. Two epidemic models are introduced that embody the intimate social networks related to HPV transmission. Ultimately, aggregating these methodologies with epidemic and social network modeling facilitate effective development of strategies for targeted interventions.
Date: August 2009
Creator: Corley, Courtney David
System: The UNT Digital Library

SurfKE: A Graph-Based Feature Learning Framework for Keyphrase Extraction

Access: Use of this item is restricted to the UNT Community
Current unsupervised approaches for keyphrase extraction compute a single importance score for each candidate word by considering the number and quality of its associated words in the graph and they are not flexible enough to incorporate multiple types of information. For instance, nodes in a network may exhibit diverse connectivity patterns which are not captured by the graph-based ranking methods. To address this, we present a new approach to keyphrase extraction that represents the document as a word graph and exploits its structure in order to reveal underlying explanatory factors hidden in the data that may distinguish keyphrases from non-keyphrases. Experimental results show that our model, which uses phrase graph representations in a supervised probabilistic framework, obtains remarkable improvements in performance over previous supervised and unsupervised keyphrase extraction systems.
Date: August 2019
Creator: Florescu, Corina Andreea
System: The UNT Digital Library
Skin Detection in Image and Video Founded in Clustering and Region Growing (open access)

Skin Detection in Image and Video Founded in Clustering and Region Growing

Researchers have been involved for decades in search of an efficient skin detection method. Yet current methods have not overcome the major limitations. To overcome these limitations, in this dissertation, a clustering and region growing based skin detection method is proposed. These methods together with a significant insight result in a more effective algorithm. The insight concerns a capability to define dynamically the number of clusters in a collection of pixels organized as an image. In clustering for most problem domains, the number of clusters is fixed a priori and does not perform effectively over a wide variety of data contents. Therefore, in this dissertation, a skin detection method has been proposed using the above findings and validated. This method assigns the number of clusters based on image properties and ultimately allows freedom from manual thresholding or other manual operations. The dynamic determination of clustering outcomes allows for greater automation of skin detection when dealing with uncertain real-world conditions.
Date: August 2019
Creator: Islam, A B M Rezbaul
System: The UNT Digital Library
Sentence Similarity Analysis with Applications in Automatic Short Answer Grading (open access)

Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating automatic feedback from the student answers. I continue to combine graph alignment features with lexical semantic similarity measures and employ machine learning techniques to show that grade assignment error can be reduced compared to a system that considers only lexical semantic measures of similarity. I also detail a preliminary attempt to align the dependency graphs of student and instructor answers in order to utilize a structural component that is necessary to simulate human-level grading of student answers. I further explore the utility of these techniques to several related tasks in natural language processing including the detection of text similarity, paraphrase, and textual entailment.
Date: August 2012
Creator: Mohler, Michael A. G.
System: The UNT Digital Library
Application of Adaptive Techniques in Regression Testing for Modern Software Development (open access)

Application of Adaptive Techniques in Regression Testing for Modern Software Development

In this dissertation we investigate the applicability of different adaptive techniques to improve the effectiveness and efficiency of the regression testing. Initially, we introduce the concept of regression testing. We then perform a literature review of current practices and state-of-the-art regression testing techniques. Finally, we advance the regression testing techniques by performing four empirical studies in which we use different types of information (e.g. user session, source code, code commit, etc.) to investigate the effectiveness of each software metric on fault detection capability for different software environments. In our first empirical study, we show the effectiveness of applying user session information for test case prioritization. In our next study, we apply learning from the previous study, and implement a collaborative filtering recommender system for test case prioritization, which uses user sessions and change history information as input parameter, and return the risk score associated with each component. Results of this study show that our recommender system improves the effectiveness of test prioritization; the performance of our approach was particularly noteworthy when we were under time constraints. We then investigate the merits of multi-objective testing over single objective techniques with a graph-based testing framework. Results of this study indicate that the …
Date: August 2019
Creator: Azizi, Maral
System: The UNT Digital Library
Modeling Epidemics on Structured Populations: Effects of Socio-demographic Characteristics and Immune Response Quality (open access)

Modeling Epidemics on Structured Populations: Effects of Socio-demographic Characteristics and Immune Response Quality

Epidemiologists engage in the study of the distribution and determinants of health-related states or events in human populations. Eventually, they will apply that study to prevent and control problems and contingencies associated with the health of the population. Due to the spread of new pathogens and the emergence of new bio-terrorism threats, it has become imperative to develop new and expand existing techniques to equip public health providers with robust tools to predict and control health-related crises. In this dissertation, I explore the effects caused in the disease dynamics by the differences in individuals’ physiology and social/behavioral characteristics. Multiple computational and mathematical models were developed to quantify the effect of those factors on spatial and temporal variations of the disease epidemics. I developed statistical methods to measure the effects caused in the outbreak dynamics by the incorporation of heterogeneous demographics and social interactions to the individuals of the population. Specifically, I studied the relationship between demographics and the physiological characteristics of an individual when preparing for an infectious disease epidemic.
Date: August 2014
Creator: Reyes Silveyra, Jorge A.
System: The UNT Digital Library