Bayesian Probabilistic Reasoning Applied to Mathematical Epidemiology for Predictive Spatiotemporal Analysis of Infectious Diseases (open access)

Bayesian Probabilistic Reasoning Applied to Mathematical Epidemiology for Predictive Spatiotemporal Analysis of Infectious Diseases

Abstract Probabilistic reasoning under uncertainty suits well to analysis of disease dynamics. The stochastic nature of disease progression is modeled by applying the principles of Bayesian learning. Bayesian learning predicts the disease progression, including prevalence and incidence, for a geographic region and demographic composition. Public health resources, prioritized by the order of risk levels of the population, will efficiently minimize the disease spread and curtail the epidemic at the earliest. A Bayesian network representing the outbreak of influenza and pneumonia in a geographic region is ported to a newer region with different demographic composition. Upon analysis for the newer region, the corresponding prevalence of influenza and pneumonia among the different demographic subgroups is inferred for the newer region. Bayesian reasoning coupled with disease timeline is used to reverse engineer an influenza outbreak for a given geographic and demographic setting. The temporal flow of the epidemic among the different sections of the population is analyzed to identify the corresponding risk levels. In comparison to spread vaccination, prioritizing the limited vaccination resources to the higher risk groups results in relatively lower influenza prevalence. HIV incidence in Texas from 1989-2002 is analyzed using demographic based epidemic curves. Dynamic Bayesian networks are integrated with …
Date: May 2006
Creator: Abbas, Kaja Moinudeen
System: The UNT Digital Library
An Integrated Architecture for Ad Hoc Grids (open access)

An Integrated Architecture for Ad Hoc Grids

Extensive research has been conducted by the grid community to enable large-scale collaborations in pre-configured environments. grid collaborations can vary in scale and motivation resulting in a coarse classification of grids: national grid, project grid, enterprise grid, and volunteer grid. Despite the differences in scope and scale, all the traditional grids in practice share some common assumptions. They support mutually collaborative communities, adopt a centralized control for membership, and assume a well-defined non-changing collaboration. To support grid applications that do not confirm to these assumptions, we propose the concept of ad hoc grids. In the context of this research, we propose a novel architecture for ad hoc grids that integrates a suite of component frameworks. Specifically, our architecture combines the community management framework, security framework, abstraction framework, quality of service framework, and reputation framework. The overarching objective of our integrated architecture is to support a variety of grid applications in a self-controlled fashion with the help of a self-organizing ad hoc community. We introduce mechanisms in our architecture that successfully isolates malicious elements from the community, inherently improving the quality of grid services and extracting deterministic quality assurances from the underlying infrastructure. We also emphasize on the technology-independence of our …
Date: May 2006
Creator: Amin, Kaizar Abdul Husain
System: The UNT Digital Library
Resource Efficient and Scalable Routing using Intelligent Mobile Agents (open access)

Resource Efficient and Scalable Routing using Intelligent Mobile Agents

Many of the contemporary routing algorithms use simple mechanisms such as flooding or broadcasting to disseminate the routing information available to them. Such routing algorithms cause significant network resource overhead due to the large number of messages generated at each host/router throughout the route update process. Many of these messages are wasteful since they do not contribute to the route discovery process. Reducing the resource overhead may allow for several algorithms to be deployed in a wide range of networks (wireless and ad-hoc) which require a simple routing protocol due to limited availability of resources (memory and bandwidth). Motivated by the need to reduce the resource overhead associated with routing algorithms a new implementation of distance vector routing algorithm using an agent-based paradigm known as Agent-based Distance Vector Routing (ADVR) has been proposed. In ADVR, the ability of route discovery and message passing shifts from the nodes to individual agents that traverse the network, co-ordinate with each other and successively update the routing tables of the nodes they visit.
Date: May 2003
Creator: Amin, Kaizar Abdul Husain
System: The UNT Digital Library
The Role of Intelligent Mobile Agents in Network Management and Routing (open access)

The Role of Intelligent Mobile Agents in Network Management and Routing

In this research, the application of intelligent mobile agents to the management of distributed network environments is investigated. Intelligent mobile agents are programs which can move about network systems in a deterministic manner in carrying their execution state. These agents can be considered an application of distributed artificial intelligence where the (usually small) agent code is moved to the data and executed locally. The mobile agent paradigm offers potential advantages over many conventional mechanisms which move (often large) data to the code, thereby wasting available network bandwidth. The performance of agents in network routing and knowledge acquisition has been investigated and simulated. A working mobile agent system has also been designed and implemented in JDK 1.2.
Date: December 2000
Creator: Balamuru, Vinay Gopal
System: The UNT Digital Library

Using Reinforcement Learning in Partial Order Plan Space

Access: Use of this item is restricted to the UNT Community
Partial order planning is an important approach that solves planning problems without completely specifying the orderings between the actions in the plan. This property provides greater flexibility in executing plans; hence making the partial order planners a preferred choice over other planning methodologies. However, in order to find partially ordered plans, partial order planners perform a search in plan space rather than in space of world states and an uninformed search in plan space leads to poor efficiency. In this thesis, I discuss applying a reinforcement learning method, called First-visit Monte Carlo method, to partial order planning in order to design agents which do not need any training data or heuristics but are still able to make informed decisions in plan space based on experience. Communicating effectively with the agent is crucial in reinforcement learning. I address how this task was accomplished in plan space and the results from an evaluation of a blocks world test bed.
Date: May 2006
Creator: Ceylan, Hakan
System: The UNT Digital Library
Natural Language Interfaces to Databases (open access)

Natural Language Interfaces to Databases

Natural language interfaces to databases (NLIDB) are systems that aim to bridge the gap between the languages used by humans and computers, and automatically translate natural language sentences to database queries. This thesis proposes a novel approach to NLIDB, using graph-based models. The system starts by collecting as much information as possible from existing databases and sentences, and transforms this information into a knowledge base for the system. Given a new question, the system will use this knowledge to analyze and translate the sentence into its corresponding database query statement. The graph-based NLIDB system uses English as the natural language, a relational database model, and SQL as the formal query language. In experiments performed with natural language questions ran against a large database containing information about U.S. geography, the system showed good performance compared to the state-of-the-art in the field.
Date: December 2006
Creator: Chandra, Yohan
System: The UNT Digital Library
An Approach Towards Self-Supervised Classification Using Cyc (open access)

An Approach Towards Self-Supervised Classification Using Cyc

Due to the long duration required to perform manual knowledge entry by human knowledge engineers it is desirable to find methods to automatically acquire knowledge about the world by accessing online information. In this work I examine using the Cyc ontology to guide the creation of Naïve Bayes classifiers to provide knowledge about items described in Wikipedia articles. Given an initial set of Wikipedia articles the system uses the ontology to create positive and negative training sets for the classifiers in each category. The order in which classifiers are generated and used to test articles is also guided by the ontology. The research conducted shows that a system can be created that utilizes statistical text classification methods to extract information from an ad-hoc generated information source like Wikipedia for use in a formal semantic ontology like Cyc. Benefits and limitations of the system are discussed along with future work.
Date: December 2006
Creator: Coursey, Kino High
System: The UNT Digital Library
Keywords in the mist:  Automated keyword extraction for very large documents and back of the book indexing. (open access)

Keywords in the mist: Automated keyword extraction for very large documents and back of the book indexing.

This research addresses the problem of automatic keyphrase extraction from large documents and back of the book indexing. The potential benefits of automating this process are far reaching, from improving information retrieval in digital libraries, to saving countless man-hours by helping professional indexers creating back of the book indexes. The dissertation introduces a new methodology to evaluate automated systems, which allows for a detailed, comparative analysis of several techniques for keyphrase extraction. We introduce and evaluate both supervised and unsupervised techniques, designed to balance the resource requirements of an automated system and the best achievable performance. Additionally, a number of novel features are proposed, including a statistical informativeness measure based on chi statistics; an encyclopedic feature that taps into the vast knowledge base of Wikipedia to establish the likelihood of a phrase referring to an informative concept; and a linguistic feature based on sophisticated semantic analysis of the text using current theories of discourse comprehension. The resulting keyphrase extraction system is shown to outperform the current state of the art in supervised keyphrase extraction by a large margin. Moreover, a fully automated back of the book indexing system based on the keyphrase extraction system was shown to lead to back …
Date: May 2008
Creator: Csomai, Andras
System: The UNT Digital Library
Performance Analysis of Wireless Networks with QoS Adaptations (open access)

Performance Analysis of Wireless Networks with QoS Adaptations

The explosive demand for multimedia and fast transmission of continuous media on wireless networks means the simultaneous existence of traffic requiring different qualities of service (QoS). In this thesis, several efficient algorithms have been developed which offer several QoS to the end-user. We first look at a request TDMA/CDMA protocol for supporting wireless multimedia traffic, where CDMA is laid over TDMA. Then we look at a hybrid push-pull algorithm for wireless networks, and present a generalized performance analysis of the proposed protocol. Some of the QoS factors considered include customer retrial rates due to user impatience and system timeouts and different levels of priority and weights for mobile hosts. We have also looked at how customer impatience and system timeouts affect the QoS provided by several queuing and scheduling schemes such as FIFO, priority, weighted fair queuing, and the application of the stretch-optimal algorithm to scheduling.
Date: August 2003
Creator: Dash, Trivikram
System: The UNT Digital Library
Survey of Approximation Algorithms for Set Cover Problem (open access)

Survey of Approximation Algorithms for Set Cover Problem

In this thesis, I survey 11 approximation algorithms for unweighted set cover problem. I have also implemented the three algorithms and created a software library that stores the code I have written. The algorithms I survey are: 1. Johnson's standard greedy; 2. f-frequency greedy; 3. Goldsmidt, Hochbaum and Yu's modified greedy; 4. Halldorsson's local optimization; 5. Dur and Furer semi local optimization; 6. Asaf Levin's improvement to Dur and Furer; 7. Simple rounding; 8. Randomized rounding; 9. LP duality; 10. Primal-dual schema; and 11. Network flow technique. Most of the algorithms surveyed are refinements of standard greedy algorithm.
Date: December 2009
Creator: Dutta, Himanshu Shekhar
System: The UNT Digital Library
A Minimally Supervised Word Sense Disambiguation Algorithm Using Syntactic Dependencies and Semantic Generalizations (open access)

A Minimally Supervised Word Sense Disambiguation Algorithm Using Syntactic Dependencies and Semantic Generalizations

Natural language is inherently ambiguous. For example, the word "bank" can mean a financial institution or a river shore. Finding the correct meaning of a word in a particular context is a task known as word sense disambiguation (WSD), which is essential for many natural language processing applications such as machine translation, information retrieval, and others. While most current WSD methods try to disambiguate a small number of words for which enough annotated examples are available, the method proposed in this thesis attempts to address all words in unrestricted text. The method is based on constraints imposed by syntactic dependencies and concept generalizations drawn from an external dictionary. The method was tested on standard benchmarks as used during the SENSEVAL-2 and SENSEVAL-3 WSD international evaluation exercises, and was found to be competitive.
Date: December 2005
Creator: Faruque, Md. Ehsanul
System: The UNT Digital Library
General Purpose Programming on Modern Graphics Hardware (open access)

General Purpose Programming on Modern Graphics Hardware

I start with a brief introduction to the graphics processing unit (GPU) as well as general-purpose computation on modern graphics hardware (GPGPU). Next, I explore the motivations for GPGPU programming, and the capabilities of modern GPUs (including advantages and disadvantages). Also, I give the background required for further exploring GPU programming, including the terminology used and the resources available. Finally, I include a comprehensive survey of previous and current GPGPU work, and end with a look at the future of GPU programming.
Date: May 2008
Creator: Fleming, Robert
System: The UNT Digital Library
Force-Directed Graph Drawing and Aesthetics Measurement in a Non-Strict Pure Functional Programming Language (open access)

Force-Directed Graph Drawing and Aesthetics Measurement in a Non-Strict Pure Functional Programming Language

Non-strict pure functional programming often requires redesigning algorithms and data structures to work more effectively under new constraints of non-strict evaluation and immutable state. Graph drawing algorithms, while numerous and broadly studied, have no presence in the non-strict pure functional programming model. Additionally, there is currently no freely licensed standalone toolkit used to quantitatively analyze aesthetics of graph drawings. This thesis addresses two previously unexplored questions. Can a force-directed graph drawing algorithm be implemented in a non-strict functional language, such as Haskell, and still be practically usable? Can an easily extensible aesthetic measuring tool be implemented in a language such as Haskell and still be practically usable? The focus of the thesis is on implementing one of the simplest force-directed algorithms, that of Fruchterman and Reingold, and comparing its resulting aesthetics to those of a well-known C++ implementation of the same algorithm.
Date: December 2009
Creator: Gaconnet, Christopher James
System: The UNT Digital Library
Flexible Digital Authentication Techniques (open access)

Flexible Digital Authentication Techniques

Abstract This dissertation investigates authentication techniques in some emerging areas. Specifically, authentication schemes have been proposed that are well-suited for embedded systems, and privacy-respecting pay Web sites. With embedded systems, a person could own several devices which are capable of communication and interaction, but these devices use embedded processors whose computational capabilities are limited as compared to desktop computers. Examples of this scenario include entertainment devices or appliances owned by a consumer, multiple control and sensor systems in an automobile or airplane, and environmental controls in a building. An efficient public key cryptosystem has been devised, which provides a complete solution to an embedded system, including protocols for authentication, authenticated key exchange, encryption, and revocation. The new construction is especially suitable for the devices with constrained computing capabilities and resources. Compared with other available authentication schemes, such as X.509, identity-based encryption, etc, the new construction provides unique features such as simplicity, efficiency, forward secrecy, and an efficient re-keying mechanism. In the application scenario for a pay Web site, users may be sensitive about their privacy, and do not wish their behaviors to be tracked by Web sites. Thus, an anonymous authentication scheme is desirable in this case. That is, a …
Date: May 2006
Creator: Ge, He
System: The UNT Digital Library
Exploring Trusted Platform Module Capabilities: A Theoretical and Experimental Study (open access)

Exploring Trusted Platform Module Capabilities: A Theoretical and Experimental Study

Trusted platform modules (TPMs) are hardware modules that are bound to a computer's motherboard, that are being included in many desktops and laptops. Augmenting computers with these hardware modules adds powerful functionality in distributed settings, allowing us to reason about the security of these systems in new ways. In this dissertation, I study the functionality of TPMs from a theoretical as well as an experimental perspective. On the theoretical front, I leverage various features of TPMs to construct applications like random oracles that are impossible to implement in a standard model of computation. Apart from random oracles, I construct a new cryptographic primitive which is basically a non-interactive form of the standard cryptographic primitive of oblivious transfer. I apply this new primitive to secure mobile agent computations, where interaction between various entities is typically required to ensure security. I prove these constructions are secure using standard cryptographic techniques and assumptions. To test the practicability of these constructions and their applications, I performed an experimental study, both on an actual TPM and a software TPM simulator which has been enhanced to make it reflect timings from a real TPM. This allowed me to benchmark the performance of the applications and test …
Date: May 2008
Creator: Gunupudi, Vandana
System: The UNT Digital Library

Logic Programming Tools for Dynamic Content Generation and Internet Data Mining

Access: Use of this item is restricted to the UNT Community
The phenomenal growth of Information Technology requires us to elicit, store and maintain huge volumes of data. Analyzing this data for various purposes is becoming increasingly important. Data mining consists of applying data analysis and discovery algorithms that under acceptable computational efficiency limitations, produce a particular enumeration of patterns over the data. We present two techniques based on using Logic programming tools for data mining. Data mining analyzes data by extracting patterns which describe its structure and discovers co-relations in the form of rules. We distinguish analysis methods as visual and non-visual and present one application of each. We explain that our focus on the field of Logic Programming makes some of the very complex tasks related to Web based data mining and dynamic content generation, simple and easy to implement in a uniform framework.
Date: December 2000
Creator: Gupta, Anima
System: The UNT Digital Library

A Netcentric Scientific Research Repository

Access: Use of this item is restricted to the UNT Community
The Internet and networks in general have become essential tools for disseminating in-formation. Search engines have become the predominant means of finding information on the Web and all other data repositories, including local resources. Domain scientists regularly acquire and analyze images generated by equipment such as microscopes and cameras, resulting in complex image files that need to be managed in a convenient manner. This type of integrated environment has been recently termed a netcentric sci-entific research repository. I developed a number of data manipulation tools that allow researchers to manage their information more effectively in a netcentric environment. The specific contributions are: (1) A unique interface for management of data including files and relational databases. A wrapper for relational databases was developed so that the data can be indexed and searched using traditional search engines. This approach allows data in databases to be searched with the same interface as other data. Fur-thermore, this approach makes it easier for scientists to work with their data if they are not familiar with SQL. (2) A Web services based architecture for integrating analysis op-erations into a repository. This technique allows the system to leverage the large num-ber of existing tools by wrapping them …
Date: December 2006
Creator: Harrington, Brian
System: The UNT Digital Library
Adaptive Planning and Prediction in Agent-Supported Distributed Collaboration. (open access)

Adaptive Planning and Prediction in Agent-Supported Distributed Collaboration.

Agents that act as user assistants will become invaluable as the number of information sources continue to proliferate. Such agents can support the work of users by learning to automate time-consuming tasks and filter information to manageable levels. Although considerable advances have been made in this area, it remains a fertile area for further development. One application of agents under careful scrutiny is the automated negotiation of conflicts between different user's needs and desires. Many techniques require explicit user models in order to function. This dissertation explores a technique for dynamically constructing user models and the impact of using them to anticipate the need for negotiation. Negotiation is reduced by including an advising aspect to the agent that can use this anticipation of conflict to adjust user behavior.
Date: December 2004
Creator: Hartness, Ken T. N.
System: The UNT Digital Library
An Empirical Evaluation of Communication and Coordination Effectiveness in Autonomous Reactive Multiagent Systems (open access)

An Empirical Evaluation of Communication and Coordination Effectiveness in Autonomous Reactive Multiagent Systems

This thesis describes experiments designed to measure the effect of collaborative communication on task performance of a multiagent system. A discrete event simulation was developed to model a multi-agent system completing a task to find and collect food resources, with the ability to substitute various communication and coordination methods. Experiments were conducted to find the effects of the various communication methods on completion of the task to find and harvest the food resources. Results show that communication decreases the time required to complete the task. However, all communication methods do not fare equally well. In particular, results indicate that the communication model of the bee is a particularly effective method of agent communication and collaboration. Furthermore, results indicate that direct communication with additional information content provides better completion results. Cost-benefit models show some conflicting information, indicating that the increased performance may not offset the additional cost of achieving that performance.
Date: May 2005
Creator: Hurt, David
System: The UNT Digital Library
Planning techniques for agent based 3D animations. (open access)

Planning techniques for agent based 3D animations.

The design of autonomous agents capable of performing a given goal in a 3D domain continues to be a challenge for computer animated story generation systems. We present a novel prototype which consists of a 3D engine and a planner for a simple virtual world. We incorporate the 2D planner into the 3D engine to provide 3D animations. Based on the plan, the 3D world is created and the objects are positioned. Then the plan is linearized into simpler actions for object animation and rendered via the 3D engine. We use JINNI3D as the engine and WARPLAN-C as the planner for the above-mentioned prototype. The user can interact with the system using a simple natural language interface. The interface consists of a shallow parser, which is capable of identifying a set of predefined basic commands. The command given by the user is considered as the goal for the planner. The resulting plan is created and rendered in 3D. The overall system is comparable to a character based interactive story generation system except that it is limited to the predefined 3D environment.
Date: December 2005
Creator: Kandaswamy, Balasubramanian
System: The UNT Digital Library
System and Methods for Detecting Unwanted Voice Calls (open access)

System and Methods for Detecting Unwanted Voice Calls

Voice over IP (VoIP) is a key enabling technology for the migration of circuit-switched PSTN architectures to packet-based IP networks. However, this migration is successful only if the present problems in IP networks are addressed before deploying VoIP infrastructure on a large scale. One of the important issues that the present VoIP networks face is the problem of unwanted calls commonly referred to as SPIT (spam over Internet telephony). Mostly, these SPIT calls are from unknown callers who broadcast unwanted calls. There may be unwanted calls from legitimate and known people too. In this case, the unwantedness depends on social proximity of the communicating parties. For detecting these unwanted calls, I propose a framework that analyzes incoming calls for unwanted behavior. The framework includes a VoIP spam detector (VSD) that analyzes incoming VoIP calls for spam behavior using trust and reputation techniques. The framework also includes a nuisance detector (ND) that proactively infers the nuisance (or reluctance of the end user) to receive incoming calls. This inference is based on past mutual behavior between the calling and the called party (i.e., caller and callee), the callee's presence (mood or state of mind) and tolerance in receiving voice calls from the …
Date: December 2007
Creator: Kolan, Prakash
System: The UNT Digital Library
Towards Communicating Simple Sentence using Pictorial Representations (open access)

Towards Communicating Simple Sentence using Pictorial Representations

Language can sometimes be an impediment in communication. Whether we are talking about people who speak different languages, students who are learning a new language, or people with language disorders, the understanding of linguistic representations in a given language requires a certain amount of knowledge that not everybody has. In this thesis, we propose "translation through pictures" as a means for conveying simple pieces of information across language barriers, and describe a system that can automatically generate pictorial representations for simple sentences. Comparative experiments conducted on visual and linguistic representations of information show that a considerable amount of understanding can be achieved through pictorial descriptions, with results within a comparable range of those obtained with current machine translation techniques. Moreover, a user study conducted around the pictorial translation system reveals that users found the system to generally produce correct word/image associations, and rate the system as interactive and intelligent.
Date: May 2006
Creator: Leong, Chee Wee
System: The UNT Digital Library
Group-EDF: A New Approach and an Efficient Non-Preemptive Algorithm for Soft Real-Time Systems (open access)

Group-EDF: A New Approach and an Efficient Non-Preemptive Algorithm for Soft Real-Time Systems

Hard real-time systems in robotics, space and military missions, and control devices are specified with stringent and critical time constraints. On the other hand, soft real-time applications arising from multimedia, telecommunications, Internet web services, and games are specified with more lenient constraints. Real-time systems can also be distinguished in terms of their implementation into preemptive and non-preemptive systems. In preemptive systems, tasks are often preempted by higher priority tasks. Non-preemptive systems are gaining interest for implementing soft-real applications on multithreaded platforms. In this dissertation, I propose a new algorithm that uses a two-level scheduling strategy for scheduling non-preemptive soft real-time tasks. Our goal is to improve the success ratios of the well-known earliest deadline first (EDF) approach when the load on the system is very high and to improve the overall performance in both underloaded and overloaded conditions. Our approach, known as group-EDF (gEDF), is based on dynamic grouping of tasks with deadlines that are very close to each other, and using a shortest job first (SJF) technique to schedule tasks within the group. I believe that grouping tasks dynamically with similar deadlines and utilizing secondary criteria, such as minimizing the total execution time can lead to new and more …
Date: August 2006
Creator: Li, Wenming
System: The UNT Digital Library
High Performance Architecture using Speculative Threads and Dynamic Memory Management Hardware (open access)

High Performance Architecture using Speculative Threads and Dynamic Memory Management Hardware

With the advances in very large scale integration (VLSI) technology, hundreds of billions of transistors can be packed into a single chip. With the increased hardware budget, how to take advantage of available hardware resources becomes an important research area. Some researchers have shifted from control flow Von-Neumann architecture back to dataflow architecture again in order to explore scalable architectures leading to multi-core systems with several hundreds of processing elements. In this dissertation, I address how the performance of modern processing systems can be improved, while attempting to reduce hardware complexity and energy consumptions. My research described here tackles both central processing unit (CPU) performance and memory subsystem performance. More specifically I will describe my research related to the design of an innovative decoupled multithreaded architecture that can be used in multi-core processor implementations. I also address how memory management functions can be off-loaded from processing pipelines to further improve system performance and eliminate cache pollution caused by runtime management functions.
Date: December 2007
Creator: Li, Wentong
System: The UNT Digital Library