94 Matching Results

Results open in a new window/tab.

An English and Arabic Character Printer (open access)

An English and Arabic Character Printer

This paper is presented in satisfaction of the requirement for two problems in lieu of thesis which are required for the degree, Master of Science. The two problems are: (1) to provide an electric interface between the M6800 microprocessor and the printer; and (2) to design an Arabic character set and to provide the logic required for its implementation. As it would be artificial and impractical to document these problems separately, a single document here is provided.
Date: December 1976
Creator: Abdel-Razzack, Malek G.
System: The UNT Digital Library
Generating Machine Code for High-Level Programming Languages (open access)

Generating Machine Code for High-Level Programming Languages

The purpose of this research was to investigate the generation of machine code from high-level programming language. The following steps were undertaken: 1) Choose a high-level programming language as the source language and a computer as the target computer. 2) Examine all stages during the compiling of a high-level programming language and all data sets involved in the compilation. 3) Discover the mechanism for generating machine code and the mechanism to generate more efficient machine code from the language. 3) Construct an algorithm for generating machine code for the target computer. The results suggest that compiler is best implemented in a high-level programming language, and that SCANNER and PARSER should be independent of target representations, if possible.
Date: December 1976
Creator: Chao, Chia-Huei
System: The UNT Digital Library
Execution Time Analysis through Software Monitors (open access)

Execution Time Analysis through Software Monitors

The analysis of an executing program and the isolation of critical code has been a problem since the first program was written. This thesis examines the process of program analysis through the use of a software monitoring system. Since there is a trend toward structured languages a subset of PL/I was developed t~o exhibit source statement monitoring and costing techniques. By filtering a PL/W program through a preorocessor which determines the cost of source statements and inserts monitoring code, a post-execution analysis of the program can be obtained. This analysis displays an estimated time cost for each source statements the number of times the statement w3s executed, and the product of these values. Additionally, a bar graph is printed in order to quickly locate very active code.
Date: December 1977
Creator: Whistler, Wayne C.
System: The UNT Digital Library
VISOR (Variable Interval Schedule Of Reinforcement) System Documentation (open access)

VISOR (Variable Interval Schedule Of Reinforcement) System Documentation

This program will be used in operant behavior research to monitor and record responses and trigger and record reinforcements on a variable reinforcement (VI) schedule. The original application of this program will be the servicing of several rat cages simultaneously. The response will be the pressing of a metal bar in the cage, the reinforcement will be the triggering of a feeding mechanism which disperses a food pellet into the cage. The subsequent applications of this program are not limited, in that the actual response and reinforcement devices and the subject type are all treated indifferently by the program.
Date: December 1979
Creator: Long, Daniel Paul
System: The UNT Digital Library
Design and Implementation of a Parser for the DBase II Query Language (open access)

Design and Implementation of a Parser for the DBase II Query Language

In this paper the DBase II query language of an RDBMS for personal computers is discussed. Other languages will be provided by large and sophisticated DBMS will not be discussed here. The reason for selecting the DBase II query language for discussion are as follows: 1. It is a simple language that can be learned easily [TOWN 84, DINE 84]. Within a short period, users can learn all of the facilities and manage the system very well. 2. It is a language suitable for interactive programming and execution like BASIC. 3. It provides adequate facilities for a small data base system and serves as an introductory guide for more sophisticated systems.
Date: December 1985
Creator: Chan, Kin Pong
System: The UNT Digital Library
Independent Quadtrees (open access)

Independent Quadtrees

This dissertation deals with the problem of manipulating and storing an image using quadtrees. A quadtree is a tree in which each node has four ordered children or is a leaf. It can be used to represent an image via hierarchical decomposition. The image is broken into four regions. A region can be a solid color (homogeneous) or a mixture of colors (heterogeneous). If a region is heterogeneous it is broken into four subregions, and the process continues recursively until all subregions are homogeneous. The traditional quadtree suffers from dependence on the underlying grid. The grid coordinate system is implicit, and therefore fixed. The fixed coordinate system implies a rigid tree. A rigid tree cannot be translated, scaled, or rotated. Instead, a new tree must be built which is the result of one of these transformations. This dissertation introduces the independent quadtree. The independent quadtree is free of any underlying coordinate system. The tree is no longer rigid and can be easily translated, scaled, or rotated. Algorithms to perform these operations axe presented. The translation and rotation algorithms take constant time. The scaling algorithm has linear time in the number nodes in the tree. The disadvantage of independent quadtrees is …
Date: December 1986
Creator: Atwood, Larry D. (Larry Dale)
System: The UNT Digital Library
A Timescale Estimating Model for Rule-Based Systems (open access)

A Timescale Estimating Model for Rule-Based Systems

The purpose of this study was to explore the subject of timescale estimating for rule-based systems. A model for estimating the timescale necessary to build rule-based systems was built and then tested in a controlled environment.
Date: December 1987
Creator: Moseley, Charles Warren
System: The UNT Digital Library
Computer Graphics Primitives and the Scan-Line Algorithm (open access)

Computer Graphics Primitives and the Scan-Line Algorithm

This paper presents the scan-line algorithm which has been implemented on the Lisp Machine. The scan-line algorithm resides beneath a library of primitive software routines which draw more fundamental objects: lines, triangles and rectangles. This routine, implemented in microcode, applies the A(BC)*D approach to word boundary alignments in order to create an extremely fast, efficient, and general purpose drawing primitive. The scan-line algorithm improves on previous methodologies by limiting the number of CPU intensive instructions and by minimizing the number of words referenced. This paper will describe how to draw scan-lines and the constraints imposed upon the scan-line algorithm by the Lisp Machine's hardware and software.
Date: December 1988
Creator: Myjak, Michael D. (Michael David)
System: The UNT Digital Library
Linear Unification (open access)

Linear Unification

Efficient unification is considered within the context of logic programming. Unification is explained in terms of equivalence classes made up of terms, where there is a constraint that no equivalence class may contain more than one function term. It is demonstrated that several well-known "efficient" but nonlinear unification algorithms continually maintain the said constraint as a consequence of their choice of data structure for representing equivalence classes. The linearity of the Paterson-Wegman unification algorithm is shown largely to be a consequence of its use of unbounded lists of pointers for representing equivalences between terms, which allows it to avoid the nonlinearity of "union-find".
Date: December 1989
Creator: Wilbanks, John W. (John Winston)
System: The UNT Digital Library
The Object-Oriented Database Editor (open access)

The Object-Oriented Database Editor

Because of an interest in object-oriented database systems, designers have created systems to store and manipulate specific sets of abstract data types that belong to the real world environment they represent. Unfortunately, the advantage of these systems is also a disadvantage since no single object-oriented database system can be used for all applications. This paper describes an object-oriented database management system called the Object-oriented Database Editor (ODE) which overcomes this disadvantage by allowing designers to create and execute an object-oriented database that represents any type of environment and then to store it and simulate that environment. As conditions within the environment change, the designer can use ODE to alter that environment without loss of data. ODE provides a flexible environment for the user; it is efficient; and it can run on a personal computer.
Date: December 1989
Creator: Coats, Sidney M. (Sidney Mark)
System: The UNT Digital Library
Efficient Parallel Algorithms and Data Structures Related to Trees (open access)

Efficient Parallel Algorithms and Data Structures Related to Trees

The main contribution of this dissertation proposes a new paradigm, called the parentheses matching paradigm. It claims that this paradigm is well suited for designing efficient parallel algorithms for a broad class of nonnumeric problems. To demonstrate its applicability, we present three cost-optimal parallel algorithms for breadth-first traversal of general trees, sorting a special class of integers, and coloring an interval graph with the minimum number of colors.
Date: December 1991
Creator: Chen, Calvin Ching-Yuen
System: The UNT Digital Library
Efficient Linked List Ranking Algorithms and Parentheses Matching as a New Strategy for Parallel Algorithm Design (open access)

Efficient Linked List Ranking Algorithms and Parentheses Matching as a New Strategy for Parallel Algorithm Design

The goal of a parallel algorithm is to solve a single problem using multiple processors working together and to do so in an efficient manner. In this regard, there is a need to categorize strategies in order to solve broad classes of problems with similar structures and requirements. In this dissertation, two parallel algorithm design strategies are considered: linked list ranking and parentheses matching.
Date: December 1993
Creator: Halverson, Ranette Hudson
System: The UNT Digital Library
Multiresolutional/Fractal Compression of Still and Moving Pictures (open access)

Multiresolutional/Fractal Compression of Still and Moving Pictures

The scope of the present dissertation is a deep lossy compression of still and moving grayscale pictures while maintaining their fidelity, with a specific goal of creating a working prototype of a software system for use in low bandwidth transmission of still satellite imagery and weather briefings with the best preservation of features considered important by the end user.
Date: December 1993
Creator: Kiselyov, Oleg E.
System: The UNT Digital Library
Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm (open access)

Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm

Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, has been introduced. The neural network model has a multi-layer structure, i.e. the number of layers is not limited. Also, the model has the flexibility of defining output nodes in any layer. This is especially important for connectionist expert system applications. A temporal backpropagation algorithm which supports the model has been developed. The model along with the temporal backpropagation algorithm makes it extremely practical to define any artificial neural network application. Also, an approach that can be followed to decrease the memory space used by weight matrix has been introduced. The algorithm was tested using a medical connectionist expert system to show how best we describe not only the disease but also the entire course of the disease. …
Date: December 1993
Creator: Civelek, Ferda N. (Ferda Nur)
System: The UNT Digital Library
A Highly Fault-Tolerant Distributed Database System with Replicated Data (open access)

A Highly Fault-Tolerant Distributed Database System with Replicated Data

Because of the high cost and impracticality of a high connectivity network, most recent research in transaction processing has focused on a distributed replicated database system. In such a system, multiple copies of a data item are created and stored at several sites in the network, so that the system is able to tolerate more crash and communication failures and attain higher data availability. However, the multiple copies also introduce a global inconsistency problem, especially in a partitioned network. In this dissertation a tree quorum algorithm is proposed to solve this problem, imposing a logical tree structure along with dynamic system reconfiguration on all the copies of each data item. The proposed algorithm can be viewed as a dynamic voting technique which, with the help of an appropriate concurrency control algorithm, exhibits the major advantages of quorum-based replica control algorithms and of the available copies algorithm, so that a single copy is read for a read operation and a quorum of copies is written for a write operation. In addition, read and write quorums are computed dynamically and independently. As a result expensive read operations, like those that require several copies of a data item to be read in most …
Date: December 1994
Creator: Lin, Tsai S. (Tsai Shooumeei)
System: The UNT Digital Library
Recognition of Face Images (open access)

Recognition of Face Images

The focus of this dissertation is a methodology that enables computer systems to classify different up-front images of human faces as belonging to one of the individuals to which the system has been exposed previously. The images can present variance in size, location of the face, orientation, facial expressions, and overall illumination. The approach to the problem taken in this dissertation can be classified as analytic as the shapes of individual features of human faces are examined separately, as opposed to holistic approaches to face recognition. The outline of the features is used to construct signature functions. These functions are then magnitude-, period-, and phase-normalized to form a translation-, size-, and rotation-invariant representation of the features. Vectors of a limited number of the Fourier decomposition coefficients of these functions are taken to form the feature vectors representing the features in the corresponding vector space. With this approach no computation is necessary to enforce the translational, size, and rotational invariance at the stage of recognition thus reducing the problem of recognition to the k-dimensional clustering problem. A recognizer is specified that can reliably classify the vectors of the feature space into object classes. The recognizer made use of the following principle: …
Date: December 1994
Creator: Pershits, Edward
System: The UNT Digital Library
An Algorithm for the PLA Equivalence Problem (open access)

An Algorithm for the PLA Equivalence Problem

The Programmable Logic Array (PLA) has been widely used in the design of VLSI circuits and systems because of its regularity, flexibility, and simplicity. The equivalence problem is typically to verify that the final description of a circuit is functionally equivalent to its initial description. Verifying the functional equivalence of two descriptions is equivalent to proving their logical equivalence. This problem of pure logic is essential to circuit design. The most widely used technique to solve the problem is based on Binary Decision Diagram or BDD, proposed by Bryant in 1986. Unfortunately, BDD requires too much time and space to represent moderately large circuits for equivalence testing. We design and implement a new algorithm called the Cover-Merge Algorithm for the equivalence problem based on a divide-and-conquer strategy using the concept of cover and a derivational method. We prove that the algorithm is sound and complete. Because of the NP-completeness of the problem, we emphasize simplifications to reduce the search space or to avoid redundant computations. Simplification techniques are incorporated into the algorithm as an essential part to speed up the the derivation process. Two different sets of heuristics are developed for two opposite goals: one for the proof of equivalence …
Date: December 1995
Creator: Moon, Gyo Sik
System: The UNT Digital Library
Convexity-Preserving Scattered Data Interpolation (open access)

Convexity-Preserving Scattered Data Interpolation

Surface fitting methods play an important role in many scientific fields as well as in computer aided geometric design. The problem treated here is that of constructing a smooth surface that interpolates data values associated with scattered nodes in the plane. The data is said to be convex if there exists a convex interpolant. The problem of convexity-preserving interpolation is to determine if the data is convex, and construct a convex interpolant if it exists.
Date: December 1995
Creator: Leung, Nim Keung
System: The UNT Digital Library
Field Programmable Devices and Reconfigurable Computing (open access)

Field Programmable Devices and Reconfigurable Computing

The motivation behind this research has been the idea of the capability of the computing device to dynamically reconfigure itself. The goal of this work is to measure the computational power of reconfigurable machines rather in an abstract manner by proposing a model the FPGAs abstract computing machines. Modeling FPGAs in terms of Automata Theory would give a base to answer more fundamental questions about the capabilities and possible answers. If a Finite State Machine (FSM) or a Turing Machine (TM) has the capability of reconfiguring its finite control, does this ability give the abstract computing device new computational power. In other words is a reconfigurable FSM, TM or a Cellular Automata more powerful than their corresponding non-configurable versions?
Date: December 1995
Creator: Koyuncu, Osman
System: The UNT Digital Library
Quantifying Design Principles in Reusable Software Components (open access)

Quantifying Design Principles in Reusable Software Components

Software reuse can occur in various places during the software development cycle. Reuse of existing source code is the most commonly practiced form of software reuse. One of the key requirements for software reuse is readability, thus the interest in the use of data abstraction, inheritance, modularity, and aspects of the visible portion of module specifications. This research analyzed the contents of software reuse libraries to answer the basic question of what makes a good reusable software component. The approach taken was to measure and analyze various software metrics as mapped to design characteristics. A related research question investigated the change in the design principles over time. This was measured by comparing sets of Ada reuse libraries categorized into two time periods. It was discovered that recently developed Ada reuse components scored better on readability than earlier developed components. A benefit of this research has been the development of a set of "design for reuse" guidelines. These guidelines address coding practices as well as design principles for an Ada implementation. C++ software reuse libraries were also analyzed to determine if design principles can be applied in a language independent fashion. This research used cyclomatic complexity metrics, software science metrics, and …
Date: December 1995
Creator: Moore, Freeman Leroy
System: The UNT Digital Library
Efficient Algorithms and Framework for Bandwidth Allocation, Quality-of-Service Provisioning and Location Management in Mobile Wireless Computing (open access)

Efficient Algorithms and Framework for Bandwidth Allocation, Quality-of-Service Provisioning and Location Management in Mobile Wireless Computing

The fusion of computers and communications has promised to herald the age of information super-highway over high speed communication networks where the ultimate goal is to enable a multitude of users at any place, access information from anywhere and at any time. This, in a nutshell, is the goal envisioned by the Personal Communication Services (PCS) and Xerox's ubiquitous computing. In view of the remarkable growth of the mobile communication users in the last few years, the radio frequency spectrum allocated by the FCC (Federal Communications Commission) to this service is still very limited and the usable bandwidth is by far much less than the expected demand, particularly in view of the emergence of the next generation wireless multimedia applications like video-on-demand, WWW browsing, traveler information systems etc. Proper management of available spectrum is necessary not only to accommodate these high bandwidth applications, but also to alleviate problems due to sudden explosion of traffic in so called hot cells. In this dissertation, we have developed simple load balancing techniques to cope with the problem of tele-traffic overloads in one or more hot cells in the system. The objective is to ease out the high channel demand in hot cells by …
Date: December 1997
Creator: Sen, Sanjoy Kumar
System: The UNT Digital Library
Exon/Intron Discrimination Using the Finite Induction Pattern Matching Technique (open access)

Exon/Intron Discrimination Using the Finite Induction Pattern Matching Technique

DNA sequence analysis involves precise discrimination of two of the sequence's most important components: exons and introns. Exons encode the proteins that are responsible for almost all the functions in a living organism. Introns interrupt the sequence coding for a protein and must be removed from primary RNA transcripts before translation to protein can occur. A pattern recognition technique called Finite Induction (FI) is utilized to study the language of exons and introns. FI is especially suited for analyzing and classifying large amounts of data representing sequences of interest. It requires no biological information and employs no statistical functions. Finite Induction is applied to the exon and intron components of DNA by building a collection of rules based upon what it finds in the sequences it examines. It then attempts to match the known rule patterns with new rules formed as a result of analyzing a new sequence. A high number of matches predict a probable close relationship between the two sequences; a low number of matches signifies a large amount of difference between the two. This research demonstrates FI to be a viable tool for measurement when known patterns are available for the formation of rule sets.
Date: December 1997
Creator: Taylor, Pamela A., 1941-
System: The UNT Digital Library
Dynamic Grid-Based Data Distribution Management in Large Scale Distributed Simulations (open access)

Dynamic Grid-Based Data Distribution Management in Large Scale Distributed Simulations

Distributed simulation is an enabling concept to support the networked interaction of models and real world elements that are geographically distributed. This technology has brought a new set of challenging problems to solve, such as Data Distribution Management (DDM). The aim of DDM is to limit and control the volume of the data exchanged during a distributed simulation, and reduce the processing requirements of the simulation hosts by relaying events and state information only to those applications that require them. In this thesis, we propose a new DDM scheme, which we refer to as dynamic grid-based DDM. A lightweight UNT-RTI has been developed and implemented to investigate the performance of our DDM scheme. Our results clearly indicate that our scheme is scalable and it significantly reduces both the number of multicast groups used, and the message overhead, when compared to previous grid-based allocation schemes using large-scale and real-world scenarios.
Date: December 2000
Creator: Roy, Amber Joyce
System: The UNT Digital Library

Logic Programming Tools for Dynamic Content Generation and Internet Data Mining

Access: Use of this item is restricted to the UNT Community
The phenomenal growth of Information Technology requires us to elicit, store and maintain huge volumes of data. Analyzing this data for various purposes is becoming increasingly important. Data mining consists of applying data analysis and discovery algorithms that under acceptable computational efficiency limitations, produce a particular enumeration of patterns over the data. We present two techniques based on using Logic programming tools for data mining. Data mining analyzes data by extracting patterns which describe its structure and discovers co-relations in the form of rules. We distinguish analysis methods as visual and non-visual and present one application of each. We explain that our focus on the field of Logic Programming makes some of the very complex tasks related to Web based data mining and dynamic content generation, simple and easy to implement in a uniform framework.
Date: December 2000
Creator: Gupta, Anima
System: The UNT Digital Library