An Algorithm for the PLA Equivalence Problem (open access)

An Algorithm for the PLA Equivalence Problem

The Programmable Logic Array (PLA) has been widely used in the design of VLSI circuits and systems because of its regularity, flexibility, and simplicity. The equivalence problem is typically to verify that the final description of a circuit is functionally equivalent to its initial description. Verifying the functional equivalence of two descriptions is equivalent to proving their logical equivalence. This problem of pure logic is essential to circuit design. The most widely used technique to solve the problem is based on Binary Decision Diagram or BDD, proposed by Bryant in 1986. Unfortunately, BDD requires too much time and space to represent moderately large circuits for equivalence testing. We design and implement a new algorithm called the Cover-Merge Algorithm for the equivalence problem based on a divide-and-conquer strategy using the concept of cover and a derivational method. We prove that the algorithm is sound and complete. Because of the NP-completeness of the problem, we emphasize simplifications to reduce the search space or to avoid redundant computations. Simplification techniques are incorporated into the algorithm as an essential part to speed up the the derivation process. Two different sets of heuristics are developed for two opposite goals: one for the proof of equivalence …
Date: December 1995
Creator: Moon, Gyo Sik
System: The UNT Digital Library
Multiresolution Signal Cross-correlation (open access)

Multiresolution Signal Cross-correlation

Signal Correlation is a digital signal processing technique which has a wide variety of applications, ranging from geophysical exploration to acoustic signal enhancements, or beamforming. This dissertation will consider this technique in an underwater acoustics perspective, but the algorithms illustrated here can be readily applied to other areas. Although beamforming techniques have been studied for the past fifty years, modern beamforming systems still have difficulty in operating in noisy environments, especially in shallow water.
Date: December 1994
Creator: Novaes, Marcos (Marcos Nogueira)
System: The UNT Digital Library
A Real-Time Merging-Buffering Technique for MIDI Messages (open access)

A Real-Time Merging-Buffering Technique for MIDI Messages

A powerful and efficient algorithm has been designed to deal with the critical timing problem of the MIDI messages. This algorithm can convert note events stored in a natural way to MIDI messages dynamically. Only limited memory space (the buffer) is required to finish the conversion work, and the size of the buffer is independent of the size of the original sequence (notes). This algorithm's real-time variable properties suggest not only the flexible real-time controls in the use of musical aspects, but also the expandability to interactive multi-media applications. A compositional environment called MusicSculptor has been implemented in terms of this algorithm.
Date: December 1991
Creator: Chang, Kuo-Lung
System: The UNT Digital Library
An English and Arabic Character Printer (open access)

An English and Arabic Character Printer

This paper is presented in satisfaction of the requirement for two problems in lieu of thesis which are required for the degree, Master of Science. The two problems are: (1) to provide an electric interface between the M6800 microprocessor and the printer; and (2) to design an Arabic character set and to provide the logic required for its implementation. As it would be artificial and impractical to document these problems separately, a single document here is provided.
Date: December 1976
Creator: Abdel-Razzack, Malek G.
System: The UNT Digital Library
Design and Implementation of a Parser for the DBase II Query Language (open access)

Design and Implementation of a Parser for the DBase II Query Language

In this paper the DBase II query language of an RDBMS for personal computers is discussed. Other languages will be provided by large and sophisticated DBMS will not be discussed here. The reason for selecting the DBase II query language for discussion are as follows: 1. It is a simple language that can be learned easily [TOWN 84, DINE 84]. Within a short period, users can learn all of the facilities and manage the system very well. 2. It is a language suitable for interactive programming and execution like BASIC. 3. It provides adequate facilities for a small data base system and serves as an introductory guide for more sophisticated systems.
Date: December 1985
Creator: Chan, Kin Pong
System: The UNT Digital Library
Recognition of Face Images (open access)

Recognition of Face Images

The focus of this dissertation is a methodology that enables computer systems to classify different up-front images of human faces as belonging to one of the individuals to which the system has been exposed previously. The images can present variance in size, location of the face, orientation, facial expressions, and overall illumination. The approach to the problem taken in this dissertation can be classified as analytic as the shapes of individual features of human faces are examined separately, as opposed to holistic approaches to face recognition. The outline of the features is used to construct signature functions. These functions are then magnitude-, period-, and phase-normalized to form a translation-, size-, and rotation-invariant representation of the features. Vectors of a limited number of the Fourier decomposition coefficients of these functions are taken to form the feature vectors representing the features in the corresponding vector space. With this approach no computation is necessary to enforce the translational, size, and rotational invariance at the stage of recognition thus reducing the problem of recognition to the k-dimensional clustering problem. A recognizer is specified that can reliably classify the vectors of the feature space into object classes. The recognizer made use of the following principle: …
Date: December 1994
Creator: Pershits, Edward
System: The UNT Digital Library
A Highly Fault-Tolerant Distributed Database System with Replicated Data (open access)

A Highly Fault-Tolerant Distributed Database System with Replicated Data

Because of the high cost and impracticality of a high connectivity network, most recent research in transaction processing has focused on a distributed replicated database system. In such a system, multiple copies of a data item are created and stored at several sites in the network, so that the system is able to tolerate more crash and communication failures and attain higher data availability. However, the multiple copies also introduce a global inconsistency problem, especially in a partitioned network. In this dissertation a tree quorum algorithm is proposed to solve this problem, imposing a logical tree structure along with dynamic system reconfiguration on all the copies of each data item. The proposed algorithm can be viewed as a dynamic voting technique which, with the help of an appropriate concurrency control algorithm, exhibits the major advantages of quorum-based replica control algorithms and of the available copies algorithm, so that a single copy is read for a read operation and a quorum of copies is written for a write operation. In addition, read and write quorums are computed dynamically and independently. As a result expensive read operations, like those that require several copies of a data item to be read in most …
Date: December 1994
Creator: Lin, Tsai S. (Tsai Shooumeei)
System: The UNT Digital Library
Efficient Algorithms and Framework for Bandwidth Allocation, Quality-of-Service Provisioning and Location Management in Mobile Wireless Computing (open access)

Efficient Algorithms and Framework for Bandwidth Allocation, Quality-of-Service Provisioning and Location Management in Mobile Wireless Computing

The fusion of computers and communications has promised to herald the age of information super-highway over high speed communication networks where the ultimate goal is to enable a multitude of users at any place, access information from anywhere and at any time. This, in a nutshell, is the goal envisioned by the Personal Communication Services (PCS) and Xerox's ubiquitous computing. In view of the remarkable growth of the mobile communication users in the last few years, the radio frequency spectrum allocated by the FCC (Federal Communications Commission) to this service is still very limited and the usable bandwidth is by far much less than the expected demand, particularly in view of the emergence of the next generation wireless multimedia applications like video-on-demand, WWW browsing, traveler information systems etc. Proper management of available spectrum is necessary not only to accommodate these high bandwidth applications, but also to alleviate problems due to sudden explosion of traffic in so called hot cells. In this dissertation, we have developed simple load balancing techniques to cope with the problem of tele-traffic overloads in one or more hot cells in the system. The objective is to ease out the high channel demand in hot cells by …
Date: December 1997
Creator: Sen, Sanjoy Kumar
System: The UNT Digital Library
Quantifying Design Principles in Reusable Software Components (open access)

Quantifying Design Principles in Reusable Software Components

Software reuse can occur in various places during the software development cycle. Reuse of existing source code is the most commonly practiced form of software reuse. One of the key requirements for software reuse is readability, thus the interest in the use of data abstraction, inheritance, modularity, and aspects of the visible portion of module specifications. This research analyzed the contents of software reuse libraries to answer the basic question of what makes a good reusable software component. The approach taken was to measure and analyze various software metrics as mapped to design characteristics. A related research question investigated the change in the design principles over time. This was measured by comparing sets of Ada reuse libraries categorized into two time periods. It was discovered that recently developed Ada reuse components scored better on readability than earlier developed components. A benefit of this research has been the development of a set of "design for reuse" guidelines. These guidelines address coding practices as well as design principles for an Ada implementation. C++ software reuse libraries were also analyzed to determine if design principles can be applied in a language independent fashion. This research used cyclomatic complexity metrics, software science metrics, and …
Date: December 1995
Creator: Moore, Freeman Leroy
System: The UNT Digital Library
Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm (open access)

Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm

Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, has been introduced. The neural network model has a multi-layer structure, i.e. the number of layers is not limited. Also, the model has the flexibility of defining output nodes in any layer. This is especially important for connectionist expert system applications. A temporal backpropagation algorithm which supports the model has been developed. The model along with the temporal backpropagation algorithm makes it extremely practical to define any artificial neural network application. Also, an approach that can be followed to decrease the memory space used by weight matrix has been introduced. The algorithm was tested using a medical connectionist expert system to show how best we describe not only the disease but also the entire course of the disease. …
Date: December 1993
Creator: Civelek, Ferda N. (Ferda Nur)
System: The UNT Digital Library
Macro Control Structures for Structured Programming in ALC (open access)

Macro Control Structures for Structured Programming in ALC

This thesis describes a set of computer program control structures which permits the application of certain structured programming techniques to the IBM/360 assembly language (ALC). The control structures are implemented by programmerdefined instructions known as macros. A history of computer software is presented, providing a basis for the emergence of structured programming. A survey of the major concepts of structured programming with special attention to control structures and their significance to structured programming follows. The macros developed in this study include DO, ENDDO, LEAVE, CASE, and ENDCASE. They provide a looping control structure, a loop-escape construct, and a selective control structure. Examples of usage are given.
Date: December 1975
Creator: Connally, Kim G.
System: The UNT Digital Library