A Model for the Efficient Investment of Temporary Funds by Corporate Money Managers (open access)

A Model for the Efficient Investment of Temporary Funds by Corporate Money Managers

In this study seventeen various relationships between yields of three-month, six-month, and twelve-month maturity negotiable CD's and U.S. Government T-Bills were analyzed to find a leading indicator of short-term interest rates. Each of the seventeen relationships was tested for correlation with actual three-, six-, and twelve-month yields from zero to twenty-six weeks in the future. Only one relationship was found to be significant as a leading indicator. This was the twelve-month yield minus the six-month yield adjusted for scale and accumulated where the result was positive. This indicator (variable nineteen in the study) was further tested for usefulness as a trend indicator by transforming it into a function consisting of +1 (when its slope was positive), 0 (when its slope was zero), and -1 (when its slope was negative). Stage II of the study consisted of constructing a computer-aided model employing variable nineteen as a forecasting device. The model accepts a week-by-week minimum cash balance forecast, and the past thirteen weeks' yields of three-, six-, and twelve-month CD's as input. The output of the model consists of a cash time availability schedule, a numerical listing of variable nineteen values, the thirteen-week history of three-, six-, and twelve-month CD yields, a …
Date: August 1974
Creator: McWilliams, Donald B., 1936-
System: The UNT Digital Library
Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance? (open access)

Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance?

The major purposes of the current research are twofold. The first purpose is to present a composite approach to the general classification problem by using outputs from various parametric statistical procedures and neural networks. The second purpose is to compare several parametric and neural network models on a transportation planning related classification problem and five simulated classification problems.
Date: December 1998
Creator: Mitchell, David
System: The UNT Digital Library
Robustness of Parametric and Nonparametric Tests When Distances between Points Change on an Ordinal Measurement Scale (open access)

Robustness of Parametric and Nonparametric Tests When Distances between Points Change on an Ordinal Measurement Scale

The purpose of this research was to evaluate the effect on parametric and nonparametric tests using ordinal data when the distances between points changed on the measurement scale. The research examined the performance of Type I and Type II error rates using selected parametric and nonparametric tests.
Date: August 1994
Creator: Chen, Andrew H. (Andrew Hwa-Fen)
System: The UNT Digital Library
A Quantitative Approach to Medical Decision Making (open access)

A Quantitative Approach to Medical Decision Making

The purpose of this study is to develop a technique by which a physician may use a predetermined data base to derive a preliminary diagnosis for a patient with a given set of symptoms. The technique will not yield an absolute diagnosis, but rather will point the way to a set of most likely diseases upon which the physician may concentrate his efforts. There will be no reliance upon a data base compiled from poorly kept medical records with non-standardization of terminology. While this study produces a workable tool for the physician to use in the process of medical diagnosis, the ultimate responsibility for the patient's welfare must still rest with the physician.
Date: May 1975
Creator: Meredith, John W.
System: The UNT Digital Library
Economic Statistical Design of Inverse Gaussian Distribution Control Charts (open access)

Economic Statistical Design of Inverse Gaussian Distribution Control Charts

Statistical quality control (SQC) is one technique companies are using in the development of a Total Quality Management (TQM) culture. Shewhart control charts, a widely used SQC tool, rely on an underlying normal distribution of the data. Often data are skewed. The inverse Gaussian distribution is a probability distribution that is wellsuited to handling skewed data. This analysis develops models and a set of tools usable by practitioners for the constrained economic statistical design of control charts for inverse Gaussian distribution process centrality and process dispersion. The use of this methodology is illustrated by the design of an x-bar chart and a V chart for an inverse Gaussian distributed process.
Date: August 1990
Creator: Grayson, James M. (James Morris)
System: The UNT Digital Library
A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications (open access)

A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications

This study develops a heuristic procedure for specifying parameters for a neural network configuration (learning rate, momentum, and the number of neurons in a single hidden layer) in Shewhart X-bar control chart applications. Also, this study examines the replicability of the neural network solution when the neural network is retrained several times with different initial weights.
Date: December 1993
Creator: Nam, Kyungdoo T.
System: The UNT Digital Library
The Establishment of Helicopter Subsystem Design-to-Cost Estimates by Use of Parametric Cost Estimating Models (open access)

The Establishment of Helicopter Subsystem Design-to-Cost Estimates by Use of Parametric Cost Estimating Models

The purpose of this research was to develop parametric Design-to-Cost models for selected major subsystems of certain helicopters. This was accomplished by analyzing the relationships between historical production costs and certain design parameters which are available during the preliminary design phase of the life cycle. Several potential contributions are identified in the areas of academia, government, and industry. Application of the cost models will provide estimates beneficial to the government and DoD by allowing derivation of realistic Design-to-Cost estimates. In addition, companies in the helicopter industry will benefit by using the models for two key purposes: (1) optimizing helicopter design through cost-effective tradeoffs, and (2) justifying a proposal estimate.
Date: August 1979
Creator: Gilliland, Johnny J.
System: The UNT Digital Library
The Effect of Certain Modifications to Mathematical Programming Models for the Two-Group Classification Problem (open access)

The Effect of Certain Modifications to Mathematical Programming Models for the Two-Group Classification Problem

This research examines certain modifications of the mathematical programming models to improve their classificatory performance. These modifications involve the inclusion of second-order terms and secondary goals in mathematical programming models. A Monte Carlo simulation study is conducted to investigate the performance of two standard parametric models and various mathematical programming models, including the MSD (minimize sum of deviations) model, the MIP (mixed integer programming) model and the hybrid linear programming model.
Date: May 1994
Creator: Wanarat, Pradit
System: The UNT Digital Library
Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles (open access)

Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles

The cycle regression analysis algorithm is the most recent addition to a group of techniques developed to detect "hidden periodicities." This dissertation investigates four major aspects of the algorithm. The objectives of this research are 1. To develop an objective method of obtaining an initial estimate of the cycle period? the present procedure of obtaining this estimate involves considerable subjective judgment; 2. To validate the algorithm's success in extracting cycles from multi-cylical data; 3. To determine if a consistent relationship exists among the smallest amplitude, the error standard deviation, and the number of replications of a cycle contained in the data; 4. To investigate the behavior of the algorithm in the predictions of major drops.
Date: December 1982
Creator: Mehta, Mayur Ravishanker
System: The UNT Digital Library
Application of Spectral Analysis to the Cycle Regression Algorithm (open access)

Application of Spectral Analysis to the Cycle Regression Algorithm

Many techniques have been developed to analyze time series. Spectral analysis and cycle regression analysis represent two such techniques. This study combines these two powerful tools to produce two new algorithms; the spectral algorithm and the one-pass algorithm. This research encompasses four objectives. The first objective is to link spectral analysis with cycle regression analysis to determine an initial estimate of the sinusoidal period. The second objective is to determine the best spectral window and truncation point combination to use with cycle regression for the initial estimate of the sinusoidal period. The third is to determine whether the new spectral algorithm performs better than the old T-value algorithm in estimating sinusoidal parameters. The fourth objective is to determine whether the one-pass algorithm can be used to estimate all significant harmonics simultaneously.
Date: August 1984
Creator: Shah, Vivek
System: The UNT Digital Library
The Evaluation and Control of the Changes in Basic Statistics Encountered in Grouped Data (open access)

The Evaluation and Control of the Changes in Basic Statistics Encountered in Grouped Data

This dissertation describes the effect that the construction of frequency tables has on basic statistics computed from those frequency tables. It is directly applicable only to normally distributed data summarized by Sturges' Rule. The purpose of this research was to identify factors tending to bias sample statistics when data are summarized, and thus to allow researchers to avoid such bias. The methodology employed was a large scale simulation where 1000 replications of samples of size n = 2 ᵏ⁻¹ for 2 to 12 were drawn from a normally distributed population with a mean of zero and a standard deviation of one. A FORTRAN IV source listing is included. The report concludes that researchers should avoid the use of statistics computed from frequency tables in cases where raw data are available. Where the use of such statistics is unavoidable, the researchers can eliminate their bias by the use of empirical correction factors provided in the paper. Further research is suggested to determine the effect of summarization of data drawn from various non-normal distributions.
Date: May 1979
Creator: Scott, James P.
System: The UNT Digital Library
The Development and Evaluation of a Forecasting System that Incorporates ARIMA Modeling with Autoregression and Exponential Smoothing (open access)

The Development and Evaluation of a Forecasting System that Incorporates ARIMA Modeling with Autoregression and Exponential Smoothing

This research was designed to develop and evaluate an automated alternative to the Box-Jenkins method of forecasting. The study involved two major phases. The first phase was the formulation of an automated ARIMA method; the second was the combination of forecasts from the automated ARIMA with forecasts from two other automated methods, the Holt-Winters method and the Stepwise Autoregressive method. The development of the automated ARIMA, based on a decision criterion suggested by Akaike, borrows heavily from the work of Ang, Chuaa and Fatema. Seasonality and small data set handling were some of the modifications made to the original method to make it suitable for use with a broad range of time series. Forecasts were combined by means of both the simple average and a weighted averaging scheme. Empirical and generated data were employed to perform the forecasting evaluation. The 111 sets of empirical data came from the M-Competition. The twenty-one sets of generated data arose from ARIMA models that Box, Taio and Pack analyzed using the Box-Jenkins method. To compare the forecasting abilities of the Box-Jenkins and the automated ARIMA alone and in combination with the other two methods, two accuracy measures were used. These measures, which are free …
Date: May 1985
Creator: Simmons, Laurette Poulos
System: The UNT Digital Library
A Goal Programming Safety and Health Standards Compliance Model (open access)

A Goal Programming Safety and Health Standards Compliance Model

The purpose of this dissertation was to create a safety compliance model which would advance the state of the art of safety compliance models and provide management with a practical tool which can be used in making safety decisions in an environment where multiple objectives exist. A goal programming safety compliance model (OSHA Model) was developed to fulfill this purpose. The objective function of the OSHA Model was designed to minimize the total deviation from the established goals of the model. These model goals were expressed in terms of 1) level of compliance to OSHA safety and health regulations, 2) company accident frequency rate, 3) company accident cost per worker, and 4) a company budgetary restriction. This particular set of goals was selected to facilitate management's fulfillment of its responsibilities to OSHA, the employees, and to ownership. This study concludes that all the research objectives have been accomplished. The OSHA Model formulated not only advances the state of the art of safety compliance models, but also provides a practical tool which facilitates management's safety and health decisions. The insight into the relationships existing in a safety compliance decision system provided by the OSHA Model and its accompanying sensitivity analysis was …
Date: August 1976
Creator: Ryan, Lanny J.
System: The UNT Digital Library
Reliable Prediction Intervals and Bayesian Estimation for Demand Rates of Slow-Moving Inventory (open access)

Reliable Prediction Intervals and Bayesian Estimation for Demand Rates of Slow-Moving Inventory

Application of multisource feedback (MSF) increased dramatically and became widespread globally in the past two decades, but there was little conceptual work regarding self-other agreement and few empirical studies investigated self-other agreement in other cultural settings. This study developed a new conceptual framework of self-other agreement and used three samples to illustrate how national culture affected self-other agreement. These three samples included 428 participants from China, 818 participants from the US, and 871 participants from globally dispersed teams (GDTs). An EQS procedure and a polynomial regression procedure were used to examine whether the covariance matrices were equal across samples and whether the relationships between self-other agreement and performance would be different across cultures, respectively. The results indicated MSF could be applied to China and GDTs, but the pattern of relationships between self-other agreement and performance was different across samples, suggesting that the results found in the U.S. sample were the exception rather than rule. Demographics also affected self-other agreement disparately across perspectives and cultures, indicating self-concept was susceptible to cultural influences. The proposed framework only received partial support but showed great promise to guide future studies. This study contributed to the literature by: (a) developing a new framework of self-other …
Date: August 2007
Creator: Lindsey, Matthew Douglas
System: The UNT Digital Library
Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers (open access)

Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers

In the Information Age, a proliferation of unstructured text electronic documents exists. Processing these documents by humans is a daunting task as humans have limited cognitive abilities for processing large volumes of documents that can often be extremely lengthy. To address this problem, text data computer algorithms are being developed. Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are two text data computer algorithms that have received much attention individually in the text data literature for topic extraction studies but not for document classification nor for comparison studies. Since classification is considered an important human function and has been studied in the areas of cognitive science and information science, in this dissertation a research study was performed to compare LDA, LSA and humans as document classifiers. The research questions posed in this study are: R1: How accurate is LDA and LSA in classifying documents in a corpus of textual data over a known set of topics? R2: How accurate are humans in performing the same classification task? R3: How does LDA classification performance compare to LSA classification performance? To address these questions, a classification study involving human subjects was designed where humans were asked to generate and classify documents …
Date: December 2011
Creator: Anaya, Leticia H.
System: The UNT Digital Library
The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas (open access)

The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas

The purpose of this study is to determine the effects of water pollution control on financing municipal water pollution control facilities in selected cities in North Central Texas. This objective is accomplished by addressing the following topics: (1) the cost to municipalities of meeting federally mandated water pollution control, (2) the sources of funds for financing sewage treatment, and (3) the financial implications of employing these financing tools to satisfy water quality regulations. The study makes the following conclusions regarding the impact of water pollution control costs on municipalities in the North Central Texas Region: 1) The financing of the wastewater treatment requirements of the Water Pollution Control Act Amendments of 1972 will cause many municipalities to report operating deficits for their Water and Sewer Fund. 2) A federal grant program funded at the rate of 75 per cent of waste treatment needs will prevent operating deficits in the majority of cities in which 1990 waste treatment needs constitute 20 per cent or more of the expected Water and Sewer Fund capital structure. 3) A federal grant program funded at the average rate of 35 per cent of needs will benefit only a small number of cities. 4) The federal …
Date: May 1976
Creator: Rucks, Andrew C.
System: The UNT Digital Library
Financial Leverage and the Cost of Capital (open access)

Financial Leverage and the Cost of Capital

The objective of the research reported in this dissertation is to conduct an empirical test of the hypothesis that, excluding income tax effects, the cost of capital to a firm is independent of the degree of financial leverage employed by the firm. This hypothesis, set forth by Franco Modigliani and Merton Miller in 1958, represents a challenge to the traditional view on the subject, a challenge which carries implications of considerable importance in the field of finance. The challenge has led to a lengthy controversy which can ultimately be resolved only by subjecting the hypothesis to empirical test. The basis of the test was Modigliani and Miller's Proposition II, a corollary of their fundamental hypothesis. Proposition II, in effect, states that equity investors fully discount any increase in risk due to financial leverage so that there is no possibility for the firm to reduce its cost of capital by employing financial leverage. The results of the research reported in this dissertation do not support that contention. The study indicates that, if equity investors require any increase in premium for increasing financial leverage, the premium required is significantly less than that predicted by the Modigliani-Miller Proposition II, over the range of …
Date: December 1977
Creator: Brust, Melvin F.
System: The UNT Digital Library
Links among perceived service quality, patient satisfaction and behavioral intentions in the urgent care industry: Empirical evidence from college students. (open access)

Links among perceived service quality, patient satisfaction and behavioral intentions in the urgent care industry: Empirical evidence from college students.

Patient perceptions of health care quality are critical to a health care service provider's long-term success because of the significant influence perceptions have on customer satisfaction and consequently organization financial performance. Patient satisfaction affects not only the outcome of the health care process such as patient compliance with physician advice and treatment, but also patient retention and favorable word-of-mouth. Accordingly, it is a critical strategy for health care organizations to provide quality service and address patient satisfaction. The urgent care (UC) industry is an integral part of the health care system in the United States that has been experiencing a rapid growth. UC provides a wide range of medical services for a large group of patients and now serves an increasing population. UC is becoming popular because of the convenient locations, extended hours, walk-in policy, short waiting times, and accessibility. A closer examination of the current health care research, however, indicates that there is a paucity of research on urgent care providers. Confronted with the emergence of the urgent care industry and the increasing demand for urgent care, it is necessary to understand how patients perceive urgent care providers and what influences patient satisfaction and retention. This dissertation addresses four …
Date: August 2009
Creator: Qin, Hong
System: The UNT Digital Library
Investigating the relationship between the business performance management framework and the Malcolm Baldrige National Quality Award framework. (open access)

Investigating the relationship between the business performance management framework and the Malcolm Baldrige National Quality Award framework.

The business performance management (BPM) framework helps an organization continuously adjust and successfully execute its strategies. BPM helps increase flexibility by providing managers with an early alert about changes and, as a result, allows faster response to such changes. The Malcolm Baldrige National Quality Award (MBNQA) framework provides a basis for self-assessment and a systems perspective for managing an organization's key processes for achieving business results. The MBNQA framework is a more comprehensive framework and encapsulates the underlying constructs in the BPM framework. The objectives of this dissertation are fourfold: (1) to validate the underlying relationships presented in the 2008 MBNQA framework, (2) to explore the MBNQA framework at the dimension level, and develop and test constructs measured at that level in a causal model, (3) to validate and create a common general framework for the business performance model by integrating the practitioner literature with basic theory including existing MBNQA theory, and (4) to integrate the BPM framework and the MBNQA framework into a new framework (BPM-MBNQA framework) that can guide organizations in their journey toward achieving and sustaining competitive and strategic advantages. The purpose of this study is to achieve these objectives by means of a combination of methodologies …
Date: August 2009
Creator: Hossain, Muhammad Muazzem
System: The UNT Digital Library