Mathematical Programming Approaches to the Three-Group Classification Problem (open access)

Mathematical Programming Approaches to the Three-Group Classification Problem

In the last twelve years there has been considerable research interest in mathematical programming approaches to the statistical classification problem, primarily because they are not based on the assumptions of the parametric methods (Fisher's linear discriminant function, Smith's quadratic discriminant function) for optimality. This dissertation focuses on the development of mathematical programming models for the three-group classification problem and examines the computational efficiency and classificatory performance of proposed and existing models. The classificatory performance of these models is compared with that of Fisher's linear discriminant function and Smith's quadratic discriminant function. Additionally, this dissertation investigates theoretical characteristics of mathematical programming models for the classification problem with three or more groups. A computationally efficient model for the three-group classification problem is developed. This model minimizes directly the number of misclassifications in the training sample. Furthermore, the classificatory performance of the proposed model is enhanced by the introduction of a two-phase algorithm. The same algorithm can be used to improve the classificatory performance of any interval-based mathematical programming model for the classification problem with three or more groups. A modification to improve the computational efficiency of an existing model is also proposed. In addition, a multiple-group extension of a mathematical programming model …
Date: August 1993
Creator: Loucopoulos, Constantine
System: The UNT Digital Library
A Relationship-based Cross National Customer Decision-making Model in the Service Industry (open access)

A Relationship-based Cross National Customer Decision-making Model in the Service Industry

In 2012, the CIA World Fact Book showed that the service sector contributed about 76.6% and 51.4% of the 2010 gross national product of both the United States and Ghana, respectively. Research in the services area shows that a firm's success in today's competitive business environment is dependent upon its ability to deliver superior service quality. However, these studies have yet to address factors that influence customers to remain committed to a mass service in economically diverse countries. In addition, there is little research on established service quality measures pertaining to the mass service domain. This dissertation applies Rusbult's investment model of relationship commitment and examines its psychological impact on the commitment level of a customer towards a service in two economically diverse countries. In addition, service quality is conceptualized as a hierarchical construct in the mass service (banking) and specific dimensions are developed on which customers assess their quality evaluations. Using, PLS path modeling, a structural equation modeling approach to data analysis, service quality as a hierarchical third-order construct was found to have three primary dimensions and six sub-dimensions. The results also established that a country's national economy has a moderating effect on the relationship between service quality and …
Date: August 2013
Creator: Boakye, Kwabena G.
System: The UNT Digital Library
A Simulation Study Comparing Various Confidence Intervals for the Mean of Voucher Populations in Accounting (open access)

A Simulation Study Comparing Various Confidence Intervals for the Mean of Voucher Populations in Accounting

This research examined the performance of three parametric methods for confidence intervals: the classical, the Bonferroni, and the bootstrap-t method, as applied to estimating the mean of voucher populations in accounting. Usually auditing populations do not follow standard models. The population for accounting audits generally is a nonstandard mixture distribution in which the audit data set contains a large number of zero values and a comparatively small number of nonzero errors. This study assumed a situation in which only overstatement errors exist. The nonzero errors were assumed to be normally, exponentially, and uniformly distributed. Five indicators of performance were used. The classical method was found to be unreliable. The Bonferroni method was conservative for all population conditions. The bootstrap-t method was excellent in terms of reliability, but the lower limit of the confidence intervals produced by this method was unstable for all population conditions. The classical method provided the shortest average width of the confidence intervals among the three methods. This study provided initial evidence as to how the parametric bootstrap-t method performs when applied to the nonstandard distribution of audit populations of line items. Further research should provide a reliable confidence interval for a wider variety of accounting populations.
Date: December 1992
Creator: Lee, Ihn Shik
System: The UNT Digital Library
Supply Chain Network Planning for Humanitarian Operations During Seasonal Disasters (open access)

Supply Chain Network Planning for Humanitarian Operations During Seasonal Disasters

To prevent loss of lives during seasonal disasters, relief agencies distribute critical supplies and provide lifesaving services to the affected populations. Despite agencies' efforts, frequently occuring disasters increase the cost of relief operations. The purpose of our study is to minimize the cost of relief operations, considering that such disasters cause random demand. To achieve this, we have formulated a series of models, which are distinct from the current studies in three ways. First, to the best of our knowledge, we are the first ones to capture both perishable and durable products together. Second, we have aggregated multiple products in a different way than current studies do. This unique aggregation requires less data than that of other types of aggregation. Finally, our models are compatible with the practical data generated by FEMA. Our models offer insights on the impacts of various parameters on optimum cost and order size. The analyses of correlation of demand and quality of information offer interesting insights; for instance, under certain cases, the quality of information does not influence cost. Our study has considered both risk averse and risk neutral approaches and provided insights. The insights obtained from our models are expected to help agencies reduce …
Date: May 2013
Creator: Ponnaiyan, Subramaniam
System: The UNT Digital Library
Call Option Premium Dynamics (open access)

Call Option Premium Dynamics

This study has a twofold purpose: to demonstrate the use of the Marquardt compromise method in estimating the unknown parameters contained in the probability call-option pricing models and to test empirically the following models: the Boness, the Black-Scholes, the Merton proportional dividend, the Ingersoll differential tax, and the Ingersoll proportional dividend and differential tax.
Date: December 1982
Creator: Chen, Jim
System: The UNT Digital Library
Financial Leverage and the Cost of Capital (open access)

Financial Leverage and the Cost of Capital

The objective of the research reported in this dissertation is to conduct an empirical test of the hypothesis that, excluding income tax effects, the cost of capital to a firm is independent of the degree of financial leverage employed by the firm. This hypothesis, set forth by Franco Modigliani and Merton Miller in 1958, represents a challenge to the traditional view on the subject, a challenge which carries implications of considerable importance in the field of finance. The challenge has led to a lengthy controversy which can ultimately be resolved only by subjecting the hypothesis to empirical test. The basis of the test was Modigliani and Miller's Proposition II, a corollary of their fundamental hypothesis. Proposition II, in effect, states that equity investors fully discount any increase in risk due to financial leverage so that there is no possibility for the firm to reduce its cost of capital by employing financial leverage. The results of the research reported in this dissertation do not support that contention. The study indicates that, if equity investors require any increase in premium for increasing financial leverage, the premium required is significantly less than that predicted by the Modigliani-Miller Proposition II, over the range of …
Date: December 1977
Creator: Brust, Melvin F.
System: The UNT Digital Library
Application of Spectral Analysis to the Cycle Regression Algorithm (open access)

Application of Spectral Analysis to the Cycle Regression Algorithm

Many techniques have been developed to analyze time series. Spectral analysis and cycle regression analysis represent two such techniques. This study combines these two powerful tools to produce two new algorithms; the spectral algorithm and the one-pass algorithm. This research encompasses four objectives. The first objective is to link spectral analysis with cycle regression analysis to determine an initial estimate of the sinusoidal period. The second objective is to determine the best spectral window and truncation point combination to use with cycle regression for the initial estimate of the sinusoidal period. The third is to determine whether the new spectral algorithm performs better than the old T-value algorithm in estimating sinusoidal parameters. The fourth objective is to determine whether the one-pass algorithm can be used to estimate all significant harmonics simultaneously.
Date: August 1984
Creator: Shah, Vivek
System: The UNT Digital Library
Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles (open access)

Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles

The cycle regression analysis algorithm is the most recent addition to a group of techniques developed to detect "hidden periodicities." This dissertation investigates four major aspects of the algorithm. The objectives of this research are 1. To develop an objective method of obtaining an initial estimate of the cycle period? the present procedure of obtaining this estimate involves considerable subjective judgment; 2. To validate the algorithm's success in extracting cycles from multi-cylical data; 3. To determine if a consistent relationship exists among the smallest amplitude, the error standard deviation, and the number of replications of a cycle contained in the data; 4. To investigate the behavior of the algorithm in the predictions of major drops.
Date: December 1982
Creator: Mehta, Mayur Ravishanker
System: The UNT Digital Library
The Evaluation and Control of the Changes in Basic Statistics Encountered in Grouped Data (open access)

The Evaluation and Control of the Changes in Basic Statistics Encountered in Grouped Data

This dissertation describes the effect that the construction of frequency tables has on basic statistics computed from those frequency tables. It is directly applicable only to normally distributed data summarized by Sturges' Rule. The purpose of this research was to identify factors tending to bias sample statistics when data are summarized, and thus to allow researchers to avoid such bias. The methodology employed was a large scale simulation where 1000 replications of samples of size n = 2 ᵏ⁻¹ for 2 to 12 were drawn from a normally distributed population with a mean of zero and a standard deviation of one. A FORTRAN IV source listing is included. The report concludes that researchers should avoid the use of statistics computed from frequency tables in cases where raw data are available. Where the use of such statistics is unavoidable, the researchers can eliminate their bias by the use of empirical correction factors provided in the paper. Further research is suggested to determine the effect of summarization of data drawn from various non-normal distributions.
Date: May 1979
Creator: Scott, James P.
System: The UNT Digital Library
Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time (open access)

Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time

In this study a two-part mixed probability density function was derived which described the relative changes in the Standard and Poor's 100 Stock Index over various intervals of time. The density function is a mixture of two different halves of normal distributions. Optimal values for the standard deviations for the two halves and the mean are given. Also, a general form of the function is given which uses linear regression models to estimate the standard deviations and the means. The density functions allow stock market participants trading index options and futures contracts on the S & P 100 Stock Index to determine probabilities of success or failure of trades involving price movements of certain magnitudes in given lengths of time.
Date: August 1988
Creator: Bunger, R. C. (Robert Charles)
System: The UNT Digital Library
The Normal Curve Approximation to the Hypergeometric Probability Distribution (open access)

The Normal Curve Approximation to the Hypergeometric Probability Distribution

The classical normal curve approximation to cumulative hypergeometric probabilities requires that the standard deviation of the hypergeometric distribution be larger than three which limits the usefulness of the approximation for small populations. The purposes of this study are to develop clearly-defined rules which specify when the normal curve approximation to the cumulative hypergeometric probability distribution may be successfully utilized and to determine where maximum absolute differences between the cumulative hypergeometric and normal curve approximation of 0.01 and 0.05 occur in relation to the proportion of the population sampled.
Date: December 1981
Creator: Willman, Edward N. (Edward Nicholas)
System: The UNT Digital Library
Economic Statistical Design of Inverse Gaussian Distribution Control Charts (open access)

Economic Statistical Design of Inverse Gaussian Distribution Control Charts

Statistical quality control (SQC) is one technique companies are using in the development of a Total Quality Management (TQM) culture. Shewhart control charts, a widely used SQC tool, rely on an underlying normal distribution of the data. Often data are skewed. The inverse Gaussian distribution is a probability distribution that is wellsuited to handling skewed data. This analysis develops models and a set of tools usable by practitioners for the constrained economic statistical design of control charts for inverse Gaussian distribution process centrality and process dispersion. The use of this methodology is illustrated by the design of an x-bar chart and a V chart for an inverse Gaussian distributed process.
Date: August 1990
Creator: Grayson, James M. (James Morris)
System: The UNT Digital Library
Developing Criteria for Extracting Principal Components and Assessing Multiple Significance Tests in Knowledge Discovery Applications (open access)

Developing Criteria for Extracting Principal Components and Assessing Multiple Significance Tests in Knowledge Discovery Applications

With advances in computer technology, organizations are able to store large amounts of data in data warehouses. There are two fundamental issues researchers must address: the dimensionality of data and the interpretation of multiple statistical tests. The first issue addressed by this research is the determination of the number of components to retain in principal components analysis. This research establishes regression, asymptotic theory, and neural network approaches for estimating mean and 95th percentile eigenvalues for implementing Horn's parallel analysis procedure for retaining components. Certain methods perform better for specific combinations of sample size and numbers of variables. The adjusted normal order statistic estimator (ANOSE), an asymptotic procedure, performs the best overall. Future research is warranted on combining methods to increase accuracy. The second issue involves interpreting multiple statistical tests. This study uses simulation to show that Parker and Rothenberg's technique using a density function with a mixture of betas to model p-values is viable for p-values from central and non-central t distributions. The simulation study shows that final estimates obtained in the proposed mixture approach reliably estimate the true proportion of the distributions associated with the null and nonnull hypotheses. Modeling the density of p-values allows for better control of …
Date: August 1999
Creator: Keeling, Kellie Bliss
System: The UNT Digital Library
The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas (open access)

The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas

The purpose of this study is to determine the effects of water pollution control on financing municipal water pollution control facilities in selected cities in North Central Texas. This objective is accomplished by addressing the following topics: (1) the cost to municipalities of meeting federally mandated water pollution control, (2) the sources of funds for financing sewage treatment, and (3) the financial implications of employing these financing tools to satisfy water quality regulations. The study makes the following conclusions regarding the impact of water pollution control costs on municipalities in the North Central Texas Region: 1) The financing of the wastewater treatment requirements of the Water Pollution Control Act Amendments of 1972 will cause many municipalities to report operating deficits for their Water and Sewer Fund. 2) A federal grant program funded at the rate of 75 per cent of waste treatment needs will prevent operating deficits in the majority of cities in which 1990 waste treatment needs constitute 20 per cent or more of the expected Water and Sewer Fund capital structure. 3) A federal grant program funded at the average rate of 35 per cent of needs will benefit only a small number of cities. 4) The federal …
Date: May 1976
Creator: Rucks, Andrew C.
System: The UNT Digital Library
A Quantitative Approach to Medical Decision Making (open access)

A Quantitative Approach to Medical Decision Making

The purpose of this study is to develop a technique by which a physician may use a predetermined data base to derive a preliminary diagnosis for a patient with a given set of symptoms. The technique will not yield an absolute diagnosis, but rather will point the way to a set of most likely diseases upon which the physician may concentrate his efforts. There will be no reliance upon a data base compiled from poorly kept medical records with non-standardization of terminology. While this study produces a workable tool for the physician to use in the process of medical diagnosis, the ultimate responsibility for the patient's welfare must still rest with the physician.
Date: May 1975
Creator: Meredith, John W.
System: The UNT Digital Library
A Model for the Efficient Investment of Temporary Funds by Corporate Money Managers (open access)

A Model for the Efficient Investment of Temporary Funds by Corporate Money Managers

In this study seventeen various relationships between yields of three-month, six-month, and twelve-month maturity negotiable CD's and U.S. Government T-Bills were analyzed to find a leading indicator of short-term interest rates. Each of the seventeen relationships was tested for correlation with actual three-, six-, and twelve-month yields from zero to twenty-six weeks in the future. Only one relationship was found to be significant as a leading indicator. This was the twelve-month yield minus the six-month yield adjusted for scale and accumulated where the result was positive. This indicator (variable nineteen in the study) was further tested for usefulness as a trend indicator by transforming it into a function consisting of +1 (when its slope was positive), 0 (when its slope was zero), and -1 (when its slope was negative). Stage II of the study consisted of constructing a computer-aided model employing variable nineteen as a forecasting device. The model accepts a week-by-week minimum cash balance forecast, and the past thirteen weeks' yields of three-, six-, and twelve-month CD's as input. The output of the model consists of a cash time availability schedule, a numerical listing of variable nineteen values, the thirteen-week history of three-, six-, and twelve-month CD yields, a …
Date: August 1974
Creator: McWilliams, Donald B., 1936-
System: The UNT Digital Library
The Impact of Culture on the Decision Making Process in Restaurants (open access)

The Impact of Culture on the Decision Making Process in Restaurants

Understanding the process of consumers during key purchasing decision points is the margin between success and failure for any business. The cultural differences between the factors that affect consumers in their decision-making process is the motivation of this research. The purpose of this research is to extend the current body of knowledge about decision-making factors by developing and testing a new theoretical model to measure how culture may affect the attitudes and behaviors of consumers in restaurants. This study has its theoretical foundation in the theory of service quality, theory of planned behavior, and rational choice theory. To understand how culture affects the decision-making process and perceived satisfaction, it is necessary to analyze the relationships among the decision factors and attitudes. The findings of this study contribute by building theory and having practical implications for restaurant owners and managers. This study employs a mixed methodology of qualitative and quantitative research. More specifically, the methodologies employed include the development of a framework and testing of that framework via collection of data using semi-structured interviews and a survey instrument. Considering this framework, we test culture as a moderating relationship by using respondents’ birth country, parents’ birth country and ethnic identity. The results …
Date: August 2015
Creator: Boonme, Kittipong
System: The UNT Digital Library
Robustness of Parametric and Nonparametric Tests When Distances between Points Change on an Ordinal Measurement Scale (open access)

Robustness of Parametric and Nonparametric Tests When Distances between Points Change on an Ordinal Measurement Scale

The purpose of this research was to evaluate the effect on parametric and nonparametric tests using ordinal data when the distances between points changed on the measurement scale. The research examined the performance of Type I and Type II error rates using selected parametric and nonparametric tests.
Date: August 1994
Creator: Chen, Andrew H. (Andrew Hwa-Fen)
System: The UNT Digital Library
Robustness of the One-Sample Kolmogorov Test to Sampling from a Finite Discrete Population (open access)

Robustness of the One-Sample Kolmogorov Test to Sampling from a Finite Discrete Population

One of the most useful and best known goodness of fit test is the Kolmogorov one-sample test. The assumptions for the Kolmogorov (one-sample test) test are: 1. A random sample; 2. A continuous random variable; 3. F(x) is a completely specified hypothesized cumulative distribution function. The Kolmogorov one-sample test has a wide range of applications. Knowing the effect fromusing the test when an assumption is not met is of practical importance. The purpose of this research is to analyze the robustness of the Kolmogorov one-sample test to sampling from a finite discrete distribution. The standard tables for the Kolmogorov test are derived based on sampling from a theoretical continuous distribution. As such, the theoretical distribution is infinite. The standard tables do not include a method or adjustment factor to estimate the effect on table values for statistical experiments where the sample stems from a finite discrete distribution without replacement. This research provides an extension of the Kolmogorov test when the hypothesized distribution function is finite and discrete, and the sampling distribution is based on sampling without replacement. An investigative study has been conducted to explore possible tendencies and relationships in the distribution of Dn when sampling with and without replacement …
Date: December 1996
Creator: Tucker, Joanne M. (Joanne Morris)
System: The UNT Digital Library
Comparing the Powers of Several Proposed Tests for Testing the Equality of the Means of Two Populations When Some Data Are Missing (open access)

Comparing the Powers of Several Proposed Tests for Testing the Equality of the Means of Two Populations When Some Data Are Missing

In comparing the means .of two normally distributed populations with unknown variance, two tests very often used are: the two independent sample and the paired sample t tests. There is a possible gain in the power of the significance test by using the paired sample design instead of the two independent samples design.
Date: May 1994
Creator: Dunu, Emeka Samuel
System: The UNT Digital Library
Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance? (open access)

Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance?

The major purposes of the current research are twofold. The first purpose is to present a composite approach to the general classification problem by using outputs from various parametric statistical procedures and neural networks. The second purpose is to compare several parametric and neural network models on a transportation planning related classification problem and five simulated classification problems.
Date: December 1998
Creator: Mitchell, David
System: The UNT Digital Library
The Effect of Certain Modifications to Mathematical Programming Models for the Two-Group Classification Problem (open access)

The Effect of Certain Modifications to Mathematical Programming Models for the Two-Group Classification Problem

This research examines certain modifications of the mathematical programming models to improve their classificatory performance. These modifications involve the inclusion of second-order terms and secondary goals in mathematical programming models. A Monte Carlo simulation study is conducted to investigate the performance of two standard parametric models and various mathematical programming models, including the MSD (minimize sum of deviations) model, the MIP (mixed integer programming) model and the hybrid linear programming model.
Date: May 1994
Creator: Wanarat, Pradit
System: The UNT Digital Library
A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications (open access)

A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications

This study develops a heuristic procedure for specifying parameters for a neural network configuration (learning rate, momentum, and the number of neurons in a single hidden layer) in Shewhart X-bar control chart applications. Also, this study examines the replicability of the neural network solution when the neural network is retrained several times with different initial weights.
Date: December 1993
Creator: Nam, Kyungdoo T.
System: The UNT Digital Library
The Fixed v. Variable Sampling Interval Shewhart X-Bar Control Chart in the Presence of Positively Autocorrelated Data (open access)

The Fixed v. Variable Sampling Interval Shewhart X-Bar Control Chart in the Presence of Positively Autocorrelated Data

This study uses simulation to examine differences between fixed sampling interval (FSI) and variable sampling interval (VSI) Shewhart X-bar control charts for processes that produce positively autocorrelated data. The influence of sample size (1 and 5), autocorrelation parameter, shift in process mean, and length of time between samples is investigated by comparing average time (ATS) and average number of samples (ANSS) to produce an out of control signal for FSI and VSI Shewhart X-bar charts. These comparisons are conducted in two ways: control chart limits pre-set at ±3σ_x / √n and limits computed from the sampling process. Proper interpretation of the Shewhart X-bar chart requires the assumption that observations are statistically independent; however, process data are often autocorrelated over time. Results of this study indicate that increasing the time between samples decreases the effect of positive autocorrelation between samples. Thus, with sufficient time between samples the assumption of independence is essentially not violated. Samples of size 5 produce a faster signal than samples of size 1 with both the FSI and VSI Shewhart X-bar chart when positive autocorrelation is present. However, samples of size 5 require the same time when the data are independent, indicating that this effect is a …
Date: May 1993
Creator: Harvey, Martha M. (Martha Mattern)
System: The UNT Digital Library