37 Matching Results

Results open in a new window/tab.

Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time (open access)

Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time

In this study a two-part mixed probability density function was derived which described the relative changes in the Standard and Poor's 100 Stock Index over various intervals of time. The density function is a mixture of two different halves of normal distributions. Optimal values for the standard deviations for the two halves and the mean are given. Also, a general form of the function is given which uses linear regression models to estimate the standard deviations and the means. The density functions allow stock market participants trading index options and futures contracts on the S & P 100 Stock Index to determine probabilities of success or failure of trades involving price movements of certain magnitudes in given lengths of time.
Date: August 1988
Creator: Bunger, R. C. (Robert Charles)
System: The UNT Digital Library
The Normal Curve Approximation to the Hypergeometric Probability Distribution (open access)

The Normal Curve Approximation to the Hypergeometric Probability Distribution

The classical normal curve approximation to cumulative hypergeometric probabilities requires that the standard deviation of the hypergeometric distribution be larger than three which limits the usefulness of the approximation for small populations. The purposes of this study are to develop clearly-defined rules which specify when the normal curve approximation to the cumulative hypergeometric probability distribution may be successfully utilized and to determine where maximum absolute differences between the cumulative hypergeometric and normal curve approximation of 0.01 and 0.05 occur in relation to the proportion of the population sampled.
Date: December 1981
Creator: Willman, Edward N. (Edward Nicholas)
System: The UNT Digital Library
Economic Statistical Design of Inverse Gaussian Distribution Control Charts (open access)

Economic Statistical Design of Inverse Gaussian Distribution Control Charts

Statistical quality control (SQC) is one technique companies are using in the development of a Total Quality Management (TQM) culture. Shewhart control charts, a widely used SQC tool, rely on an underlying normal distribution of the data. Often data are skewed. The inverse Gaussian distribution is a probability distribution that is wellsuited to handling skewed data. This analysis develops models and a set of tools usable by practitioners for the constrained economic statistical design of control charts for inverse Gaussian distribution process centrality and process dispersion. The use of this methodology is illustrated by the design of an x-bar chart and a V chart for an inverse Gaussian distributed process.
Date: August 1990
Creator: Grayson, James M. (James Morris)
System: The UNT Digital Library
The Development and Evaluation of a Forecasting System that Incorporates ARIMA Modeling with Autoregression and Exponential Smoothing (open access)

The Development and Evaluation of a Forecasting System that Incorporates ARIMA Modeling with Autoregression and Exponential Smoothing

This research was designed to develop and evaluate an automated alternative to the Box-Jenkins method of forecasting. The study involved two major phases. The first phase was the formulation of an automated ARIMA method; the second was the combination of forecasts from the automated ARIMA with forecasts from two other automated methods, the Holt-Winters method and the Stepwise Autoregressive method. The development of the automated ARIMA, based on a decision criterion suggested by Akaike, borrows heavily from the work of Ang, Chuaa and Fatema. Seasonality and small data set handling were some of the modifications made to the original method to make it suitable for use with a broad range of time series. Forecasts were combined by means of both the simple average and a weighted averaging scheme. Empirical and generated data were employed to perform the forecasting evaluation. The 111 sets of empirical data came from the M-Competition. The twenty-one sets of generated data arose from ARIMA models that Box, Taio and Pack analyzed using the Box-Jenkins method. To compare the forecasting abilities of the Box-Jenkins and the automated ARIMA alone and in combination with the other two methods, two accuracy measures were used. These measures, which are free …
Date: May 1985
Creator: Simmons, Laurette Poulos
System: The UNT Digital Library
A Goal Programming Safety and Health Standards Compliance Model (open access)

A Goal Programming Safety and Health Standards Compliance Model

The purpose of this dissertation was to create a safety compliance model which would advance the state of the art of safety compliance models and provide management with a practical tool which can be used in making safety decisions in an environment where multiple objectives exist. A goal programming safety compliance model (OSHA Model) was developed to fulfill this purpose. The objective function of the OSHA Model was designed to minimize the total deviation from the established goals of the model. These model goals were expressed in terms of 1) level of compliance to OSHA safety and health regulations, 2) company accident frequency rate, 3) company accident cost per worker, and 4) a company budgetary restriction. This particular set of goals was selected to facilitate management's fulfillment of its responsibilities to OSHA, the employees, and to ownership. This study concludes that all the research objectives have been accomplished. The OSHA Model formulated not only advances the state of the art of safety compliance models, but also provides a practical tool which facilitates management's safety and health decisions. The insight into the relationships existing in a safety compliance decision system provided by the OSHA Model and its accompanying sensitivity analysis was …
Date: August 1976
Creator: Ryan, Lanny J.
System: The UNT Digital Library
The Establishment of Helicopter Subsystem Design-to-Cost Estimates by Use of Parametric Cost Estimating Models (open access)

The Establishment of Helicopter Subsystem Design-to-Cost Estimates by Use of Parametric Cost Estimating Models

The purpose of this research was to develop parametric Design-to-Cost models for selected major subsystems of certain helicopters. This was accomplished by analyzing the relationships between historical production costs and certain design parameters which are available during the preliminary design phase of the life cycle. Several potential contributions are identified in the areas of academia, government, and industry. Application of the cost models will provide estimates beneficial to the government and DoD by allowing derivation of realistic Design-to-Cost estimates. In addition, companies in the helicopter industry will benefit by using the models for two key purposes: (1) optimizing helicopter design through cost-effective tradeoffs, and (2) justifying a proposal estimate.
Date: August 1979
Creator: Gilliland, Johnny J.
System: The UNT Digital Library
The Comparative Effects of Varying Cell Sizes on Mcnemar's Test with the Χ^2 Test of Independence and T Test for Related Samples (open access)

The Comparative Effects of Varying Cell Sizes on Mcnemar's Test with the Χ^2 Test of Independence and T Test for Related Samples

This study compared the results for McNemar's test, the t test for related measures, and the chi-square test of independence as cell sized varied in a two-by-two frequency table. In this study. the probability results for McNemar's rest, the t test for related measures, and the chi-square test of independence were compared for 13,310 different combinations of cell sizes in a two-by-two design. Several conclusions were reached: With very few exceptions, the t test for related measures and McNemar's test yielded probability results within .002 of each other. The chi-square test seemed to equal the other two tests consistently only when low probabilities less than or equal to .001 were attained. It is recommended that the researcher consider using the t test for related measures as a viable option for McNemar's test except when the researcher is certain he/she is only interested in 'changes'. The chi-square test of independence not only tests a different hypothesis than McNemar's test, but it often yields greatly differing results from McNemar's test.
Date: August 1980
Creator: Black, Kenneth U.
System: The UNT Digital Library
The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas (open access)

The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas

The purpose of this study is to determine the effects of water pollution control on financing municipal water pollution control facilities in selected cities in North Central Texas. This objective is accomplished by addressing the following topics: (1) the cost to municipalities of meeting federally mandated water pollution control, (2) the sources of funds for financing sewage treatment, and (3) the financial implications of employing these financing tools to satisfy water quality regulations. The study makes the following conclusions regarding the impact of water pollution control costs on municipalities in the North Central Texas Region: 1) The financing of the wastewater treatment requirements of the Water Pollution Control Act Amendments of 1972 will cause many municipalities to report operating deficits for their Water and Sewer Fund. 2) A federal grant program funded at the rate of 75 per cent of waste treatment needs will prevent operating deficits in the majority of cities in which 1990 waste treatment needs constitute 20 per cent or more of the expected Water and Sewer Fund capital structure. 3) A federal grant program funded at the average rate of 35 per cent of needs will benefit only a small number of cities. 4) The federal …
Date: May 1976
Creator: Rucks, Andrew C.
System: The UNT Digital Library
A Quantitative Approach to Medical Decision Making (open access)

A Quantitative Approach to Medical Decision Making

The purpose of this study is to develop a technique by which a physician may use a predetermined data base to derive a preliminary diagnosis for a patient with a given set of symptoms. The technique will not yield an absolute diagnosis, but rather will point the way to a set of most likely diseases upon which the physician may concentrate his efforts. There will be no reliance upon a data base compiled from poorly kept medical records with non-standardization of terminology. While this study produces a workable tool for the physician to use in the process of medical diagnosis, the ultimate responsibility for the patient's welfare must still rest with the physician.
Date: May 1975
Creator: Meredith, John W.
System: The UNT Digital Library
A Model for the Efficient Investment of Temporary Funds by Corporate Money Managers (open access)

A Model for the Efficient Investment of Temporary Funds by Corporate Money Managers

In this study seventeen various relationships between yields of three-month, six-month, and twelve-month maturity negotiable CD's and U.S. Government T-Bills were analyzed to find a leading indicator of short-term interest rates. Each of the seventeen relationships was tested for correlation with actual three-, six-, and twelve-month yields from zero to twenty-six weeks in the future. Only one relationship was found to be significant as a leading indicator. This was the twelve-month yield minus the six-month yield adjusted for scale and accumulated where the result was positive. This indicator (variable nineteen in the study) was further tested for usefulness as a trend indicator by transforming it into a function consisting of +1 (when its slope was positive), 0 (when its slope was zero), and -1 (when its slope was negative). Stage II of the study consisted of constructing a computer-aided model employing variable nineteen as a forecasting device. The model accepts a week-by-week minimum cash balance forecast, and the past thirteen weeks' yields of three-, six-, and twelve-month CD's as input. The output of the model consists of a cash time availability schedule, a numerical listing of variable nineteen values, the thirteen-week history of three-, six-, and twelve-month CD yields, a …
Date: August 1974
Creator: McWilliams, Donald B., 1936-
System: The UNT Digital Library
The Impact of Culture on the Decision Making Process in Restaurants (open access)

The Impact of Culture on the Decision Making Process in Restaurants

Understanding the process of consumers during key purchasing decision points is the margin between success and failure for any business. The cultural differences between the factors that affect consumers in their decision-making process is the motivation of this research. The purpose of this research is to extend the current body of knowledge about decision-making factors by developing and testing a new theoretical model to measure how culture may affect the attitudes and behaviors of consumers in restaurants. This study has its theoretical foundation in the theory of service quality, theory of planned behavior, and rational choice theory. To understand how culture affects the decision-making process and perceived satisfaction, it is necessary to analyze the relationships among the decision factors and attitudes. The findings of this study contribute by building theory and having practical implications for restaurant owners and managers. This study employs a mixed methodology of qualitative and quantitative research. More specifically, the methodologies employed include the development of a framework and testing of that framework via collection of data using semi-structured interviews and a survey instrument. Considering this framework, we test culture as a moderating relationship by using respondents’ birth country, parents’ birth country and ethnic identity. The results …
Date: August 2015
Creator: Boonme, Kittipong
System: The UNT Digital Library
The Effect of Value Co-creation and Service Quality on Customer Satisfaction and Commitment in Healthcare Management (open access)

The Effect of Value Co-creation and Service Quality on Customer Satisfaction and Commitment in Healthcare Management

Despite much interest in service quality and various other service quality measures, scholars appear to have overlooked the overall concept of quality. More specifically, previous research has yet to integrate the effect of the customer network and customer knowledge into the measurement of quality. In this work, it is posited that the evaluation of quality is based on both the delivered value from the provider as well as the value developed from the relationships among customers and between customers and providers. This research examines quality as a broad and complex issue, and uses the “Big Quality” concept within the context of routine healthcare service. The last few decades have witnessed interest and activities surrounding the subject of quality and value co-creation. These are core features of Service-Dominant (S-D) logic theory. In this theory, the customer is a collaborative partner who co-creates value with the firm. Customers create value through the strength of their relations and network, and they take a central role in value actualization as value co-creator. I propose to examine the relationship between quality and the constructs of value co-creation. As well, due to the pivotal role of the decision-making process in customer satisfaction, I will also operationalize …
Date: August 2015
Creator: Kwon, Junhyuk
System: The UNT Digital Library
The Chi Square Approximation to the Hypergeometric Probability Distribution (open access)

The Chi Square Approximation to the Hypergeometric Probability Distribution

This study compared the results of his chi square text of independence and the corrected chi square statistic against Fisher's exact probability test (the hypergeometric distribution) in contection with sampling from a finite population. Data were collected by advancing the minimum call size from zero to a maximum which resulted in a tail area probability of 20 percent for sample sizes from 10 to 100 by varying increments. Analysis of the data supported the rejection of the null hypotheses regarding the general rule-of-thumb guidelines concerning sample size, minimum cell expected frequency and the continuity correction factor. it was discovered that the computation using Yates' correction factor resulted in values which were so overly conservative (i.e. tail area porobabilities that were 20 to 50 percent higher than Fisher's exact test) that conclusions drawn from this calculation might prove to be inaccurate. Accordingly, a new correction factor was proposed which eliminated much of this discrepancy. Its performance was equally consistent with that of the uncorrected chi square statistic and at times, even better.
Date: August 1982
Creator: Anderson, Randy J. (Randy Jay)
System: The UNT Digital Library