A Simulation Study Comparing Various Confidence Intervals for the Mean of Voucher Populations in Accounting (open access)

A Simulation Study Comparing Various Confidence Intervals for the Mean of Voucher Populations in Accounting

This research examined the performance of three parametric methods for confidence intervals: the classical, the Bonferroni, and the bootstrap-t method, as applied to estimating the mean of voucher populations in accounting. Usually auditing populations do not follow standard models. The population for accounting audits generally is a nonstandard mixture distribution in which the audit data set contains a large number of zero values and a comparatively small number of nonzero errors. This study assumed a situation in which only overstatement errors exist. The nonzero errors were assumed to be normally, exponentially, and uniformly distributed. Five indicators of performance were used. The classical method was found to be unreliable. The Bonferroni method was conservative for all population conditions. The bootstrap-t method was excellent in terms of reliability, but the lower limit of the confidence intervals produced by this method was unstable for all population conditions. The classical method provided the shortest average width of the confidence intervals among the three methods. This study provided initial evidence as to how the parametric bootstrap-t method performs when applied to the nonstandard distribution of audit populations of line items. Further research should provide a reliable confidence interval for a wider variety of accounting populations.
Date: December 1992
Creator: Lee, Ihn Shik
System: The UNT Digital Library
Call Option Premium Dynamics (open access)

Call Option Premium Dynamics

This study has a twofold purpose: to demonstrate the use of the Marquardt compromise method in estimating the unknown parameters contained in the probability call-option pricing models and to test empirically the following models: the Boness, the Black-Scholes, the Merton proportional dividend, the Ingersoll differential tax, and the Ingersoll proportional dividend and differential tax.
Date: December 1982
Creator: Chen, Jim
System: The UNT Digital Library
Financial Leverage and the Cost of Capital (open access)

Financial Leverage and the Cost of Capital

The objective of the research reported in this dissertation is to conduct an empirical test of the hypothesis that, excluding income tax effects, the cost of capital to a firm is independent of the degree of financial leverage employed by the firm. This hypothesis, set forth by Franco Modigliani and Merton Miller in 1958, represents a challenge to the traditional view on the subject, a challenge which carries implications of considerable importance in the field of finance. The challenge has led to a lengthy controversy which can ultimately be resolved only by subjecting the hypothesis to empirical test. The basis of the test was Modigliani and Miller's Proposition II, a corollary of their fundamental hypothesis. Proposition II, in effect, states that equity investors fully discount any increase in risk due to financial leverage so that there is no possibility for the firm to reduce its cost of capital by employing financial leverage. The results of the research reported in this dissertation do not support that contention. The study indicates that, if equity investors require any increase in premium for increasing financial leverage, the premium required is significantly less than that predicted by the Modigliani-Miller Proposition II, over the range of …
Date: December 1977
Creator: Brust, Melvin F.
System: The UNT Digital Library
Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles (open access)

Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles

The cycle regression analysis algorithm is the most recent addition to a group of techniques developed to detect "hidden periodicities." This dissertation investigates four major aspects of the algorithm. The objectives of this research are 1. To develop an objective method of obtaining an initial estimate of the cycle period? the present procedure of obtaining this estimate involves considerable subjective judgment; 2. To validate the algorithm's success in extracting cycles from multi-cylical data; 3. To determine if a consistent relationship exists among the smallest amplitude, the error standard deviation, and the number of replications of a cycle contained in the data; 4. To investigate the behavior of the algorithm in the predictions of major drops.
Date: December 1982
Creator: Mehta, Mayur Ravishanker
System: The UNT Digital Library
The Normal Curve Approximation to the Hypergeometric Probability Distribution (open access)

The Normal Curve Approximation to the Hypergeometric Probability Distribution

The classical normal curve approximation to cumulative hypergeometric probabilities requires that the standard deviation of the hypergeometric distribution be larger than three which limits the usefulness of the approximation for small populations. The purposes of this study are to develop clearly-defined rules which specify when the normal curve approximation to the cumulative hypergeometric probability distribution may be successfully utilized and to determine where maximum absolute differences between the cumulative hypergeometric and normal curve approximation of 0.01 and 0.05 occur in relation to the proportion of the population sampled.
Date: December 1981
Creator: Willman, Edward N. (Edward Nicholas)
System: The UNT Digital Library
Robustness of the One-Sample Kolmogorov Test to Sampling from a Finite Discrete Population (open access)

Robustness of the One-Sample Kolmogorov Test to Sampling from a Finite Discrete Population

One of the most useful and best known goodness of fit test is the Kolmogorov one-sample test. The assumptions for the Kolmogorov (one-sample test) test are: 1. A random sample; 2. A continuous random variable; 3. F(x) is a completely specified hypothesized cumulative distribution function. The Kolmogorov one-sample test has a wide range of applications. Knowing the effect fromusing the test when an assumption is not met is of practical importance. The purpose of this research is to analyze the robustness of the Kolmogorov one-sample test to sampling from a finite discrete distribution. The standard tables for the Kolmogorov test are derived based on sampling from a theoretical continuous distribution. As such, the theoretical distribution is infinite. The standard tables do not include a method or adjustment factor to estimate the effect on table values for statistical experiments where the sample stems from a finite discrete distribution without replacement. This research provides an extension of the Kolmogorov test when the hypothesized distribution function is finite and discrete, and the sampling distribution is based on sampling without replacement. An investigative study has been conducted to explore possible tendencies and relationships in the distribution of Dn when sampling with and without replacement …
Date: December 1996
Creator: Tucker, Joanne M. (Joanne Morris)
System: The UNT Digital Library
Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance? (open access)

Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance?

The major purposes of the current research are twofold. The first purpose is to present a composite approach to the general classification problem by using outputs from various parametric statistical procedures and neural networks. The second purpose is to compare several parametric and neural network models on a transportation planning related classification problem and five simulated classification problems.
Date: December 1998
Creator: Mitchell, David
System: The UNT Digital Library
A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications (open access)

A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications

This study develops a heuristic procedure for specifying parameters for a neural network configuration (learning rate, momentum, and the number of neurons in a single hidden layer) in Shewhart X-bar control chart applications. Also, this study examines the replicability of the neural network solution when the neural network is retrained several times with different initial weights.
Date: December 1993
Creator: Nam, Kyungdoo T.
System: The UNT Digital Library
Impact of Forecasting Method Selection and Information Sharing on Supply Chain Performance. (open access)

Impact of Forecasting Method Selection and Information Sharing on Supply Chain Performance.

Effective supply chain management gains much attention from industry and academia because it helps firms across a supply chain to reduce cost and improve customer service level efficiently. Focusing on one of the key challenges of the supply chains, namely, demand uncertainty, this dissertation extends the work of Zhao, Xie, and Leung so as to examine the effects of forecasting method selection coupled with information sharing on supply chain performance in a dynamic business environment. The results of this study showed that under various scenarios, advanced forecasting methods such as neural network and GARCH models play a more significant role when capacity tightness increases and is more important to the retailers than to the supplier under certain circumstances in terms of supply chain costs. Thus, advanced forecasting models should be promoted in supply chain management. However, this study also demonstrated that forecasting methods not capable of modeling features of certain demand patterns significantly impact a supply chain's performance. That is, a forecasting method misspecified for characteristics of the demand pattern usually results in higher supply chain costs. Thus, in practice, supply chain managers should be cognizant of the cost impact of selecting commonly used traditional forecasting methods, such as moving …
Date: December 2009
Creator: Pan, Youqin
System: The UNT Digital Library
Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers (open access)

Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers

In the Information Age, a proliferation of unstructured text electronic documents exists. Processing these documents by humans is a daunting task as humans have limited cognitive abilities for processing large volumes of documents that can often be extremely lengthy. To address this problem, text data computer algorithms are being developed. Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are two text data computer algorithms that have received much attention individually in the text data literature for topic extraction studies but not for document classification nor for comparison studies. Since classification is considered an important human function and has been studied in the areas of cognitive science and information science, in this dissertation a research study was performed to compare LDA, LSA and humans as document classifiers. The research questions posed in this study are: R1: How accurate is LDA and LSA in classifying documents in a corpus of textual data over a known set of topics? R2: How accurate are humans in performing the same classification task? R3: How does LDA classification performance compare to LSA classification performance? To address these questions, a classification study involving human subjects was designed where humans were asked to generate and classify documents …
Date: December 2011
Creator: Anaya, Leticia H.
System: The UNT Digital Library