HVAC (heating, ventilation, air conditioning) literature in Japan: A critical review (open access)

HVAC (heating, ventilation, air conditioning) literature in Japan: A critical review

Japanese businessmen in the heating, ventilation, air conditioning, and refrigeration (HVACandR) industry consider the monitoring of technical and market developments in the United States to be a normal part of their business. In contrast, efforts by US businessmen to monitor Japanese HVAC and R developments are poorly developed. To begin to redress this imbalance, this report establishes the groundwork for a more effective system for use in monitoring Japanese HVAC and R literature. Discussions of a review of the principal HVAC and R publications in Japan and descriptions of the type of information contained in each of those publications are included in this report. Since the Japanese HVAC and R literature is abundant, this report also provides practical suggestions on how a researcher or research manager can limit the monitoring effort to the publications and type of information that would most likely be of greatest value.
Date: February 1, 1988
Creator: Hane, G.J.
Object Type: Report
System: The UNT Digital Library
RSIC (Radiation Shielding Information Center) after 25 years: Challenges and opportunities. [Radiation Shielding Information Center] (open access)

RSIC (Radiation Shielding Information Center) after 25 years: Challenges and opportunities. [Radiation Shielding Information Center]

The Radiation Shielding Information Center (RSIC) observed its 25th year in operation in 1987. During that time numerous changes have occurred in the government programs that sponsor RSIC and in the radiation transport community which it serves. The continued need for RSIC is evident from the steady volume of requests and interactions with the user community. It is a continual challenge to adjust and adapt our operation to respond to the demands placed on RSIC by sponsors and users. Cooperation between sponsors, users, and the RSIC staff is the key to keeping RSIC as the focus of activities in the international radiation transport community. 7 refs.
Date: January 1, 1988
Creator: Roussin, R. W.; Maskewitz, B. F. & Trubey, D. K.
Object Type: Article
System: The UNT Digital Library
Nuclear Test-Experimental Science: Annual report, fiscal year 1988 (open access)

Nuclear Test-Experimental Science: Annual report, fiscal year 1988

Fiscal year 1988 has been a significant, rewarding, and exciting period for Lawrence Livermore National Laboratory's nuclear testing program. It was significant in that the Laboratory's new director chose to focus strongly on the program's activities and to commit to a revitalized emphasis on testing and the experimental science that underlies it. It was rewarding in that revolutionary new measurement techniques were fielded on recent important and highly complicated underground nuclear tests with truly incredible results. And it was exciting in that the sophisticated and fundamental problems of weapons science that are now being addressed experimentally are yielding new challenges and understanding in ways that stimulate and reward the brightest and best of scientists. During FY88 the program was reorganized to emphasize our commitment to experimental science. The name of the program was changed to reflect this commitment, becoming the Nuclear Test-Experimental Science (NTES) Program.
Date: January 1, 1988
Creator: Struble, G.L.; Donohue, M.L.; Bucciarelli, G.; Hymer, J.D.; Kirvel, R.D.; Middleton, C. et al.
Object Type: Report
System: The UNT Digital Library
The ecology of software configuration management (open access)

The ecology of software configuration management

This paper discusses how to overcome failures and pitfalls in software configuration management. (LSP)
Date: January 1, 1988
Creator: Cort, Gary P.
Object Type: Article
System: The UNT Digital Library
A national HIV (Human Immunodeficiency Virus) database that facilitates data sharing (open access)

A national HIV (Human Immunodeficiency Virus) database that facilitates data sharing

The purpose of this communication is to stimulate discussion on a National Human Immunodeficiency Virus (HIV) Database that facilitates and coordinates data sharing. We argue for the creation of a new database because significant gaps exists in the type of information that are available on HIV. Databases that extensively survey the published literature on HIV are widely available, however, databases that contain either raw data or that describe ongoing HIV research efforts are not widely available. For epidemiologists, sociologists and mathematical modelers, who need to draw on raw epidemiologic and behavior data from a broad range of fields, the existing databases are inadequate. In this paper we emphasize the particular requirements of epidemiologists, sociologists and modelers, and suggest a plan to accommodate their database needs.
Date: January 1, 1988
Creator: Layne, S. P.; Marr, T. G.; Stanley, E. A.; Hyman, J. M. & Colgate, S. A.
Object Type: Article
System: The UNT Digital Library
Nuclear Science Division annual report, October 1, 1986--September 30, 1987 (open access)

Nuclear Science Division annual report, October 1, 1986--September 30, 1987

This report summarizes the activities of the Nuclear Science Division during the period October 1, 1986 to September 30, 1987. A highlight of the experimental program during this time was the completion of the first round of heavy-ion running at CERN with ultrarelativistic oxygen and sulfur beams. Very rapid progress is being made in the analysis of these important experiments and preliminary results are presented in this report. During this period, the Bevalac also continued to produce significant new physics results, while demand for beam time remained high. An important new community of users has arrived on the scene, eager to exploit the unique low-energy heavy-beam capabilities of the Bevalac. Another major highlight of the program has been the performance of the Dilepton Spectrometer which has entered into production running. Dileptons have been observed in the p + Be and Ca + Ca reactions at several bombarding energies. New data on pion production with heavy beams measured in the streamer chamber to shed light on the question of nuclear compressibility, while posing some new questions concerning the role of Coulomb forces on the observed pion spectra. In another quite different area, the pioneering research with radioactive beams is continuing and …
Date: September 1, 1988
Creator: Mahoney, J. (ed.)
Object Type: Report
System: The UNT Digital Library
Energy technologies and the environment: Environmental information handbook (open access)

Energy technologies and the environment: Environmental information handbook

This revision of Energy Technologies and the Environment reflects the changes in energy supply and demand, focus of environmental concern, and emphasis of energy research and development that have occurred since publication of the earlier edition in 1980. The increase in availability of oil and natural gas, at least for the near term, is responsible in part for a reduced emphasis on development of replacement fuels and technologies. Trends in energy development also have been influenced by an increased reliance on private industry initiatives, and a correspondingly reduced government involvement, in demonstrating more developed technologies. Environmental concerns related to acid rain and waste management continue to increase the demand for development of innovative energy systems. The basic criteria for including a technology in this report are that (1) the technology is a major current or potential future energy supply and (2) significant changes in employing or understanding the technology have occurred since publication of the 1980 edition. Coal is seen to be a continuing major source of energy supply, and thus chapters pertaining to the principal coal technologies have been revised from the 1980 edition (those on coal mining and preparation, conventional coal-fired power plants, fluidized-bed combustion, coal gasification, and …
Date: October 1, 1988
Creator: unknown
Object Type: Report
System: The UNT Digital Library
The graphics advisor: A prototypical advisory expert system (open access)

The graphics advisor: A prototypical advisory expert system

The Graphics Advisor is an expert system that helps users of the Los Alamos Integrated Computing Network select the graphics library that is best suited to the characteristics of a specific graphics application. The implementation of the system, using a commercial expert system development tool, is described. Delivery options are discussed. Although the domain knowledge of the Graphics Advisor is oriented toward libraries that are supported by the Computing and Communications Division at Los Alamos, it exemplifies a class of knowledge-based systems that are potentially useful in a variety of advisory situations. 11 refs., 4 figs.
Date: January 1, 1988
Creator: Berkbigler, K.P. & Max, P.A.
Object Type: Article
System: The UNT Digital Library
A training program for scientific supercomputing users (open access)

A training program for scientific supercomputing users

There is need for a mechanism to transfer supercomputing technology into the hands of scientists and engineers in such a way that they will acquire a foundation of knowledge that will permit integration of supercomputing as a tool in their research. Most computing center training emphasizes computer-specific information about how to use a particular computer system; most academic programs teach concepts to computer scientists. Only a few brief courses and new programs are designed for computational scientists. This paper describes an eleven-week training program aimed principally at graduate and postdoctoral students in computationally-intensive fields. The program is designed to balance the specificity of computing center courses, the abstractness of computer science courses, and the personal contact of traditional apprentice approaches. It is based on the experience of computer scientists and computational scientists, and consists of seminars and clinics given by many visiting and local faculty. It covers a variety of supercomputing concepts, issues, and practices related to architecture, operating systems, software design, numerical considerations, code optimization, graphics, communications, and networks. Its research component encourages understanding of scientific computing and supercomputer hardware issues. Flexibility in thinking about computing needs is emphasized by the use of several different supercomputer architectures, such as …
Date: January 1, 1988
Creator: Hanson, F.; Moher, T.; Sabelli, N. & Solem, A.
Object Type: Article
System: The UNT Digital Library
Conference on iterative methods for large linear systems (open access)

Conference on iterative methods for large linear systems

This conference is dedicated to providing an overview of the state of the art in the use of iterative methods for solving sparse linear systems with an eye to contributions of the past, present and future. The emphasis is on identifying current and future research directions in the mainstream of modern scientific computing. Recently, the use of iterative methods for solving linear systems has experienced a resurgence of activity as scientists attach extremely complicated three-dimensional problems using vector and parallel supercomputers. Many research advances in the development of iterative methods for high-speed computers over the past forty years are reviewed, as well as focusing on current research.
Date: December 1, 1988
Creator: Kincaid, D. R.
Object Type: Article
System: The UNT Digital Library
Selecting an authoring system 1988: How the vendors make it more difficult (open access)

Selecting an authoring system 1988: How the vendors make it more difficult

The information and demonstration packages provided by vendors of authoring systems are not adequate to make a thorough evaluation of the product. Usually, data critical to the evaluation process is absent in favor of detailed descriptions of features of the system. The difficulties we encountered in evaluating authoring systems prompted several recommendations for vendors to include in their packages. These recommendations include allowing the prospective customer to actually use the system, providing information on instructional strategies and sequences that can be achieved with the system, and expanding the information provided on computer-managed instruction (CMI) capabilities and applications. The paper does not include any reference to specific vendors or systems, for the purpose is to increase awareness of a specific problem in the field. 12 refs.
Date: January 1, 1988
Creator: Trainor, Mary S. & Schultejann, Patricial Alexandra
Object Type: Article
System: The UNT Digital Library
A Bayesian Approach to the Design and Analysis of Computer Experiments (open access)

A Bayesian Approach to the Design and Analysis of Computer Experiments

We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of …
Date: January 1, 1988
Creator: Currin, C.
Object Type: Report
System: The UNT Digital Library
Conference on iterative methods for large linear systems (open access)

Conference on iterative methods for large linear systems

This conference is dedicated to providing an overview of the state of the art in the use of iterative methods for solving sparse linear systems with an eye to contributions of the past, present and future. The emphasis is on identifying current and future research directions in the mainstream of modern scientific computing. Recently, the use of iterative methods for solving linear systems has experienced a resurgence of activity as scientists attach extremely complicated three-dimensional problems using vector and parallel supercomputers. Many research advances in the development of iterative methods for high-speed computers over the past forty years are reviewed, as well as focusing on current research.
Date: December 1, 1988
Creator: Kincaid, D. R.
Object Type: Article
System: The UNT Digital Library
Automation impact study of Army Training Management (open access)

Automation impact study of Army Training Management

The main objectives of this impact study were to identify the potential cost savings associated with automated Army Training Management (TM), and to perform a cost-benefit analysis for an Army-wide automated TM system. A subsidiary goal was to establish baseline data for an independent evaluation of a prototype Integrated Training Management System (ITMS), to be tested in the fall of 1988. A structured analysis of TM doctrine was performed for comparison with empirical data gathered in a job analysis survey of selected units of the 9ID (MTZ) at Ft. Lewis, Washington. These observations will be extended to other units in subsequent surveys. The survey data concerning staffing levels and amount of labor expended on eight distinct TM tasks were analyzed in a cost effectiveness model. The main results of the surveys and cost effectiveness modelling are summarized. 18 figs., 47 tabs.
Date: January 1, 1988
Creator: Sanquist, T. F.; Schuller, C. R.; McCallum, M. C.; Underwood, J. A.; Bettin, P. J.; King, J. L. et al.
Object Type: Report
System: The UNT Digital Library
Building an artificial intelligence capability at Los Alamos (open access)

Building an artificial intelligence capability at Los Alamos

In 1985, after three years of preliminary work, Management of the Los Alamos National Laboratory started an ambitious program to develop a strong technical capability in the rapidly emerging field of Artificial Intelligence/Knowledge Based Systems (AI/KBS). When this AI development program began, except for a few staff members doing basic AI research, AI was essentially nonexistent at the laboratory. The basics, including such things as AI computer hardware and software, literature, books, knowledgeable personnel, or even a general knowledge of what AI was, were most difficult if not impossible to find. For this reason, we had to approach the problem with a very broad perspective, which strongly addressed the basics while aiming toward more advanced AI program elements. Broad, intensive education was the ''bootstrapping'' tool used in this five year, multi-million dollar AI capability development program. Halfway through the program, our accomplishments indicate that the program is extremely successful. In terms of trained staff, active programs and ''state-of-the-art equipment,'' we have developed one of the strongest AI technical capabilities within the Department of Energy (DOE) and the Department of Defense(DOD). However, a great deal more must be done before the full potential of the program can be realized. 1 fig.
Date: January 1, 1988
Creator: Marinuzzi, J.G.
Object Type: Article
System: The UNT Digital Library
NNDC (National Nuclear Data Center) support for fusion nuclear data needs (open access)

NNDC (National Nuclear Data Center) support for fusion nuclear data needs

The National Data Center (NNDC) located at Brookhaven National Laboratory is an outgrowth of the Sigma Center founded by D.J. Hughes to compile low energy neutron reaction data in the 1950's. The center has played a lead role in the production of evaluated nuclear data (ENDF/B) for the United States nuclear power program. This data file, now in its sixth version, is produced as a cooperative effort of many DOE funded organizations via the Cross Section Evaluation Working Group (GSEWG). The NNDC's role, in addition to providing the structure and leadership for CSEWG, is to supply compiled bibliographic and experimental data and provide file processing, checking, distribution and documentation services. In the past, the NNDC has also produced nuclear data evaluations.lt. slash
Date: January 1, 1988
Creator: Dunford, C.L.
Object Type: Article
System: The UNT Digital Library
Parallel algorithms for stochastic linear programs: A summary of research performed (open access)

Parallel algorithms for stochastic linear programs: A summary of research performed

This report contains papers on parallel algorithms used in linear programs. (LSP)
Date: January 1, 1988
Creator: Ariyawansa, K.A.
Object Type: Report
System: The UNT Digital Library
The EQ3/6 software package for geochemical modeling: Current status (open access)

The EQ3/6 software package for geochemical modeling: Current status

EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300{degree}C. 60 refs., 2 figs.
Date: July 1, 1988
Creator: Wolery, T. J.; Jackson, K. J.; Bourcier, W. L.; Bruton, C. J.; Viani, B. E.; Knauss, K. G. et al.
Object Type: Article
System: The UNT Digital Library
Nuclear data uncertainties: I, Basic concepts of probability (open access)

Nuclear data uncertainties: I, Basic concepts of probability

Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Date: December 1, 1988
Creator: Smith, D. L.
Object Type: Report
System: The UNT Digital Library
MHTGR [modular high-temperature gas-cooled reactor] core physics validation plan (open access)

MHTGR [modular high-temperature gas-cooled reactor] core physics validation plan

This document contains the verification and validation (V&V) plan for analytical methods utilized in the nuclear design for normal and off-normal conditions within the Modular High-Temperature Gas-Cooled Reactor (MHTGR). Regulations, regulatory guides, and industry standards have been reviewed and the approach for V&V has been developed. MHTGR core physics methods are described and the status of previous V&V is summarized within this document. Additional work required to verify and validate these methods is identified. The additional validation work includes comparison of calculations with available experimental data, benchmark comparison of calculations with available experimental data, benchmark comparisons with other validated codes, results from a cooperative program now underway at the Arbeitsgemeinschaft Versuchs-Reaktor GmbH (AVR) facility in Germany, results from a planned series of experiments on the Compact Nuclear Power Source (CNPS) facility at Los Alamos, and detailed documentation of all V&V studies. In addition, information will be obtained from planned international cooperative agreements to provide supplemental data for V&V. The regulatory technology development plan will be revised to include these additional experiments. A work schedule and cost estimate for completing this plan is also provided. This work schedule indicates the timeframe in which major milestones must be performed in order to …
Date: January 1, 1988
Creator: Baxter, A. & Hackney, R.
Object Type: Report
System: The UNT Digital Library
Statistical evaluation of PACSTAT random number generation capabilities (open access)

Statistical evaluation of PACSTAT random number generation capabilities

This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT were implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.
Date: May 1, 1988
Creator: Piepel, G. F.; Toland, M. R.; Harty, H.; Budden, M. J. & Bartley, C. L.
Object Type: Report
System: The UNT Digital Library
SCINFUL: A Monte Carlo based computer program to determine a scintillator full energy response to neutron detection for E/sub n/ between 0. 1 and 80 MeV: Program development and comparisons of program predictions with experimental data (open access)

SCINFUL: A Monte Carlo based computer program to determine a scintillator full energy response to neutron detection for E/sub n/ between 0. 1 and 80 MeV: Program development and comparisons of program predictions with experimental data

This document provides a discussion of the development of the FORTRAN Monte Carlo program SCINFUL (for scintillator full response), a program designed to provide a calculated full response anticipated for either an NE-213 (liquid) scintillator or an NE-110 (solid) scintillator. The program may also be used to compute angle-integrated spectra of charged particles (p, d, t, /sup 3/He, and ..cap alpha..) following neutron interactions with /sup 12/C. Extensive comparisons with a variety of experimental data are given. There is generally overall good agreement (<10% differences) of results from SCINFUL calculations with measured detector responses, i.e., N(E/sub r/) vs E/sub r/ where E/sub r/ is the response pulse height, reproduce measured detector responses with an accuracy which, at least partly, depends upon how well the experimental configuration is known. For E/sub n/ < 16 MeV and for E/sub r/ > 15% of the maximum pulse height response, calculated spectra are within +-5% of experiment on the average. For E/sub n/ up to 50 MeV similar good agreement is obtained with experiment for E/sub r/ > 30% of maximum response. For E/sub n/ up to 75 MeV the calculated shape of the response agrees with measurements, but the calculations underpredicts the measured …
Date: April 1, 1988
Creator: Dickens, J. K.
Object Type: Report
System: The UNT Digital Library
An assessment of the state of the art in predicting the failure of ceramics: Final report (open access)

An assessment of the state of the art in predicting the failure of ceramics: Final report

The greatest weakness in existing design strategies for brittle fracture is in the narrow range of conditions for which the strategies are adequate. The primary reason for this weakness is the use of simplistic mechanical models of fracture processes and unverified statistical models of materials. To improve the design methodology, the models must first be improved. Specifically recommended research goals are: to develop models of cracks with realistic geometry under arbitrary stress states; to identify and model the most important relationships between fracture processes and microstructural features; to assess the technology available for acquiring statistical data on microstructure and flaw populations, and to establish the amount of data required for verification of statistical models; and to establish a computer-based fracture simulation that can incorporate a wide variety of mechanical and statistical models and crack geometries, as well as arbitrary stress states. 204 refs., 2 tabs.
Date: March 1, 1988
Creator: Boulet, J.A.M.
Object Type: Report
System: The UNT Digital Library
Pacific Northwest Laboratory annual report for 1987 to the DOE Office of Energy Research: Part 1, Biomedical Sciences (open access)

Pacific Northwest Laboratory annual report for 1987 to the DOE Office of Energy Research: Part 1, Biomedical Sciences

This report summarizes progress on OHER biomedical and health-effects research conducted at Pacific Northwest Laboratory in FY 1987. The research develops the knowledge and scientific principles necessary to identify, understand, and anticipate the long-term health consequences of energy-related radiation and chemicals. Our continuing emphasis is to decrease the uncertainty of health-effects risk estimates from existing and/or developing energy-related technologies through an increased understanding of how radiation and chemicals cause health effects. The report is arranged to reflect PNL research relative to OHER programmatic structure. The first section, on human health effects, concerns statistical and epidemiological studies for assessing health risks. The next section, which contains reports of health-effects research in biological systems, includes research with radiation and chemicals. The last section is related to medical applications of nuclear technology.
Date: February 1, 1988
Creator: Park, J.F.
Object Type: Report
System: The UNT Digital Library