Managing electronic records: A guideline (open access)

Managing electronic records: A guideline

A committee at Martin Marietta Energy Systems (MMES) has drafted a guideline to assist offices in the management of electronic records. This paper will address the activities surrounding its creating. The guideline is for use by creators, users, and custodians of any type of electronic information. The guideline supports and supplements requirements from DOE and the National Archives and Records Administration (NARA), other internal processes such as system reviews, and the comprehensive records management program. While an in-house publication, it could prove useful to other organizations implementing an electronic records management program.
Date: July 1, 1995
Creator: Stewart, J.
System: The UNT Digital Library
Advanced combustion technologies for gas turbine power plants (open access)

Advanced combustion technologies for gas turbine power plants

Objectives are to develop actuators for enhancing the mixing between gas streams, increase combustion stability, and develop hgih-temperature materials for actuators and sensors in combustors. Turbulent kinetic energy maps of an excited jet with co-flow in a cavity with a partially closed exhaust end are given with and without a longitudinal or a transverse acoustic field. Dielectric constants and piezoelectric coefficients were determined for Sr{sub 2}(Nb{sub x}Ta{sub 1-x}){sub 2}O{sub 7} ceramics.
Date: December 1995
Creator: Vandsburger, U.; Roe, L. A. & Desu, S. B.
System: The UNT Digital Library
Object-oriented modeling and design for sloan digital sky survey retained data (open access)

Object-oriented modeling and design for sloan digital sky survey retained data

The SDSS project will produce tens of terabytes of data with nonionships among them and with uncertain complexity in their usage. The survey is being conducted by an international collaboration of eight institutions scattered throughout the US and Japan as well as numerous individuals at other sites. The data archive must provide adequate access to all collaborating partners during the five-year survey lifetime to support: development and testing of software algorithms; quality analysis on both the raw and processed data; selection of spectroscopic targets from the photometric catalogs; and scientific analysis. Additionally, the archive will serve as the basis for the public distribution of the final calibrated data on a timely basis. In this paper, we document how we applied Object-Oriented modeling design to the development of data archives. In the end, based on the experiences, we put Object-Orientation in a proper perspective.
Date: December 1, 1995
Creator: Huang, C. H.; Munn, J.; Yanny, B.; Kent, S.; Petravick, D.; Pordes, R. et al.
System: The UNT Digital Library
Possible Uses of Animal Databases for Further Statistical Evaluation and Modeling (open access)

Possible Uses of Animal Databases for Further Statistical Evaluation and Modeling

Many studies have been performed in animals which mimic potential exposures of people in order to understand how factors modify radiation dose-response relationships. Cooperative analyses by investigators in different laboratories have a large potential for strengthening the conclusions that can be drawn from individual studies. When information on each animal is combined, then formal tests can be made to demonstrate that apparent consistencies or inconsistencies are statistically significant. Statistical methods must be carefully chosen so that differences between laboratories or studies can be controlled or described as part of the analysis in the interpretation of the conclusions. In this report, the example of bone cancer of the large number of studies of modifying factors for bone cancer available from studies in US and European laboratories.
Date: October 1, 1995
Creator: Griffith, William C.; Boecker, B. B.; Watson, C. R. & Gerber, G. B.
System: The UNT Digital Library
1995 Department of Energy Records Management Conference (open access)

1995 Department of Energy Records Management Conference

The Department of Energy (DOE) Records Management Group (RMG) provides a forum for DOE and its contractor personnel to review and discuss subjects, issues, and concerns of common interest. This forum will include the exchange of information, and interpretation of requirements, and a dialog to aid in cost-effective management of the DOE Records Management program. Issues addressed by the RMG may result in recommendations for DOE-wide initiatives. Proposed DOE-wide initiatives shall be, provided in writing by the RMG Steering Committee to the DOE Records Management Committee and to DOE`s Office of ERM Policy, Records, and Reports Management for appropriate action. The membership of the RMG is composed of personnel engaged in Records Management from DOE Headquarters, Field sites, contractors, and other organizations, as appropriate. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.
Date: July 1, 1995
Creator: unknown
System: The UNT Digital Library
Communication in support of software sharing and collaborative development (open access)

Communication in support of software sharing and collaborative development

To be successful, software which is shared among several users requires a means of reporting trouble and receiving help. This is even more critical in the case of a collaborative software development effort. The Experimental Physics and Industrial Control System (EPICS) collaboration uses the Internet as its major communication medium. In addition to conventional electronic mail and occasional use of MBONE teleconferencing a distributed listserver system is used to announce releases, ask for aid, announce the discovery and disposal of bugs, and to converse generally about the future development directions of EPICS tools and methods. The EPICS listservers are divided into several subject categories, and since all questions, answers, and announcements are archived for future reference, some statistics can be gleaned from these records. Such statistics and information from the collaborators show that they make use of this system and find it helpful. As a manager, I have found that the system gives reassuring evidence that the collaboration is alive, responsive to calls for aid, and helpful even to those not actively participating in the question and answer activity.
Date: December 31, 1995
Creator: Knott, M. J.
System: The UNT Digital Library
The role of metadata in managing large environmental science datasets. Proceedings (open access)

The role of metadata in managing large environmental science datasets. Proceedings

The purpose of this workshop was to bring together computer science researchers and environmental sciences data management practitioners to consider the role of metadata in managing large environmental sciences datasets. The objectives included: establishing a common definition of metadata; identifying categories of metadata; defining problems in managing metadata; and defining problems related to linking metadata with primary data.
Date: June 1995
Creator: Melton, R. B.; DeVaney, D. M. & French, J. C.
System: The UNT Digital Library
Electronic Document Management Meets Environmental Restoration Recordkeeping Requirements: A Case Study (open access)

Electronic Document Management Meets Environmental Restoration Recordkeeping Requirements: A Case Study

Efforts at migrating records management at five Department of Energy sites operated under management by Lockheed Martin Energy Systems, Inc. for Environmental Restoration (ER) business activities are described. The corporate environment, project definition, records keeping requirements are described first. Then an evaluation of electronic document management technologies and of internal and commercially available systems are provided. Finally adopted incremental implementation strategy and lessons learned are discussed.
Date: December 31, 1995
Creator: Burnham, S. L.
System: The UNT Digital Library
A comparison of the shielding performances of the AT-400A, Model FL and Model AL-R8 containers (open access)

A comparison of the shielding performances of the AT-400A, Model FL and Model AL-R8 containers

A comparison of the neutron and photon dose rates at different locations on the outside surface of the Model AL-RB, Model FL and the AT-400A containers for a given pit load has been done in order to understand the shielding characteristics of these containers. The Model AL-R8 is not certified for transport and is only used for storage of pits, while the Model FL is a certified Type B pit transportation container. The AT-400A is being developed as a type B pit storage and transportation container. The W48, W56 and B83 pits were chosen for this study because of their encompassing features with regard to other pits presently being stored. A detailed description of the geometry and materials of these containers and of the neutron and photon emission spectra from the actinide materials present in the pit have been used in the calculations of the total dose rates. The calculations have been done using the three-dimensional, neutron-photon Monte Carlo code MCNP. The results indicate the need for a containment vessel (CV), as is found in the Model FL and AT-400A containers, in order to assure compliance with 10 CFR 71 regulations. The absence of a CV in the AL-R8 container …
Date: April 28, 1995
Creator: Hansen, L. F.
System: The UNT Digital Library
A plug-and-play approach to automated data interpretation: the data interpretation module (DIM) (open access)

A plug-and-play approach to automated data interpretation: the data interpretation module (DIM)

The Contaminant Analysis Automation (CAA) Project`s automated analysis laboratory provides a ``plug-and-play`` reusable infrastructure for many types of environmental assays. As a sample progresses through sample preparation to sample analysis and finally to data interpretation, increasing expertise and judgment are needed at each step. The Data Interpretation Module (DIM) echoes the automation`s plug-and-play philosophy as a reusable engine and architecture for handling both the uncertainty and knowledge required for interpreting contaminant sample data. This presentation describes the implementation and performance of the DIM in interpreting polychlorinated biphenyl (PCB) gas chromatogram and shows the DIM architecture`s reusability for other applications.
Date: December 1995
Creator: Hartog, B. K. D.; Elling, J. W. & Mniszewski, S. M.
System: The UNT Digital Library
The Fernald wet records recovery project: A case history (open access)

The Fernald wet records recovery project: A case history

This paper discusses a project performed to recover wet records discovered in January 1995 at the Fernald Environmental Management Project (FEMP). This paper discusses the emergency and record recovery phases of the project, the technical options considered for records recovery, and special measures which were required due to radiological contamination of the records. Also, the root causes and lessons learned from the incident, and path forward for future records management operations at Fernald, are discussed.
Date: June 22, 1995
Creator: Sterling, Harry J.; Devir, Brian R.; Hawley, Robert A. & Freesmeyer, Mary T.
System: The UNT Digital Library
The Global Historical Climatology Network: A preview of Version 2 (open access)

The Global Historical Climatology Network: A preview of Version 2

Instruments that could reliably measure temperature, precipitation, and pressure were developed by the late 17th and early 18th centuries. It has been estimated that weather records have been collected at one to two hundred thousand locations since those first instruments were placed in the field. Numerous applications, from global change studies to climate impact assessments to general circulation models, make use of such historical records. Given their importance, it is unfortunate that one cannot approach a single researcher or data center to acquire all of the records for all of the stations, or even a large portion of them. In 1990, the Carbon Dioxide Information Analysis Center (CDIAC), the National Climatic Data Center (NCDC), and the World Meteorological Organization (WMO) undertook a collaborative effort aimed at solving this problem. The initiative completed its first data product, known as the Global Historical Climatology Network (GHCN) version 1.0, in 1992. This data base contains quality-controlled monthly climatic time series from 6,039 temperature, 7,533 precipitation, 1,883 sea level pressure, and 1,873 station pressure stations located on global land areas. This paper describes the data and methods being used to compile GHCN version 2.0, an expanded and improved version of its predecessor. Planned for …
Date: February 1, 1995
Creator: Vose, R. S.; Schmoyer, R. L.; Peterson, T. C. & Eischeid, J. K.
System: The UNT Digital Library
Public/private key certification authority and key distribution. Draft (open access)

Public/private key certification authority and key distribution. Draft

Traditional encryption, which protects messages from prying eyes, has been used for many decades. The present concepts of encryption are built from that heritage. Utilization of modern software-based encryption techniques implies much more than simply converting files to an unreadable form. Ubiquitous use of computers and advances in encryption technology coupled with the use of wide-area networking completely changed the reasons for utilizing encryption technology. The technology demands a new and extensive infrastructure to support these functions. Full understanding of these functions, their utility and value, and the need for an infrastructure, takes extensive exposure to the new paradigm. This paper addresses issues surrounding the establishment and operation of a key management system (i.e., certification authority) that is essential to the successful implementation and wide-spread use of encryption.
Date: September 25, 1995
Creator: Long, J.P.; Christensen, M.J.; Sturtevant, A.P. & Johnston, W.E.
System: The UNT Digital Library
The founding of CEBAF, 1979 to 1987 (open access)

The founding of CEBAF, 1979 to 1987

In early 1979 a group of physicists assembled at the University of Virginia (UVa) for a conference entitled ''Future Possibilities for Electron Accelerators.'' In the audience sat an organizer of the conference, UVa professor James McCarthy. While listening to talks by Gregory Loew of the Stanford Linear Accelerator Center (SLAC) and Roger Servranckx of the University of Saskatchewan, McCarthy got very excited. Both discussed new approaches to producing an almost continuous stream of electrons with improved designs for pulse stretcher rings that could be built within a reasonable budget. McCarthy saw the possibility of realizing a dream. This dream had its origins in the 1950s, when Robert Hofstadter, McCarthy's thesis advisor, made groundbreaking discoveries at Stanford's High Energy Physics Laboratory (HEPL) about the internal structure of nuclei and nucleons. For these experiments Hofstadter used Mark III, the most advanced in a series of electron accelerators designed by William Hansen, who pioneered methods of high frequency acceleration of electrons. The work by Hofstadter and Hansen led to two productive lines of inquiry. One group of researchers studied particle production using electrons at higher energies, which led to the construction in the 1960s of SLAC at Stanford. Another group of researchers, which …
Date: February 1, 1995
Creator: Westfall, C.
System: The UNT Digital Library
Waste site characterization through digital analysis of historical aerial photographs at Los Alamos National Laboratory and Eglin Air Force Base (open access)

Waste site characterization through digital analysis of historical aerial photographs at Los Alamos National Laboratory and Eglin Air Force Base

Historical aerial photographs are used to provide a physical history and preliminary mapping information for characterizing hazardous waste sites at Los Alamos National Laboratory and Eglin Air Force Base. The examples cited show how imagery was used to accurately locate and identify previous activities at a site, monitor changes that occurred over time, and document the observable of such activities today. The methodology demonstrates how historical imagery (along with any other pertinent data) can be used in the characterization of past environmental damage.
Date: May 1, 1995
Creator: Van Eeckhout, E.; Pope, P.; Wells, B.; Rofer, C. & Martin, B.
System: The UNT Digital Library
A software perspective of environmental data quality (open access)

A software perspective of environmental data quality

Because of the large amount of complex data in environmental projects, particularly large decontamination and decommissioning projects, the quality of the data has a profound impact on the success and cost of the mission. In every phase of the life cycle of the project, including regulatory intervention and legal proceedings, maintaining the quality of data and presenting data in a timely and meaningful manner are critical. In this paper, a systemic view of data quality management from a software engineering perspective is presented. A method of evaluation evolves from this view. This method complements the principles of the data quality objective. When graded adequately, the method of evaluation establishes a paradigm for ensuring data quality for new and renewed projects. This paper also demonstrates that incorporating good practices of software engineering into the data management process leads to continuous improvement of data quality.
Date: July 1, 1995
Creator: Banerjee, B.
System: The UNT Digital Library
Maintaining scale as a realiable computational system for criticality safety analysis (open access)

Maintaining scale as a realiable computational system for criticality safety analysis

Accurate and reliable computational methods are essential for nuclear criticality safety analyses. The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer code system was originally developed at Oak Ridge National Laboratory (ORNL) to enable users to easily set up and perform criticality safety analyses, as well as shielding, depletion, and heat transfer analyses. Over the fifteen-year life of SCALE, the mainstay of the system has been the criticality safety analysis sequences that have featured the KENO-IV and KENO-V.A Monte Carlo codes and the XSDRNPM one-dimensional discrete-ordinates code. The criticality safety analysis sequences provide automated material and problem-dependent resonance processing for each criticality calculation. This report details configuration management which is essential because SCALE consists of more than 25 computer codes (referred to as modules) that share libraries of commonly used subroutines. Changes to a single subroutine in some cases affect almost every module in SCALE! Controlled access to program source and executables and accurate documentation of modifications are essential to maintaining SCALE as a reliable code system. The modules and subroutine libraries in SCALE are programmed by a staff of approximately ten Code Managers. The SCALE Software Coordinator maintains the SCALE system and is the only person who modifies the …
Date: April 1, 1995
Creator: Bowmann, S. M.; Parks, C. V. & Martin, S. K.
System: The UNT Digital Library
The DT-19 Container Design, Impact Testing and Analysis (open access)

The DT-19 Container Design, Impact Testing and Analysis

Containers used by the Department of Energy (DOE) for the transport of radioactive material components, including components and special assemblies, are required to meet certain impact and thermal requirements that are demonstrated by performance or compliance testing, analytical procedures or a combination of both. The Code of Federal Regulations (CFR) Part 49, Section 173.7(d) stipulates that, {prime}Packages (containers) made by or under direction of the US DOE may be used for the transportation of radioactive materials when evaluated, approved, and certified by the DOE against packaging standards equivalent to those specified in 10 CFR Part 71. This paper describes the details of the design, analysis and testing efforts undertaken to improve the overall structural and thermal integrity of the DC-19 shipping container.
Date: December 1, 1995
Creator: Aramayo, G. A. & Goins, M. L.
System: The UNT Digital Library
Flood protection for the Kansas City bannister federal complex (open access)

Flood protection for the Kansas City bannister federal complex

The Bannister Federal Complex is bordered on the east by the Blue River and on the south by Indian Creek. After a flood in 1961 and several near-miss floods, flood protection has been installed. The protection consists of 2,916 feet of concrete flood walls, 8,769 feet of levee, five rolling gates, four stoplog gaps, one hinged pedestrian gate, and one sandbag gap. The flood walls are over 14 feet tall. Construction was started on August 3, 1992 and was completed in early 1995. Architectural treatment was incorporated in the flood walls as well as landscaping to enhance the appearance of the flood protection.
Date: August 1, 1995
Creator: Nolan, J.J.; Williams, R.H. & Betzen, G.A.
System: The UNT Digital Library
The collection and analysis of transient test data using the mobile instrumentation data acquisition system (MIDAS) (open access)

The collection and analysis of transient test data using the mobile instrumentation data acquisition system (MIDAS)

Packages designed to transport radioactive materials are required to survive exposure to environments defined in Code of Federal Regulations. Cask designers can investigate package designs through structural and thermal testing of full-scale packages, components, or representative models. The acquisition of reliable response data from instrumentation measurement devices is an essential part of this testing activity. Sandia National Laboratories, under the sponsorship of the US Department of Energy (DOE), has developed the Mobile Instrumentation Data Acquisition System (MIDAS) dedicated to the collection and processing of structural and thermal data from regulatory tests.
Date: December 31, 1995
Creator: Uncapher, W. L. & Arviso, M.
System: The UNT Digital Library
Software engineering methods and standards used int he sloan digital sky survey (open access)

Software engineering methods and standards used int he sloan digital sky survey

We present an integrated science software development environment, code maintenance and support system for the Sloan Digital Sky Survey (SDSS) now being actively used throughout the collaboration. The SDSS is a collaboration between the Fermi National Accelerator Laboratory, the Institute for Advanced Study, The Japan Promotion Group, Johns Hopkins University, Princeton University, The United States Naval Observatory, the University of Chicago, and the University of Washington. The SDSS will produce a five-color imaging survey of 1/4 of the sky about the north galactic cap and image 10{sup 8} Stars, 10{sup 8} galaxies, and 10{sup 5} Quasars. Spectra will be obtained for 10{sup 6} galaxies and 10{sup 5} Quasars as well. The survey will utilize a dedicated 2.5 meter telescope at the Apache Point Observatory in New Mexico. Its imaging camera will hold 54 Charge-Coupled Devices (CADS). The SDSS will take five years to complete, acquiring well over 12 TB of data.
Date: April 1, 1995
Creator: Petravick, D.; Berman, E.; Gurbani, V.; Nicinski, T.; Pordes, R.; Rechenmacher, R. et al.
System: The UNT Digital Library
A distributed atomic physics database and modeling system for plasma spectroscopy (open access)

A distributed atomic physics database and modeling system for plasma spectroscopy

We are undertaking to develop a set of computational capabilities which will facilitate the access, manipulation, and understanding of atomic data in calculations of x-ray spectral modeling. In this present limited description we will emphasize the objectives for this work, the design philosophy, and aspects of the atomic database, as a more complete description of this work is available. The project is referred to as the Plasma Spectroscopy Initiative; the computing environment is called PSI, or the ``PSI shell`` since the primary interface resembles a UNIX shell window. The working group consists of researchers in the fields of x-ray plasma spectroscopy, atomic physics, plasma diagnostics, line shape theory, astrophysics, and computer science. To date, our focus has been to develop the software foundations, including the atomic physics database, and to apply the existing capabilities to a range of working problems. These problems have been chosen in part to exercise the overall design and implementation of the shell. For successful implementation the final design must have great flexibility since our goal is not simply to satisfy our interests but to vide a tool of general use to the community.
Date: August 1, 1995
Creator: Nash, J. K.; Liedahl, D.; Chen, M. H.; Iglesias, C. A.; Lee, R. W. & Salter, J. M.
System: The UNT Digital Library
Fuel conditioning facility material accountancy (open access)

Fuel conditioning facility material accountancy

The operation of the Fuel conditioning Facility (FCF) is based on the electrometallurgical processing of spent metallic reactor fuel. It differs significantly, therefore, from traditional PUREX process facilities in both processing technology and safeguards implications. For example, the fissile material is processed in FCF only in batches and is transferred within the facility only as solid, well-characterized items; there are no liquid steams containing fissile material within the facility, nor entering or leaving the facility. The analysis of a single batch lends itself also to an analytical relationship between the safeguards criteria, such as alarm limit, detection probability, and maximum significant amount of fissile material, and the accounting system`s performance, as it is reflected in the variance associated with the estimate of the inventory difference. This relation, together with the sensitivity of the inventory difference to the uncertainties in the measurements, allows a thorough evaluation of the power of the accounting system. The system for the accountancy of the fissile material in the FCF has two main components: a system to gather and store information during the operation of the facility, and a system to interpret this information with regard to meeting safeguards criteria. These are described and the precision …
Date: August 1, 1995
Creator: Yacout, A. M.; Bucher, R. G. & Orechwa, Y.
System: The UNT Digital Library
Evaluation of ARAC`s participation in a long-range transport experiment (open access)

Evaluation of ARAC`s participation in a long-range transport experiment

The 1994 European Tracer Experiment (ETEX) involved two releases of inert tracer gas in western France, allowing subsequent detection at many locations across Europe. Twenty four operational and research facilities from 20 countries made predictions of the motion of the released plume and the resulting concentrations detected at the sampler locations. This paper describes participation by the Lawrence Livermore National Laboratory`s Atmospheric Release Advisory Capability (ARAC) in ETEX. In its role as a real-time emergency response center, ARAC operates a suite of numerical models which simulate the advection and diffusion of airborne releases, and which calculate the estimated downwind concentration of the released material. The models and procedures used by ARAC to participate in ETEX were essentially the same as those which would be used to respond to a release at any previously unspecified location.
Date: August 1, 1995
Creator: Pace, J. C.; Pobanz, B. M.; Foster, C. S.; Baskett, R. L.; Vogt, P. J. & Schalk, W. W., III
System: The UNT Digital Library