Degree Department

Method and Apparatus for Remote Delivery and Manipulation of a Miniature Tool Adjacent a Work Piece in a Restricted Space (open access)

Method and Apparatus for Remote Delivery and Manipulation of a Miniature Tool Adjacent a Work Piece in a Restricted Space

An apparatus for remote delivery and manipulation of a miniature tool adjacent a work piece in a restricted space, includes a tool camer, a camage for manipulating the tool carrier relative to the work piece, a first actuator for operating the carnage, and an optional remote secondary operating actuator for operating the first actuator.
Date: August 10, 2004
Creator: Sale, Christopher H. & Kaltenbaugh, Daniel R.
Object Type: Patent
System: The UNT Digital Library
Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration (open access)

Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: remote sensing for carbon analysis; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and …
Date: July 10, 2004
Creator: Stanley, Bill; Brown, Sandra; Gonzalez, Patrick; Kant, Zoe; Tiepolo, Gilberto; Sabido, Wilber et al.
Object Type: Report
System: The UNT Digital Library
Environmental Transport Input Parameters for the Biosphere Model (open access)

Environmental Transport Input Parameters for the Biosphere Model

This analysis report is one of the technical reports documenting the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment for the license application (TSPA-LA) for the geologic repository at Yucca Mountain. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows relationships among the reports developed for biosphere modeling and biosphere abstraction products for the TSPA-LA, as identified in the ''Technical Work Plan for Biosphere Modeling and Expert Support'' (BSC 2004 [DIRS 169573]) (TWP). This figure provides an understanding of how this report contributes to biosphere modeling in support of the license application (LA). This report is one of the five reports that develop input parameter values for the biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes the conceptual model and the mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed description of the model input parameters. The output of this report is used as direct input in the ''Nominal Performance Biosphere Dose Conversion Factor Analysis'' and in the ''Disruptive Event Biosphere Dose Conversion Factor Analysis'' that calculate the …
Date: September 10, 2004
Creator: Wasiolek, M.
Object Type: Report
System: The UNT Digital Library
Verification and Validation using DAKOTA via the DakTools scripts (open access)

Verification and Validation using DAKOTA via the DakTools scripts

Several of the intermediate capabilities which are being developed by the AX V&V program may be helpful in other ways. This paper describes a new PYTHON interface to one such tool, DAKOTA (a parallel optimizing controller from Sandia National Laboratory) and the subsequent simpler set of operations required to run and analyze sets of calculations using any LCC computational platform.
Date: December 10, 2004
Creator: Brandon, S & Tipton, P
Object Type: Article
System: The UNT Digital Library
THE COMMISSIONING PLAN FOR THE SPALLATION NEUTRON SOURCE RING AND TRANSPORT LINES. (open access)

THE COMMISSIONING PLAN FOR THE SPALLATION NEUTRON SOURCE RING AND TRANSPORT LINES.

The Spallation Neutron Source (SNS) accelerator systems will provide a 1 GeV, 1.44 MW proton beam to a liquid mercury target for neutron production. In order to satisfy the accelerator systems' portion of the Critical Decision 4 (CD-4) commissioning goal (which marks the completion of the construction phase of the project), a beam pulse with intensity greater than 1 x 10{sup 13} protons must be accumulated in the ring, extracted in a single turn and delivered to the target. A commissioning plan has been formulated for bringing into operation and establishing nominal operating conditions for the various ring and transport line subsystems as well as for establishing beam conditions and parameters which meet the commissioning goal.
Date: March 10, 2004
Creator: RAPARIA,D. BLASKIEWICZ,M. LEE,Y. Y. WEI,J. ET AL.
Object Type: Article
System: The UNT Digital Library
COMPUTATIONAL STUDIES OF COLLECTIVE BEAM DYNAMICS IN HIGH INTENSITY RINGS. (open access)

COMPUTATIONAL STUDIES OF COLLECTIVE BEAM DYNAMICS IN HIGH INTENSITY RINGS.

Collective interactions of the beam with itself and with its periodic lattice surroundings in high intensity accelerator rings, such as PSR and SNS, can lead to beam growth, halo generation, and losses. These interactions also provide a rich source of dynamic phenomena for analytical, computational, and experimental study. With continuing increases in model development and computer power, a number of sophisticated codes are now capable of detailed realistic studies of collective beam dynamics in rings. We concentrate here on a computational examination of high intensity beam dynamics in SNS. These studies include the effects of the accelerator lattice, space charge, impedances, losses and collimation, and magnet errors.
Date: March 10, 2004
Creator: HOLMES,J. A. COUSINEAU,S. DANILOV,V. HENDERSON,S. SHISHLO,A. FEDOTOV,A.
Object Type: Article
System: The UNT Digital Library
Performance Evaluation of Industrial Hygiene Air Monitoring Sensors (open access)

Performance Evaluation of Industrial Hygiene Air Monitoring Sensors

Tests were performed to evaluate the accuracy, precision and response time of certain commercially available handheld toxic gas monitors. The tests were conducted by PNNL in the Chemical Chamber Test Facility for CH2MHill Hanford Company. The instruments were tested with a set of dilute test gases including ammonia, nitrous oxide, and a mixture of organic vapors (acetone, benzene, ethanol, hexane, toluene and xylene). The certified gases were diluted to concentrations that may be encountered in the outdoor environment above the underground tank farms containing radioactive waste at the U.S. Department of Energy's Hanford site, near Richland, Washington. The challenge concentrations are near the lower limits of instrument sensitivity and response time. The performance test simulations were designed to look at how the instruments respond to changes in test gas concentrations that are similar to field conditions.
Date: December 10, 2004
Creator: Maughan, A D.; Glissmeyer, John A. & Birnbaum, Jerome C.
Object Type: Report
System: The UNT Digital Library
On improving linear solver performance: a block variant of GMRES (open access)

On improving linear solver performance: a block variant of GMRES

The increasing gap between processor performance and memory access time warrants the re-examination of data movement in iterative linear solver algorithms. For this reason, we explore and establish the feasibility of modifying a standard iterative linear solver algorithm in a manner that reduces the movement of data through memory. In particular, we present an alternative to the restarted GMRES algorithm for solving a single right-hand side linear system Ax = b based on solving the block linear system AX = B. Algorithm performance, i.e. time to solution, is improved by using the matrix A in operations on groups of vectors. Experimental results demonstrate the importance of implementation choices on data movement as well as the effectiveness of the new method on a variety of problems from different application areas.
Date: May 10, 2004
Creator: Baker, A H; Dennis, J M & Jessup, E R
Object Type: Article
System: The UNT Digital Library
Potlining Additives (open access)

Potlining Additives

In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.
Date: August 10, 2004
Creator: Keller, Rudolf
Object Type: Report
System: The UNT Digital Library
Longitudinal single-bunch instabilities in the NLC main damping rings (open access)

Longitudinal single-bunch instabilities in the NLC main damping rings

Because of tight requirements on beam quality longitudinal single-bunch instabilities are a serious concern for the damping rings of the next generation of linear colliders. Unlike multi-bunch instabilities they cannot be damped using feed-back systems and need to be avoided altogether. We present an analysis of these instabilities for the current Feb. 03 NLC main damping ring design, with attention paid to coherent synchrotron radiation and vacuum chamber effects, with the latter including the main components (RF cavities, BPM's, and resistive wall). The study is carried out by solving the Vlasov-Fokker-Planck equation for the longitudinal motion numerically. Comparison is made, whenever possible, with linear theory. We find that collective effects are dominated by coherent synchrotron radiation and estimate the instability threshold to be safely above 6 times the design current.
Date: May 10, 2004
Creator: Venturini, Marco
Object Type: Report
System: The UNT Digital Library
Inhalation Exposure Input Parameters for the Biosphere Model (open access)

Inhalation Exposure Input Parameters for the Biosphere Model

This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total …
Date: September 10, 2004
Creator: Rautenstrauch, K.
Object Type: Report
System: The UNT Digital Library
Comparison of Average Transport and Dispersion Among a Gaussian Model, a Two-Dimensional Model and a Three-Dimensional Model (open access)

Comparison of Average Transport and Dispersion Among a Gaussian Model, a Two-Dimensional Model and a Three-Dimensional Model

The Nuclear Regulatory Commission uses MACCS2 (MELCOR Accident Consequence Code System, Version 2) for regulatory purposes such as planning for emergencies and cost-benefit analyses. MACCS2 uses a straight-line Gaussian model for atmospheric transport and dispersion. This model has been criticized as being overly simplistic, although only expected values of metrics of interest are used in the regulatory arena. To test the assumption that averaging numerous weather results adequately compensates for the loss of structure in the meteorology that occurs away from the point of release, average MACCS2 results have been compared with average results from a state-of-the-art, 3-dimensional LODI (Lagrangian Operational Dispersion Integrator)/ADAPT (Atmospheric Data Assimilation and Parameterization Technique) and a Lagrangian trajectory, Gaussian puff transport and dispersion model from RASCAL (Radiological Assessment System for consequence Analysis). The weather sample included 610 weather trials representing conditions for a hypothetical release at the Central Facility of the Department of Energy's Atmospheric Radiation Measurement site. The values compared were average ground concentrations and average surface-level air concentrations at several distances out to 100 miles (160.9 km) from the assumed release site.
Date: May 10, 2004
Creator: Mitchell, J. A.; Molenkamp, C. R.; Bixler, N. E.; Morrow, C. W. & Ramsdell, J. V. Jr.
Object Type: Article
System: The UNT Digital Library
Proposed Holistic Strategy for the Closure of F-Area, A Large Nuclear Industrial Complex at the Savannah River Site, South Carolina (open access)

Proposed Holistic Strategy for the Closure of F-Area, A Large Nuclear Industrial Complex at the Savannah River Site, South Carolina

F-Area is a large nuclear complex located near the center of the Department of Energy's (DOEs) Savannah River Site in South Carolina. The present closure strategy for F-Area is based on established SRS protocol for a site-specific, graded approach to deactivation and decommissioning. Uncontaminated facilities will be closed under the National Environmental Policy Act (NEPA). Facilities requiring removal or in-situ disposition of residual chemical and/or radiological inventories will be decommissioned under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The F-Area Tank Farm, which is permitted under the Clean Water Act, will be closed in accordance with an industrial wastewater closure plan. F-Area closure will also involve the near- and long-term remediation of contaminated soil and groundwater resources. The proposed holistic F-Area closure strategy would enhance the existing project-specific SRS closure protocol by incorporating a comprehensive area-wide groundwater modeling tool, or Composite Analysis. The use of this methodology would allow for the assessment of the relative impacts of individual projects, as well as the cumulative effect of all F-Area closure actions, on area groundwater resources. Other critical elements of the proposed strategy include (i) the consistent use of site-specific Risk Assessments (RAs) and Performance Assessments (PAs), (ii) the closer …
Date: February 10, 2004
Creator: SHEDROW, CB
Object Type: Article
System: The UNT Digital Library
Providing a Complete Menu: Using Competitive Usability in a Home Page Usability Study (open access)

Providing a Complete Menu: Using Competitive Usability in a Home Page Usability Study

This article presents results from the task-based testing and focus group portion of a usability study on the University of North Texas Libraries' website in addition to three other academic library web sites to examine the effects of multiple design elements and styles on the participants' results and to provide additional insight into user preferences.
Date: December 10, 2004
Creator: Thomsett-Scott, Beth
Object Type: Article
System: The UNT Digital Library
Lithophysal Rock Mass Mechanical Properties of the Repository Host Horizon (open access)

Lithophysal Rock Mass Mechanical Properties of the Repository Host Horizon

The purpose of this calculation is to develop estimates of key mechanical properties for the lithophysal rock masses of the Topopah Spring Tuff (Tpt) within the repository host horizon, including their uncertainties and spatial variability. The mechanical properties to be characterized include an elastic parameter, Young's modulus, and a strength parameter, uniaxial compressive strength. Since lithophysal porosity is used as a surrogate property to develop the distributions of the mechanical properties, an estimate of the distribution of lithophysal porosity is also developed. The resulting characterizations of rock parameters are important for supporting the subsurface design, developing the preclosure safety analysis, and assessing the postclosure performance of the repository (e.g., drift degradation and modeling of rockfall impacts on engineered barrier system components).
Date: November 10, 2004
Creator: Rigby, D.
Object Type: Report
System: The UNT Digital Library
Temporary Losses of Highway Capacity and Impacts on Performance: Phase 2 (open access)

Temporary Losses of Highway Capacity and Impacts on Performance: Phase 2

Traffic congestion and its impacts significantly affect the nation's economic performance and the public's quality of life. In most urban areas, travel demand routinely exceeds highway capacity during peak periods. In addition, events such as crashes, vehicle breakdowns, work zones, adverse weather, railroad crossings, large trucks loading/unloading in urban areas, and other factors such as toll collection facilities and sub-optimal signal timing cause temporary capacity losses, often worsening the conditions on already congested highway networks. The impacts of these temporary capacity losses include delay, reduced mobility, and reduced reliability of the highway system. They can also cause drivers to re-route or reschedule trips. Such information is vital to formulating sound public policies for the highway infrastructure and its operation. In response to this need, Oak Ridge National Laboratory, sponsored by the Federal Highway Administration (FHWA), made an initial attempt to provide nationwide estimates of the capacity losses and delay caused by temporary capacity-reducing events (Chin et al. 2002). This study, called the Temporary Loss of Capacity (TLC) study, estimated capacity loss and delay on freeways and principal arterials resulting from fatal and non-fatal crashes, vehicle breakdowns, and adverse weather, including snow, ice, and fog. In addition, it estimated capacity loss …
Date: November 10, 2004
Creator: Chin, S.M.
Object Type: Report
System: The UNT Digital Library
An Evaluation of Savings and Measure Persistence Fromretrocommissioning of Large Commercial Buildings (open access)

An Evaluation of Savings and Measure Persistence Fromretrocommissioning of Large Commercial Buildings

Commercial building retrocommissioning activity has increased in recent years. LBNL recently conducted a study of 8 participants in Sacramento Municipal Utility District's (SMUD) retrocommissioning program. We evaluated the persistence of energy savings and measure implementation, in an effort to identify and understand factors that affect the longevity of retrocommissioning benefits. The LBNL analysis looked at whole-building energy and the retrocommissioning measure implementation status, incorporating elements from previous work by Texas A&M University and Portland Energy Conservation Inc. When possible, adjustments due to newly discovered major end uses, occupancy patterns and 2001 energy crisis responses were included in the whole-building energy analysis. The measure implementation analysis categorized each recommended measure and tracked the measures to their current operational status. Results showed a 59% implementation rate of recommended measures. The whole-building energy analysis showed an aggregate electricity savings of approximately 10.5% in the second post-retrocommissioning year, diminishing to approximately 8% in the fourth year. Results also showed the 2001 energy crisis played a significant role in the post-retrocommissioning energy use at the candidate sites. When natural gas consumption was included in the analysis, savings were reduced slightly, showing the importance in considering interactive effects between cooling and heating systems. The cost effectiveness …
Date: March 10, 2004
Creator: Bourassa, Norman J.; Piette, Mary Ann & Motegi, Naoya
Object Type: Article
System: The UNT Digital Library
High Level Waste Lag Storage and Feed Blending (open access)

High Level Waste Lag Storage and Feed Blending

SRTC performed small-scale tests to determine the behavior associated with blending streams in the High-level Waste (HLW) Lag Storage and Feed Blending Process System for the Hanford Waste Treatment and Immobilization Plant (WTP). The work reported here was planned and designed in response to the test specification. The Office of River Protection Hanford Waste Treatment and Immobilization Plant consists of three primary facilities: a Pretreatment Facility and two facilities for low-activity and high-level waste vitrification. The Pretreatment Facility contains unit operations which receive waste feed from the Hanford Tank Farms and separate it into two treated waste streams: a low-activity, liquid waste stream stripped of most solids and radioisotopes (processed through the Low-Activity Waste Vitrification Facility) and a high-level waste slurry containing most of the solids and radioisotopes (processed through the High-Level Waste Vitrification Facility). Blending of the later solids a nd radioisotopes streams and their resulting properties is the subject of this report. These mixtures are shown to be unreactive and pumpable by using statistically designed combinations of nonradioactive simulants for the process streams. Properties of the mixtures are also predicted numerically (with the Environmental Simulation Program) and compared with the experimental results. The results did not reveal any …
Date: May 10, 2004
Creator: BARNES, M.J.
Object Type: Report
System: The UNT Digital Library
Evidence for neutrino oscillations in the Sudbury Neutrino Observatory (open access)

Evidence for neutrino oscillations in the Sudbury Neutrino Observatory

The Sudbury Neutrino Observatory (SNO) is a large-volume heavy water Cerenkov detector designed to resolve the solar neutrino problem. SNO observes charged-current interactions with electron neutrinos, neutral-current interactions with all active neutrinos, and elastic-scattering interactions primarily with electron neutrinos with some sensitivity to other flavors. This dissertation presents an analysis of the solar neutrino flux observed in SNO in the second phase of operation, while {approx}2 tonnes of salt (NaCl) were dissolved in the heavy water. The dataset here represents 391 live days of data. Only the events above a visible energy threshold of 5.5 MeV and inside a fiducial volume within 550 cm of the center of the detector are studied. The neutrino flux observed via the charged-current interaction is [1.71 {+-} 0.065(stat.){+-}{sub 0.068}{sup 0.065}(sys.){+-}0.02(theor.)] x 10{sup 6}cm{sup -2}s{sup -1}, via the elastic-scattering interaction is [2.21{+-}0.22(stat.){+-}{sub 0.12}{sup 0.11}(sys.){+-}0.01(theor.)] x 10{sup 6}cm{sup -2}s{sup -1}, and via the neutral-current interaction is [5.05{+-}0.23(stat.){+-}{sub 0.37}{sup 0.31}(sys.){+-}0.06(theor.)] x 10{sup 6}cm{sup -2}s{sup -1}. The electron-only flux seen via the charged-current interaction is more than 7{sigma} below the total active flux seen via the neutral-current interaction, providing strong evidence that neutrinos are undergoing flavor transformation as they travel from the core of the Sun to the …
Date: August 10, 2004
Creator: Marino, Alysia Diane
Object Type: Thesis or Dissertation
System: The UNT Digital Library
Recent Flash X-Ray Injector Modeling (open access)

Recent Flash X-Ray Injector Modeling

The injector of the Flash X-Ray (FXR) accelerator has a significantly larger than expected beam emittance. A computer modeling effort involving three different injector design codes was undertaken to characterize the FXR injector and determine the cause of the large emittance. There were some variations between the codes, but in general the simulations were consistent and pointed towards a much smaller normalized, rms emittance (36 cm-mr) than what was measured (193 cm-mr) at the exit of the injector using a pepperpot technique. The simulations also indicated that the present diode design was robust with respect to perturbations to the nominal design. Easily detected mechanical alignment/position errors and magnet errors did not lead to appreciable increase in the simulated emittance. The physics of electron emission was not modeled by any of the codes and could be the source of increased emittance. The nominal simulation assumed uniform Child-Langmuir Law emission from the velvet cathode and no shroud emission. Simulations that looked at extreme non-uniform cathode and shroud emission scenarios resulted in doubling of the emittance. An alternative approach was to question the pepperpot measurement. Simulations of the measurement showed that the pepperpot aperture foil could double the emittance with respect to the …
Date: November 10, 2004
Creator: Houck, T.; Blackfield, D.; Burke, J.; Chen, Y.; Javedani, J. & Paul, A. C.
Object Type: Report
System: The UNT Digital Library
PHYSICAL, CHEMICAL AND STRUCTURAL EVOLUTIION OF ZEOLITE-CONTAINING WASTE FORMS PRODUCED FROM METAKAOLINITE AND CALCINED SODUIM BEARING WASTE (HLW AND/OR LLW) (open access)

PHYSICAL, CHEMICAL AND STRUCTURAL EVOLUTIION OF ZEOLITE-CONTAINING WASTE FORMS PRODUCED FROM METAKAOLINITE AND CALCINED SODUIM BEARING WASTE (HLW AND/OR LLW)

Zeolites are extremely versatile. They can adsorb liquids and gases and serve as cation exchange media. They occur in nature as well cemented deposits. The Romans used blocks of zeolitized tuff as a building material. Using zeolites for the management of radioactive waste is not new, but a process by which the zeolites can be made to act as a cementing agent is. Zeolitic materials are relatively easy to synthesize from a wide range of both natural and man-made precursors. The process under study is derived from a well known method in which metakaolin (thermally dehydroxylated kaolin a mixture of kaolinite and smaller amounts of quartz and mica that has been heated to {approx}700 C) is mixed with sodium hydroxide (NaOH) and water and reacted in slurry form (for a day or two) at mildly elevated temperatures. The zeolites form as finely divided powders containing micrometer ({micro}m) sized crystals. However, if the process is changed slightly and just enough concentrated sodium hydroxide solution is added to the metakaolinite to make a thick paste and then the paste is cured under mild hydrothermal conditions (60-200 C), the mixture forms a concrete-like ceramic material made up of distinct crystalline tectosilicate minerals (zeolites …
Date: June 10, 2004
Creator: Grutzeck, Michael W.
Object Type: Report
System: The UNT Digital Library
Real-Time Water Quality Management in the Grassland Water District (open access)

Real-Time Water Quality Management in the Grassland Water District

The purpose of the research project was to advance the concept of real-time water quality management in the San Joaquin Basin by developing an application to drainage of seasonal wetlands in the Grassland Water District. Real-time water quality management is defined as the coordination of reservoir releases, return flows and river diversions to improve water quality conditions in the San Joaquin River and ensure compliance with State water quality objectives. Real-time water quality management is achieved through information exchange and cooperation between shakeholders who contribute or withdraw flow and salt load to or from the San Joaquin River. This project complements a larger scale project that was undertaken by members of the Water Quality Subcommittee of the San Joaquin River Management Program (SJRMP) and which produced forecasts of flow, salt load and San Joaquin River assimilative capacity between 1999 and 2003. These forecasts can help those entities exporting salt load to the River to develop salt load targets as a mechanism for improving compliance with salinity objectives. The mass balance model developed by this project is the decision support tool that helps to establish these salt load targets. A second important outcome of this project was the development and application …
Date: December 10, 2004
Creator: Quinn, Nigel W.T.; Hanna, W. Mark; Hanlon, Jeremy S.; Burns, Josphine R.; Taylor, Christophe M.; Marciochi, Don et al.
Object Type: Report
System: The UNT Digital Library
Physics with Ultracold and Thermal Neutron Beams (open access)

Physics with Ultracold and Thermal Neutron Beams

This project has been focused on a measurement of the mean lifetime {tau}{sub n} of the free neutron with a precision better than 0.1%. The neutron {beta}-decay n {yields} p + e{sup -} + {bar {nu}}{sub e} + 783 keV into a proton, electron and electron antineutrino is the prototype semi-leptonic weak decay, involving both leptons and hadrons in the first generation of elementary particles. Within the standard V-A theory of weak interaction, it is governed by only two constants: the vector coupling constant g{sub V}, and axial vector constant g{sub A}. The neutron lifetime has been measured many times over decades, and the present (2004) world-average, {tau}{sub n} = 885.7 {+-} 0.8 s, has a weighted error of {approx}0.1% while individual uncertainties are typically 2-10 seconds for high precision data. The highest precision claimed by an individual measurement is {approx}0.15%. An improvement is required to resolve issues of the Standard Model of the electro-weak interaction as well as of astrophysics and of Big Bang theories. The focus in astrophysics is the solar neutrino deficit problem, which requires a precise value of g{sub A}. Big Bang theories require a precise {tau}{sub n}-value to understand the primordial He/H ratio. The strong …
Date: August 10, 2004
Creator: Steyerl, Albert
Object Type: Report
System: The UNT Digital Library
Energy Employees Compensation: Many Claims Have Been Processed, but Action Is Needed to Expedite Processing of Claims Requiring Radiation Exposure Estimates (open access)

Energy Employees Compensation: Many Claims Have Been Processed, but Action Is Needed to Expedite Processing of Claims Requiring Radiation Exposure Estimates

A letter report issued by the Government Accountability Office with an abstract that begins "Subtitle B of the Energy Employees Occupational Illness Compensation Program Act, administered by the Department of Labor (Labor), provides eligible workers who developed illnesses from their work, or their survivors, with a onetime total payment of $150,000, and coverage for medical expenses related to the illnesses. For some claims, Labor uses radiation exposure estimates (dose reconstructions) performed by the National Institute for Occupational Safety and Health (NIOSH), part of the Department of Health and Human Services' (HHS) Centers for Disease Control and Prevention (CDC), to determine if the illness claimed was "as least as likely as not" related to employment at a covered facility. GAO was asked to determine (1) how well Labor's procedures and practices ensure the timely and consistent processing of claims that are not referred to NIOSH for dose reconstruction but are being processed by Labor and (2) how well Labor's and NIOSH's procedures and practices ensure the timely and consistent processing of claims that are referred for dose reconstruction. GAO did not assess the quality of Labor's claims decisions."
Date: September 10, 2004
Creator: United States. Government Accountability Office.
Object Type: Report
System: The UNT Digital Library