States

Water-related Issues Affecting Conventional Oil and Gas Recovery and Potential Oil-Shale Development in the Uinta Basin, Utah (open access)

Water-related Issues Affecting Conventional Oil and Gas Recovery and Potential Oil-Shale Development in the Uinta Basin, Utah

Saline water disposal is one of the most pressing issues with regard to increasing petroleum and natural gas production in the Uinta Basin of northeastern Utah. Conventional oil fields in the basin provide 69 percent of Utah?s total crude oil production and 71 percent of Utah?s total natural gas, the latter of which has increased 208% in the past 10 years. Along with hydrocarbons, wells in the Uinta Basin produce significant quantities of saline water ? nearly 4 million barrels of saline water per month in Uintah County and nearly 2 million barrels per month in Duchesne County. As hydrocarbon production increases, so does saline water production, creating an increased need for economic and environmentally responsible disposal plans. Current water disposal wells are near capacity, and permitting for new wells is being delayed because of a lack of technical data regarding potential disposal aquifers and questions concerning contamination of freshwater sources. Many companies are reluctantly resorting to evaporation ponds as a short-term solution, but these ponds have limited capacity, are prone to leakage, and pose potential risks to birds and other wildlife. Many Uinta Basin operators claim that oil and natural gas production cannot reach its full potential until a …
Date: April 30, 2012
Creator: Berg, Michael Vanden; Anderson, Paul; Wallace, Janae; Morgan, Craig & Carney, Stephanie
Object Type: Report
System: The UNT Digital Library
Hadronic Contributions to R and G-2 from Initial-State-Radiation Data (open access)

Hadronic Contributions to R and G-2 from Initial-State-Radiation Data

I review the recent efforts to improve the precision of the prediction of the anomalous moment of the muon, in particular of the hadronic contribution of the vacuum polarization, which is the contribution with the largest uncertainty. Focus is given to the recent result for e{sup +}e{sup -} {yields} {pi}{sup +}{pi}{sup -} by the BaBar collaboration, obtained using events with radiation in the initial state.
Date: April 6, 2012
Creator: Bernard, Denis, 1 & Polytechnique, /Ecole
Object Type: Article
System: The UNT Digital Library
COMPOSE-HPC: A Transformational Approach to Exascale (open access)

COMPOSE-HPC: A Transformational Approach to Exascale

The goal of the COMPOSE-HPC project is to 'democratize' tools for automatic transformation of program source code so that it becomes tractable for the developers of scientific applications to create and use their own transformations reliably and safely. This paper describes our approach to this challenge, the creation of the KNOT tool chain, which includes tools for the creation of annotation languages to control the transformations (PAUL), to perform the transformations (ROTE), and optimization and code generation (BRAID), which can be used individually and in combination. We also provide examples of current and future uses of the KNOT tools, which include transforming code to use different programming models and environments, providing tests that can be used to detect errors in software or its execution, as well as composition of software written in different programming languages, or with different threading patterns.
Date: April 1, 2012
Creator: Bernholdt, David E; Allan, Benjamin A.; Armstrong, Robert C.; Chavarria-Miranda, Daniel; Dahlgren, Tamara L.; Elwasif, Wael R et al.
Object Type: Report
System: The UNT Digital Library
Subsidy Cost of Federal Credit: Cost to the Government or Fair Value Cost? (open access)

Subsidy Cost of Federal Credit: Cost to the Government or Fair Value Cost?

Since the mid-1980s, budget experts have debated whether the best method of measuring the subsidy cost of federal credit (direct loans and loan guarantees) is the cost to the government or the fair value cost. This report presents a chronology of this still unresolved debate, which dates from the mid-1980s.
Date: April 25, 2012
Creator: Bickley, James M.
Object Type: Report
System: The UNT Digital Library
EC Transmission Line Risk Identification and Analysis (open access)

EC Transmission Line Risk Identification and Analysis

The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.
Date: April 1, 2012
Creator: Bigelow, Tim S.
Object Type: Report
System: The UNT Digital Library
Analytical Approach Treating Three-Dimensional Geometrical Effects of Parabolic Trough Collectors: Preprint (open access)

Analytical Approach Treating Three-Dimensional Geometrical Effects of Parabolic Trough Collectors: Preprint

An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.
Date: April 1, 2012
Creator: Binotti, M.; Zhu, G.; Gray, A. & Manzollini, G.
Object Type: Article
System: The UNT Digital Library
Annual Performance Evaluation of a Pair of Energy Efficient Houses (WC3 and WC4) in Oak Ridge, TN (open access)

Annual Performance Evaluation of a Pair of Energy Efficient Houses (WC3 and WC4) in Oak Ridge, TN

Beginning in 2008, two pairs of energy-saver houses were built at Wolf Creek in Oak Ridge, TN. These houses were designed to maximize energy efficiency using new ultra-high-efficiency components emerging from ORNL s Cooperative Research and Development Agreement (CRADA) partners and others. The first two houses contained 3713 square feet of conditioned area and were designated as WC1 and WC2; the second pair consisted of 2721 square feet conditioned area with crawlspace foundation and they re called WC3 and WC4. This report is focused on the annual energy performance of WC3 and WC4, and how they compare against a previously benchmarked maximum energy efficient house of a similar footprint. WC3 and WC4 are both about 55-60% more efficient than traditional new construction. Each house showcases a different envelope system: WC3 is built with advanced framing featured cellulose insulation partially mixed with phase change materials (PCM); and WC4 house has cladding composed of an exterior insulation and finish system (EIFS). The previously benchmarked house was one of three built at the Campbell Creek subdivision in Knoxville, TN. This house (CC3) was designed as a transformation of a builder house (CC1) with the most advanced energy-efficiency features, including solar electricity and hot …
Date: April 1, 2012
Creator: Biswas, Kaushik; Christian, Jeffrey E; Gehl, Anthony C; Jackson, Roderick K & Boudreaux, Philip R
Object Type: Report
System: The UNT Digital Library
Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements (open access)

Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of …
Date: April 1, 2012
Creator: Blanc, Katya Le & Oxstrand, Johanna
Object Type: Report
System: The UNT Digital Library
Risk Informed Safety Margin Characterization Case Study: Selection of Electrical Equipment To Be Subjected to Environmental Qualification (open access)

Risk Informed Safety Margin Characterization Case Study: Selection of Electrical Equipment To Be Subjected to Environmental Qualification

In general, the margins-based safety case helps the decision-maker manage plant margins most effectively. It tells the plant decision-maker such things as what margin is present (at the plant level, at the functional level, at the barrier level, at the component level), and where margin is thin or perhaps just degrading. If the plant is safe, it tells the decision-maker why the plant is safe and where margin needs to be maintained, and perhaps where the plant can afford to relax.
Date: April 1, 2012
Creator: Blanchard, D. & Youngblood, R.
Object Type: Report
System: The UNT Digital Library
Multiplexed gas spectroscopy using tunable VCSELs (open access)

Multiplexed gas spectroscopy using tunable VCSELs

Detection and identification of gas species using tunable laser diode laser absorption spectroscopy has been performed using vertical cavity surface emitting lasers (VCSEL). Two detection methods are compared: direct absorbance and wavelength modulation spectroscopy (WMS). In the first, the output of a DC-based laser is directly monitored to detect for any quench at the targeted specie wavelength. In the latter, the emission wavelength of the laser is modulated by applying a sinusoidal component on the drive current of frequency {omega}, and measuring the harmonics component (2{omega}) of the photo-detected current. This method shows a better sensitivity measured as signal to noise ratio, and is less susceptible to interference effects such as scattering or fouling. Gas detection was initially performed at room temperature and atmospheric conditions using VCSELs of emission wavelength 763 nm for oxygen and 1392 nm for water, scanning over a range of approximately 10 nm, sufficient to cover 5-10 gas specific absorption lines that enable identification and quantization of gas composition. The amplitude and frequency modulation parameters were optimized for each detected gas species, by performing two dimensional sweeps for both tuning current and either amplitude or frequency, respectively. We found that the highest detected signal is observed …
Date: April 10, 2012
Creator: Bond, T; Bond, S; McCarrick, J; Zumstein, J; Chang, A; Moran, B et al.
Object Type: Article
System: The UNT Digital Library
Deep Residential Retrofits in East Tennessee (open access)

Deep Residential Retrofits in East Tennessee

Executive Summary Oak Ridge National Laboratory (ORNL) is furthering residential energy retrofit research in the mixed-humid climate of East Tennessee by selecting 10 homes and guiding the homeowners in the energy retrofit process. The homeowners pay for the retrofits, and ORNL advises which retrofits to complete and collects post-retrofit data. This effort is in accordance with the Department of Energy s Building America program research goal of demonstrating market-ready energy retrofit packages that reduce home energy use by 30 50%. Through this research, ORNL researchers hope to understand why homeowners decide to partake in energy retrofits, the payback of home energy retrofits, and which retrofit packages most economically reduce energy use. Homeowner interviews help the researchers understand the homeowners experience. Information gathered during the interviews will aid in extending market penetration of home energy retrofits by helping researchers and the retrofit industry understand what drives homeowners in making positive decisions regarding these retrofits. This report summarizes the selection process, the pre-retrofit condition, the recommended retrofits, the actual cost of the retrofits (when available), and an estimated energy savings of the retrofit package using EnergyGauge . Of the 10 households selected to participate in the study, only five completed the recommended …
Date: April 1, 2012
Creator: Boudreaux, Philip R; Hendrick, Timothy P; Christian, Jeffrey E & Jackson, Roderick K
Object Type: Report
System: The UNT Digital Library
NEXT GENERATION COMMERCIAL HEAT PUMPWATER HEATER USING CARBON DIOXIDE USING DIFFERENT IMPROVEMENT APPROACHES (open access)

NEXT GENERATION COMMERCIAL HEAT PUMPWATER HEATER USING CARBON DIOXIDE USING DIFFERENT IMPROVEMENT APPROACHES

Although heat pump water heaters are today widely accepted in Japan, where energy costs are high and government incentives for their use exist, acceptance of such a product in the U.S. has been slow. This trend is slowly changing with the introduction of heat pump water heaters into the residential market, but remains in the commercial sector. Barriers to heat pump water heater acceptance in the commercial market have historically been performance, reliability and first/operating costs. The use of carbon dioxide (R744) as the refrigerant in such a system can improve performance for relatively small increase in initial cost and make this technology more appealing. What makes R744 an excellent candidate for use in heat pump water heaters is not only the wide range of ambient temperatures within which it can operate, but also the excellent ability to match water to refrigerant temperatures on the high side, resulting in very high exit water temperatures of up to 82ºC, as required by sanitary codes in the U.S. (Food Code, 2005), in a single pass, temperatures that are much more difficult to reach with other refrigerants. This can be especially attractive in applications where this water is used for the purpose of …
Date: April 1, 2012
Creator: Bowers, Chad; Petersen, Michael; Elbel, Stefan & Hrnjak, Pega
Object Type: Article
System: The UNT Digital Library
Measurements of the top quark mass at the Tevatron (open access)

Measurements of the top quark mass at the Tevatron

The mass of the top quark (m{sub top}) is a fundamental parameter of the standard model (SM). Currently, its most precise measurements are performed by the CDF and D0 collaborations at the Fermilab Tevatron p{bar p} collider at a centre-of-mass energy of {radical}s = 1.96 TeV. We review the most recent of those measurements, performed on data samples of up to 8.7 fb{sup -1} of integrated luminosity. The Tevatron combination using up to 5.8 fb{sup -1} of data results in a preliminary world average top quark mass of m{sub top} = 173.2 {+-} 0.9 GeV. This corresponds to a relative precision of about 0.54%. We conclude with an outlook of anticipated precision the final measurement of m{sub top} at the Tevatron.
Date: April 1, 2012
Creator: Brandt, Oleg & /Gottingen U., II. Phys. Inst.
Object Type: Article
System: The UNT Digital Library
Reexamination of Agency Reporting Requirements: Annual Process Under the GPRA Modernization Act of 2010 (GPRAMA) (open access)

Reexamination of Agency Reporting Requirements: Annual Process Under the GPRA Modernization Act of 2010 (GPRAMA)

On January 4, 2011, the GPRA Modernization Act of 2010 (GPRAMA) became law. The acronym "GPRA" in the act's short title refers to the Government Performance and Results Act of 1993 (GPRA 1993), a law that GPRAMA substantially modified. Some of GPRAMA's provisions require agencies to produce plans and reports for a variety of audiences that focus on goal-setting and performance measurement. Other provisions, by contrast, establish an annual process to reexamine the usefulness of certain reporting requirements. The report concludes by looking potential issues for Congress in two categories.
Date: April 18, 2012
Creator: Brass, Clinton T.
Object Type: Report
System: The UNT Digital Library
Unknown Foundation Determination for Scour (open access)

Unknown Foundation Determination for Scour

This report describes a project to develop a global approach that would reduce the level of uncertainty associated with unknown foundations.
Date: April 2012
Creator: Briaud, Jean-Louis; Medina-Cetina, Zenon; Hurlebaus, Stefan; Everett, Mark; Tucker, Stacey; Yousefpour, Negin et al.
Object Type: Report
System: The Portal to Texas History
Eliminating the Renormalization Scale Ambiguity for Top-Pair Production Using the Principle of Maximum Conformality (open access)

Eliminating the Renormalization Scale Ambiguity for Top-Pair Production Using the Principle of Maximum Conformality

The uncertainty in setting the renormalization scale in finite-order perturbative QCD predictions using standard methods substantially reduces the precision of tests of the Standard Model in collider experiments. It is conventional to choose a typical momentum transfer of the process as the renormalization scale and take an arbitrary range to estimate the uncertainty in the QCD prediction. However, predictions using this procedure depend on the choice of renormalization scheme, leave a non-convergent renormalon perturbative series, and moreover, one obtains incorrect results when applied to QED processes. In contrast, if one fixes the renormalization scale using the Principle of Maximum Conformality (PMC), all non-conformal {l_brace}{beta}{sub i}{r_brace}-terms in the perturbative expansion series are summed into the running coupling, and one obtains a unique, scale-fixed, scheme-independent prediction at any finite order. The PMC renormalization scale {mu}{sub R}{sup PMC} and the resulting finite-order PMC prediction are both to high accuracy independent of choice of the initial renormalization scale {mu}{sub R}{sup init}, consistent with renormalization group invariance. Moreover, after PMC scale-setting, the n!-growth of the pQCD expansion is eliminated. Even the residual scale-dependence at fixed order due to unknown higher-order {l_brace}{beta}{sub i}{r_brace}-terms is substantially suppressed. As an application, we apply the PMC procedure to obtain …
Date: April 2, 2012
Creator: Brodsky, Stanley J. & Wu, Xing-Gang
Object Type: Article
System: The UNT Digital Library
Reliable High Performance Peta- and Exa-Scale Computing (open access)

Reliable High Performance Peta- and Exa-Scale Computing

As supercomputers become larger and more powerful, they are growing increasingly complex. This is reflected both in the exponentially increasing numbers of components in HPC systems (LLNL is currently installing the 1.6 million core Sequoia system) as well as the wide variety of software and hardware components that a typical system includes. At this scale it becomes infeasible to make each component sufficiently reliable to prevent regular faults somewhere in the system or to account for all possible cross-component interactions. The resulting faults and instability cause HPC applications to crash, perform sub-optimally or even produce erroneous results. As supercomputers continue to approach Exascale performance and full system reliability becomes prohibitively expensive, we will require novel techniques to bridge the gap between the lower reliability provided by hardware systems and users unchanging need for consistent performance and reliable results. Previous research on HPC system reliability has developed various techniques for tolerating and detecting various types of faults. However, these techniques have seen very limited real applicability because of our poor understanding of how real systems are affected by complex faults such as soft fault-induced bit flips or performance degradations. Prior work on such techniques has had very limited practical utility because …
Date: April 2, 2012
Creator: Bronevetsky, G
Object Type: Report
System: The UNT Digital Library
Compiled MPI: Cost-Effective Exascale Applications Development (open access)

Compiled MPI: Cost-Effective Exascale Applications Development

The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount …
Date: April 10, 2012
Creator: Bronevetsky, G.; Quinlan, D.; Lumsdaine, A. & Hoefler, T.
Object Type: Report
System: The UNT Digital Library
Religion and the Workplace: Legal Analysis of Title VII of the Civil Rights Act of 1964 as It Applies to Religion and Religious Organizations (open access)

Religion and the Workplace: Legal Analysis of Title VII of the Civil Rights Act of 1964 as It Applies to Religion and Religious Organizations

This report reviews the scope of Title VII's prohibition on religious discrimination, its exemptions for religious organizations, and its requirements for accommodations. It also analyzes the exemptions available to religious organizations for the non-discrimination requirements. Finally, it addresses protections based on Title VII's religious exemption that have been included in the proposed Employment Non-Discrimination Act.
Date: April 3, 2012
Creator: Brougher, Cynthia
Object Type: Report
System: The UNT Digital Library
Assessment and Mitigation of Diagnostic-Generated Electromagnetic Interference at the National Ignition Facility (open access)

Assessment and Mitigation of Diagnostic-Generated Electromagnetic Interference at the National Ignition Facility

Electromagnetic interference (EMI) is an ever-present challenge at laser facilities such as the National Ignition Facility (NIF). The major source of EMI at such facilities is laser-target interaction that can generate intense electromagnetic fields within, and outside of, the laser target chamber. In addition, the diagnostics themselves can be a source of EMI, even interfering with themselves. In this paper we describe EMI generated by ARIANE and DIXI, present measurements, and discuss effects of the diagnostic-generated EMI on ARIANE's CCD and on a PMT nearby DIXI. Finally we present some of the efforts we have made to mitigate the effects of diagnostic-generated EMI on NIF diagnostics.
Date: April 20, 2012
Creator: Brown, C. G.; Ayers, M. J.; Felker, B.; Ferguson, W.; Holder, J P; Nagel, S. R. et al.
Object Type: Article
System: The UNT Digital Library
Measure Guideline: Transitioning from Three-Coat Stucco to One-Coat Stucco with EPS (open access)

Measure Guideline: Transitioning from Three-Coat Stucco to One-Coat Stucco with EPS

This Measure Guideline has been developed to help builders transition from using a traditional three-coat stucco wall-cladding system to a one-coat stucco wall-cladding system with expanded polystyrene (EPS) insulated sheathing. The three-coat system uses a base layer, a fill layer, and a finish layer. The one-coat system maintains the look of a traditional stucco system but uses only a base layer and a finish coat over EPS insulation that achieves higher levels of energy efficiency. Potential risks associated with the installation of a one-coat stucco system are addressed in terms of design, installation, and warranty concerns such as cracking and delamination, along with mitigation strategies to reduce these risks.
Date: April 1, 2012
Creator: Brozyna, K.; Davis, G. & Rapport, A.
Object Type: Report
System: The UNT Digital Library
Ballast Water Management to Combat Invasive Species (open access)

Ballast Water Management to Combat Invasive Species

This report is on Ballast Water Management to Combat Invasive Species.
Date: April 10, 2012
Creator: Buck, Eugene H.
Object Type: Report
System: The UNT Digital Library
Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment (open access)

Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment

The massive Tohoku earthquake and tsunami of March 11, 2011, caused extensive damage in northeastern Japan, including damage to the Fukushima Dai-ichi nuclear power installation, which resulted in the release of radiation. Concerns arose about the potential effects of this released radiation on the U.S. marine environment and resources.
Date: April 2, 2012
Creator: Buck, Eugene H. & Upton, Harold F.
Object Type: Report
System: The UNT Digital Library
The Endangered Species Act (ESA) in the 112th Congress: Conflicting Values and Difficult Choices (open access)

The Endangered Species Act (ESA) in the 112th Congress: Conflicting Values and Difficult Choices

None
Date: April 4, 2012
Creator: Buck, Eugene H.; Corn, M. Lynne; Alexander, Kristina; Sheikh, Pervaze A. & Meltz, Robert
Object Type: Report
System: The UNT Digital Library