TOUGH Short Course for Scientists and Engineers (open access)

TOUGH Short Course for Scientists and Engineers

The TOUGH family of codes is a suite of computer programs for the simulation of multiphase fluid and heat flows in porous and fractured media with applications to geothermal reservoir engineering, nuclear waste disposal in geologic formations, geologic carbon sequestration, gas hydrate research, vadose zone hydrology, environmental remediation, oil and gas reservoir engineering, and other mass transport and energy transfer problems in complex geologic settings. TOUGH has been developed in the Earth Sciences Division of the Lawrence Berkeley National Laboratory (LBNL). Many modifications and enhancements have been made to TOUGH (at LBNL and elsewhere) from the time it was first released in 1987. TOUGH and its various descendants (such as iTOUGH2, T2VOC, TMVOC, EWASG, TOUGHREACT, TOUGH+ and many more) are currently in use in approximately 300 research laboratories, private companies, and universities in 33 countries. The LBNL group, headed by Karsten Pruess, serves as custodian of the code. The TOUGH simulators were developed for problems involving strongly heat-driven flow. To describe these phenomena a multi-phase approach to fluid and heat flow is used, which fully accounts for the movement of gaseous and liquid phases, their transport of latent and sensible heat, and phase transitions between liquid and vapor. TOUGH takes …
Date: August 1, 2006
Creator: Kowalsky, Michael B. & Finsterle, Stefan
Object Type: Report
System: The UNT Digital Library
Post-Closure Inspection and Monitoring Report for Corrective Action Unit 110: Area 3 WMD U-3ax/bl Crater, Nevada Test Site, Nevada (open access)

Post-Closure Inspection and Monitoring Report for Corrective Action Unit 110: Area 3 WMD U-3ax/bl Crater, Nevada Test Site, Nevada

This Post-Closure Inspection and Monitoring Report provides the results and inspections and monitoring for Corrective Action Unit 110: Area 3 Waste Management Division U-3ax/bl Crater, Nevada Test Site, Nevada. This report includes an analysis and summary of the site inpsections, repairs and maintenance, meteorological information, and soil moisture monitoring data obtained at Corrective Action Unit 110, for the annual period July 2005 thrugh June 2006.
Date: August 1, 2006
Creator: United States. National Nuclear Security Administration. Nevada Site Office.
Object Type: Report
System: The UNT Digital Library
Evaluation of Cavitation-Erosion Resistance of 316LN Stainless Steel in Mercury Containing Metallic Solutes (open access)

Evaluation of Cavitation-Erosion Resistance of 316LN Stainless Steel in Mercury Containing Metallic Solutes

Room temperature cavitation tests of vacuum annealed type 316LN stainless steel were performed in pure Hg and in Hg with various amounts of metallic solute to evaluate potential mitigation of erosion/wastage. Tests were performed using an ultrasonic vibratory horn with specimens attached at the tip. All of the solutes examined, which included 5 wt% In, 10 wt% In, 4.4 wt% Cd, 2 wt% Ga, and a mixture that included 1 wt% each of Pb, Sn, and Zn, were found to increase cavitation-erosion as measured by increased weight loss and/or surface profile development compared to exposures for the same conditions in pure Hg. Qualitatively, each solute appeared to increase the post-test wetting tenacity of the Hg solutions and render the Hg mixture susceptible to manipulation of droplet shape.
Date: August 1, 2006
Creator: Pawel, Steven J & Mansur, Louis K
Object Type: Report
System: The UNT Digital Library
Low energy spread 100 MeV-1 GeV electron bunches from laserwakefiel d acceleration at LOASIS (open access)

Low energy spread 100 MeV-1 GeV electron bunches from laserwakefiel d acceleration at LOASIS

Experiments at the LOASIS laboratory of LBNL recentlydemonstrated production of 100 MeV electron beams with low energy spreadand low divergence from laser wakefield acceleration. The radiationpressure of a 10 TW laser pulse guided over 10 diffraction ranges by aplasma density channel was used to drive an intense plasma wave(wakefield), producing acceleration gradients on the order of 100 GV/m ina mm-scale channel. Beam energy has now been increased from 100 to 1000MeV by using a cm-scale guiding channel at lower density, driven by a 40TW laser, demonstrating the anticipated scaling to higher beam energies.Particle simulations indicate that the low energy spread beams wereproduced from self trapped electrons through the interplay of trapping,loading, and dephasing. Other experiments and simulations are alsounderway to control injection of particles into the wake, and henceimprove beam quality and stability further.
Date: August 1, 2006
Creator: Geddes, C. G. R.; Esarey, E.; Michel, P.; Nagler, B.; Nakamura, K.; Plateau, G. R. et al.
Object Type: Article
System: The UNT Digital Library
Clean Slate Environmental Remediation DSA for 10 CFR 830 Compliance (open access)

Clean Slate Environmental Remediation DSA for 10 CFR 830 Compliance

Clean Slate Sites II and III are scheduled for environmental remediation (ER) to remove elevated levels of radionuclides in soil. These sites are contaminated with legacy remains of non-nuclear yield nuclear weapons experiments at the Nevada Test Site, that involved high explosive, fissile, and related materials. The sites may also hold unexploded ordnance (UXO) from military training activities in the area over the intervening years. Regulation 10 CFR 830 (Ref. 1) identifies DOE-STD-1120-98 (Ref. 2) and 29 CFR 1910.120 (Ref. 3) as the safe harbor methodologies for performing these remediation operations. Of these methodologies, DOE-STD-1120-98 has been superseded by DOE-STD-1120-2005 (Ref. 4). The project adopted DOE-STD-1120-2005, which includes an approach for ER projects, in combination with 29 CFR 1910.120, as the basis documents for preparing the documented safety analysis (DSA). To securely implement the safe harbor methodologies, we applied DOE-STD-1027-92 (Ref. 5) and DOE-STD-3009-94 (Ref. 6), as needed, to develop a robust hazard classification and hazards analysis that addresses non-standard hazards such as radionuclides and UXO. The hazard analyses provided the basis for identifying Technical Safety Requirements (TSR) level controls. The DOE-STD-1186-2004 (Ref. 7) methodology showed that some controls warranted elevation to Specific Administrative Control (SAC) status. In addition to …
Date: August 1, 2006
Creator: James L. Traynor, Stephen L. Nicolosi, Michael L. Space, Louis F. Restrepo
Object Type: Article
System: The UNT Digital Library
High-luminosity primary vertex selection in top-quark studies using the Collider Detector at Fermilab (open access)

High-luminosity primary vertex selection in top-quark studies using the Collider Detector at Fermilab

Improving our ability to identify the top quark pair (t{bar t}) primary vertex (PV) on an event-by-event basis is essential for many analyses in the lepton-plus-jets channel performed by the Collider Detector at Fermilab (CDF) Collaboration. We compare the algorithm currently used by CDF (A1) with another algorithm (A2) using Monte Carlo simulation at high instantaneous luminosities. We confirm that A1 is more efficient than A2 at selecting the t{bar t} PV at all PV multiplicities, both with efficiencies larger than 99%. Event selection rejects events with a distance larger than 5 cm along the proton beam between the t{bar t} PV and the charged lepton. We find flat distributions for the signal over background significance of this cut for all cut values larger than 1 cm, for all PV multiplicities and for both algorithms. We conclude that any cut value larger than 1 cm is acceptable for both algorithms under the Tevatron's expected instantaneous luminosity improvements.
Date: August 1, 2006
Creator: Buzatu, Adrian
Object Type: Report
System: The UNT Digital Library
Deployment Notes for Sodars at the Stevens Institute of Technology during the March 2005 Urban Dispersion Program Field Campaign (MSG05) (open access)

Deployment Notes for Sodars at the Stevens Institute of Technology during the March 2005 Urban Dispersion Program Field Campaign (MSG05)

This report documents the deployment of two sodars at the Stevens Institute of Technology (SIT) in Hoboken, New Jersey, during the March 2005 Madison Square Garden Urban Dispersion Field Campaign (MSG05) conducted in the vicinity of Madison Square Garden in Midtown Manhattan. One sodar was a Scintec MFAS sodar that was operated on a dock along the Hudson River. This sodar was only operated during Intensive Observation Periods (IOPs). The other sodar was an AeroVironment (AV) Model 3000 MiniSodar that was located on top of the Howe Center at SIT. This sodar was operated continually, but there were data quality issues in the lowest three and upper seven range gates during non-IOP periods. The IOP data from the AeroVironment was reprocessed so that only data from the lowest three and highest seven range gates was removed. Measurements from both sodars were compared to measurements made using a propeller and vane anemometer that was also located on top of the Howe Center. This report also describes the quality control methods applied to data from each sodar and the structure of the data files available. The agreement between the sodars is generally good, and we recommend using either the AV data or …
Date: August 1, 2006
Creator: Berg, Larry K. & Allwine, K Jerry
Object Type: Report
System: The UNT Digital Library
STROBOSCOPIC IMAGE CAPTURE: REDUCING THE DOSE PER FRAME BY AFACTOR OF 30 DOES NOT PREVENT BEAM-INDUCED SPECIMEN MOVEMENT INPARAFFIN (open access)

STROBOSCOPIC IMAGE CAPTURE: REDUCING THE DOSE PER FRAME BY AFACTOR OF 30 DOES NOT PREVENT BEAM-INDUCED SPECIMEN MOVEMENT INPARAFFIN

Beam-induced specimen movement may be the major factor that limits the quality of high-resolution images of organic specimens. One of the possible measures to improve the situation that was proposed by Henderson and Glaeser (Henderson and Glaeser, 1985), which we refer to here as 'stroboscopic image capture', is to divide the normal exposure into many successive frames, thus reducing the amount of electron exposure--and possibly the amount of beam-induced movement--per frame. The frames would then be aligned and summed. We have performed preliminary experiments on stroboscopic imaging using a 200-kV electron microscope that was equipped with a high dynamic range CCD camera for image recording and a liquid N{sub 2}-cooled cryoholder. Single-layer paraffin crystals on carbon film were used as a test specimen. The ratio F(g)/F(0) of paraffin reflections, calculated from the images, serves as our criterion for the image quality. In the series that were evaluated, no significant improvement of the F{sub image}(g)/F{sub image}(0) ratio was found, even though the electron exposure per frame was reduced by a factor of 30. A frame-to-frame analysis of image distortions showed that considerable beam-induced movement had still occurred during each frame. In addition, the paraffin crystal lattice was observed to move relative …
Date: August 1, 2006
Creator: Typke, Dieter; Gilpin, Christopher J.; Downing, Kenneth H. & Glaeser, Robert M.
Object Type: Article
System: The UNT Digital Library
The structure of a-C: What NEXAFS and EXAFS see (open access)

The structure of a-C: What NEXAFS and EXAFS see

Mechanically hard ha-C and soft sa-C amorphous carbon films of 2.9 and 2.2 g cm-3 approximate densities were prepared by filtered cathodic arc deposition and analyzed by near-edge x-ray absorption spectroscopy NEXAFS and extended x-ray absorption spectroscopy EXAFS to determine their structure. The analysis observed an insignificant level of pi bond conjugation in both kind of films. EXAFS distinguished two types of atomic environments in them: one semiordered with well defined bond lengths, and the other with so strong bond disorder that its contribution to EXAFS was undetectable. The proportion of atoms in the semiordered atomic environments was of less than 40percent in both films. Their bond lengths were similar to those of diamond in the ha-C films and to graphite in the sa-C. NEXAFS spectra analysis was based on the linear relation between sigma* energy and bond length. It served to quantify the proportion of sp3 bonded atoms in a-C, to deduce the average bond length of the atoms undetected by EXAFS, and to determine the level of bond conjugation in the films. The sp3 concentration estimated with the proposed method was of 44percent in the ha-C films and 10percent in the sa-C films. These values were consistent with …
Date: August 1, 2006
Creator: Hussain, Zahid; Diaz, J.; Monteiro, O.R. & Hussain, Z.
Object Type: Article
System: The UNT Digital Library
Summary of research output from DOE grant DE-FG02-92ER45471 during the period 1992-2006: publications, invited talks, conference organization, and PhD students graduated. (open access)

Summary of research output from DOE grant DE-FG02-92ER45471 during the period 1992-2006: publications, invited talks, conference organization, and PhD students graduated.

In this report I summarize some of the main results obtained during the present grant period. They are: (1) Orientation selection in dendritic evolution; (2) Globubar-dendritic transition; and (3) Physics and prediction of grain boundary mobility.
Date: August 1, 2006
Creator: Karma, Alain, PhD.
Object Type: Article
System: The UNT Digital Library
QCD fits to neutrino-iron structure functions at NuTeV (open access)

QCD fits to neutrino-iron structure functions at NuTeV

This thesis presents a new determination of {Lambda}{sub QCD} from Next-to-Leading Order QCD fits to the Q{sup 2} dependence of neutrino-iron structure functions. This is the first measurement of {Lambda}{sub QCD} which uses a theoretical model that fully accounts for heavy quark production. Compared with previous neutrino measurements, the result has improved understanding of the largest systematic uncertainties on the muon and hadron energy scales. These improvements lead to one of the most precise determination of {alpha}{sub S} at moderate Q{sup 2}. NuTeV is a neutrino-iron deep inelastic scattering (DIS) experiment that collected data during 1996-97 at Fermilab. The key features of NuTeV include its sign-selected beam which produced separate high purity neutrino and antineutrino beams, and its continuous calibration beam which enabled NuTeV to considerably improve the knowledge of energy scales which have dominated uncertainties in the previous measurements.
Date: August 1, 2006
Creator: Radescu, Voica A. & U., /Pittsburgh
Object Type: Thesis or Dissertation
System: The UNT Digital Library
Post-Closure Inspection and Monitoring Report for Corrective Action Unit 110: Area 3 WMD U-3ax/bl Crater, Nevada Test Site, Nevada (open access)

Post-Closure Inspection and Monitoring Report for Corrective Action Unit 110: Area 3 WMD U-3ax/bl Crater, Nevada Test Site, Nevada

This Post-Closure Inspection and Monitoring Report (PCIMR) provides the results of inspections and monitoring for Corrective Action Unit (CAU) 110, Area 3 WMD [Waste Management Division] U-3ax/bl Crater. This PCIMR includes an analysis and summary of the site inspections, repairs and maintenance, meteorological information, and soil moisture monitoring data obtained at CAU 110, for the annual period July 2005 through June 2006. Site inspections of the cover were performed quarterly to identify any significant changes to the site requiring action. The overall condition of the cover, cover vegetation, perimeter fence, and UR warning signs was good. Settling was observed that exceeded the action level as specified in Section VILB.7 of the Hazardous Waste Permit Number NEV HW009 (Nevada Division of Environmental Protection, 2000). This permit states that cracks or settling greater than 15 centimeters (6 inches) deep that extend 1.0 meter (m) (3 feet [ft]) or more on the cover will be evaluated and repaired within 60 days of detection. Along the east edge of the cover (repaired previously in August 2003, December 2003, May 2004, October 2004), an area of settling was observed during the December 2005 inspection to again be above the action level, and required repair. This …
Date: August 1, 2006
Creator: National Security Technologies, LLC
Object Type: Report
System: The UNT Digital Library
Development and test of LARP technological quadrupole (TQC) magnet (open access)

Development and test of LARP technological quadrupole (TQC) magnet

In support of the development of a large-aperture Nb{sub 3}Sn superconducting quadrupole for the Large Hadron Collider (LHC) luminosity upgrade, two-layer quadrupole models (TQC and TQS) with 90-mm aperture are being constructed at Fermilab and LBNL within the framework of the US LHC Accelerator Research Program (LARP). This paper describes the construction and test of model TQC01. ANSYS calculations of the structure are compared with measurements during construction. Fabrication experience is described and in-process measurements are reported. Test results at 4.5K are presented, including magnet training, current ramp rate studies and magnet quench current . Results of magnetic measurements at helium temperature are also presented.
Date: August 1, 2006
Creator: Feher, S.; Bossert, R. C.; Ambrosio, G.; Andreev, N.; Barzi, E.; Carcagno, R. et al.
Object Type: Article
System: The UNT Digital Library
Efficient System Design and Sustainable Finance for China's Village Electrification Program: Preprint (open access)

Efficient System Design and Sustainable Finance for China's Village Electrification Program: Preprint

This paper describes a joint effort of the Institute for Electrical Engineering of the Chinese Academy of Sciences (IEE), and the U.S. National Renewable Energy Laboratory (NREL) to support China's rural electrification program. This project developed a design tool that provides guidelines both for off-grid renewable energy system designs and for cost-based tariff and finance schemes to support them. This tool was developed to capitalize on lessons learned from the Township Electrification Program that preceded the Village Electrification Program. We describe the methods used to develop the analysis, some indicative results, and the planned use of the tool in the Village Electrification Program.
Date: August 1, 2006
Creator: Ma, S.; Yin, H. & Kline, D. M.
Object Type: Article
System: The UNT Digital Library
Design of a boron neutron capture enhanced fast neutron therapy assembly (open access)

Design of a boron neutron capture enhanced fast neutron therapy assembly

The use of boron neutron capture to boost tumor dose in fast neutron therapy has been investigated at several fast neutron therapy centers worldwide. This treatment is termed boron neutron capture enhanced fast neutron therapy (BNCEFNT). It is a combination of boron neutron capture therapy (BNCT) and fast neutron therapy (FNT). It is believed that BNCEFNT may be useful in the treatment of some radioresistant brain tumors, such as glioblastoma multiform (GBM). A boron neutron capture enhanced fast neutron therapy assembly has been designed for the Fermilab Neutron Therapy Facility (NTF). This assembly uses a tungsten filter and collimator near the patient's head, with a graphite reflector surrounding the head to significantly increase the dose due to boron neutron capture reactions. The assembly was designed using Monte Carlo radiation transport code MCNP version 5 for a standard 20x20 cm{sup 2} treatment beam. The calculated boron dose enhancement at 5.7-cm depth in a water-filled head phantom in the assembly with a 5x5 cm{sup 2} collimation was 21.9% per 100-ppm {sup 10}B for a 5.0-cm tungsten filter and 29.8% for a 8.5-cm tungsten filter. The corresponding dose rate for the 5.0-cm and 8.5-cm thick filters were 0.221 and 0.127 Gy/min, respectively; about …
Date: August 1, 2006
Creator: Wang, Zhonglu & Tech, /Georgia
Object Type: Thesis or Dissertation
System: The UNT Digital Library
A Measurement of the Top Quark Mass in the Dilepton Decay Channel at CDF II (open access)

A Measurement of the Top Quark Mass in the Dilepton Decay Channel at CDF II

The top quark, the most recently discovered quark, is the most massive known fundamental fermion. Precision measurements of its mass, a free parameter in the Standard Model of particle physics, can be used to constrain the mass of the Higgs Boson. In addition, deviations in the mass as measured in different channels can provide possible evidence for new physics. We describe a measurement of the top quark mass in the decay channel with two charged leptons, known as the dilepton channel, using data collected by the CDF II detector from p{bar p} collisions with {radical}s = 1.96 TeV at the Fermilab Tevatron. The likelihood in top mass is calculated for each event by convolving the leading order matrix element describing q{bar q} {yields} t{bar t} {yields} b{ell}{nu}{sub {ell}}{bar b}{ell}'{nu}{sub {ell}'} with detector resolution functions. The presence of background events in the data sample is modeled using similar calculations involving the matrix elements for major background processes. In a data sample with integrated luminosity of 1.0 fb{sup -1}, we observe 78 candidate events and measure M{sub t} = 164.5 {+-} 3.9(stat.) {+-} 3.9(syst.) GeV/c{sup 2}, the most precise measurement of the top quark mass in this channel to date.
Date: August 1, 2006
Creator: Jayatilaka, Bodhitha A.
Object Type: Thesis or Dissertation
System: The UNT Digital Library
Impact of polymer film thickness and cavity size on polymer flow during embossing : towards process design rules for nanoimprint lithography. (open access)

Impact of polymer film thickness and cavity size on polymer flow during embossing : towards process design rules for nanoimprint lithography.

This paper presents continuum simulations of polymer flow during nanoimprint lithography (NIL). The simulations capture the underlying physics of polymer flow from the nanometer to millimeter length scale and examine geometry and thermophysical process quantities affecting cavity filling. Variations in embossing tool geometry and polymer film thickness during viscous flow distinguish different flow driving mechanisms. Three parameters can predict polymer deformation mode: cavity width to polymer thickness ratio, polymer supply ratio, and Capillary number. The ratio of cavity width to initial polymer film thickness determines vertically or laterally dominant deformation. The ratio of indenter width to residual film thickness measures polymer supply beneath the indenter which determines Stokes or squeeze flow. The local geometry ratios can predict a fill time based on laminar flow between plates, Stokes flow, or squeeze flow. Characteristic NIL capillary number based on geometry-dependent fill time distinguishes between capillary or viscous driven flows. The three parameters predict filling modes observed in published studies of NIL deformation over nanometer to millimeter length scales. The work seeks to establish process design rules for NIL and to provide tools for the rational design of NIL master templates, resist polymers, and process parameters.
Date: August 1, 2006
Creator: Schunk, Peter Randall; King, William P. (Georgia Institute of Technology, Atlanta, GA); Sun, Amy Cha-Tien & Rowland, Harry D. (Georgia Institute of Technology, Atlanta, GA)
Object Type: Report
System: The UNT Digital Library
A Hard Constraint Algorithm to Model Particle Interactions in DNA-laden Flows (open access)

A Hard Constraint Algorithm to Model Particle Interactions in DNA-laden Flows

We present a new method for particle interactions in polymer models of DNA. The DNA is represented by a bead-rod polymer model and is fully-coupled to the fluid. The main objective in this work is to implement short-range forces to properly model polymer-polymer and polymer-surface interactions, specifically, rod-rod and rod-surface uncrossing. Our new method is based on a rigid constraint algorithm whereby rods elastically bounce off one another to prevent crossing, similar to our previous algorithm used to model polymer-surface interactions. We compare this model to a classical (smooth) potential which acts as a repulsive force between rods, and rods and surfaces.
Date: August 1, 2006
Creator: Trebotich, D; Miller, G H & Bybee, M D
Object Type: Article
System: The UNT Digital Library
Temporal transcriptomic analysis of Desulfovibrio vulgaris Hildenborough transition into stationary phase growth during electrondonor depletion (open access)

Temporal transcriptomic analysis of Desulfovibrio vulgaris Hildenborough transition into stationary phase growth during electrondonor depletion

Desulfovibrio vulgaris was cultivated in a defined medium, and biomass was sampled for approximately 70 h to characterize the shifts in gene expression as cells transitioned from the exponential to the stationary phase during electron donor depletion. In addition to temporal transcriptomics, total protein, carbohydrate, lactate, acetate, and sulfate levels were measured. The microarray data were examined for statistically significant expression changes, hierarchical cluster analysis, and promoter element prediction and were validated by quantitative PCR. As the cells transitioned from the exponential phase to the stationary phase, a majority of the down-expressed genes were involved in translation and transcription, and this trend continued at the remaining times. There were general increases in relative expression for intracellular trafficking and secretion, ion transport, and coenzyme metabolism as the cells entered the stationary phase. As expected, the DNA replication machinery was down-expressed, and the expression of genes involved in DNA repair increased during the stationary phase. Genes involved in amino acid acquisition, carbohydrate metabolism, energy production, and cell envelope biogenesis did not exhibit uniform transcriptional responses. Interestingly, most phage-related genes were up-expressed at the onset of the stationary phase. This result suggested that nutrient depletion may affect community dynamics and DNA transfer mechanisms …
Date: August 1, 2006
Creator: Clark, M. E.; He, Q.; He, Z.; Huang, K. H.; Alm, E. J.; Wan, X. -F. et al.
Object Type: Article
System: The UNT Digital Library
Genomics:GTL Bioenergy Research Centers White Paper (open access)

Genomics:GTL Bioenergy Research Centers White Paper

In his Advanced Energy Initiative announced in January 2006, President George W. Bush committed the nation to new efforts to develop alternative sources of energy to replace imported oil and fossil fuels. Developing cost-effective and energy-efficient methods of producing renewable alternative fuels such as cellulosic ethanol from biomass and solar-derived biofuels will require transformational breakthroughs in science and technology. Incremental improvements in current bioenergy production methods will not suffice. The Genomics:GTL Bioenergy Research Centers will be dedicated to fundamental research on microbe and plant systems with the goal of developing knowledge that will advance biotechnology-based strategies for biofuels production. The aim is to spur substantial progress toward cost-effective production of biologically based renewable energy sources. This document describes the rationale for the establishment of the centers and their objectives in light of the U.S. Department of Energy's mission and goals. Developing energy-efficient and cost-effective methods of producing alternative fuels such as cellulosic ethanol from biomass will require transformational breakthroughs in science and technology. Incremental improvements in current bioenergy-production methods will not suffice. The focus on microbes (for cellular mechanisms) and plants (for source biomass) fundamentally exploits capabilities well known to exist in the microbial world. Thus 'proof of concept' is …
Date: August 1, 2006
Creator: Mansfield, Betty Kay; Alton, Anita Jean; Andrews, Shirley H; Bownas, Jennifer Lynn; Casey, Denise; Martin, Sheryl A et al.
Object Type: Report
System: The UNT Digital Library
Development of a Model Specification for Performance MonitoringSystems for Commercial Buildings (open access)

Development of a Model Specification for Performance MonitoringSystems for Commercial Buildings

The paper describes the development of a model specification for performance monitoring systems for commercial buildings. The specification focuses on four key aspects of performance monitoring: (1) performance metrics; (2) measurement system requirements; (3) data acquisition and archiving; and (4) data visualization and reporting. The aim is to assist building owners in specifying the extensions to their control systems that are required to provide building operators with the information needed to operate their buildings more efficiently and to provide automated diagnostic tools with the information required to detect and diagnose faults and problems that degrade energy performance. The paper reviews the potential benefits of performance monitoring, describes the specification guide and discusses briefly the ways in which it could be implemented. A prototype advanced visualization tool is also described, along with its application to performance monitoring. The paper concludes with a description of the ways in which the specification and the visualization tool are being disseminated and deployed.
Date: August 1, 2006
Creator: Haves, Philip; Hitchcock, Robert J.; Gillespie, Kenneth L.; Brook, Martha; Shockman, Christine; Deringer, Joseph J. et al.
Object Type: Article
System: The UNT Digital Library
The effect of truncation on very small cardiac SPECT camerasystems (open access)

The effect of truncation on very small cardiac SPECT camerasystems

Background: The limited transaxial field-of-view (FOV) of avery small cardiac SPECT camera system causes view-dependent truncationof the projection of structures exterior to, but near the heart. Basictomographic principles suggest that the reconstruction of non-attenuatedtruncated data gives a distortion-free image in the interior of thetruncated region, but the DC term of the Fourier spectrum of thereconstructed image is incorrect, meaning that the intensity scale of thereconstruction is inaccurate. The purpose of this study was tocharacterize the reconstructed image artifacts from truncated data, andto quantify their effects on the measurement of tracer uptake in themyocardial. Particular attention was given to instances where the heartwall is close to hot structures (structures of high activity uptake).Methods: The MCAT phantom was used to simulate a 2D slice of the heartregion. Truncated and non-truncated projections were formed both with andwithout attenuation. The reconstructions were analyzed for artifacts inthe myocardium caused by truncation, and for the effect that attenuationhas relative to increasing those artifacts. Results: The inaccuracy dueto truncation is primarily caused by an incorrect DC component. Forvisualizing theleft ventricular wall, this error is not worse than theeffect of attenuation. The addition of a small hot bowel-like structurenear the left ventricle causes few changes in counts on …
Date: August 1, 2006
Creator: Rohmer, Damien; Eisner, Robert L. & Gullberg, Grant T.
Object Type: Report
System: The UNT Digital Library
A Risk-Based Sensor Placement Methodology (open access)

A Risk-Based Sensor Placement Methodology

A sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Historical meteorological data are used to characterize weather conditions as wind speed and direction pairs with the percentage of occurrence of the pairs over the historical period. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate population at risk against standard exposure levels. Sensor locations are determined via a dynamic programming algorithm where threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. Moreover, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats.
Date: August 1, 2006
Creator: Lee, Ronald W. & Kulesz, James J.
Object Type: Report
System: The UNT Digital Library
Effect of Process Variables During the Head-End Treatment of Spent Oxide Fuel (open access)

Effect of Process Variables During the Head-End Treatment of Spent Oxide Fuel

The development of a head-end processing step for spent oxide fuel that applies to both aqueous and pyrometallurgical technologies is being performed by the Idaho National Laboratory, the Oak Ridge National Laboratory, and the Korean Atomic Energy Research Institute through a joint International Nuclear Energy Research Initiative. The processing step employs high temperatures and oxidative gases to promote the oxidation of UO2 to U3O8. Potential benefits of the head-end step include the removal or reduction of fission products as well as separation of the fuel from cladding. The effects of temperature, pressure, oxidative gas, and cladding have been studied with irradiated spent oxide fuel to determine the optimum conditions for process control. Experiments with temperatures ranging from 500oC to 1250oC have been performed on spent fuel using either air or oxygen gas for the oxidative reaction. Various flowrates and applications have been tested with the oxidative gases to discern the effects on the process. Tests have also been performed under vacuum conditions, following the oxidation cycle, at high temperatures to improve the removal of fission products. The effects of cladding on fission product removal have also been investigated with released fuel under vacuum and high temperature conditions. Results from these …
Date: August 1, 2006
Creator: Bateman, K. J.; Morgan, C. D.; Berg, J. F.; Brough, D. J.; Crane, P. J.; Cummings, D. G. et al.
Object Type: Article
System: The UNT Digital Library