Some Specific CASL Requirements for Advanced Multiphase Flow Simulation of Light Water Reactors (open access)

Some Specific CASL Requirements for Advanced Multiphase Flow Simulation of Light Water Reactors

Because of the diversity of physical phenomena occuring in boiling, flashing, and bubble collapse, and of the length and time scales of LWR systems, it is imperative that the models have the following features: • Both vapor and liquid phases (and noncondensible phases, if present) must be treated as compressible. • Models must be mathematically and numerically well-posed. • The models methodology must be multi-scale. A fundamental derivation of the multiphase governing equation system, that should be used as a basis for advanced multiphase modeling in LWR coolant systems, is given in the Appendix using the ensemble averaging method. The remainder of this work focuses specifically on the compressible, well-posed, and multi-scale requirements of advanced simulation methods for these LWR coolant systems, because without these are the most fundamental aspects, without which widespread advancement cannot be claimed. Because of the expense of developing multiple special-purpose codes and the inherent inability to couple information from the multiple, separate length- and time-scales, efforts within CASL should be focused toward development of a multi-scale approaches to solve those multiphase flow problems relevant to LWR design and safety analysis. Efforts should be aimed at developing well-designed unified physical/mathematical and high-resolution numerical models for compressible, …
Date: November 1, 2010
Creator: Berry, R. A.
Object Type: Report
System: The UNT Digital Library
Production of high brightness H- beam by charge exchange of hydrogen atom beam in sodium jet (open access)

Production of high brightness H- beam by charge exchange of hydrogen atom beam in sodium jet

Production of H{sup -} beam for accelerators applications by charge exchange of high brightness hydrogen neutral beam in a sodium jet cell is experimentally studied in joint BNL-BINP experiment. In the experiment, a hydrogen-neutral beam with 3-6 keV energy, equivalent current up to 5 A and 200 microsecond pulse duration is used. The atomic beam is produced by charge exchange of a proton beam in a pulsed hydrogen target. Formation of the proton beam is performed in an ion source by four-electrode multiaperture ion-optical system. To achieve small beam emittance, the apertures in the ion-optical system have small enough size, and the extraction of ions is carried out from the surface of plasma emitter with a low transverse ion temperature of {approx}0.2 eV formed as a result of plasma jet expansion from the arc plasma generator. Developed for the BNL optically pumped polarized ion source, the sodium jet target with recirculation and aperture diameter of 2 cm is used in the experiment. At the first stage of the experiment H{sup -} beam with 36 mA current, 5 keV energy and {approx}0.15 cm {center_dot} mrad normalized emittance was obtained. To increase H{sup -} beam current ballistically focused hydrogen neutral beam will …
Date: November 16, 2010
Creator: Davydenko, V.; Zelenski, A.; Ivanov, A. & Kolmogorov, A.
Object Type: Article
System: The UNT Digital Library
Using a two-step matrix solution to reduce the run time in KULL's magnetic diffusion package (open access)

Using a two-step matrix solution to reduce the run time in KULL's magnetic diffusion package

Recently a Resistive Magnetohydrodynamics (MHD) package has been added to the KULL code. In order to be compatible with the underlying hydrodynamics algorithm, a new sub-zonal magnetics discretization was developed that supports arbitrary polygonal and polyhedral zones. This flexibility comes at the cost of many more unknowns per zone - approximately ten times more for a hexahedral mesh. We can eliminate some (or all, depending on the dimensionality) of the extra unknowns from the global matrix during assembly by using a Schur complement approach. This trades expensive global work for cache-friendly local work, while still allowing solution for the full system. Significant improvements in the solution time are observed for several test problems.
Date: December 17, 2010
Creator: Brunner, T A & Kolev, T V
Object Type: Article
System: The UNT Digital Library

High Throughput Pretreatment and Enzyme Hydrolysis of Biomass: Screening Recalcitrance in Large Sample Populations

Presentation on the execution of the first high-throughput thermochemical pretreatment/enzyme digestion pipeline for screening biomass for recalcitrance.
Date: October 1, 2010
Creator: Decker, S. R.
Object Type: Presentation
System: The UNT Digital Library
Fighting Fire with Fire: Modeling the Datacenter-Scale Effects of Targeted Superlattice Thermal Management (open access)

Fighting Fire with Fire: Modeling the Datacenter-Scale Effects of Targeted Superlattice Thermal Management

Local thermal hot-spots in microprocessors lead to worst case provisioning of global cooling resources, especially in large-scale systems. However, efficiency of cooling solutions degrade non-linearly with supply temperature, resulting in high power consumption and cost in cooling - 50 {approx} 100% of IT power. Recent advances in active cooling techniques have shown on-chip thermoelectric coolers (TECs) to be very efficient at selectively eliminating small hot-spots, where applying current to a superlattice film deposited between silicon and the heat spreader results in a Peltier effect that spreads the heat and lowers the temperature of the hot-spot significantly to improve chip reliability. In this paper, we propose that hot-spot mitigation using thermoelectric coolers can be used as a power management mechanism to allow global coolers to be provisioned for a better worst case temperature leading to substantial savings in cooling power. In order to quantify the potential power savings from using TECs in data center servers, we present a detailed power model that integrates on-chip dynamic and leakage power sources, heat diffusion through the entire chip, TEC and global cooler efficiencies, and all their mutual interactions. Our multiscale analysis shows that, for a typical data center, TECs allow global coolers to operate …
Date: November 11, 2010
Creator: Biswas, S; Tiwari, M; Theogarajan, L; Sherwood, T P & Chong, F T
Object Type: Article
System: The UNT Digital Library
A truncated Levenberg-Marquardt algorithm for the calibration of highly parameterized nonlinear models (open access)

A truncated Levenberg-Marquardt algorithm for the calibration of highly parameterized nonlinear models

We propose a modification to the Levenberg-Marquardt minimization algorithm for a more robust and more efficient calibration of highly parameterized, strongly nonlinear models of multiphase flow through porous media. The new method combines the advantages of truncated singular value decomposition with those of the classical Levenberg-Marquardt algorithm, thus enabling a more robust solution of underdetermined inverse problems with complex relations between the parameters to be estimated and the observable state variables used for calibration. The truncation limit separating the solution space from the calibration null space is re-evaluated during the iterative calibration process. In between these re-evaluations, fewer forward simulations are required, compared to the standard approach, to calculate the approximate sensitivity matrix. Truncated singular values are used to calculate the Levenberg-Marquardt parameter updates, ensuring that safe small steps along the steepest-descent direction are taken for highly correlated parameters of low sensitivity, whereas efficient quasi-Gauss-Newton steps are taken for independent parameters with high impact. The performance of the proposed scheme is demonstrated for a synthetic data set representing infiltration into a partially saturated, heterogeneous soil, where hydrogeological, petrophysical, and geostatistical parameters are estimated based on the joint inversion of hydrological and geophysical data.
Date: October 15, 2010
Creator: Finsterle, S. & Kowalsky, M.B.
Object Type: Article
System: The UNT Digital Library
Tools for Predicting Optical Damage on Inertial Confinement Fusion-Class Laser Systems (open access)

Tools for Predicting Optical Damage on Inertial Confinement Fusion-Class Laser Systems

Operating a fusion-class laser to its full potential requires a balance of operating constraints. On the one hand, the total laser energy delivered must be high enough to give an acceptable probability for ignition success. On the other hand, the laser-induced optical damage levels must be low enough to be acceptably handled with the available infrastructure and budget for optics recycle. Our research goal was to develop the models, database structures, and algorithmic tools (which we collectively refer to as ''Loop Tools'') needed to successfully maintain this balance. Predictive models are needed to plan for and manage the impact of shot campaigns from proposal, to shot, and beyond, covering a time span of years. The cost of a proposed shot campaign must be determined from these models, and governance boards must decide, based on predictions, whether to incorporate a given campaign into the facility shot plan based upon available resources. Predictive models are often built on damage ''rules'' derived from small beam damage tests on small optics. These off-line studies vary the energy, pulse-shape and wavelength in order to understand how these variables influence the initiation of damage sites and how initiated damage sites can grow upon further exposure to …
Date: December 20, 2010
Creator: Nostrand, M. C.; Carr, C. W.; Liao, Z. M.; Honig, J.; Spaeth, M. L.; Manes, K. R. et al.
Object Type: Report
System: The UNT Digital Library
Commercial Buildings Partnership Projects - Metered Data Format and Delivery (open access)

Commercial Buildings Partnership Projects - Metered Data Format and Delivery

A number of the Commercial Building Partnership Projects (CBPs) will require metering, monitoring, data analysis and verification of savings after the retrofits are complete. Although monitoring and verification (M&V) agents are free to use any metering and monitoring devices that they chose, the data they collect should be reported to Pacific Northwest National Laboratory (PNNL) in a standard format. PNNL will store the data collected in its CBP database for further use by PNNL and U.S. Department of Energy. This document describes the data storage process and the deliver format of the data from the M&V agents.
Date: November 16, 2010
Creator: Katipamula, Srinivas
Object Type: Report
System: The UNT Digital Library
What Can China Do? China's Best Alternative Outcome for Energy Efficiency and CO2 Emissions (open access)

What Can China Do? China's Best Alternative Outcome for Energy Efficiency and CO2 Emissions

After rapid growth in economic development and energy demand over the last three decades, China has undertaken energy efficiency improvement efforts to reduce its energy intensity under the 11th Five Year Plan (FYP). Since becoming the world's largest annual CO{sub 2} emitter in 2007, China has set reduction targets for energy and carbon intensities and committed to meeting 15% of its total 2020 energy demand with non-fossil fuel. Despite having achieved important savings in 11th FYP efficiency programs, rising per capita income and the continued economic importance of trade will drive demand for transport activity and fuel use. At the same time, an increasingly 'electrified' economy will drive rapid power demand growth. Greater analysis is therefore needed to understand the underlying drivers, possible trajectories and mitigation potential in the growing industrial, transport and power sectors. This study uses scenario analysis to understand the likely trajectory of China's energy and carbon emissions to 2030 in light of the current and planned portfolio of programs, policies and technology development and ongoing urbanization and demographic trends. It evaluates the potential impacts of alternative transportation and power sector development using two key scenarios, Continued Improvement Scenario (CIS) and Accelerated Improvement Scenario (AIS). CIS represents …
Date: July 1, 2010
Creator: G. Fridley, David; Zheng, Nina & T. Aden, Nathaniel
Object Type: Report
System: The UNT Digital Library
Cryogenic safety aspect of the low -$\beta$ magnest systems at the Large Hadron Collider (LHC) (open access)

Cryogenic safety aspect of the low -$\beta$ magnest systems at the Large Hadron Collider (LHC)

The low-{beta} magnet systems are located in the LHC insertion regions around the four interaction points. They are the key elements in the beams focusing/defocusing process and will allow proton collisions at a luminosity of up to 10{sup 34}cm{sup -2}s{sup -1}. Large radiation dose deposited at the proximity of the beam collisions dictate stringent requirements for the design and operation of the systems. The hardware commissioning phase of the LHC was completed in the winter of 2010 and permitted to validate this system safe operation. This paper presents the analysis used to qualify and quantify the safe operation of the low-{beta} magnet systems in the Large Hadron Collider (LHC) for the first years of operation.
Date: July 1, 2010
Creator: Darve, C.
Object Type: Article
System: The UNT Digital Library
Midlatitude Continental Convective Clouds Experiment (MC3E) (open access)

Midlatitude Continental Convective Clouds Experiment (MC3E)

The Midlatitude Continental Convective Clouds Experiment (MC3E) will take place in central Oklahoma during the April–May 2011 period. The experiment is a collaborative effort between the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility and the National Aeronautics and Space Administration’s (NASA) Global Precipitation Measurement (GPM) mission Ground Validation (GV) program. The field campaign leverages the unprecedented observing infrastructure currently available in the central United States, combined with an extensive sounding array, remote sensing and in situ aircraft observations, NASA GPM ground validation remote sensors, and new ARM instrumentation purchased with American Recovery and Reinvestment Act funding. The overarching goal is to provide the most complete characterization of convective cloud systems, precipitation, and the environment that has ever been obtained, providing constraints for model cumulus parameterizations and space-based rainfall retrieval algorithms over land that have never before been available.
Date: April 10, 2010
Creator: Jensen, M. P.; Petersen, W. A.; Del Genio, A. D.; Giangrande, S. E.; Heymsfield, A.; Heymsfield, G. et al.
Object Type: Report
System: The UNT Digital Library
Biospecimen Reporting for Improved Study Quality (BRISQ) (open access)

Biospecimen Reporting for Improved Study Quality (BRISQ)

Human biospecimens are subjected to collection, processing, and storage that can significantly alter their molecular composition and consistency. These biospecimen preanalytical factors, in turn, influence experimental outcomes and the ability to reproduce scientific results. Currently, the extent and type of information specific to the biospecimen preanalytical conditions reported in scientific publications and regulatory submissions varies widely. To improve the quality of research that uses human tissues, it is crucial that information on the handling of biospecimens be reported in a thorough, accurate, and standardized manner. The Biospecimen Reporting for Improved Study Quality (BRISQ) recommendations outlined herein are intended to apply to any study in which human biospecimens are used. The purpose of reporting these details is to supply others, from researchers to regulators, with more consistent and standardized information to better evaluate, interpret, compare, and reproduce the experimental results. The BRISQ guidelines are proposed as an important and timely resource tool to strengthen communication and publications on biospecimen-related research and to help reassure patient contributors and the advocacy community that their contributions are valued and respected.
Date: September 2, 2010
Creator: Institute, National Cancer; Jewell, Ph.D., Scott D.; Seijo, M.S., Edward; Kelly, Ph.D., Andrea; Somiari, Ph.D., Stella; B.Chir., M.B. et al.
Object Type: Article
System: The UNT Digital Library
StralSV: assessment of sequence variability within similar 3D structures and application to polio RNA-dependent RNA polymerase (open access)

StralSV: assessment of sequence variability within similar 3D structures and application to polio RNA-dependent RNA polymerase

Most of the currently used methods for protein function prediction rely on sequence-based comparisons between a query protein and those for which a functional annotation is provided. A serious limitation of sequence similarity-based approaches for identifying residue conservation among proteins is the low confidence in assigning residue-residue correspondences among proteins when the level of sequence identity between the compared proteins is poor. Multiple sequence alignment methods are more satisfactory - still, they cannot provide reliable results at low levels of sequence identity. Our goal in the current work was to develop an algorithm that could overcome these difficulties and facilitate the identification of structurally (and possibly functionally) relevant residue-residue correspondences between compared protein structures. Here we present StralSV, a new algorithm for detecting closely related structure fragments and quantifying residue frequency from tight local structure alignments. We apply StralSV in a study of the RNA-dependent RNA polymerase of poliovirus and demonstrate that the algorithm can be used to determine regions of the protein that are relatively unique or that shared structural similarity with structures that are distantly related. By quantifying residue frequencies among many residue-residue pairs extracted from local alignments, one can infer potential structural or functional importance of specific …
Date: November 29, 2010
Creator: Zemla, A; Lang, D; Kostova, T; Andino, R & Zhou, C
Object Type: Article
System: The UNT Digital Library
Corrective Action Decision Document/Closure Report for Corrective Action Unit 383: Area E-Tunnel Sites, Nevada Test Site (open access)

Corrective Action Decision Document/Closure Report for Corrective Action Unit 383: Area E-Tunnel Sites, Nevada Test Site

This Corrective Action Decision Document/Closure Report (CADD/CR) was prepared by the Defense Threat Reduction Agency (DTRA) for Corrective Action Unit (CAU) 383, Area 12 E-Tunnel Sites, which is the joint responsibility of DTRA and the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO). This CADD/CR is consistent with the requirements of the Federal Facility Agreement and Consent Order (FFACO) agreed to by the State of Nevada, the DOE, and the U.S. Department of Defense. Corrective Action Unit 383 is comprised of three Corrective Action Sites (CASs) and two adjacent areas: • CAS 12-06-06, Muckpile • CAS 12-25-02, Oil Spill • CAS 12-28-02, Radioactive Material • Drainage below the Muckpile • Ponds 1, 2, and 3 The purpose of this CADD/CR is to provide justification and documentation to support the recommendation for closure with no further corrective action, by placing use restrictions at the three CASs and two adjacent areas of CAU 383.
Date: March 15, 2010
Creator: National Security Technologies, LLC
Object Type: Report
System: The UNT Digital Library
LDRD 2010 Annual Report: Laboratory Directed Research and Development Program Activities (open access)

LDRD 2010 Annual Report: Laboratory Directed Research and Development Program Activities

N/A
Date: December 31, 2010
Creator: P., Looney J. & Fox, K.
Object Type: Report
System: The UNT Digital Library
Offshore Code Comparison Collaboration (OC3) for IEA Wind Task 23 Offshore Wind Technology and Deployment (open access)

Offshore Code Comparison Collaboration (OC3) for IEA Wind Task 23 Offshore Wind Technology and Deployment

This final report for IEA Wind Task 23, Offshore Wind Energy Technology and Deployment, is made up of two separate reports, Subtask 1: Experience with Critical Deployment Issues and Subtask 2: Offshore Code Comparison Collaborative (OC3). Subtask 1 discusses ecological issues and regulation, electrical system integration, external conditions, and key conclusions for Subtask 1. Subtask 2 included here, is the larger of the two volumes and contains five chapters that cover background information and objectives of Subtask 2 and results from each of the four phases of the project.
Date: December 1, 2010
Creator: Jonkman, J. & Musial, W.
Object Type: Report
System: The UNT Digital Library
Development of tools and techniques for momentum compression of fast rare isotopes (open access)

Development of tools and techniques for momentum compression of fast rare isotopes

As part of our past research and development work, we have created and developed the LISE++ simulation code [Tar04, Tar08]. The LISE++ package was significantly extended with the addition of a Monte Carlo option that includes an option for calculating ion trajectories using a Taylor-series expansion up to fifth order, and implementation of the MOTER Monte Carlo code [Kow87] for ray tracing of the ions into the suite of LISE++ codes. The MOTER code was rewritten from FORTRAN into C++ and transported to the MS-Windows operating system. Extensive work went into the creation of a user-friendly interface for the code. An example of the graphical user interface created for the MOTER code is shown in the left panel of Figure 1 and the results of a typical calculation for the trajectories of particles that pass through the A1900 fragment separator are shown in the right panel. The MOTER code is presently included as part of the LISE++ package for downloading without restriction by the worldwide community. The LISE++ was extensively developed and generalized to apply to any projectile fragment separator during the early phase of this grant. In addition to the inclusion of the MOTER code, other important additions to …
Date: November 21, 2010
Creator: Morrissey, David J.; Sherrill, Bradley M. & Tarasov, Oleg
Object Type: Report
System: The UNT Digital Library
Conceptual Design Report for Remote-Handled Low-Level Waste Disposal Facility (open access)

Conceptual Design Report for Remote-Handled Low-Level Waste Disposal Facility

This conceptual design report addresses development of replacement remote-handled low-level waste disposal capability for the Idaho National Laboratory. Current disposal capability at the Radioactive Waste Management Complex is planned until the facility is full or until it must be closed in preparation for final remediation (approximately at the end of Fiscal Year 2017). This conceptual design report includes key project assumptions; design options considered in development of the proposed onsite disposal facility (the highest ranked alternative for providing continued uninterrupted remote-handled low level waste disposal capability); process and facility descriptions; safety and environmental requirements that would apply to the proposed facility; and the proposed cost and schedule for funding, design, construction, and operation of the proposed onsite disposal facility.
Date: October 1, 2010
Creator: Harvego, Lisa; Duncan, David; Connolly, Joan; Hinman, Margaret; Marcinkiewicz, Charles & Mecham, Gary
Object Type: Report
System: The UNT Digital Library
A New Approach in Advance Network Reservation and Provisioning for High-Performance Scientific Data Transfers (open access)

A New Approach in Advance Network Reservation and Provisioning for High-Performance Scientific Data Transfers

Scientific applications already generate many terabytes and even petabytes of data from supercomputer runs and large-scale experiments. The need for transferring data chunks of ever-increasing sizes through the network shows no sign of abating. Hence, we need high-bandwidth high speed networks such as ESnet (Energy Sciences Network). Network reservation systems, i.e. ESnet's OSCARS (On-demand Secure Circuits and Advance Reservation System) establish guaranteed bandwidth of secure virtual circuits at a certain time, for a certain bandwidth and length of time. OSCARS checks network availability and capacity for the specified period of time, and allocates requested bandwidth for that user if it is available. If the requested reservation cannot be granted, no further suggestion is returned back to the user. Further, there is no possibility from the users view-point to make an optimal choice. We report a new algorithm, where the user specifies the total volume that needs to be transferred, a maximum bandwidth that he/she can use, and a desired time period within which the transfer should be done. The algorithm can find alternate allocation possibilities, including earliest time for completion, or shortest transfer duration - leaving the choice to the user. We present a novel approach for path finding in …
Date: January 28, 2010
Creator: Balman, Mehmet; Chaniotakis, Evangelos; Shoshani, Arie & Sim, Alex
Object Type: Report
System: The UNT Digital Library
D0 results on diphoton direct production and double parton interactions in photon + 3 jet events (open access)

D0 results on diphoton direct production and double parton interactions in photon + 3 jet events

We report the measurement of differential diphoton direct production cross sections and a study of photon + 3-jet events with double parton (DP) interactions, based on data taken with the D0 experiment at the Fermilab Tevatron proton-antiproton collider. We measure single differential cross sections as a function of the diphoton mass, the transverse momentum of the diphoton system, the azimuthal angle between the photons, and the polar scattering angle of the photons. In addition, we measure double differential cross sections considering the last three kinematic variables in three diphoton mass bins. The results are compared with different perturbative QCD predictions and event generators. We have used a sample of photon + 3-jet events collected by the D0 experiment with an integrated luminosity of about 1 fb{sup -1} to determine the fraction of events with double parton scattering (f{sub DP}) in a single p{bar p} collision at {radical}s = 1.96 TeV. The DP fraction and effective cross section ({sigma}{sub eff}), a process-independent scale parameter related to the parton density inside the nucleon, are measured in three intervals of the second (ordered in p{sub T}) jet transverse momentum p{sub T}{sup jet2} within the range 15 < p{sub T}{sup jet} < 30 GeV. …
Date: January 1, 2010
Creator: Sawyer, Lee & U., /Louisiana Tech.
Object Type: Article
System: The UNT Digital Library
The use of a high-order MEMS deformable mirror in the Gemini Planet Imager (open access)

The use of a high-order MEMS deformable mirror in the Gemini Planet Imager

We briefly review the development history of the Gemini Planet Imager's 4K Boston Micromachines MEMS deformable mirror. We discuss essential calibration steps and algorithms to control the MEMS with nanometer precision, including voltage-phase calibration and influence function characterization. We discuss the integration of the MEMS into GPI's Adaptive Optics system at Lawrence Livermore and present experimental results of 1.5 kHz closed-loop control. We detail mitigation strategies in the coronagraph to reduce the impact of abnormal actuators on final image contrast.
Date: December 17, 2010
Creator: Poyneer, L. A.; Bauman, B.; Cornelissen, S.; Jones, S.; Macintosh, B.; Palmer, D. et al.
Object Type: Article
System: The UNT Digital Library
Iron-Air Rechargeable Battery: A Robust and Inexpensive Iron-Air Rechargeable Battery for Grid-Scale Energy Storage (open access)

Iron-Air Rechargeable Battery: A Robust and Inexpensive Iron-Air Rechargeable Battery for Grid-Scale Energy Storage

GRIDS Project: USC is developing an iron-air rechargeable battery for large-scale energy storage that could help integrate renewable energy sources into the electric grid. Iron-air batteries have the potential to store large amounts of energy at low cost—iron is inexpensive and abundant, while oxygen is freely obtained from the air we breathe. However, current iron-air battery technologies have suffered from low efficiency and short life spans. USC is working to dramatically increase the efficiency of the battery by placing chemical additives on the battery’s iron-based electrode and restructuring the catalysts at the molecular level on the battery’s air-based electrode. This can help the battery resist degradation and increase life span. The goal of the project is to develop a prototype iron-air battery at significantly cost lower than today’s best commercial batteries.
Date: October 1, 2010
Creator: unknown
Object Type: Text
System: The UNT Digital Library
Solid oxide electrochemical reactor science. (open access)

Solid oxide electrochemical reactor science.

Solid-oxide electrochemical cells are an exciting new technology. Development of solid-oxide cells (SOCs) has advanced considerable in recent years and continues to progress rapidly. This thesis studies several aspects of SOCs and contributes useful information to their continued development. This LDRD involved a collaboration between Sandia and the Colorado School of Mines (CSM) ins solid-oxide electrochemical reactors targeted at solid oxide electrolyzer cells (SOEC), which are the reverse of solid-oxide fuel cells (SOFC). SOECs complement Sandia's efforts in thermochemical production of alternative fuels. An SOEC technology would co-electrolyze carbon dioxide (CO{sub 2}) with steam at temperatures around 800 C to form synthesis gas (H{sub 2} and CO), which forms the building blocks for a petrochemical substitutes that can be used to power vehicles or in distributed energy platforms. The effort described here concentrates on research concerning catalytic chemistry, charge-transfer chemistry, and optimal cell-architecture. technical scope included computational modeling, materials development, and experimental evaluation. The project engaged the Colorado Fuel Cell Center at CSM through the support of a graduate student (Connor Moyer) at CSM and his advisors (Profs. Robert Kee and Neal Sullivan) in collaboration with Sandia.
Date: September 1, 2010
Creator: Sullivan, Neal P. (Colorado School of Mines, Golden, CO); Stechel, Ellen Beth; Moyer, Connor J. (Colorado School of Mines, Golden, CO); Ambrosini, Andrea & Key, Robert J. (Colorado School of Mines, Golden, CO)
Object Type: Report
System: The UNT Digital Library
The National Ignition Facility and the Promise of Inertial Fusion Energy (open access)

The National Ignition Facility and the Promise of Inertial Fusion Energy

The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory (LLNL) in Livermore, CA, is now operational. The NIF is the world's most energetic laser system capable of producing 1.8 MJ and 500 TW of ultraviolet light. By concentrating the energy from its 192 extremely energetic laser beams into a mm{sup 3}-sized target, NIF can produce temperatures above 100 million K, densities of 1,000 g/cm{sup 3}, and pressures 100 billion times atmospheric pressure - conditions that have never been created in a laboratory and emulate those in planetary interiors and stellar environments. On September 29, 2010, the first integrated ignition experiment was conducted, demonstrating the successful coordination of the laser, cryogenic target system, array of diagnostics and infrastructure required for ignition demonstration. In light of this strong progress, the U.S. and international communities are examining the implication of NIF ignition for inertial fusion energy (IFE). A laser-based IFE power plant will require a repetition rate of 10-20 Hz and a laser with 10% electrical-optical efficiency, as well as further development and advances in large-scale target fabrication, target injection, and other supporting technologies. These capabilities could lead to a prototype IFE demonstration plant in the 10- to 15-year time frame. …
Date: December 13, 2010
Creator: Moses, E I
Object Type: Article
System: The UNT Digital Library