Environmentally Benign Stab Detonators (open access)

Environmentally Benign Stab Detonators

The coupling of energetic metallic multilayers (a.k.a. flash metal) with energetic sol-gel synthesis and processing is an entirely new approach to forming energetic devices for several DoD and DOE needs. They are also practical and commercially viable manufacturing techniques. Improved occupational safety and health, performance, reliability, reproducibility, and environmentally acceptable processing can be achieved using these methodologies and materials. The development and fielding of this technology will enhance mission readiness and reduce the costs, environmental risks and the necessity of resolving environmental concerns related to maintaining military readiness while simultaneously enhancing safety and health. Without sacrificing current performance, we will formulate new impact initiated device (IID) compositions to replace materials from the current composition that pose significant environmental, health, and safety problems associated with functions such as synthesis, material receipt, storage, handling, processing into the composition, reaction products from testing, and safe disposal. To do this, we will advance the use of nanocomposite preparation via the use of multilayer flash metal and sol-gel technologies and apply it to new small IIDs. This work will also serve to demonstrate that these technologies and resultant materials are relevant and practical to a variety of energetic needs of DoD and DOE. The goal …
Date: July 7, 2006
Creator: Gash, A. E.
System: The UNT Digital Library
HINS R&D Collaboration on Electron Cloud Effects: Midyea rReport (open access)

HINS R&D Collaboration on Electron Cloud Effects: Midyea rReport

We present a report on ongoing activities on electron-cloud R&D for the MI upgrade. These results update and extend those presented in Refs. 1, 2. In this report we have significantly expanded the parameter range explored in bunch intensity Nb, RMS bunch length {sigma}{sub z} and peak secondary emission yield (SEY) {delta}{sub max}, but we have constrained our simulations to a field-free region. We describe the threshold behaviors in all of the above three parameters. For {delta}{sub max} {ge} 1.5 we find that, even for N{sub b} = 1 x 10{sup 11}, the electron cloud density, when averaged over the entire chamber, exceeds the beam neutralization level, but remains significantly below the local neutralization level (ie., when the electron density is computed in the neighborhood of the beam). This 'excess' of electrons is accounted for by narrow regions of high concentration of electrons very close to the chamber surface, especially at the top and bottom of the chamber, akin to virtual cathodes. These virtual cathodes are kept in equilibrium, on average, by a competition between space-charge forces (including their images) and secondary emission, a mechanism that shares some features with the space-charge saturation of the current in a diode at …
Date: November 7, 2006
Creator: Furman, M. A.; Sonnad, K. & Vay, J. L.
System: The UNT Digital Library
Liquid salt - very high temperature reactor : survey of sodium-cooled fast reactor fuel handling systems for relevant design and operating characteristics. (open access)
Evaluation of TCP Congestion Control Algorithms on the Windows Vista Platform (open access)

Evaluation of TCP Congestion Control Algorithms on the Windows Vista Platform

CTCP, an innovative TCP congestion control algorithm developed by Microsoft, is evaluated and compared to HSTCP and StandardTCP. Tests were performed on the production Internet from Stanford Linear Accelerator Center (SLAC) to various geographically located hosts to give a broad overview of the performances. We find that certain issues were apparent during testing (not directly related to the congestion control algorithms) which may skew results. With this in mind, we find that CTCP performed similarly to HSTCP across a multitude of different network environments. However, to improve the fairness and to reduce the impact of CTCP upon existing StandardTCP traffic, two areas of further research were investigated. Algorithmic additions to CTCP for burst control to reduce the aggressiveness of its cwnd increments demonstrated beneficial improvements in both fairness and throughput over the original CTCP algorithm. Similarly, {gamma} auto-tuning algorithms were investigated to dynamically adapt CTCP flows to their network conditions for optimal performance. While the effects of these auto-tuning algorithms when used in addition to burst control showed little to no benefit to fairness nor throughput for the limited number of network paths tested, one of the auto-tuning algorithms performed such that there was negligible impact upon StandardTCP. With these …
Date: July 7, 2006
Creator: Li, Yee-Ting
System: The UNT Digital Library
Noncommutative Inspired Black Holes in Extra Dimensions (open access)

Noncommutative Inspired Black Holes in Extra Dimensions

In a recent string theory motivated paper, Nicolini, Smailagic and Spallucci (NSS) presented an interesting model for a noncommutative inspired, Schwarzschild-like black hole solution in 4-dimensions. The essential effect of having noncommutative co-ordinates in this approach is to smear out matter distributions on a scale associated with the turn-on of noncommutativity which was taken to be near the 4-d Planck mass. In particular, NSS assumed that this smearing was essentially Gaussian. This energy scale is sufficiently large that in 4-d such effects may remain invisible indefinitely. Extra dimensional models which attempt to address the gauge hierarchy problem, however, allow for the possibility that the effective fundamental scale may not be far from {approx} 1 TeV, an energy regime that will soon be probed by experiments at both the LHC and ILC. In this paper we generalize the NSS model to the case where flat, toroidally compactified extra dimensions are accessible at the TeV-scale and examine the resulting modifications in black hole properties due to the existence of noncommutativity. We show that while many of the noncommutativity-induced black hole features found in 4-d by NSS persist, in some cases there can be significant modifications due the presence of extra dimensions. We …
Date: June 7, 2006
Creator: Rizzo, Thomas G.
System: The UNT Digital Library
Studies of Helium Based Gas Mixtures Using a Small Cell Drift Chamber (open access)

Studies of Helium Based Gas Mixtures Using a Small Cell Drift Chamber

An international collaboration is currently working on the construction and design of an asymmetric B Factory at the Stanford Linear Accelerator Center that will be ready to collect data in 1999. The main physics motivation for such a facility is to test the description and mechanism of CP violation in the Standard Model of particle physics and provide insight into the question of why more matter than antimatter is observed in the universe today. In particular, this experiment will measure CP violation in the decay of B mesons. In the early stages of this effort, the Canadian contingent proposed to build the central tracking chamber for the BaBar detector. Presently, a prototype drift chamber is in operation and studies are being performed to test some of the unique features of drift chamber design dictated by the conditions of the experiment. Using cosmic muons, it is possible to study tracking and pattern recognition in the prototype chamber, and therefore calculate the efficiency and spatial resolution of the prototype chamber cells. These performance features will be used to test whether or not the helium-based gas mixtures proposed for the BaBar drift chamber are a viable alternative to the more traditional argon-based gases.
Date: July 7, 2006
Creator: Heise, Jaret & U., /British Columbia
System: The UNT Digital Library
RH Packaging Program Guidance (open access)

RH Packaging Program Guidance

The purpose of this program guidance document is to provide the technical requirements for use, operation, inspection, and maintenance of the RH-TRU 72-B Waste Shipping Package and directly related components. This document complies with the requirements as specified in the RH-TRU 72-B Safety Analysis Report for Packaging (SARP), and Nuclear Regulatory Commission (NRC) Certificate of Compliance (C of C) 9212. If there is a conflict between this document and the SARP and/or C of C, the C of C shall govern. The C of C states: "...each package must be prepared for shipment and operated in accordance with the procedures described in Chapter 7.0, Operating Procedures, of the application." It further states: "...each package must be tested and maintained in accordance with the procedures described in Chapter 8.0, Acceptance Tests and Maintenance Program of the Application." Chapter 9.0 of the SARP tasks the Waste Isolation Pilot Plant (WIPP) Management and Operating (M&O) Contractor with assuring the packaging is used in accordance with the requirements of the C of C. Because the packaging is NRC-approved, users need to be familiar with 10 Code of Federal Regulations (CFR) §71.8, "Deliberate Misconduct." Any time a user suspects or has indications that the conditions …
Date: November 7, 2006
Creator: Westinghouse TRU Solutions LLC
System: The UNT Digital Library
Design and Application of the Reconstruction Software for the BaBar Calorimeter (open access)

Design and Application of the Reconstruction Software for the BaBar Calorimeter

The BaBar high energy physics experiment will be in operation at the PEP-II asymmetric e{sup +}e{sup -} collider in Spring 1999. The primary purpose of the experiment is the investigation of CP violation in the neutral B meson system. The electromagnetic calorimeter forms a central part of the experiment and new techniques are employed in data acquisition and reconstruction software to maximize the capability of this device. The use of a matched digital filter in the feature extraction in the front end electronics is presented. The performance of the filter in the presence of the expected high levels of soft photon background from the machine is evaluated. The high luminosity of the PEP-II machine and the demands on the precision of the calorimeter require reliable software that allows for increased physics capability. BaBar has selected C++ as its primary programming language and object oriented analysis and design as its coding paradigm. The application of this technology to the reconstruction software for the calorimeter is presented. The design of the systems for clustering, cluster division, track matching, particle identification and global calibration is discussed with emphasis on the provisions in the design for increased physics capability as levels of understanding of …
Date: July 7, 2006
Creator: Strother, Philip David
System: The UNT Digital Library
Analysis of Percent On-Cell Reformation of Methane in SOFC Stacks: Thermal, Electrical and Stress Analysis (open access)

Analysis of Percent On-Cell Reformation of Methane in SOFC Stacks: Thermal, Electrical and Stress Analysis

This report summarizes a parametric analysis performed to determine the effect of varying the percent on-cell reformation (OCR) of methane on the thermal and electrical performance for a generic, planar solid oxide fuel cell (SOFC) stack design. OCR of methane can be beneficial to an SOFC stack because the reaction (steam-methane reformation) is endothermic and can remove excess heat generated by the electrochemical reactions directly from the cell. The heat removed is proportional to the amount of methane reformed on the cell. Methane can be partially pre-reformed externally, then supplied to the stack, where rapid reaction kinetics on the anode ensures complete conversion. Thus, the thermal load varies with methane concentration entering the stack, as does the coupled scalar distributions, including the temperature and electrical current density. The endotherm due to the reformation reaction can cause a temperature depression on the anode near the fuel inlet, resulting in large thermal gradients. This effect depends on factors that include methane concentration, local temperature, and stack geometry.
Date: April 7, 2006
Creator: Recknagle, Kurtis P.; Yokuda, Satoru T.; Jarboe, Daniel T. & Khaleel, Mohammad A.
System: The UNT Digital Library
Remaining Sites Verification Package for the 126-B-3, 184-B Coal Pit Dumping Area, Waste Site Reclassification Form 2005-028 (open access)

Remaining Sites Verification Package for the 126-B-3, 184-B Coal Pit Dumping Area, Waste Site Reclassification Form 2005-028

The 126-B-3 waste site is the former coal storage pit for the 184-B Powerhouse. During demolition operations in the 1970s, the site was used for disposal of demolition debris from 100-B/C Area facilities. The site has been remediated by removing debris and contaminated soils. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.
Date: August 7, 2006
Creator: Dittmer, L. M.
System: The UNT Digital Library
National Certification Methodology for the Nuclear Weapons Stockpile (open access)

National Certification Methodology for the Nuclear Weapons Stockpile

Lawrence Livermore and Los Alamos National Laboratories have developed a common framework and key elements of a national certification methodology called Quantification of Margins and Uncertainties (QMU). A spectrum from senior managers to weapons designers has been engaged in this activity at the two laboratories for on the order of a year to codify this methodology in an overarching and integrated paper. Following is the certification paper that has evolved. In the process of writing this paper, an important outcome has been the realization that a joint Livermore/Los Alamos workshop on QMU, focusing on clearly identifying and quantifying differences between approaches between the two labs plus developing an even stronger technical foundation on methodology, will be valuable. Later in FY03, such a joint laboratory workshop will be held. One of the outcomes of this workshop will be a new version of this certification paper. A comprehensive approach to certification must include specification of problem scope, development of system baseline models, formulation of standards of performance assessment, and effective procedures for peer review and documentation. This document concentrates on the assessment and peer review aspects of the problem. In addressing these points, a central role is played by a 'watch list' …
Date: August 7, 2006
Creator: Goodwin, B T & Juzaitis, R J
System: The UNT Digital Library
The Computation Directorate at Lawrence Livermore National Laboratory (open access)

The Computation Directorate at Lawrence Livermore National Laboratory

The Computation Directorate at Lawrence Livermore National Laboratory has four major areas of work: (1) Programmatic Support -- Programs are areas which receive funding to develop solutions to problems or advance basic science in their areas (Stockpile Stewardship, Homeland Security, the Human Genome project). Computer scientists are 'matrixed' to these programs to provide computer science support. (2) Livermore Computer Center (LCC) -- Development, support and advanced planning for the large, massively parallel computers, networks and storage facilities used throughout the laboratory. (3) Research -- Computer scientists research advanced solutions for programmatic work and for external contracts and research new HPC hardware solutions. (4) Infrastructure -- Support for thousands of desktop computers and numerous LANs, labwide unclassified networks, computer security, computer-use policy.
Date: September 7, 2006
Creator: Cook, L
System: The UNT Digital Library
CryoFree Final Report (open access)

CryoFree Final Report

CryoFree, a gamma-ray spectrometer, has been built and successfully tested. This instrument is based on a planar germanium semiconductor detector and is optimized for high-resolution spectroscopy in the range of a 30 keV to a few hundred keV to detect U and Pu. The spectrometer is cooled with a mechanical cryocooler that obviates the need for liquid cryogen. Furthermore, the instrument is battery powered. The combination of mechanical cooling and battery operation allows high-resolution spectroscopy in a highly-portable field instrument. A description of the instrument along with its performance is given.
Date: November 7, 2006
Creator: Burks, M
System: The UNT Digital Library
On the performance of SPAI and ADI-like preconditioners for core collapse supernova simulations in one spatial dimension (open access)

On the performance of SPAI and ADI-like preconditioners for core collapse supernova simulations in one spatial dimension

The simulation of core collapse supernovae calls for the time accurate solution of the (Euler) equations for inviscid hydrodynamics coupled with the equations for neutrino transport. The time evolution is carried out by evolving the Euler equations explicitly and the neutrino transport equations implicitly. Neutrino transport is modeled by the multi-group Boltzmann transport (MGBT) and the multi-group flux limited diffusion (MGFLD) equations. An implicity time stepping scheme for the MGBT and MGFLD equations yields Jacobian systems that necessitate scaling and reconditioning. Two types of preconditioners, namely a sparse approximate inverse (SPAI) preconditioner and a preconditioner based on the alternatingdirection implicit iteration (ADI-like) have been fond to be effective for the MGFLD and MGBT formulations. This paper compares these two preconditioners. The ADI-like preconditioner performs well with both MGBT and MGFLD systems. For the MGBT system tested, the SPAI preconditioner did not give competitive results. However, since the MGBT system in our experiments had a high condition number before scaling and since we used a sequential platform, care must be taken in evaluating these results.
Date: July 7, 2006
Creator: Smolarski, Dennis C.; Balakrishnan, Ramesh; D'Azevedo, Eduardo F.; Fettig, John W.; Messer, Bronson; Mezzacappa, Anthony et al.
System: The UNT Digital Library
2006 Status Report Savings Estimates for the ENERGY STAR(R)Voluntary Labeling Program (open access)

2006 Status Report Savings Estimates for the ENERGY STAR(R)Voluntary Labeling Program

ENERGY STAR(R) is a voluntary labeling program designed toidentify and promote energy-efficient products, buildings and practices.Operated jointly by the Environmental Protection Agency (EPA) and theU.S. Department of Energy (DOE), ENERGY STAR labels exist for more thanthirty products, spanning office equipment, residential heating andcooling equipment, commercial and residential lighting, home electronics,and major appliances. This report presents savings estimates for a subsetof ENERGY STAR labeled products. We present estimates of the energy,dollar and carbon savings achieved by the program in the year 2005, whatwe expect in 2006, and provide savings forecasts for two marketpenetration scenarios for the periods 2006 to 2015 and 2006 to 2025. Thetarget market penetration forecast represents our best estimate of futureENERGY STAR savings. It is based on realistic market penetration goalsfor each of the products. We also provide a forecast under the assumptionof 100 percent market penetration; that is, we assume that all purchasersbuy ENERGY STAR-compliant products instead of standard efficiencyproducts throughout the analysis period.
Date: March 7, 2006
Creator: Webber, Carrie A.; Brown, Richard E.; Sanchez, Marla & Homan,Gregory K.
System: The UNT Digital Library
Application of the t-model of optimal prediction to the estimationof the rate of decay of solutions of the Euler equations in two and threedimensions (open access)

Application of the t-model of optimal prediction to the estimationof the rate of decay of solutions of the Euler equations in two and threedimensions

The "t-model" for dimensional reduction is applied to theestimation of the rate of decay of solutions of the Burgers equation andof the Euler equations in two and three space dimensions. The model wasfirst derived in a statistical mechanics context, but here we analyze itpurely as a numerical tool and prove its convergence. In the Burgers casethe model captures the rate of decay exactly, as was already previouslyshown. For the Euler equations in two space dimensions, the modelpreserves energy as it should. In three dimensions, we find a power lawdecay in time and observe a temporal intermittency.
Date: October 7, 2006
Creator: Hald, Ole H.; Shvets, Yelena & Stinis, Panagiotis
System: The UNT Digital Library
Sources, Speciation and Mobility of Plutonium and Other Transuranics in the Groundwater at the Savannah River Site (Sept. 2003-Sept. 2006) (open access)

Sources, Speciation and Mobility of Plutonium and Other Transuranics in the Groundwater at the Savannah River Site (Sept. 2003-Sept. 2006)

The intent of this research effort is to: (1) provide the basis for accurate modeling and prediction of actinide transport; (2) allow for remediation strategies to be planned that might use in-situ manipulations of geochemical variables to enhance (for extraction) or retard (for immobilization) Pu mobility in the groundwater zone; (3) identify specific Pu sources and the extent of far field, or long-term migration of transuranics in groundwater; (4) reduce costly uncertainty in performance and risk assessment calculations. This new knowledge is essential to ensure continued public and worker safety at the DOE sites and the efficient management of cleanup and containment strategies.
Date: November 7, 2006
Creator: Buesseler, K. O.; Kaplan, D.; Peterson, S. & Dai, M.
System: The UNT Digital Library
Fully Atomistic Simulations of Hydrodynamic Instabilities and Mixing (open access)

Fully Atomistic Simulations of Hydrodynamic Instabilities and Mixing

The large-scale computational capabilities at LLNL make it possible to develop seamless connections from processes at the atomic scale to complex macroscopic phenomena such as hydrodynamic instabilities and turbulent mixing. Traditionally, these connections have been made by combining results from different scientific fields. For gases and fluids, atomic and molecular scattering cross sections must first be obtained and incorporated into Boltzmann transport equations. Their solution yields then transport coefficients which are input parameters for the Navier-Stokes equations for fluid dynamics. The latter are solved numerically with hydro-codes. For visco-elastic solids, on the other hand, atomistic simulations must first provide constitutive laws for the mobility and multiplication of dislocations and other crystalline defects. In turn, these laws are utilized to construct meso-scale models for plastic deformation. These models are then incorporated into hydro- and finite element codes to predict the macroscopic behavior of solid materials. Many of these intermediate steps can be bypassed with large-scale molecular dynamics simulations. For this purpose, codes have been developed in which trajectories of atoms or molecules are mapped onto continuum field descriptions for mass density, mass flow, stresses, and for temperature. It is now possible to compare directly and quantitatively atomistic simulations with predictions from …
Date: September 7, 2006
Creator: Kubota, A & Wolfer, W G
System: The UNT Digital Library
Site Environmental Report for 2005 Volume I and Volume II (open access)

Site Environmental Report for 2005 Volume I and Volume II

Each year, Ernest Orlando Lawrence Berkeley National Laboratory prepares an integrated report on its environmental programs to satisfy the requirements of United States Department of Energy Order 231.1A, ''Environment, Safety, and Health Reporting''. The ''Site Environmental Report for 2005'' summarizes Berkeley Lab's environmental management performance, presents environmental monitoring results, and describes significant programs for calendar year 2005. (Throughout this report, Ernest Orlando Lawrence Berkeley National Laboratory is referred to as ''Berkeley Lab'', ''the Laboratory'', ''Lawrence Berkeley National Laboratory'', and ''LBNL''.) The report is separated into two volumes. Volume I contains an overview of the Laboratory, the status of environmental programs, and summarized results from surveillance and monitoring activities. This year's Volume I text body is organized into an executive summary followed by six chapters. The report's structure has been reorganized this year, and it now includes a chapter devoted to environmental management system topics. Volume II contains individual data results from surveillance and monitoring activities. The ''Site Environmental Report'' is distributed by releasing it on the Web from the Berkeley Lab Environmental Services Group (ESG) home page, which is located at http://www.lbl.gov/ehs/esg/. Many of the documents cited in this report also are accessible from the ESG Web page. CD and …
Date: July 7, 2006
Creator: Ruggieri, Michael
System: The UNT Digital Library
Screening of Potential Remediation Methods for the 200-ZP-1 Operable Unit at the Hanford Site (open access)

Screening of Potential Remediation Methods for the 200-ZP-1 Operable Unit at the Hanford Site

A screening-level evaluation of potential remediation methods for application to the contaminants of concern (COC) in the 200-ZP-1 Operable Unit at the Hanford Site was conducted based on the methods outlined in the Guidance for Conducting Remedial Investigations and Feasibility Studies under CERCLA Interim Final. The scope of this screening was to identify the most promising remediation methods for use in the more detailed analysis of remediation alternatives that will be conducted as part of the full feasibility study. The screening evaluation was conducted for the primary COC (potential major risk drivers). COC with similar properties were grouped for the screening evaluation. The screening evaluation was conducted in two primary steps. The initial screening step evaluated potential remediation methods based on whether they can be effectively applied within the environmental setting of the 200-ZP-1 Operable Unit for the specified contaminants. In the second step, potential remediation methods were screened using scoping calculations to estimate the scale of infrastructure, overall quantities of reagents, and conceptual approach for applying the method for each defined grouping of COC. Based on these estimates, each method was screened with respect to effectiveness, implementability, and relative cost categories of the CERCLA feasibility study screening process defined …
Date: August 7, 2006
Creator: Truex, Michael J.; Nimmons, Michael J.; Johnson, Christian D.; Dresel, P EVAN. & Murray, Christopher J.
System: The UNT Digital Library
Inherent Negative Biases in the Generalized Geometry Holdup (GGH) Model (open access)

Inherent Negative Biases in the Generalized Geometry Holdup (GGH) Model

None
Date: September 7, 2006
Creator: Oberer, R. B.; Gunn, C. A. & Chiang, L. G.
System: The UNT Digital Library
Atomic and Electronic Structure and Chemistry of Ceramic/Metal Interfaces. Final Report (open access)

Atomic and Electronic Structure and Chemistry of Ceramic/Metal Interfaces. Final Report

Materials containing ceramic and metal phases play a significant role in modern materials technology.
Date: July 7, 2006
Creator: Seidman, D. N.
System: The UNT Digital Library
LLNL Capabilities in Underground Coal Gasification (open access)

LLNL Capabilities in Underground Coal Gasification

Underground coal gasification (UCG) has received renewed interest as a potential technology for producing hydrogen at a competitive price particularly in Europe and China. The Lawrence Livermore National Laboratory (LLNL) played a leading role in this field and continues to do so. It conducted UCG field tests in the nineteen-seventies and -eighties resulting in a number of publications culminating in a UCG model published in 1989. LLNL successfully employed the ''Controlled Retraction Injection Point'' (CRIP) method in some of the Rocky Mountain field tests near Hanna, Wyoming. This method, shown schematically in Fig.1, uses a horizontally-drilled lined injection well where the lining can be penetrated at different locations for injection of the O{sub 2}/steam mixture. The cavity in the coal seam therefore gets longer as the injection point is retracted as well as wider due to reaction of the coal wall with the hot gases. Rubble generated from the collapsing wall is an important mechanism studied by Britten and Thorsness.
Date: June 7, 2006
Creator: Friedmann, S J; Burton, E & Upadhye, R
System: The UNT Digital Library
AN ACCELERATED RATE CALORIMETRY STUDY OF CAUSTIC-SIDE SOLVENT EXTRACTION SOLVENT WITHOUT EXTRACTANT (open access)

AN ACCELERATED RATE CALORIMETRY STUDY OF CAUSTIC-SIDE SOLVENT EXTRACTION SOLVENT WITHOUT EXTRACTANT

This study found that 4 - 48 part per thousand (ppth) of Caustic Side Solvent Extraction (CSSX) solvent without extractant in caustic salt solution at evaporator-relevant temperatures result in no process-significant energetic events. However, the data suggest a chemical reaction (possible decomposition) in the CSSX solvent near 140 C. This concentration of entrained solvent is believed to markedly exceed the amount of solvent that will pass from the Modular Caustic Side Solvent Unit (MCU) through the downstream Defense Waste Processing Facility and enter the evaporator through routine tank farm operations. The rate of pressure rise at 140 C differs appreciably - i.e., is reduced - for salt solution containing the organic from that of the same solution without solvent. This behavior is due to a reaction between the CSSX components and the salt solution simulant.
Date: March 7, 2006
Creator: Fondeur, F & Samuel Fink, S
System: The UNT Digital Library