Comparative assessment of status and opportunities for carbon Dioxide Capture and storage and Radioactive Waste Disposal In North America (open access)

Comparative assessment of status and opportunities for carbon Dioxide Capture and storage and Radioactive Waste Disposal In North America

Aside from the target storage regions being underground, geologic carbon sequestration (GCS) and radioactive waste disposal (RWD) share little in common in North America. The large volume of carbon dioxide (CO{sub 2}) needed to be sequestered along with its relatively benign health effects present a sharp contrast to the limited volumes and hazardous nature of high-level radioactive waste (RW). There is well-documented capacity in North America for 100 years or more of sequestration of CO{sub 2} from coal-fired power plants. Aside from economics, the challenges of GCS include lack of fully established legal and regulatory framework for ownership of injected CO{sub 2}, the need for an expanded pipeline infrastructure, and public acceptance of the technology. As for RW, the USA had proposed the unsaturated tuffs of Yucca Mountain, Nevada, as the region's first high-level RWD site before removing it from consideration in early 2009. The Canadian RW program is currently evolving with options that range from geologic disposal to both decentralized and centralized permanent storage in surface facilities. Both the USA and Canada have established legal and regulatory frameworks for RWD. The most challenging technical issue for RWD is the need to predict repository performance on extremely long time scales …
Date: July 22, 2011
Creator: Oldenburg, C. & Birkholzer, J.T.
Object Type: Book
System: The UNT Digital Library
Parallelizing Heavyweight Debugging Tools with MPIecho (open access)

Parallelizing Heavyweight Debugging Tools with MPIecho

None
Date: April 22, 2011
Creator: Rountree, B L; Cobb, G X; Gamblin, G T; Schulz, M W; de Supinski, B R & Tufo, H M
Object Type: Article
System: The UNT Digital Library
Investigation of novel decay B _____ ____(2S)____K at BaBar (open access)

Investigation of novel decay B _____ ____(2S)____K at BaBar

We investigate the undocumented B meson decay, B{sup +} {yields} {Psi}(2S){omega}K{sup +}. The data were collected with the BaBar detector at the SLAC PEP-II asymmetric-energy e{sup +}e{sup -} collier operating at the {gamma}(4S) resonance, a center-of-mass energy of 10.58 GeV/c{sup 2}. The {gamma}(4S) resonance primarily decays to pairs of B-mesons. The BaBar collaboration at the PEP-II ring was located at the SLAC National Accelerator Laboratory and was designed to study the collisions of positrons and electrons. The e{sup -}e{sup +} pairs collide at asymmetric energies, resulting in a center of mass which is traveling at relativistic speeds. The resulting time dilation allows the decaying particles to travel large distances through the detector before undergoing their rapid decays, a process that occurs in the in the center of mass frame over extremely small distances. As they travel through silicon vertex trackers, a drift chamber, a Cerenkov radiation detector and finally an electromagnetic calorimeter, we measure the charge, energy, momentum, and particle identification in order to reconstruct the decays that have occurred. While all well understood mesons currently fall into the qq model, the quark model has no a priori exclusion of higher configuration states such as qqqq which has led experimentalists …
Date: June 22, 2011
Creator: Schalch, Jacob & /SLAC, /Oberlin Coll.
Object Type: Report
System: The UNT Digital Library
New Manufacturing Method for Paper filler and Fiber Material (open access)

New Manufacturing Method for Paper filler and Fiber Material

The study compares commercial available filler products with a new developed “Hybrid Fiber Filler Composite Material” and how main structural, optical and strength properties are affected by increasing the filler content of at least 5% over commercial values. The study consists of: (i) an overview of paper filler materials used in the paper production process, (ii) discusses the manufacturing technology of lime based filler materials for paper applications, (iii) gives an overview of new emerging paper filler technologies, (iv) discusses a filler evaluation of commercial available digital printing paper products, (v) reports from a detailed handsheet study and 12” pilot plant paper machine trial runs with the new Hybrid Fiber Filler Composite Material, and (vi) evaluates and compares commercial filler products and the new Hybrid Fiber Filler Composite Material with a life cycle analyses that explains manufacturing, economic and environmental benefits as they are applied to uncoated digital printing papers.
Date: November 22, 2011
Creator: Doelle, Klaus
Object Type: Article
System: The UNT Digital Library
Critical Infrastructure for Ocean Research and Societal Needs in 2030 (open access)

Critical Infrastructure for Ocean Research and Societal Needs in 2030

The United States has jurisdiction over 3.4 million square miles of ocean—an expanse greater than the land area of all fifty states combined. This vast marine area offers researchers opportunities to investigate the ocean’s role in an integrated Earth system, but also presents challenges to society, including damaging tsunamis and hurricanes, industrial accidents, and outbreaks of waterborne diseases. The 2010 Gulf of Mexico Deepwater Horizon oil spill and 2011 Japanese earthquake and tsunami are vivid reminders that a broad range of infrastructure is needed to advance our still-incomplete understanding of the ocean. The National Research Council (NRC)’s Ocean Studies Board was asked by the National Science and Technology Council’s Subcommittee on Ocean Science and Technology, comprised of 25 U.S. government agencies, to examine infrastructure needs for ocean research in the year 2030. This request reflects concern, among a myriad of marine issues, over the present state of aging and obsolete infrastructure, insufficient capacity, growing technological gaps, and declining national leadership in marine technological development; issues brought to the nation’s attention in 2004 by the U.S. Commission on Ocean Policy. A 15-member committee of experts identified four themes that encompass 32 future ocean research questions–enabling stewardship of the environment, protecting life …
Date: April 22, 2011
Creator: National Research Council
Object Type: Report
System: The UNT Digital Library
Financial Analysis of Experimental Releases Conducted at Glen Canyon Dam During Water Years 2006 Through 2010. (open access)

Financial Analysis of Experimental Releases Conducted at Glen Canyon Dam During Water Years 2006 Through 2010.

Because of concerns about the impact that Glen Canyon Dam (GCD) operations were having on downstream ecosystems and endangered species, the Bureau of Reclamation (Reclamation) conducted an Environmental Impact Statement (EIS) on dam operations (DOE 1996). New operating rules and management goals for GCD that had been specified in the Record of Decision (ROD) (Reclamation 1996) were adopted in February 1997. In addition to issuing new operating criteria, the ROD mandated experimental releases for the purpose of conducting scientific studies. A report released in January 2011 examined the financial implications of the experimental flows that were conducted at the GCD from 1997 to 2005. This report continues the analysis and examines the financial implications of the experimental flows conducted at the GCD from 2006 to 2010. An experimental release may have either a positive or negative impact on the financial value of energy production. This study estimates the financial costs of experimental releases, identifies the main factors that contribute to these costs, and compares the interdependencies among these factors. An integrated set of tools was used to compute the financial impacts of the experimental releases by simulating the operation of the GCD under two scenarios, namely, (1) a baseline scenario …
Date: August 22, 2011
Creator: Poch, L. A.; Veselka, T. D.; Palmer, C. S.; Loftin, S.; Osiek, B. & Western Area Power Administration, Colorado River Storage Project Management Center
Object Type: Report
System: The UNT Digital Library
Simulation of Dimensional Changes and Hot Tears During Solidification of Steel Castings (open access)

Simulation of Dimensional Changes and Hot Tears During Solidification of Steel Castings

During solidification, contractions or distortions of the steel, known as “dimensional changes,” can cause the final product to vary significantly from the original pattern. Cracks in the casting that form during the late stages of solidification, called “hot tears,” occur when contractions can no longer be accommodated by residual liquid metal flow or solid metal displacement. Dimensional changes and hot tears are major problems in the steel casting industry. These occurrences are difficult to anticipate and correct using traditional foundry engineering methods. While dimensional changes are accommodated using pattern allowances, the desired dimensions are often inaccurate. Castings that form hot tears must then be scrapped or weld repaired, expending unnecessary energy. Correcting either of these problems requires a tedious trial-and-error process that may not necessarily yield accurate results. A model that predicts hot tears and dimensional changes during steel casting solidification has been successfully developed and implemented in commercial casting and stress analysis software. This model is based on a visco-plastic constitutive model with damage, where the damage begins to form when liquid feed metal is cut off to a solidifying region. The hot tear prediction is a locater for hot tear initiation sites, and not a full tear prediction: …
Date: July 22, 2011
Creator: Beckermann, Christoph & Carlson, Kent
Object Type: Report
System: The UNT Digital Library
ANALYSIS OF HARRELL MONOSODIUM TITANATE LOT #052511 (open access)

ANALYSIS OF HARRELL MONOSODIUM TITANATE LOT #052511

Monosodium titanate (MST) for use in the Actinide Removal Process (ARP) must be qualified and verified in advance. A single qualification sample for each batch of material is sent to SRNL for analysis, as well as a statistical sampling of verification samples. The Harrell Industries Lot No.052511 qualification and 14 verification samples met all the requirements in the specification indicating the material is acceptable for use in the process.
Date: August 22, 2011
Creator: Taylor-Pashow, K.
Object Type: Report
System: The UNT Digital Library
3013/9975 Surveillance Program Interim Summary Report (open access)

3013/9975 Surveillance Program Interim Summary Report

The K-Area Materials Storage (KAMS) Documented Safety Analysis (DSA) requires a surveillance program to monitor the safety performance of 3013 containers and 9975 shipping packages stored in KAMS. The SRS surveillance program [Reference 1] outlines activities for field surveillance and laboratory tests that demonstrate the packages meet the functional performance requirements described in the DSA. The SRS program also supports the complexwide Integrated Surveillance Program (ISP) [Reference 2] for 3013 containers. The purpose of this report is to provide a summary of the SRS portion of the surveillance program activities through fiscal year 2010 (FY10) and formally communicate the interpretation of these results by the Surveillance Program Authority (SPA). Surveillance for the initial 3013 container random sampling of the Innocuous bin and the Pressure bin has been completed and there has been no indication of corrosion or significant pressurization. The maximum pressure observed was less than 50 psig, which is well below the design pressure of 699 psig for the 3013 container [Reference 3]. The data collected during surveillance of these bins has been evaluated by the Materials Identification and Surveillance (MIS) Working Group and no additional surveillance is necessary for these bins at least through FY13. A decision will …
Date: June 22, 2011
Creator: Dunn, K.; Hackney, B. & McClard, J.
Object Type: Report
System: The UNT Digital Library
Widespread spin polarization effects in photoemission from topological insulators (open access)

Widespread spin polarization effects in photoemission from topological insulators

High-resolution spin- and angle-resolved photoemission spectroscopy (spin-ARPES) was performed on the three-dimensional topological insulator Bi{sub 2}Se{sub 3} using a recently developed high-efficiency spectrometer. The topological surface state's helical spin structure is observed, in agreement with theoretical prediction. Spin textures of both chiralities, at energies above and below the Dirac point, are observed, and the spin structure is found to persist at room temperature. The measurements reveal additional unexpected spin polarization effects, which also originate from the spin-orbit interaction, but are well differentiated from topological physics by contrasting momentum and photon energy and polarization dependencies. These observations demonstrate significant deviations of photoelectron and quasiparticle spin polarizations. Our findings illustrate the inherent complexity of spin-resolved ARPES and demonstrate key considerations for interpreting experimental results.
Date: June 22, 2011
Creator: Jozwiak, C.; Chen, Y. L.; Fedorov, A. V.; Analytis, J. G.; Rotundu, C. R.; Schmid, A. K. et al.
Object Type: Article
System: The UNT Digital Library
PyDecay/GraphPhys: A Unified Language and Storage System for Particle Decay Process Descriptions (open access)

PyDecay/GraphPhys: A Unified Language and Storage System for Particle Decay Process Descriptions

To ease the tasks of Monte Carlo (MC) simulation and event reconstruction (i.e. inferring particle-decay events from experimental data) for long-term BaBar data preservation and analysis, the following software components have been designed: a language ('GraphPhys') for specifying decay processes, common to both simulation and data analysis, allowing arbitrary parameters on particles, decays, and entire processes; an automated visualization tool to show graphically what decays have been specified; and a searchable database storage mechanism for decay specifications. Unlike HepML, a proposed XML standard for HEP metadata, the specification language is designed not for data interchange between computer systems, but rather for direct manipulation by human beings as well as computers. The components are interoperable: the information parsed from files in the specification language can easily be rendered as an image by the visualization package, and conversion between decay representations was implemented. Several proof-of-concept command-line tools were built based on this framework. Applications include building easier and more efficient interfaces to existing analysis tools for current projects (e.g. BaBar/BESII), providing a framework for analyses in future experimental settings (e.g. LHC/SuperB), and outreach programs that involve giving students access to BaBar data and analysis tools to give them a hands-on feel for …
Date: June 22, 2011
Creator: Dunietz, Jesse N. & /SLAC, /MIT
Object Type: Report
System: The UNT Digital Library
BioPig: Developing Cloud Computing Applications for Next-Generation Sequence Analysis (open access)

BioPig: Developing Cloud Computing Applications for Next-Generation Sequence Analysis

Next Generation sequencing is producing ever larger data sizes with a growth rate outpacing Moore's Law. The data deluge has made many of the current sequenceanalysis tools obsolete because they do not scale with data. Here we present BioPig, a collection of cloud computing tools to scale data analysis and management. Pig is aflexible data scripting language that uses Apache's Hadoop data structure and map reduce framework to process very large data files in parallel and combine the results.BioPig extends Pig with capability with sequence analysis. We will show the performance of BioPig on a variety of bioinformatics tasks, including screeningsequence contaminants, Illumina QA/QC, and gene discovery from metagenome data sets using the Rumen metagenome as an example.
Date: March 22, 2011
Creator: Bhatia, Karan & Wang, Zhong
Object Type: Article
System: The UNT Digital Library
The Magnetoviscous-thermal Instability (open access)

The Magnetoviscous-thermal Instability

None
Date: June 22, 2011
Creator: Islam, T
Object Type: Article
System: The UNT Digital Library
A Lightweight, High-performance I/O Management Package for Data-intensive Computing (open access)

A Lightweight, High-performance I/O Management Package for Data-intensive Computing

Our group has been working with ANL collaborators on the topic “bridging the gap between parallel file system and local file system” during the course of this project period. We visited Argonne National Lab -- Dr. Robert Ross’s group for one week in the past summer 2007. We looked over our current project progress and planned the activities for the incoming years 2008-09. The PI met Dr. Robert Ross several times such as HEC FSIO workshop 08, SC’08 and SC’10. We explored the opportunities to develop a production system by leveraging our current prototype to (SOGP+PVFS) a new PVFS version. We delivered SOGP+PVFS codes to ANL PVFS2 group in 2008.We also talked about exploring a potential project on developing new parallel programming models and runtime systems for data-intensive scalable computing (DISC). The methodology is to evolve MPI towards DISC by incorporating some functions of Google MapReduce parallel programming model. More recently, we are together exploring how to leverage existing works to perform (1) coordination/aggregation of local I/O operations prior to movement over the WAN, (2) efficient bulk data movement over the WAN, (3) latency hiding techniques for latency-intensive operations. Since 2009, we start applying Hadoop/MapReduce to some HEC applications with …
Date: June 22, 2011
Creator: Wang, Jun
Object Type: Report
System: The UNT Digital Library
ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE (open access)

ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE

This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the …
Date: June 22, 2011
Creator: Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah et al.
Object Type: Report
System: The UNT Digital Library
Mesoscale and Large-Eddy Simulations for Wind Energy (open access)

Mesoscale and Large-Eddy Simulations for Wind Energy

Operational wind power forecasting, turbine micrositing, and turbine design require high-resolution simulations of atmospheric flow over complex terrain. The use of both Reynolds-Averaged Navier Stokes (RANS) and large-eddy (LES) simulations is explored for wind energy applications using the Weather Research and Forecasting (WRF) model. To adequately resolve terrain and turbulence in the atmospheric boundary layer, grid nesting is used to refine the grid from mesoscale to finer scales. This paper examines the performance of the grid nesting configuration, turbulence closures, and resolution (up to as fine as 100 m horizontal spacing) for simulations of synoptically and locally driven wind ramping events at a West Coast North American wind farm. Interestingly, little improvement is found when using higher resolution simulations or better resolved turbulence closures in comparison to observation data available for this particular site. This is true for week-long simulations as well, where finer resolution runs show only small changes in the distribution of wind speeds or turbulence intensities. It appears that the relatively simple topography of this site is adequately resolved by all model grids (even as coarse as 2.7 km) so that all resolutions are able to model the physics at similar accuracy. The accuracy of the results …
Date: February 22, 2011
Creator: Marjanovic, N
Object Type: Report
System: The UNT Digital Library
Evaluation of evolving residential electricity tariffs (open access)

Evaluation of evolving residential electricity tariffs

Residential customers in California's Pacific Gas and Electric (PG&E) territory have seen several electricity rate structure changes in the past decade. A relatively simple two-tiered pricing system (charges by usage under/over baseline for the home's climate zone) was replaced in the summer of 2001 by a more complicated five-tiered system (usage below baseline and up to 30percent, 100percent, 200percent, and 300percent+ over baseline). In 2009, PG&E began the process of upgrading its residential customers to Smart Meters and laying the groundwork for time of use pricing, due to start in 2011. This paper examines the history of the tiered pricing system, discusses the problems the utility encountered with its Smart Meter roll out, and evaluates the proposed dynamic pricing incentive structures. Scenario analyses of example PG&E customer bills will also be presented. What would these residential customers pay if they were still operating under a tiered structure, and/or if they participated in peak hour reductions?
Date: March 22, 2011
Creator: Lai, Judy; DeForest, Nicholas; Kiliccote, Sila; Stadler, Michael; Marnay, Chris & Donadee, Jon
Object Type: Article
System: The UNT Digital Library
Observations on the Optimality Tolerance in the CAISO 33% RPS Model (open access)

Observations on the Optimality Tolerance in the CAISO 33% RPS Model

In 2008 Governor Schwarzenegger of California issued an executive order requiring that 33 percent of all electricity in the state in the year 2020 should come from renewable resources such as wind, solar, geothermal, biomass, and small hydroelectric facilities. This 33% renewable portfolio standard (RPS) was further codified and signed into law by Governor Brown in 2011. To assess the market impacts of such a requirement, the California Public Utilities Commission (CPUC) initiated a study to quantify the cost, risk, and timing of achieving a 33% RPS by 2020. The California Independent System Operator (CAISO) was contracted to manage this study. The production simulation model used in this study was developed using the PLEXOS software package, which allows energy planners to optimize long-term system planning decisions under a wide variety of system constraints. In this note we describe our observations on varying the optimality tolerance in the CAISO 33% RPS model. In particular, we observe that changing the optimality tolerance from .05% to .5% leads to solutions over 5 times faster, on average, producing very similar solutions with a negligible difference in overall distance from optimality.
Date: September 22, 2011
Creator: Yao, Y; Meyers, C; Schmidt, A; Smith, S & Streitz, F
Object Type: Report
System: The UNT Digital Library
Solvation Sphere of I- and Br- in Water (open access)

Solvation Sphere of I- and Br- in Water

The solvation sphere of halides in water has been investigated using a combination of extended x-ray absorption fine structure (EXAFS) and x-ray absorption near-edge structure (XANES) analysis techniques. The results have indicated that I- and Br- both have an asymmetric, 8 water molecule primary solvation spheres. These spheres are identical, with the Br{sup -} sphere about .3 {angstrom} smaller than the I{sup -} sphere. This study utilized near-edge analysis to supplement EXAFS analysis which suffers from signal dampening/broadening due to thermal noise. This paper has reported on the solvation first sphere of I{sup -} and Br{sup -} in water. Using EXAFS and XANES analysis, strong models which describe the geometric configuration of water molecules coordinated to a central anion have been developed. The combination of these techniques has provided us with a more substantiated argument than relying solely on one or the other. An important finding of this study is that the size of the anion plays a smaller role than previously assumed in determining the number of coordinating water molecules Further experimental and theoretical investigation is required to understand why the size of the anion plays a minor role in determining the number of water molecules bound.
Date: June 22, 2011
Creator: unknown
Object Type: Report
System: The UNT Digital Library
Overview of the LBNE Neutrino Beam (open access)

Overview of the LBNE Neutrino Beam

The Long Baseline Neutrino Experiment (LBNE) will utilize a neutrino beamline facility located at Fermilab. The facility is designed to aim a beam of neutrinos toward a detector placed at the Deep Underground Science and Engineering Laboratory (DUSEL) in South Dakota. The neutrinos are produced in a three-step process. First, protons from the Main Injector hit a solid target and produce mesons. Then, the charged mesons are focused by a set of focusing horns into the decay pipe, towards the far detector. Finally, the mesons that enter the decay pipe decay into neutrinos. The parameters of the facility were determined by an amalgam of the physics goals, the Monte Carlo modeling of the facility, and the experience gained by operating the NuMI facility at Fermilab. The initial beam power is expected to be {approx}700 kW, however some of the parameters were chosen to be able to deal with a beam power of 2.3 MW.
Date: March 22, 2011
Creator: Moore, C. D.; He, Yun; Hurh, Patrick; Hylen, James; Lundberg, Byron; McGee, Mike et al.
Object Type: Article
System: The UNT Digital Library
Assessing Potential Acidification of Marine Archaeological Wood Based on Concentration of Sulfur Species (open access)

Assessing Potential Acidification of Marine Archaeological Wood Based on Concentration of Sulfur Species

The presence of sulfur in marine archaeological wood presents a challenge to conservation. Upon exposure to oxygen, sulfur compounds in waterlogged wooden artifacts are being oxidized, producing sulfuric acid. This speeds the degradation of the wood, potentially damaging specimens beyond repair. Sulfur K-edge x-ray absorption spectroscopy was used to identify the species of sulfur present in samples from the timbers of the Mary Rose, a preserved 16th century warship known to undergo degradation through acidification. The results presented here show that sulfur content varied significantly on a local scale. Only certain species of sulfur have the potential to produce sulfuric acid by contact with oxygen and seawater in situ, such as iron sulfides and elemental sulfur. Organic sulfurs, such as the amino acids cysteine and methionine, may produce acid but are integral parts of the wood's structure and may not be released from the organic matrix. The sulfur species contained in the sample reflect the exposure to oxygen while submerged, and this exposure can differ greatly over time and position. A better understanding of the species pathway to acidifications required, along with its location, in order to suggest a more customized and effective preservation strategy. Waterlogged archaeological wood, frequently in …
Date: June 22, 2011
Creator: unknown
Object Type: Report
System: The UNT Digital Library
MicroCT: Semi-Automated Analysis of CT Reconstructed Data of Home Made Explosive Materials Using the Matlab MicroCT Analysis GUI (open access)

MicroCT: Semi-Automated Analysis of CT Reconstructed Data of Home Made Explosive Materials Using the Matlab MicroCT Analysis GUI

This Standard Operating Procedure (SOP) provides the specific procedural steps for analyzing reconstructed CT images obtained under the IDD Standard Operating Procedures for data acquisition [1] and MicroCT image reconstruction [2], per the IDD Quality Assurance Plan for MicroCT Scanning [3]. Although intended to apply primarily to MicroCT data acquired in the HEAFCAT Facility at LLNL, these procedures may also be applied to data acquired at Tyndall from the YXLON cabinet and at TSL from the HEXCAT system. This SOP also provides the procedural steps for preparing the tables and graphs to be used in the reporting of analytical results. This SOP applies to R and D work - for production applications, use [4].
Date: September 22, 2011
Creator: Seetho, I M; Brown, W D; Kallman, J S; Martz, H E & White, W T
Object Type: Report
System: The UNT Digital Library
Creating an EPICS Based Test Stand Development System for a BPM Digitizer of the Linac Coherent Light Source (open access)

Creating an EPICS Based Test Stand Development System for a BPM Digitizer of the Linac Coherent Light Source

The Linac Coherent Light Source (LCLS) is required to deliver a high quality electron beam for producing coherent X-rays. As a result, high resolution beam position monitoring is required. The Beam Position Monitor (BPM) digitizer acquires analog signals from the beam line and digitizes them to obtain beam position data. Although Matlab is currently being used to test the BPM digitizer?s functions and capability, the Controls Department at SLAC prefers to use Experimental Physics and Industrial Control Systems (EPICS). This paper discusses the transition of providing similar as well as enhanced functionalities, than those offered by Matlab, to test the digitizer. Altogether, the improved test stand development system can perform mathematical and statistical calculations with the waveform signals acquired from the digitizer and compute the fast Fourier transform (FFT) of the signals. Finally, logging of meaningful data into files has been added.
Date: June 22, 2011
Creator: unknown
Object Type: Report
System: The UNT Digital Library
A Modified Treatment of Sources in Implicit Monte Carlo Radiation Transport (open access)

A Modified Treatment of Sources in Implicit Monte Carlo Radiation Transport

We describe a modification of the treatment of photon sources in the IMC algorithm. We describe this modified algorithm in the context of thermal emission in an infinite medium test problem at equilibrium and show that it completely eliminates statistical noise.
Date: March 22, 2011
Creator: Gentile, N A & Trahan, T J
Object Type: Article
System: The UNT Digital Library