The SEMATECH Berkeley MET: extending EUV learning to 16-nm half pitch (open access)

The SEMATECH Berkeley MET: extending EUV learning to 16-nm half pitch

Several high-performing resists identified in the past two years have been exposed at the 0.3-numerical-aperture (NA) SEMATECH Berkeley Microfield Exposure Tool (BMET) with an engineered dipole illumination optimized for 18-nm half pitch. Five chemically amplified platforms were found to support 20-nm dense patterning at a film thickness of approximately 45 nm. At 19-nm half pitch, however, scattered bridging kept all of these resists from cleanly resolving larger areas of dense features. At 18-nm half pitch, none of the resists were are able to cleanly resolve a single line within a bulk pattern. With this same illumination a directly imageable metal oxide hardmask showed excellent performance from 22-nm half pitch to 17-nm half pitch, and good performance at 16-nm half pitch, closely following the predicted aerial image contrast. This indicates that observed limitations of the chemically amplified resists are indeed coming from the resist and not from a shortcoming of the exposure tool. The imageable hardmask was also exposed using a Pseudo Phase-Shift-Mask technique and achieved clean printing of 15-nm half pitch lines and modulation all the way down to the theoretical 12.5-nm resolution limit of the 0.3-NA SEMATECH BMET.
Date: March 18, 2011
Creator: Anderson, Christopher N.; Baclea-an, Lorie Mae; Denham, Paul E.; George, Simi; Goldberg, Kenneth A.; Jones, Michael et al.
System: The UNT Digital Library
The Impact of Camera Optical Alignments on Weak Lensing Measures for the Dark Energy Survey (open access)

The Impact of Camera Optical Alignments on Weak Lensing Measures for the Dark Energy Survey

None
Date: March 18, 2013
Creator: Antonik, Michelle L.; Bacon, David J.; Bridle, Sarah; Doel, Peter; Brooks, David; Worswick, Sue et al.
System: The UNT Digital Library
Management Of Experiments And Data At The National Ignition Facility (open access)

Management Of Experiments And Data At The National Ignition Facility

Experiments, or 'shots', conducted at the National Ignition Facility (NIF) are discrete events that occur over a very short time frame (tens of nanoseconds) separated by many hours. Each shot is part of a larger campaign of shots to advance scientific understanding in high-energy-density physics. In one campaign, scientists use energy from the 192-beam, 1.8-Megajoule pulsed laser in the NIF system to symmetrically implode a hydrogen-filled target, thereby creating conditions similar to the interior of stars in a demonstration of controlled fusion. Each NIF shot generates gigabytes of data from over 30 diagnostics that measure optical, x-ray, and nuclear phenomena from the imploding target. We have developed systems to manage all aspects of the shot cycle. Other papers will discuss the control of the lasers and targets, while this paper focuses on the setup and management of campaigns and diagnostics. Because of the low duty cycle of shots, and the thousands of adjustments for each shot (target type, composition, shape; laser beams used, their power profiles, pointing; diagnostic systems used, their configuration, calibration, settings) it is imperative that we accurately define all equipment prior to the shot. Following the shot, and capture of the data by the automatic control system, …
Date: March 18, 2011
Creator: Azevedo, S; Casey, A; Beeler, R; Bettenhausen, R.; Bond, E.; Chandrasekaran, H. et al.
System: The UNT Digital Library
RELAP5 Application to Accident Analysis of the NIST Research Reactor (open access)

RELAP5 Application to Accident Analysis of the NIST Research Reactor

Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.
Date: March 18, 2012
Creator: Baek, J.; Cuadra Gascon, A.; Cheng, L. Y. & Diamond, D.
System: The UNT Digital Library
TNR and conservation on a university campus: a political ecological perspective (open access)

TNR and conservation on a university campus: a political ecological perspective

Article discussing TNR and conservation on a university campus and a political ecological perspective.
Date: September 25, 2013
Creator: Dombrosky, Jonathan & Wolverton, Steven J.
System: The UNT Digital Library
Control System For Cryogenic THD Layering At The National Ignition Facility (open access)

Control System For Cryogenic THD Layering At The National Ignition Facility

The National Ignition Facility (NIF) is the world largest and most energetic laser system for Inertial Confinement Fusion (ICF). In 2010, NIF began ignition experiments using cryogenically cooled targets containing layers of the tritium-hydrogen-deuterium (THD) fuel. The 75 {micro}m thick layer is formed inside of the 2 mm target capsule at temperatures of approximately 18 K. The ICF target designs require sub-micron smoothness of the THD ice layers. Formation of such layers is still an active research area, requiring a flexible control system capable of executing the evolving layering protocols. This task is performed by the Cryogenic Target Subsystem (CTS) of the NIF Integrated Computer Control System (ICCS). The CTS provides cryogenic temperature control with the 1 mK resolution required for beta-layering and for the thermal gradient fill of the capsule. The CTS also includes a 3-axis x-ray radiography engine for phase contrast imaging of the ice layers inside of the plastic and beryllium capsules. In addition to automatic control engines, CTS is integrated with the Matlab interactive programming environment to allow flexibility in experimental layering protocols. The CTS Layering Matlab Toolbox provides the tools for layer image analysis, system characterization and cryogenic control. The CTS Layering Report tool generates …
Date: March 18, 2011
Creator: Fedorov, M; Blubaugh, J; Edwards, O; Mauvais, M; Sanchez, R & Wilson, B
System: The UNT Digital Library
Status of head-on beam-beam compensation in RHIC (open access)

Status of head-on beam-beam compensation in RHIC

N/A
Date: March 18, 2013
Creator: Fischer, W.; Altinbas, Z.; Anerella, M.; Blaskiewicz, M.; Bruno, D.; Costanzo, M. et al.
System: The UNT Digital Library
A Virtualized Computing Platform For Fusion Control Systems (open access)

A Virtualized Computing Platform For Fusion Control Systems

The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. 2,500 servers, 400 network devices and 700 terabytes of networked attached storage provide the foundation for NIF's Integrated Computer Control System (ICCS) and Experimental Data Archive. This talk discusses the rationale & benefits for server virtualization in the context of an operational experimental facility, the requirements discovery process used by the NIF teams to establish evaluation criteria for virtualization alternatives, the processes and procedures defined to enable virtualization of servers in a timeframe that did not delay the execution of experimental campaigns and the lessons the NIF teams learned along the way. The virtualization architecture ultimately selected for ICCS is based on the Open Source Xen computing platform and …
Date: March 18, 2011
Creator: Frazier, T.; Adams, P.; Fisher, J. & Talbot, A.
System: The UNT Digital Library
The MINERvA Experiment (open access)

The MINERvA Experiment

The MINERvA experiment is a dedicated cross-section experiment whose aim is to measure neutrino cross sections for inclusive and exclusive final states on several nuclei. The detector is fully commissioned and began running in March 2010. As a dedicated cross-section experiment, MINERvA has a particular need to know the incident neutrino flux: both the absolute level and the energy dependence. In these proceedings we describe the MINERvA detector, give an update on the experimental status, and discuss the means to determine the neutrino flux. The MINERvA experiment is now running and has completed 25% of its full Low Energy run. There are various techniques planned for understanding the flux, including taking neutrino data at several different beam configurations. The experiment has gotten a first glimpse of two of the six configurations, and completed four horn current scans. Because of its exclusive final state reconstruction capabilities MINERvA can provide the much needed input for current and future oscillation experiments. The inclusive final state measurements and comparisons of nuclear effects across as many states as possible will provide new insights into neutrino-nucleus scattering.
Date: March 18, 2011
Creator: Harris, Deborah A. & Kopp, Sacha
System: The UNT Digital Library
2012 Community Earth System Model (CESM) Tutorial - Proposal to DOE (open access)

2012 Community Earth System Model (CESM) Tutorial - Proposal to DOE

The Community Earth System Model (CESM) is a fully-coupled, global climate model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. This document provides the agenda and list of participants for the conference. Web materials for all lectures and practical sessions available from: http://www.cesm.ucar.edu/events/tutorials/073012/ .
Date: March 18, 2013
Creator: Holland, Marika & Bailey, David A.
System: The UNT Digital Library
Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System (open access)

Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System

The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that will contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn. NIF is operated by the Integrated Computer Control System (ICCS) in an object-oriented, CORBA-based system distributed among over 1800 frontend processors, embedded controllers and supervisory servers. In the fall of 2010, a set of experiments began with deuterium and tritium filled targets as part of the National Ignition Campaign (NIC). At present, all 192 laser beams routinely fire to target chamber center to conduct fusion and high energy density experiments. During the past year, the control system was expanded to include automation of cryogenic target system and over 20 diagnostic systems to support fusion experiments were deployed and utilized in experiments in the past year. This talk discusses the current status of the NIC and the plan for …
Date: March 18, 2011
Creator: Lagin, L.; Brunton, G.; Carey, R.; Demaret, R.; Fisher, J.; Fishler, B. et al.
System: The UNT Digital Library
UV Decontamination of MDA Reagents for Single Cell Genomics (open access)

UV Decontamination of MDA Reagents for Single Cell Genomics

Single cell genomics, the amplification and sequencing of genomes from single cells, can provide a glimpse into the genetic make-up and thus life style of the vast majority of uncultured microbial cells, making it an immensely powerful and increasingly popular tool. This is accomplished by use of multiple displacement amplification (MDA), which can generate billions of copies of a single bacterial genome producing microgram-range DNA required for shotgun sequencing. Here, we address a key challenge inherent to this approach and propose a solution for the improved recovery of single cell genomes. While DNA-free reagents for the amplification of a single cell genome are a prerequisite for successful single cell sequencing and analysis, DNA contamination has been detected in various reagents, which poses a considerable challenge. Our study demonstrates the effect of UV irradiation in efficient elimination of exogenous contaminant DNA found in MDA reagents, while maintaining Phi29 activity. Consequently, we also find that increased UV exposure to Phi29 does not adversely affect genome coverage of MDA amplified single cells. While additional challenges in single cell genomics remain to be resolved, the proposed methodology is relatively quick and simple and we believe that its application will be of high value for …
Date: March 18, 2011
Creator: Lee, Janey; Tighe, Damon; Sczyrba, Alexander; Malmatrom, Rex; Clingenpeel, Scott; Malfatti, Stephanie et al.
System: The UNT Digital Library
Operational forecasting based on a modified Weather Research and Forecasting model (open access)

Operational forecasting based on a modified Weather Research and Forecasting model

Accurate short-term forecasts of wind resources are required for efficient wind farm operation and ultimately for the integration of large amounts of wind-generated power into electrical grids. Siemens Energy Inc. and Lawrence Livermore National Laboratory, with the University of Colorado at Boulder, are collaborating on the design of an operational forecasting system for large wind farms. The basis of the system is the numerical weather prediction tool, the Weather Research and Forecasting (WRF) model; large-eddy simulations and data assimilation approaches are used to refine and tailor the forecasting system. Representation of the atmospheric boundary layer is modified, based on high-resolution large-eddy simulations of the atmospheric boundary. These large-eddy simulations incorporate wake effects from upwind turbines on downwind turbines as well as represent complex atmospheric variability due to complex terrain and surface features as well as atmospheric stability. Real-time hub-height wind speed and other meteorological data streams from existing wind farms are incorporated into the modeling system to enable uncertainty quantification through probabilistic forecasts. A companion investigation has identified optimal boundary-layer physics options for low-level forecasts in complex terrain, toward employing decadal WRF simulations to anticipate large-scale changes in wind resource availability due to global climate change.
Date: March 18, 2010
Creator: Lundquist, J; Glascoe, L & Obrecht, J
System: The UNT Digital Library
In vitro High-Resolution Architecture and Structural Dynamics of Bacterial Systems (open access)

In vitro High-Resolution Architecture and Structural Dynamics of Bacterial Systems

None
Date: March 18, 2010
Creator: Malkin, A. J.; Plomp, M.; Leighton, T. J.; Vogelstein, B. & Holman, H.
System: The UNT Digital Library
A Green Prison: Santa Rita Jail Creeps Toward Zero Net Energy (ZNE) (open access)

A Green Prison: Santa Rita Jail Creeps Toward Zero Net Energy (ZNE)

In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these …
Date: March 18, 2011
Creator: Marnay, Chris; DeForest, Nicholas; Stadler, Michael; Donadee, John; Dierckxsens, Carlos; Mendes, Gonçalo et al.
System: The UNT Digital Library
A Green Prison: Santa Rita Jail Creeps Towards Zero Net Energy (ZNE) (open access)

A Green Prison: Santa Rita Jail Creeps Towards Zero Net Energy (ZNE)

A large project is underway at Alameda County's twenty-year old 45 ha 4,000-inmate Santa Rita Jail, about 70 km east of San Francisco. Often described as a green prison, it has a considerable installed base of distributed energy resources including a seven-year old 1.2 MW PV array, a four-year old 1 MW fuel cell with heat recovery, and efficiency investments. A current US$14 M expansion will add approximately 2 MW of NaS batteries, and undetermined wind capacity and a concentrating solar thermal system. This ongoing effort by a progressive local government with considerable Federal and State support provides some excellent lessons for the struggle to lower building carbon footprint. The Distributed Energy Resources Customer Adoption Model (DER-CAM) finds true optimal combinations of equipment and operating schedules for microgrids that minimize energy bills and/or carbon emissions without 2 of 12 significant searching or rules-of-thumb prioritization, such as"efficiency first then on-site generation." The results often recommend complex systems, and sensitivities show how policy changes will affect choices. This paper reports an analysis of the historic performance of the PV system and fuel cell, describes the complex optimization applied to the battery scheduling, and shows how results will affect the jail's operational costs, …
Date: March 18, 2011
Creator: Marnay, Chris; DeForest, Nicholas; Stadler, Michael; Donadee, Jon; Dierckxsens, Carlos; Mendes, Goncalo et al.
System: The UNT Digital Library
Laser Inertial Fusion Energy Control Systems (open access)

Laser Inertial Fusion Energy Control Systems

A Laser Inertial Fusion Energy (LIFE) facility point design is being developed at LLNL to support an Inertial Confinement Fusion (ICF) based energy concept. This will build upon the technical foundation of the National Ignition Facility (NIF), the world's largest and most energetic laser system. NIF is designed to compress fusion targets to conditions required for thermonuclear burn. The LIFE control systems will have an architecture partitioned by sub-systems and distributed among over 1000's of front-end processors, embedded controllers and supervisory servers. LIFE's automated control subsystems will require interoperation between different languages and target architectures. Much of the control system will be embedded into the subsystem with well defined interface and performance requirements to the supervisory control layer. An automation framework will be used to orchestrate and automate start-up and shut-down as well as steady state operation. The LIFE control system will be a high parallel segmented architecture. For example, the laser system consists of 384 identical laser beamlines in a 'box'. The control system will mirror this architectural replication for each beamline with straightforward high-level interface for control and status monitoring. Key technical challenges will be discussed such as the injected target tracking and laser pointing feedback. This talk …
Date: March 18, 2011
Creator: Marshall, C.; Carey, R.; Demaret, R.; Edwards, O.; Lagin, L. & Van Arsdall, P.
System: The UNT Digital Library
Beam-beam effects in space charge dominated ion beams (open access)

Beam-beam effects in space charge dominated ion beams

N/A
Date: March 18, 2013
Creator: Montag, C. & Fedotov, A.
System: The UNT Digital Library
The SEMATECH Berkeley MET pushing EUV development beyond 22-nm half pitch (open access)

The SEMATECH Berkeley MET pushing EUV development beyond 22-nm half pitch

Microfield exposure tools (METs) play a crucial role in the development of extreme ultraviolet (EUV) resists and masks, One of these tools is the SEMATECH Berkeley 0.3 numerical aperture (NA) MET, Using conventional illumination this tool is limited to approximately 22-nm half pitch resolution. However, resolution enhancement techniques have been used to push the patterning capabilities of this tool to half pitches of 18 nm and below, This resolution was achieved in a new imageable hard mask which also supports contact printing down to 22 nm with conventional illumination. Along with resolution, line-edge roughness is another crucial hurdle facing EUV resists, Much of the resist LER, however, can be attributed to the mask. We have shown that intenssionally aggressive mask cleaning on an older generation mask causes correlated LER in photoresist to increase from 3.4 nm to 4,0 nm, We have also shown that new generation EUV masks (100 pm of substrate roughness) can achieve correlated LER values of 1.1 nm, a 3x improvement over the correlated LER of older generation EUV masks (230 pm of substrate roughness), Finally, a 0.5-NA MET has been proposed that will address the needs of EUV development at the 16-nm node and beyond, The …
Date: March 18, 2010
Creator: Naulleau, P.; Anderson, C. N.; Backlea-an, L.-M.; Chan, D.; Denham, P.; George, S. et al.
System: The UNT Digital Library
Leveraging Structural Information for the Discovery of New Drugs - Computational Methods (open access)

Leveraging Structural Information for the Discovery of New Drugs - Computational Methods

None
Date: March 18, 2011
Creator: Nguyen, T. B.; Wong, S. E. & Lightstone, F. C.
System: The UNT Digital Library
Caregiver Participation in Hospice Interdisciplinary Team Meetings via Videophone Technology: A Pilot Study to Improve Pain Management (open access)

Caregiver Participation in Hospice Interdisciplinary Team Meetings via Videophone Technology: A Pilot Study to Improve Pain Management

Article on a pilot study to improve pain management and caregiver participation in hospice interdisciplinary team meetings via videophone technology.
Date: March 18, 2010
Creator: Oliver, Debra Parker; Demiris, George; Wittenberg-Lyles, Elaine; Porock, Davina, Ph. D.; Collier, Jacqueline & Arthur, Antony
System: The UNT Digital Library
LARP LHC 4.8 GHZ Schottky System Initial Commissioning with Beam (open access)

LARP LHC 4.8 GHZ Schottky System Initial Commissioning with Beam

The LHC Schottky system consists for four independent 4.8 GHz triple down conversion receivers with associated data acquisition systems. Each system is capable of measuring tune, chromaticity, momentum spread in either horizontal or vertical planes; two systems per beam. The hardware commissioning has taken place from spring through fall of 2010. With nominal bunch beam currents of 10{sup 11} protons, the first incoherent Schottky signals were detected and analyzed. This paper will report on these initial commissioning results. A companion paper will report on the data analysis curve fitting and remote control user interface of the system. The Schottky system for the LHC was proposed in 2004 under the auspices of the LARP collaboration. Similar systems were commissioned in 2003 in the Fermilab Tevatron and Recycler accelerators as a means of measuring tunes noninvasively. The Schottky detector is based on the stochastic cooling pickups that were developed for the Fermilab Antiproton Source Debuncher cooling upgrade completed in 2002. These slotted line waveguide pickups have the advantage of large aperture coupled with high beam coupling characteristics. For stochastic cooling, wide bandwidths are integral to cooling performance. The bandwidth of slotted waveguide pickups can be tailored by choosing the length of the …
Date: March 18, 2011
Creator: Pasquinelli, Ralph J.; Jansson, Andreas; Jones, O. Rhodri & Caspers, Fritz
System: The UNT Digital Library
Mahi‐mahi (Coryphaena hippurus ) life development: morphological, physiological, behavioral and molecular phenotypes (open access)

Mahi‐mahi (Coryphaena hippurus ) life development: morphological, physiological, behavioral and molecular phenotypes

Article relates a study describing the development of cultured mahi‐mahi from the zygote stage to adult.
Date: March 18, 2019
Creator: Perrichon, Prescilla; Stieglitz, John D.; Xu, Elvis Genbo; Magnuson, Jason T.; Pasparakis, Christina; Mager, Edward M. et al.
System: The UNT Digital Library
HPC CLOUD APPLIED TO LATTICE OPTIMIZATION (open access)

HPC CLOUD APPLIED TO LATTICE OPTIMIZATION

As Cloud services gain in popularity for enterprise use, vendors are now turning their focus towards providing cloud services suitable for scientific computing. Recently, Amazon Elastic Compute Cloud (EC2) introduced the new Cluster Compute Instances (CCI), a new instance type specifically designed for High Performance Computing (HPC) applications. At Berkeley Lab, the physicists at the Advanced Light Source (ALS) have been running Lattice Optimization on a local cluster, but the queue wait time and the flexibility to request compute resources when needed are not ideal for rapid development work. To explore alternatives, for the first time we investigate running the Lattice Optimization application on Amazon's new CCI to demonstrate the feasibility and trade-offs of using public cloud services for science.
Date: March 18, 2011
Creator: Sun, Changchun; Nishimura, Hiroshi; James, Susan; Song, Kai; Muriki, Krishna & Qin, Yong
System: The UNT Digital Library