70 Matching Results

Results open in a new window/tab.

Low Dose Radiation Response Curves, Networks and Pathways in Human Lymphoblastoid Cells Exposed from 1 to 10 cGy of Acute Gamma Radiation (open access)

Low Dose Radiation Response Curves, Networks and Pathways in Human Lymphoblastoid Cells Exposed from 1 to 10 cGy of Acute Gamma Radiation

We investigated the low dose dependency of the transcriptional response of human cells to characterize the shape and biological functions associated with the dose response curve and to identify common and conserved functions of low dose expressed genes across cells and tissues. Human lymphoblastoid (HL) cells from two unrelated individuals were exposed to graded doses of radiation spanning the range of 1-10 cGy were analyzed by transcriptome profiling, qPCR and bioinformatics, in comparison to sham irradiated samples. A set of {approx}80 genes showed consistent responses in both cell lines; these genes were associated with homeostasis mechanisms (e.g., membrane signaling, molecule transport), subcellular locations (e.g., Golgi, and endoplasmic reticulum), and involved diverse signal transduction pathways. The majority of radiation-modulated genes had plateau-like responses across 1-10 cGy, some with suggestive evidence that transcription was modulated at doses below 1 cGy. MYC, FOS and TP53 were the major network nodes of the low-dose response in HL cells. Comparison our low dose expression findings in HL cells with those of prior studies in mouse brain after whole body exposure, in human keratinocyte cultures, and in endothelial cells cultures, indicates that certain components of the low dose radiation response are broadly conserved across cell types …
Date: April 18, 2011
Creator: Wyrobek, A. J.; Manohar, C. F.; Nelson, D. O.; Furtado, M. R.; Bhattacharya, M. S.; Marchetti, F. et al.
System: The UNT Digital Library
Determination of effective axion masses in the helium-3 buffer of CAST (open access)

Determination of effective axion masses in the helium-3 buffer of CAST

The CERN Axion Solar Telescope (CAST) is a ground based experiment located in Geneva (Switzerland) searching for axions coming from the Sun. Axions, hypothetical particles that not only could solve the strong CP problem but also be one of the favored candidates for dark matter, can be produced in the core of the Sun via the Primakoff effect. They can be reconverted into X-ray photons on Earth in the presence of strong electromagnetic fields. In order to look for axions, CAST points a decommissioned LHC prototype dipole magnet with different X-ray detectors installed in both ends of the magnet towards the Sun. The analysis of the data acquired during the first phase of the experiment yielded the most restrictive experimental upper limit on the axion-to-photon coupling constant for axion masses up to about 0.02 eV/c{sup 2}. During the second phase, CAST extends its mass sensitivity by tuning the electron density present in the magnetic field region. Injecting precise amounts of helium gas has enabled CAST to look for axion masses up to 1.2 eV/c{sup 2}. This paper studies the determination of the effective axion masses scanned at CAST during its second phase. The use of a helium gas buffer at …
Date: November 18, 2011
Creator: Ruz, J
System: The UNT Digital Library
Role of Standard Demand Response Signals for Advanced Automated Aggregation (open access)

Role of Standard Demand Response Signals for Advanced Automated Aggregation

Emerging standards such as OpenADR enable Demand Response (DR) Resources to interact directly with Utilities and Independent System Operators to allow their facility automation equipment to respond to a variety of DR signals ranging from day ahead to real time ancillary services. In addition, there are Aggregators in today’s markets who are capable of bringing together collections of aggregated DR assets and selling them to the grid as a single resource. However, in most cases these aggregated resources are not automated and when they are, they typically use proprietary technologies. There is a need for a framework for dealing with aggregated resources that supports the following requirements: • Allows demand-side resources to participate in multiple DR markets ranging from wholesale ancillary services to retail tariffs without being completely committed to a single entity like an Aggregator; • Allow aggregated groups of demand-side resources to be formed in an ad hoc fashion to address specific grid-side issues and support the optimization of the collective response of an aggregated group along a number of different dimensions. This is important in order to taylor the aggregated performance envelope to the needs to of the grid; • Allow aggregated groups to be formed in …
Date: November 18, 2011
Creator: Laboratory, Lawrence Berkeley National & Kiliccote, Sila
System: The UNT Digital Library
Extracting Scattering Phase-Shifts in Higher Partial-Waves from Lattice QCD Calculations (open access)

Extracting Scattering Phase-Shifts in Higher Partial-Waves from Lattice QCD Calculations

None
Date: January 18, 2011
Creator: Luu, T & Savage, M J
System: The UNT Digital Library
LARP LHC 4.8 GHZ Schottky System Initial Commissioning with Beam (open access)

LARP LHC 4.8 GHZ Schottky System Initial Commissioning with Beam

The LHC Schottky system consists for four independent 4.8 GHz triple down conversion receivers with associated data acquisition systems. Each system is capable of measuring tune, chromaticity, momentum spread in either horizontal or vertical planes; two systems per beam. The hardware commissioning has taken place from spring through fall of 2010. With nominal bunch beam currents of 10{sup 11} protons, the first incoherent Schottky signals were detected and analyzed. This paper will report on these initial commissioning results. A companion paper will report on the data analysis curve fitting and remote control user interface of the system. The Schottky system for the LHC was proposed in 2004 under the auspices of the LARP collaboration. Similar systems were commissioned in 2003 in the Fermilab Tevatron and Recycler accelerators as a means of measuring tunes noninvasively. The Schottky detector is based on the stochastic cooling pickups that were developed for the Fermilab Antiproton Source Debuncher cooling upgrade completed in 2002. These slotted line waveguide pickups have the advantage of large aperture coupled with high beam coupling characteristics. For stochastic cooling, wide bandwidths are integral to cooling performance. The bandwidth of slotted waveguide pickups can be tailored by choosing the length of the …
Date: March 18, 2011
Creator: Pasquinelli, Ralph J.; Jansson, Andreas; Jones, O. Rhodri & Caspers, Fritz
System: The UNT Digital Library
A Green Prison: Santa Rita Jail Creeps Towards Zero Net Energy (ZNE) (open access)

A Green Prison: Santa Rita Jail Creeps Towards Zero Net Energy (ZNE)

A large project is underway at Alameda County's twenty-year old 45 ha 4,000-inmate Santa Rita Jail, about 70 km east of San Francisco. Often described as a green prison, it has a considerable installed base of distributed energy resources including a seven-year old 1.2 MW PV array, a four-year old 1 MW fuel cell with heat recovery, and efficiency investments. A current US$14 M expansion will add approximately 2 MW of NaS batteries, and undetermined wind capacity and a concentrating solar thermal system. This ongoing effort by a progressive local government with considerable Federal and State support provides some excellent lessons for the struggle to lower building carbon footprint. The Distributed Energy Resources Customer Adoption Model (DER-CAM) finds true optimal combinations of equipment and operating schedules for microgrids that minimize energy bills and/or carbon emissions without 2 of 12 significant searching or rules-of-thumb prioritization, such as"efficiency first then on-site generation." The results often recommend complex systems, and sensitivities show how policy changes will affect choices. This paper reports an analysis of the historic performance of the PV system and fuel cell, describes the complex optimization applied to the battery scheduling, and shows how results will affect the jail's operational costs, …
Date: March 18, 2011
Creator: Marnay, Chris; DeForest, Nicholas; Stadler, Michael; Donadee, Jon; Dierckxsens, Carlos; Mendes, Goncalo et al.
System: The UNT Digital Library
Multi-jet Merging with NLO Matrix Elements (open access)

Multi-jet Merging with NLO Matrix Elements

In the algorithm presented here, the ME+PS approach to merge samples of tree-level matrix elements into inclusive event samples is combined with the POWHEG method, which includes exact next-to-leading order matrix elements in the parton shower. The advantages of the method are discussed and the quality of its implementation in SHERPA is exemplified by results for e{sup +}e{sup -} annihilation into hadrons at LEP, for deep-inelastic lepton-nucleon scattering at HERA, for Drell-Yan lepton-pair production at the Tevatron and for W{sup +}W{sup -}-production at LHC energies. The simulation of hard QCD radiation in parton-shower Monte Carlos has seen tremendous progress over the last years. It was largely stimulated by the need for more precise predictions at LHC energies where the large available phase space allows additional hard QCD radiation alongside known Standard Model processes or even signals from new physics. Two types of algorithms have been developed, which allow to improve upon the soft-collinear approximations made in the parton shower, such that hard radiation is simulated according to exact matrix elements. In the ME+PS approach [1] higher-order tree-level matrix elements for different final-state jet multiplicity are merged with each other and with subsequent parton shower emissions to generate an inclusive sample. …
Date: August 18, 2011
Creator: Siegert, Frank; Hoche, Stefan; Krauss, Frank & Schonherr, Marek
System: The UNT Digital Library
Nonequilibrium Thermoelectrics: Low-Cost, High-Performance Materials for Cooling and Power Generation (open access)

Nonequilibrium Thermoelectrics: Low-Cost, High-Performance Materials for Cooling and Power Generation

Thermoelectric materials can be made into coolers (TECs) that use electricity to develop a temperature difference, cooling something, or generators (TEGs) that convert heat directly to electricity. One application of TEGs is to place them in a waste heat stream to recuperate some of the power being lost and putting it to use more profitably. To be effective thermoelectrics, however, materials must have both high electrical conductivity and low thermal conductivity, a combination rarely found in nature. Materials selection and processing has led to the development of several systems with a figure of merit, ZT, of nearly unity. By using non-equilibrium techniques, we have fabricated higher efficiency thermoelectric materials. The process involves creating an amorphous material through melt spinning and then sintering it with either spark plasma or a hot press for as little as two minutes. This results in a 100% dense material with an extremely fine grain structure. The grain boundaries appear to retard phonons resulting in a reduced thermal conductivity while the electrons move through the material relatively unchecked. The techniques used are low-cost and scaleable to support industrial manufacturing.
Date: May 18, 2011
Creator: Li, Q.
System: The UNT Digital Library
Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System (open access)

Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System

The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that will contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn. NIF is operated by the Integrated Computer Control System (ICCS) in an object-oriented, CORBA-based system distributed among over 1800 frontend processors, embedded controllers and supervisory servers. In the fall of 2010, a set of experiments began with deuterium and tritium filled targets as part of the National Ignition Campaign (NIC). At present, all 192 laser beams routinely fire to target chamber center to conduct fusion and high energy density experiments. During the past year, the control system was expanded to include automation of cryogenic target system and over 20 diagnostic systems to support fusion experiments were deployed and utilized in experiments in the past year. This talk discusses the current status of the NIC and the plan for …
Date: March 18, 2011
Creator: Lagin, L.; Brunton, G.; Carey, R.; Demaret, R.; Fisher, J.; Fishler, B. et al.
System: The UNT Digital Library
Partition Coefficients of Organic Compounds in Four New Tetraalkylammonium Bis(trifluoromethylsulfonyl)imide Ionic Liquids Using Inverse Gas Chromatography (open access)

Partition Coefficients of Organic Compounds in Four New Tetraalkylammonium Bis(trifluoromethylsulfonyl)imide Ionic Liquids Using Inverse Gas Chromatography

This article discusses partition coefficients of organic compounds in four new tetraalkylammonium bis(trifluoromethylsulfonyl)imide ionic liquids using inverse gas chromatography.
Date: August 18, 2011
Creator: Acree, William E. (William Eugene); Baker, Gary A.; Mutelet, Fabrice & Moïse, Jean-Charles
System: The UNT Digital Library

The Synergy Between Qualitative Theory, Quantitative Calculations, and Direct Experiments in Understanding, Calculating, and Measuring the Energy Differences Between the Lowest Singlet and Triplet States of Organic Diradicals

Article discussing the synergy between qualitative theory, quantitative calculations, and direct experiments in understanding, calculating, and measuring the energy differences between the lowest singlet and triplet states of organic diradicals.
Date: April 18, 2012
Creator: Lineberger, W. Carl & Borden, Weston T.
System: The UNT Digital Library
Overview of the Physics and Engineering Design of NSTX Upgrade (open access)

Overview of the Physics and Engineering Design of NSTX Upgrade

None
Date: August 18, 2011
Creator: et. al, J M
System: The UNT Digital Library
Charmonium Production and Corona Effect (open access)

Charmonium Production and Corona Effect

None
Date: November 18, 2011
Creator: Digal, S; Satz, H & Vogt, R
System: The UNT Digital Library
Advanced Gated X-Ray Imagers for Experiments at the National Ignition Facility (open access)

Advanced Gated X-Ray Imagers for Experiments at the National Ignition Facility

None
Date: August 18, 2011
Creator: Glenn, S.; Bell, P.; Benedetti, L.; Bradley, D.; Celeste, J.; Heeter, R. et al.
System: The UNT Digital Library
Leveraging Structural Information for the Discovery of New Drugs - Computational Methods (open access)

Leveraging Structural Information for the Discovery of New Drugs - Computational Methods

None
Date: March 18, 2011
Creator: Nguyen, T. B.; Wong, S. E. & Lightstone, F. C.
System: The UNT Digital Library
Quantification of emotional bias by an emotional-gain model (open access)

Quantification of emotional bias by an emotional-gain model

Article accompanying a poster presentation for the 2011 Computational Neuroscience Annual Meeting. This article discusses the quantification of emotional bias by an emotional-gain model.
Date: July 18, 2011
Creator: Tam, Nicoladie D.
System: The UNT Digital Library
Quantification of fairness bias by a fairness-equity model (open access)

Quantification of fairness bias by a fairness-equity model

Article accompanying a poster presentation for the 2011 Computational Neuroscience Meeting. This article discusses the quantification of fairness bias by a fairness-equity model.
Date: July 18, 2011
Creator: Tam, Nicoladie D.
System: The UNT Digital Library
Gender difference of emotional bias in sharing love (open access)

Gender difference of emotional bias in sharing love

Article accompanying a poster presentation for the 2011 Computational Neuroscience Annual Meeting. This article discusses gender difference of emotional bias in sharing love.
Date: July 18, 2011
Creator: Tam, Nicoladie D.
System: The UNT Digital Library
Contributing factors in judgement of fairness by monetary value (open access)

Contributing factors in judgement of fairness by monetary value

Article accompanying a poster presentation for the 2011 Computational Neuroscience Annual Meeting. This article discusses research on contributing factors in the judgement of fairness by monetary value.
Date: July 18, 2011
Creator: Tam, Nicoladie D.
System: The UNT Digital Library
A Virtualized Computing Platform For Fusion Control Systems (open access)

A Virtualized Computing Platform For Fusion Control Systems

The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. 2,500 servers, 400 network devices and 700 terabytes of networked attached storage provide the foundation for NIF's Integrated Computer Control System (ICCS) and Experimental Data Archive. This talk discusses the rationale & benefits for server virtualization in the context of an operational experimental facility, the requirements discovery process used by the NIF teams to establish evaluation criteria for virtualization alternatives, the processes and procedures defined to enable virtualization of servers in a timeframe that did not delay the execution of experimental campaigns and the lessons the NIF teams learned along the way. The virtualization architecture ultimately selected for ICCS is based on the Open Source Xen computing platform and …
Date: March 18, 2011
Creator: Frazier, T.; Adams, P.; Fisher, J. & Talbot, A.
System: The UNT Digital Library
The Backstroke Framework for Source Level Reverse Computation Applied to Parallel Discrete Event Simulation (open access)

The Backstroke Framework for Source Level Reverse Computation Applied to Parallel Discrete Event Simulation

This report introduces Backstroke, a new open source framework for the automatic generation of reverse code for functions written in C++. Backstroke enables reverse computation for optimistic parallel discrete event simulations. It is built over the ROSE open- source compiler infrastructure, and handles complex C++ features including pointers and pointer types, arrays, function and method calls, class types. inheritance, polymorphism, virtual functions, abstract classes, templated classes and containers. Backstroke also introduces new program inversion techniques based on advanced compiler analysis tools built into ROSE. We explore and illustrate some of the complex language and semantic issues that arise in generating correct reverse code for C++ functions.
Date: July 18, 2011
Creator: Vulov, G.; Hou, C.; Quinlan, D.; Vuduc, R.; Fujimoto, R. & Jefferson, D.
System: The UNT Digital Library
Modeling pulsed-laser melting of embedded semiconductor nanoparticles (open access)

Modeling pulsed-laser melting of embedded semiconductor nanoparticles

Pulsed-laser melting (PLM) is commonly used to achieve a fast quench rate in both thin films and nanoparticles. A model for the size evolution during PLM of nanoparticles confined in a transparent matrix, such as those created by ion-beam synthesis, is presented. A self-consistent mean-field rate equations approach that has been used successfully to model ion beam synthesis of germanium nanoparticles in silica is extended to include the PLM process. The PLM model includes classical optical absorption, multiscale heat transport by both analytical and finite difference methods, and melting kinetics for confined nanoparticles. The treatment of nucleation and coarsening behavior developed for the ion beam synthesis model is modified to allow for a non-uniform temperature gradient and for interacting liquid and solid particles with different properties. The model allows prediction of the particle size distribution after PLM under various laser fluences, starting from any particle size distribution including as-implanted or annealed simulated samples. A route for narrowing the size distribution of embedded nanoparticles is suggested, with simulated distribution widths as low as 15% of the average size.
Date: May 18, 2011
Creator: Sawyer, C. A.; Guzman, J.; Boswell-Koller, C. N.; Sherburne, M. P.; Mastandrea, J. P.; Bustillo, K. C. et al.
System: The UNT Digital Library