Detection of volatile organic compounds using surface enhanced Raman scattering (open access)

Detection of volatile organic compounds using surface enhanced Raman scattering

The authors present the detection of volatile organic compounds directly in their vapor phase by surface-enhanced Raman scattering (SERS) substrates based on lithographically-defined two-dimensional rectangular array of nanopillars. The type of nanopillars is known as the tapered pillars. For the tapered pillars, SERS enhancement arises from the nanofocusing effect due to the sharp tip on top. SERS experiments were carried out on these substrates using various concentrations of toluene vapor. The results show that SERS signal from a toluene vapor concentration of ppm level can be achieved, and the toluene vapor can be detected within minutes of exposing the SERS substrate to the vapor. A simple adsorption model is developed which gives results matching the experimental data. The results also show promising potential for the use of these substrates in environmental monitoring of gases and vapors.
Date: March 22, 2012
Creator: Chang, A. S.; Maiti, A.; Ileri, N.; Bora, M.; Larson, C. C.; Britten, J. A. et al.
System: The UNT Digital Library
Estimating the Bias of Local Polynomial Approximations Using the Peano Kernel (open access)

Estimating the Bias of Local Polynomial Approximations Using the Peano Kernel

These presentation visuals define local polynomial approximations, give formulas for bias and random components of the error, and express bias error in terms of the Peano kernel. They further derive constants that give figures of merit, and show the figures of merit for 3 common weighting functions. The Peano kernel theorem yields estimates for the bias error for local-polynomial-approximation smoothing that are superior in several ways to the error estimates in the current literature.
Date: March 22, 2012
Creator: Blair, J., and Machorro, E.
System: The UNT Digital Library
Micro-Analysis of Actinide Minerals for Nuclear Forensics and Treaty Verification (open access)

Micro-Analysis of Actinide Minerals for Nuclear Forensics and Treaty Verification

Micro-Raman spectroscopy has been demonstrated to be a viable tool for nondestructive determination of the crystal phase of relevant minerals. Collecting spectra on particles down to 5 microns in size was completed. Some minerals studied were weak scatterers and were better studied with the other techniques. A decent graphical software package should easily be able to compare collected spectra to a spectral library as well as subtract out matrix vibration peaks. Due to the success and unequivocal determination of the most common mineral false positive (zircon), it is clear that Raman has a future for complementary, rapid determination of unknown particulate samples containing actinides.
Date: March 22, 2012
Creator: M. Morey, M. Manard, R. Russo, G. Havrilla
System: The UNT Digital Library
An Adaptive Particle Filtering Approach to Tracking Modes in a Varying Shallow Ocean Environment (open access)

An Adaptive Particle Filtering Approach to Tracking Modes in a Varying Shallow Ocean Environment

The shallow ocean environment is ever changing mostly due to temperature variations in its upper layers (< 100m) directly affecting sound propagation throughout. The need to develop processors that are capable of tracking these changes implies a stochastic as well as an 'adaptive' design. The stochastic requirement follows directly from the multitude of variations created by uncertain parameters and noise. Some work has been accomplished in this area, but the stochastic nature was constrained to Gaussian uncertainties. It has been clear for a long time that this constraint was not particularly realistic leading a Bayesian approach that enables the representation of any uncertainty distribution. Sequential Bayesian techniques enable a class of processors capable of performing in an uncertain, nonstationary (varying statistics), non-Gaussian, variable shallow ocean. In this paper adaptive processors providing enhanced signals for acoustic hydrophonemeasurements on a vertical array as well as enhanced modal function estimates are developed. Synthetic data is provided to demonstrate that this approach is viable.
Date: March 22, 2011
Creator: Candy, J V
System: The UNT Digital Library
BioPig: Developing Cloud Computing Applications for Next-Generation Sequence Analysis (open access)

BioPig: Developing Cloud Computing Applications for Next-Generation Sequence Analysis

Next Generation sequencing is producing ever larger data sizes with a growth rate outpacing Moore's Law. The data deluge has made many of the current sequenceanalysis tools obsolete because they do not scale with data. Here we present BioPig, a collection of cloud computing tools to scale data analysis and management. Pig is aflexible data scripting language that uses Apache's Hadoop data structure and map reduce framework to process very large data files in parallel and combine the results.BioPig extends Pig with capability with sequence analysis. We will show the performance of BioPig on a variety of bioinformatics tasks, including screeningsequence contaminants, Illumina QA/QC, and gene discovery from metagenome data sets using the Rumen metagenome as an example.
Date: March 22, 2011
Creator: Bhatia, Karan & Wang, Zhong
System: The UNT Digital Library
Carrots and Sticks: A Comprehensive Business Model for the Successful Achievement of Energy Efficiency Resource Standards Environmental Energy Technologies DivisionMarch 2011 (open access)

Carrots and Sticks: A Comprehensive Business Model for the Successful Achievement of Energy Efficiency Resource Standards Environmental Energy Technologies DivisionMarch 2011

Energy efficiency resource standards (EERS) are a prominent strategy to potentially achieve rapid and aggressive energy savings goals in the U.S. As of December 2010, twenty-six U.S. states had some form of an EERS with savings goals applicable to energy efficiency (EE) programs paid for by utility customers. The European Union has initiated a similar type of savings goal, the Energy End-use Efficiency and Energy Services Directive, where it is being implemented in some countries through direct partnership with regulated electric utilities. U.S. utilities face significant financial disincentives under traditional regulation which affects the interest of shareholders and managers in aggressively pursuing cost-effective energy efficiency. Regulators are considering some combination of mandated goals ('sticks') and alternative utility business model components ('carrots' such as performance incentives) to align the utility's business and financial interests with state and federal energy efficiency public policy goals. European countries that have directed their utilities to administer EE programs have generally relied on non-binding mandates and targets; in the U.S., most state regulators have increasingly viewed 'carrots' as a necessary condition for successful achievement of energy efficiency goals and targets. In this paper, we analyze the financial impacts of an EERS on a large electric utility …
Date: March 22, 2011
Creator: Satchwell, Andrew; Cappers, Peter & Goldman, Charles
System: The UNT Digital Library
Efficient Graph Based Assembly of Short-Read Sequences on Hybrid Core Architecture (open access)

Efficient Graph Based Assembly of Short-Read Sequences on Hybrid Core Architecture

Advanced architectures can deliver dramatically increased throughput for genomics and proteomics applications, reducing time-to-completion in some cases from days to minutes. One such architecture, hybrid-core computing, marries a traditional x86 environment with a reconfigurable coprocessor, based on field programmable gate array (FPGA) technology. In addition to higher throughput, increased performance can fundamentally improve research quality by allowing more accurate, previously impractical approaches. We will discuss the approach used by Convey?s de Bruijn graph constructor for short-read, de-novo assembly. Bioinformatics applications that have random access patterns to large memory spaces, such as graph-based algorithms, experience memory performance limitations on cache-based x86 servers. Convey?s highly parallel memory subsystem allows application-specific logic to simultaneously access 8192 individual words in memory, significantly increasing effective memory bandwidth over cache-based memory systems. Many algorithms, such as Velvet and other de Bruijn graph based, short-read, de-novo assemblers, can greatly benefit from this type of memory architecture. Furthermore, small data type operations (four nucleotides can be represented in two bits) make more efficient use of logic gates than the data types dictated by conventional programming models.JGI is comparing the performance of Convey?s graph constructor and Velvet on both synthetic and real data. We will present preliminary results on …
Date: March 22, 2011
Creator: Sczyrba, Alex; Pratap, Abhishek; Canon, Shane; Han, James; Copeland, Alex; Wang, Zhong et al.
System: The UNT Digital Library
Evaluation of evolving residential electricity tariffs (open access)

Evaluation of evolving residential electricity tariffs

Residential customers in California's Pacific Gas and Electric (PG&E) territory have seen several electricity rate structure changes in the past decade. A relatively simple two-tiered pricing system (charges by usage under/over baseline for the home's climate zone) was replaced in the summer of 2001 by a more complicated five-tiered system (usage below baseline and up to 30percent, 100percent, 200percent, and 300percent+ over baseline). In 2009, PG&E began the process of upgrading its residential customers to Smart Meters and laying the groundwork for time of use pricing, due to start in 2011. This paper examines the history of the tiered pricing system, discusses the problems the utility encountered with its Smart Meter roll out, and evaluates the proposed dynamic pricing incentive structures. Scenario analyses of example PG&E customer bills will also be presented. What would these residential customers pay if they were still operating under a tiered structure, and/or if they participated in peak hour reductions?
Date: March 22, 2011
Creator: Lai, Judy; DeForest, Nicholas; Kiliccote, Sila; Stadler, Michael; Marnay, Chris & Donadee, Jon
System: The UNT Digital Library
Laser Wakefield Acceleration Beyond 1 Gev Using Ionization Induced Injection* (open access)

Laser Wakefield Acceleration Beyond 1 Gev Using Ionization Induced Injection*

A series of laser wake field accelerator experiments leading to electron energy exceeding 1 GeV are described. Theoretical concepts and experimental methods developed while conducting experiments using the 10 TW Ti:Sapphire laser at UCLA were implemented and transferred successfully to the 100 TW Callisto Laser System at the Jupiter Laser Facility at LLNL. To reach electron energies greater than 1 GeV with current laser systems, it is necessary to inject and trap electrons into the wake and to guide the laser for more than 1 cm of plasma. Using the 10 TW laser, the physics of self-guiding and the limitations in regards to pump depletion over cm-scale plasmas were demonstrated. Furthermore, a novel injection mechanism was explored which allows injection by ionization at conditions necessary for generating electron energies greater than a GeV. The 10 TW results were followed by self-guiding at the 100 TW scale over cm plasma lengths. The energy of the self-injected electrons, at 3 x 10{sup 18} cm{sup -3} plasma density, was limited by dephasing to 720 MeV. Implementation of ionization injection allowed extending the acceleration well beyond a centimeter and 1.4 GeV electrons were measured.
Date: March 22, 2011
Creator: Marsh, K. A.; Clayton, C. E.; Joshi, C.; Lu, W.; Mori, W. B.; Pak, A. et al.
System: The UNT Digital Library
A Modified Treatment of Sources in Implicit Monte Carlo Radiation Transport (open access)

A Modified Treatment of Sources in Implicit Monte Carlo Radiation Transport

We describe a modification of the treatment of photon sources in the IMC algorithm. We describe this modified algorithm in the context of thermal emission in an infinite medium test problem at equilibrium and show that it completely eliminates statistical noise.
Date: March 22, 2011
Creator: Gentile, N A & Trahan, T J
System: The UNT Digital Library
Multi-language Struct Support in Babel (open access)

Multi-language Struct Support in Babel

Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture (CCA) it is used in a wide range of research projects. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in scientific APIs; they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific code can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++ and Fortran 2003, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods.
Date: March 22, 2011
Creator: Ebner, D; Prantl, A & Epperly, T W
System: The UNT Digital Library
Overview of the LBNE Neutrino Beam (open access)

Overview of the LBNE Neutrino Beam

The Long Baseline Neutrino Experiment (LBNE) will utilize a neutrino beamline facility located at Fermilab. The facility is designed to aim a beam of neutrinos toward a detector placed at the Deep Underground Science and Engineering Laboratory (DUSEL) in South Dakota. The neutrinos are produced in a three-step process. First, protons from the Main Injector hit a solid target and produce mesons. Then, the charged mesons are focused by a set of focusing horns into the decay pipe, towards the far detector. Finally, the mesons that enter the decay pipe decay into neutrinos. The parameters of the facility were determined by an amalgam of the physics goals, the Monte Carlo modeling of the facility, and the experience gained by operating the NuMI facility at Fermilab. The initial beam power is expected to be {approx}700 kW, however some of the parameters were chosen to be able to deal with a beam power of 2.3 MW.
Date: March 22, 2011
Creator: Moore, C. D.; He, Yun; Hurh, Patrick; Hylen, James; Lundberg, Byron; McGee, Mike et al.
System: The UNT Digital Library
Bacillus anthracis genome organization in light of whole transcriptome sequencing (open access)

Bacillus anthracis genome organization in light of whole transcriptome sequencing

Emerging knowledge of whole prokaryotic transcriptomes could validate a number of theoretical concepts introduced in the early days of genomics. What are the rules connecting gene expression levels with sequence determinants such as quantitative scores of promoters and terminators? Are translation efficiency measures, e.g. codon adaptation index and RBS score related to gene expression? We used the whole transcriptome shotgun sequencing of a bacterial pathogen Bacillus anthracis to assess correlation of gene expression level with promoter, terminator and RBS scores, codon adaptation index, as well as with a new measure of gene translational efficiency, average translation speed. We compared computational predictions of operon topologies with the transcript borders inferred from RNA-Seq reads. Transcriptome mapping may also improve existing gene annotation. Upon assessment of accuracy of current annotation of protein-coding genes in the B. anthracis genome we have shown that the transcriptome data indicate existence of more than a hundred genes missing in the annotation though predicted by an ab initio gene finder. Interestingly, we observed that many pseudogenes possess not only a sequence with detectable coding potential but also promoters that maintain transcriptional activity.
Date: March 22, 2010
Creator: Martin, Jeffrey; Zhu, Wenhan; Passalacqua, Karla D.; Bergman, Nicholas & Borodovsky, Mark
System: The UNT Digital Library
Building Energy Information Systems: User Case Studies (open access)

Building Energy Information Systems: User Case Studies

Measured energy performance data are essential to national efforts to improve building efficiency, as evidenced in recent benchmarking mandates, and in a growing body of work that indicates the value of permanent monitoring and energy information feedback. This paper presents case studies of energy information systems (EIS) at four enterprises and university campuses, focusing on the attained energy savings, and successes and challenges in technology use and integration. EIS are broadly defined as performance monitoring software, data acquisition hardware, and communication systems to store, analyze and display building energy information. Case investigations showed that the most common energy savings and instances of waste concerned scheduling errors, measurement and verification, and inefficient operations. Data quality is critical to effective EIS use, and is most challenging at the subsystem or component level, and with non-electric energy sources. Sophisticated prediction algorithms may not be well understood but can be applied quite effectively, and sites with custom benchmark models or metrics are more likely to perform analyses external to the EIS. Finally, resources and staffing were identified as a universal challenge, indicating a need to identify additional models of EIS use that extend beyond exclusive in-house use, to analysis services.
Date: March 22, 2010
Creator: Granderson, Jessica; Piette, Mary Ann & Ghatikar, Girish
System: The UNT Digital Library
Nebular mixing constrained by the Stardust samples (open access)

Nebular mixing constrained by the Stardust samples

Using X-ray microprobe analysis of samples from comet Wild 2 returned by the Stardust mission, we determine that the crystalline Fe-bearing silicate fraction in this Jupiter-family comet is greater than 0.5. Assuming this mixture is a composite of crystalline inner solar system material and amorphous cold molecular cloud material, we deduce that more than half of Wild 2 has been processed in the inner solar system. Several models exist that explain the presence of crystalline materials in comets. We explore some of these models in light of our results.
Date: March 22, 2010
Creator: OGLIORE, R. C.; WESTPHAL, A. J.; GAINSFORTH, Z.; BUTTERWORTH, A. L.; FAKRA, S. C. & Marcus, Matthew A.
System: The UNT Digital Library
A Proposal for User-defined Reductions in OpenMP (open access)

A Proposal for User-defined Reductions in OpenMP

Reductions are commonly used in parallel programs to produce a global result from partial results computed in parallel. Currently, OpenMP only supports reductions for primitive data types and a limited set of base language operators. This is a significant limitation for those applications that employ user-defined data types (e. g., objects). Implementing manual reduction algorithms makes software development more complex and error-prone. Additionally, an OpenMP runtime system cannot optimize a manual reduction algorithm in ways typically applied to reductions on primitive types. In this paper, we propose new mechanisms to allow the use of most pre-existing binary functions on user-defined data types as User-Defined Reduction (UDR) operators. Our measurements show that our UDR prototype implementation provides consistently good performance across a range of thread counts without increasing general runtime overheads.
Date: March 22, 2010
Creator: Duran, A; Ferrer, R; Klemm, M; de Supinski, B R & Ayguade, E
System: The UNT Digital Library
Thin Silicon MEMS Contact-Stress Sensor (open access)

Thin Silicon MEMS Contact-Stress Sensor

This thin, MEMS contact-stress (CS) sensor continuously and accurately measures time-varying, solid interface loads in embedded systems over tens of thousands of load cycles. Unlike all other interface load sensors, the CS sensor is extremely thin (< 150 {micro}m), provides accurate, high-speed measurements, and exhibits good stability over time with no loss of calibration with load cycling. The silicon CS sensor, 5 mm{sup 2} and 65 {micro}m thick, has piezoresistive traces doped within a load-sensitive diaphragm. The novel package utilizes several layers of flexible polyimide to mechanically and electrically isolate the sensor from the environment, transmit normal applied loads to the diaphragm, and maintain uniform thickness. The CS sensors have a highly linear output in the load range tested (0-2.4 MPa) with an average accuracy of {+-} 1.5%.
Date: March 22, 2010
Creator: Kotovsky, J; Tooker, A & Horsley, D
System: The UNT Digital Library
Towards an Error Model for OpenMP (open access)

Towards an Error Model for OpenMP

OpenMP lacks essential features for developing mission-critical software. In particular, it has no support for detecting and handling errors or even a concept of them. In this paper, the OpenMP Error Model Subcommittee reports on solutions under consideration for this major omission. We identify issues with the current OpenMP specification and propose a path to extend OpenMP with error-handling capabilities. We add a construct that cleanly shuts down parallel regions as a first step. We then discuss two orthogonal proposals that extend OpenMP with features to handle system-level and user-defined errors.
Date: March 22, 2010
Creator: Wong, M.; Klemm, M.; Duran, A.; Mattson, T.; Haab, G.; de Supinski, B. R. et al.
System: The UNT Digital Library
Development of a Cell-Based Fluorescence Resonance Energy Transfer Reporter for Bacillus anthracis Lethal Factor Protease (open access)

Development of a Cell-Based Fluorescence Resonance Energy Transfer Reporter for Bacillus anthracis Lethal Factor Protease

We report the construction of a cell-based fluorescent reporter for anthrax lethal factor (LF) protease activity using the principle of fluorescence resonance energy transfer (FRET). This was accomplished by engineering an Escherichia coli cell line to express a genetically encoded FRET reporter and LF protease. Both proteins were encoded in two different expression plasmids under the control of different tightly controlled inducible promoters. The FRET-based reporter was designed to contain a LF recognition sequence flanked by the FRET pair formed by CyPet and YPet fluorescent proteins. The length of the linker between both fluorescent proteins was optimized using a flexible peptide linker containing several Gly-Gly-Ser repeats. Our results indicate that this FRET-based LF reporter was readily expressed in E. coli cells showing high levels of FRET in vivo in the absence of LF. The FRET signal, however, decreased 5 times after inducing LF expression in the same cell. These results suggest that this cell-based LF FRET reporter may be used to screen genetically encoded libraries in vivo against LF.
Date: March 22, 2007
Creator: Kimura, R H; Steenblock, E R & Camarero, J A
System: The UNT Digital Library
Dose, exposure time, and resolution in Serial X-ray Crystallography (open access)

Dose, exposure time, and resolution in Serial X-ray Crystallography

Using detailed simulation and analytical models, the exposure time is estimated for serial crystallography, where hydrated laser-aligned proteins are sprayed across a continuous synchrotron beam. The resolution of X-ray diffraction microscopy is limited by the maximum dose that can be delivered prior to sample damage. In the proposed Serial Crystallography method, the damage problem is addressed by distributing the total dose over many identical hydrated macromolecules running continuously in a single-file train across a continuous X-ray beam, and resolution is then limited only by the available fluxes of molecules and X-rays. Orientation of the diffracting molecules is achieved by laser alignment. We evaluate the incident X-ray fluence (energy/area) required to obtain a given resolution from (1) an analytical model, giving the count rate at the maximum scattering angle for a model protein, (2) explicit simulation of diffraction patterns for a GroEL-GroES protein complex, and (3) the frequency cut off of the transfer function following iterative solution of the phase problem, and reconstruction of a density map in the projection approximation. These calculations include counting shot noise and multiple starts of the phasing algorithm. The results indicate the number of proteins needed within the beam at any instant for a given …
Date: March 22, 2007
Creator: Starodub, D.; Rez, P.; Hembree, G.; Howells, M.; Shapiro, D.; Chapman, H. N. et al.
System: The UNT Digital Library
FULL-SCALE TREATMENT WETLANDS FOR METAL REMOVAL FROM INDUSTRIAL WASTEWATER (open access)

FULL-SCALE TREATMENT WETLANDS FOR METAL REMOVAL FROM INDUSTRIAL WASTEWATER

The A-01 NPDES outfall at the Savannah River Site receives process wastewater discharges and stormwater runoff from the Savannah River National Laboratory. Routine monitoring indicated that copper concentrations were regularly higher than discharge permit limit, and water routinely failed toxicity tests. These conditions necessitated treatment of nearly one million gallons of water per day plus storm runoff. Washington Savannah River Company personnel explored options to bring process and runoff waters into compliance with the permit conditions, including source reduction, engineering solutions, and biological solutions. A conceptual design for a constructed wetland treatment system (WTS) was developed and the full-scale system was constructed and began operation in 2000. The overall objective of our research is to better understand the mechanisms of operation of the A-01 WTS in order to provide better input to design of future systems. The system is a vegetated surface flow wetland with a hydraulic retention time of approximately 48 hours. Copper, mercury, and lead removal efficiencies are very high, all in excess of 80% removal from water passing through the wetland system. Zinc removal is 60%, and nickel is generally unaffected. Dissolved organic carbon in the water column is increased by the system and reduces toxicity of …
Date: March 22, 2007
Creator: Nelson, E & John Gladden, J
System: The UNT Digital Library
THE USE OF DIGITAL RADIOGRAPHY IN THE EVALUATION OF RADIOACTIVE MATERIALS PACKAGING PERFORMANCE TESTING (open access)

THE USE OF DIGITAL RADIOGRAPHY IN THE EVALUATION OF RADIOACTIVE MATERIALS PACKAGING PERFORMANCE TESTING

New designs of radioactive material shipping packages are required to be evaluated in accordance with 10 CFR Part 71, ''Packaging and Transportation of Radioactive Material''. This paper will discuss the use of digital radiography to evaluate the effects of the tests required by 10 CFR 71.71, Normal Conditions of Transport (NCT), and 10 CFR 71.73, Hypothetical Accident Conditions (HAC). One acceptable means of evaluating packaging performance is to subject packagings to the series of NCT and HAC tests. The evaluation includes a determination of the effect on the packaging by the conditions and tests. That determination has required that packagings be cut and sectioned to learn the actual effects on internal components. Digital radiography permits the examination of internal packaging components without sectioning a package. This allows a single package to be subjected to a series of tests. After each test, the package is digitally radiographed and the effects of particular tests evaluated. Radiography reduces the number of packages required for testing and also reduces labor and materials required to section and evaluate numerous packages. This paper will include a description of the digital radiography equipment used in the testing and evaluation of the 9977 and 9978 packages at SRNL. …
Date: March 22, 2007
Creator: May, C; Lawrence Gelder, L & Boyd Howard, B
System: The UNT Digital Library
A comparison of drive mechanisms for precision motion controlled stages (open access)

A comparison of drive mechanisms for precision motion controlled stages

This abstract presents a comparison of two drive mechanisms, a Rohlix{reg_sign} drive and a polymer nut drive, for precision motion controlled stages. A single-axis long-range stage with a 50 mm traverse combined with a short-range stage with a 16 {micro}m traverse at a operational bandwidth of 2.2 kHz were developed to evaluate the performance of the drives. The polymer nut and Rohlix{reg_sign} drives showed 4 nm RMS and 7 nm RMS positioning capabilities respectively, with traverses of 5 mm at a maximum velocity of 0.15 mm{sup -}s{sup -1} with the short range stage operating at a 2.2 kHz bandwidth. Further results will be presented in the subsequent sections.
Date: March 22, 2006
Creator: Buice, E. S.; Yang, H.; Otten, D.; Smith, S. T.; Hocken, R. J.; Trumper, D. L. et al.
System: The UNT Digital Library
Considerations of the Role of the Cathodic Region in Localized Corrosion (open access)

Considerations of the Role of the Cathodic Region in Localized Corrosion

None
Date: March 22, 2006
Creator: Argarwal, A.; Landau, U.; Payer, J.H.; Kelly, R.G.; Cui, F. & Presuel-Moreno, F.J.
System: The UNT Digital Library