A Chemical Kinetic Modeling Study of the Effects of Oxygenated Hydrocarbons on Soot Emissions from Diesel Engines (open access)

A Chemical Kinetic Modeling Study of the Effects of Oxygenated Hydrocarbons on Soot Emissions from Diesel Engines

A detailed chemical kinetic modeling approach is used to examine the phenomenon of suppression of sooting in diesel engines by addition of oxygenated hydrocarbon species to the fuel. This suppression, which has been observed experimentally for a few years, is explained kinetically as a reduction in concentrations of soot precursors present in the hot products of a fuel-rich diesel ignition zone when oxygenates are included. Oxygenates decrease the overall equivalence ratio of the igniting mixture, producing higher ignition temperatures and more radical species to consume more soot precursor species, leading to lower soot production. The kinetic model is also used to show how different oxygenates, ester structures in particular, can have different soot-suppression efficiencies due to differences in molecular structure of the oxygenated species.
Date: November 14, 2005
Creator: Westbrook, C K; Pitz, W J & Curran, H J
Object Type: Article
System: The UNT Digital Library
Is Climate Change Predictable? Really? (open access)

Is Climate Change Predictable? Really?

This project is the first application of a completely different approach to climate modeling, in which new prognostic equations are used to directly compute the evolution of two-point correlations. This project addresses three questions that are critical for the credibility of the science base for climate prediction: (1) What is the variability spectrum at equilibrium? (2) What is the rate of relaxation when subjected to external perturbations? (3) Can variations due to natural processes be distinguished from those due to transient external forces? The technical approach starts with the evolution equation for the probability distribution function and arrives at a prognostic equation for ensemble-mean two-point correlations, bypassing the detailed weather calculation. This work will expand our basic understanding of the theoretical limits of climate prediction and stimulate new experiments to perform with conventional climate models. It will furnish statistical estimates that are inaccessible with conventional climate simulations and likely will raise important new questions about the very nature of climate change and about how (and whether) climate change can be predicted. Solid progress on such issues is vital to the credibility of the science base for climate change research and will provide policymakers evaluating tradeoffs among energy technology options and …
Date: November 14, 2005
Creator: Dannevik, W P & Rotman, D A
Object Type: Report
System: The UNT Digital Library
Macroscopic Subdivision of Silica Aerogel Collectors for Sample Return Missions (open access)

Macroscopic Subdivision of Silica Aerogel Collectors for Sample Return Missions

Silica aerogel collector tiles have been employed for the collection of particles in low Earth orbit and, more recently, for the capture of cometary particles by NASA's Stardust mission. Reliable, reproducible methods for cutting these and future collector tiles from sample return missions are necessary to maximize the science output from the extremely valuable embedded particles. We present a means of macroscopic subdivision of collector tiles by generating large-scale cuts over several centimeters in silica aerogel with almost no material loss. The cut surfaces are smooth and optically clear allowing visual location of particles for analysis and extraction. This capability is complementary to the smaller-scale cutting capabilities previously described [Westphal (2004), Ishii (2005a, 2005b)] for removing individual impacts and particulate debris in tiny aerogel extractions. Macroscopic cuts enable division and storage or distribution of portions of aerogel tiles for immediate analysis of samples by certain techniques in situ or further extraction of samples suited for other methods of analysis.
Date: September 14, 2005
Creator: Ishii, H A & Bradley, J P
Object Type: Article
System: The UNT Digital Library
The Potential of the Cell Processor for Scientific Computing (open access)

The Potential of the Cell Processor for Scientific Computing

The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of the using the forth coming STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. We are the first to present quantitative Cell performance data on scientific kernels and show direct comparisons against leading superscalar (AMD Opteron), VLIW (IntelItanium2), and vector (Cray X1) architectures. Since neither Cell hardware nor cycle-accurate simulators are currently publicly available, we develop both analytical models and simulators to predict kernel performance. Our work also explores the complexity of mapping several important scientific algorithms onto the Cells unique architecture. Additionally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.
Date: October 14, 2005
Creator: Williams, Samuel; Shalf, John; Oliker, Leonid; Husbands, Parry; Kamil, Shoaib & Yelick, Katherine
Object Type: Article
System: The UNT Digital Library
Gene Transfer & Hybridization Studies in Hyperthermophilic Species (open access)

Gene Transfer & Hybridization Studies in Hyperthermophilic Species

A. ABSTRACT The importance of lateral gene transfer (LGT) in the evolution of microbial species has become increasingly evident with each completed microbial genome sequence. Most significantly, the genome of Thermotoga maritima MSB8, a hyperthermophilic bacterium isolated by Karl Stetter and workers from Vulcano Italy in 1986, and sequenced at The Institute for Genomic Research (TIGR) in Rockville Maryland in 1999, revealed extensive LGT between % . this bacterium and members of the archaeal domain (in particular Archaeoglobus fulgidus, and Pyracoccus frcriosus species). Based on whole genome comparisons, it was estimated that 24% of the genetic information in this organism was acquired by genetic exchange with archaeal species, Independent analyses including periodicity analysis of the T. maritimu genomic DNA sequence, phylogenetic reconstruction based on genes that appear archaeal-like, and codon and amino acid usage, have provided additional evidence for LGT between T. maritima and the archaea. More recently, DiRuggiero and workers have identified a very recent LGT event between two genera of hyperthermophilic archaea, where a nearly identical DNA fragment of 16 kb in length flanked by insertion sequence (IS) elements, exists. Undoubtedly, additional examples of LGT will be identified as more microbial genomes are completed. For the present moment …
Date: October 14, 2005
Creator: Nelson, Karen E.
Object Type: Report
System: The UNT Digital Library
The Crystal Structures of EAP Domains from Staphylococcus aureus Reveal an Unexpected Homology to Bacterial Superantigens (open access)

The Crystal Structures of EAP Domains from Staphylococcus aureus Reveal an Unexpected Homology to Bacterial Superantigens

The Eap (extracellular adherence protein) of Staphylococcus aureus functions as a secreted virulence factor by mediating interactions between the bacterial cell surface and several extracellular host proteins. Eap proteins from different Staphylococcal strains consist of four to six tandem repeats of a structurally uncharacterized domain (EAP domain). We have determined the three-dimensional structures of three different EAP domains to 1.8, 2.2, and 1.35 {angstrom} resolution, respectively. These structures reveal a core fold that is comprised of an {alpha}-helix lying diagonally across a five-stranded, mixed {beta}-sheet. Comparison of EAP domains with known structures reveals an unexpected homology with the C-terminal domain of bacterial superantigens. Examination of the structure of the superantigen SEC2 bound to the {beta}-chain of a T-cell receptor suggests a possible ligand-binding site within the EAP domain (Fields, B. A., Malchiodi, E. L., Li, H., Ysern, X., Stauffacher, C. V., Schlievert, P. M., Karjalainen, K., and Mariuzza, R. (1996) Nature 384, 188-192). These results provide the first structural characterization of EAP domains, relate EAP domains to a large class of bacterial toxins, and will guide the design of future experiments to analyze EAP domain structure/function relationships.
Date: October 14, 2005
Creator: Geisbrecht, B V; Hamaoka, B Y; Perman, B; Zemla, A & Leahy, D J
Object Type: Article
System: The UNT Digital Library
Comparative genome analysis of Bacillus cereus group genomes withBacillus subtilis (open access)

Comparative genome analysis of Bacillus cereus group genomes withBacillus subtilis

Genome features of the Bacillus cereus group genomes (representative strains of Bacillus cereus, Bacillus anthracis and Bacillus thuringiensis sub spp israelensis) were analyzed and compared with the Bacillus subtilis genome. A core set of 1,381 protein families among the four Bacillus genomes, with an additional set of 933 families common to the B. cereus group, was identified. Differences in signal transduction pathways, membrane transporters, cell surface structures, cell wall, and S-layer proteins suggesting differences in their phenotype were identified. The B. cereus group has signal transduction systems including a tyrosine kinase related to two-component system histidine kinases from B. subtilis. A model for regulation of the stress responsive sigma factor sigmaB in the B. cereus group different from the well studied regulation in B. subtilis has been proposed. Despite a high degree of chromosomal synteny among these genomes, significant differences in cell wall and spore coat proteins that contribute to the survival and adaptation in specific hosts has been identified.
Date: September 14, 2005
Creator: Anderson, Iain; Sorokin, Alexei; Kapatral, Vinayak; Reznik, Gary; Bhattacharya, Anamitra; Mikhailova, Natalia et al.
Object Type: Article
System: The UNT Digital Library
SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U) (open access)

SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) …
Date: December 14, 2005
Creator: Lee, R. C. & McHood, M. D.
Object Type: Report
System: The UNT Digital Library
Development of a large aperture Nb3Sn racetrack quadrupolemagnet (open access)

Development of a large aperture Nb3Sn racetrack quadrupolemagnet

The U.S. LHC Accelerator Research Program (LARP), a collaboration between BNL, FNAL, LBNL, and SLAC, has among its major objectives the development of advanced magnet technology for an LHC luminosity upgrade. The LBNL Superconducting Magnet Group supports this program with a broad effort involving design studies, Nb{sub 3}Sn conductor development, mechanical models, and basic prototypes. This paper describes the development of a large aperture Nb{sub 3}Sn racetrack quadrupole magnet using four racetrack coils from the LBNL Subscale Magnet (SM) Program. The magnet provides a gradient of 95 T/m in a 110 mm bore, with a peak field in the conductor of 11.2 T. The coils are prestressed by a mechanical structure based on a pre-tensioned aluminum shell, and axially supported with aluminum rods. The mechanical behavior has been monitored with strain gauges and the magnetic field has been measured. Results of the test are reported and analyzed.
Date: April 14, 2005
Creator: Ferracin, Paolo; Bartlett, Scott E.; Caspi, Shlomo; Dietderich,Daniel R.; Gourlay, Steven A.; Hannaford, Charles R. et al.
Object Type: Article
System: The UNT Digital Library

Experimental Approaches to Predict the Behavior of Liquid Films

None
Date: October 14, 2005
Creator: Palmer, D. A.
Object Type: Presentation
System: The UNT Digital Library
Optimization of Carbon Coatings on LiFePO4 (open access)

Optimization of Carbon Coatings on LiFePO4

The electrochemical performance of LiFePO{sub 4} in lithium cells is strongly dependent on the structure (disordered/graphene or D/G ratio) of the in situ carbon produced during synthesis from carbon-containing precursors. Addition of pyromellitic acid (PA) prior to final calcination results in lower D/G ratios, yielding a higher-rate material. Further, improvements in electrochemical performance are realized when graphitization catalysts such as ferrocene are also added during LiFePO{sub 4} preparation, although overall carbon content is still less than 2 wt.%.
Date: July 14, 2005
Creator: Doeff, Marca M.; Wilcox, James D.; Kostecki, Robert & Lau, Grace
Object Type: Article
System: The UNT Digital Library
Dynamic Data-Driven Event Reconstruction for Atmospheric Releases (open access)

Dynamic Data-Driven Event Reconstruction for Atmospheric Releases

This is a collaborative LDRD Exploratory Research project involving four directorates--Energy & Environment, Engineering, NAI and Computation. The project seeks to answer the following critical questions regarding atmospheric releases--''How much material was released? When? Where? and What are the potential consequences?'' Inaccurate estimation of the source term can lead to gross errors, time delays during a crisis, and even fatalities. We are developing a capability that seamlessly integrates observational data streams with predictive models in order to provide the best possible estimates of unknown source term parameters, as well as optimal and timely situation analyses consistent with both models and data. Our approach utilizes Bayesian inference and stochastic sampling methods (Markov Chain and Sequential Monte Carlo) to reformulate the inverse problem into a solution based on efficient sampling of an ensemble of predictive simulations, guided by statistical comparisons with data. We are developing a flexible and adaptable data-driven event-reconstruction capability for atmospheric releases that provides (1) quantitative probabilistic estimates of the principal source-term parameters (e.g., the time-varying release rate and location); (2) predictions of increasing fidelity as an event progresses and additional data become available; and (3) analysis tools for sensor network design and uncertainty studies. Our computational framework incorporates …
Date: March 14, 2005
Creator: Mirin, A; Serban, R & Kosovic, B
Object Type: Report
System: The UNT Digital Library
Evaluation of Nevada Test Site Ground Motion and Rock Property Data to Bound Ground Motions at the Yucca Mountain Repository (open access)

Evaluation of Nevada Test Site Ground Motion and Rock Property Data to Bound Ground Motions at the Yucca Mountain Repository

Yucca Mountain licensing will require estimation of ground motions from probabilistic seismic hazard analyses (PSHA) with annual probabilities of exceedance on the order of 10{sup -6} to 10{sup -7} per year or smaller, which correspond to much longer earthquake return periods than most previous PSHA studies. These long return periods for the Yucca Mountain PSHA result in estimates of ground motion that are extremely high ({approx} 10 g) and that are believed to be physically unrealizable. However, there is at present no generally accepted method to bound ground motions either by showing that the physical properties of materials cannot maintain such extreme motions, or the energy release by the source for such large motions is physically impossible. The purpose of this feasibility study is to examine recorded ground motion and rock property data from nuclear explosions to determine its usefulness for studying the ground motion from extreme earthquakes. The premise is that nuclear explosions are an extreme energy density source, and that the recorded ground motion will provide useful information about the limits of ground motion from extreme earthquakes. The data were categorized by the source and rock properties, and evaluated as to what extent non-linearity in the material has …
Date: February 14, 2005
Creator: Hutchings, L. J.; Foxall, W.; Rambo, J. & Wagoner, J. L.
Object Type: Report
System: The UNT Digital Library
TEM verification of the <111>-type 4-arm multi-junction in [001]-Mo single crystals (open access)

TEM verification of the <111>-type 4-arm multi-junction in [001]-Mo single crystals

To investigate and verify the formation of <111>-type 4-arm multi-junction by the dislocation reaction of 1/2[111] [b1] + 1/2[{bar 1}1{bar 1}] [b2] + 1/2[{bar 1}{bar 1}1] [b3] = 1/2[{bar 1}11] [b4], which has recently been discovered through computer simulations conducted by Vasily Bulatov and his colleagues.
Date: March 14, 2005
Creator: Hsiung, L
Object Type: Report
System: The UNT Digital Library
Key interactions in antibody recognition of synthetic sweeteners: Crystal structures of NC6.8 Fab co-crystallized with high potency sweetener compound SC45647 and with TES. (open access)

Key interactions in antibody recognition of synthetic sweeteners: Crystal structures of NC6.8 Fab co-crystallized with high potency sweetener compound SC45647 and with TES.

None
Date: March 14, 2005
Creator: Gokulan, K.; Khare, S.; Ronning, D.; Linthicum, S. D.; Rupp, B. & Sacchettini, J. C.
Object Type: Article
System: The UNT Digital Library
Automatic Generation of Data Types for Classification of Deep Web Sources (open access)

Automatic Generation of Data Types for Classification of Deep Web Sources

A Service Class Description (SCD) is an effective meta-data based approach for discovering Deep Web sources whose data exhibit some regular patterns. However, it is tedious and error prone to create an SCD description manually. Moreover, a manually created SCD is not adaptive to the frequent changes of Web sources. It requires its creator to identify all the possible input and output types of a service a priori. In many domains, it is impossible to exhaustively list all the possible input and output data types of a source in advance. In this paper, we describe machine learning approaches for automatic generation of the data types of an SCD. We propose two different approaches for learning data types of a class of Web sources. The Brute-Force Learner is able to generate data types that can achieve high recall, but with low precision. The Clustering-based Learner generates data types that have a high precision rate, but with a lower recall rate. We demonstrate the feasibility of these two learning-based solutions for automatic generation of data types for citation Web sources and presented a quantitative evaluation of these two solutions.
Date: February 14, 2005
Creator: Ngu, A. H.; Buttler, D. J. & Critchlow, T. J.
Object Type: Article
System: The UNT Digital Library
The Importance of Geometric Nonlinearity in Finite Element Studies of Yielding in Trabecular Bone (open access)

The Importance of Geometric Nonlinearity in Finite Element Studies of Yielding in Trabecular Bone

None
Date: February 14, 2005
Creator: Kinney, J H & Stolken, J S
Object Type: Article
System: The UNT Digital Library
ViSUS: Visualization Streams for Ultimate Scalability (open access)

ViSUS: Visualization Streams for Ultimate Scalability

In this project we developed a suite of progressive visualization algorithms and a data-streaming infrastructure that enable interactive exploration of scientific datasets of unprecedented size. The methodology aims to globally optimize the data flow in a pipeline of processing modules. Each module reads a multi-resolution representation of the input while producing a multi-resolution representation of the output. The use of multi-resolution representations provides the necessary flexibility to trade speed for accuracy in the visualization process. Maximum coherency and minimum delay in the data-flow is achieved by extensive use of progressive algorithms that continuously map local geometric updates of the input stream into immediate updates of the output stream. We implemented a prototype software infrastructure that demonstrated the flexibility and scalability of this approach by allowing large data visualization on single desktop computers, on PC clusters, and on heterogeneous computing resources distributed over a wide area network. When processing terabytes of scientific data, we have achieved an effective increase in visualization performance of several orders of magnitude in two major settings: (i) interactive visualization on desktop workstations of large datasets that cannot be stored locally; (ii) real-time monitoring of a large scientific simulation with negligible impact on the computing resources available. …
Date: February 14, 2005
Creator: Pascucci, V
Object Type: Report
System: The UNT Digital Library
Nonlinear decline-rate dependence and intrinsic variation of typeIa supernova luminosities (open access)

Nonlinear decline-rate dependence and intrinsic variation of typeIa supernova luminosities

Published B and V fluxes from nearby Type Ia supernova are fitted to light-curve templates with 4-6 adjustable parameters. Separately, B magnitudes from the same sample are fitted to a linear dependence on B-V color within a post-maximum time window prescribed by the CMAGIC method. These fits yield two independent SN magnitude estimates B{sub max} and B{sub BV}. Their difference varies systematically with decline rate {Delta}m{sub 15} in a form that is compatible with a bilinear but not a linear dependence; a nonlinear form likely describes the decline-rate dependence of B{sub max} itself. A Hubble fit to the average of B{sub max} and B{sub BV} requires a systematic correction for observed B-V color that can be described by a linear coefficient R = 2.59 {+-} 0.24, well below the coefficient R{sub B} {approx} 4.1 commonly used to characterize the effects of Milky Way dust. At 99.9% confidence the data reject a simple model in which no color correction is required for SNe that are clustered at the blue end of their observed color distribution. After systematic corrections are performed, B{sub max} and B{sub BV} exhibit mutual rms intrinsic variation equal to 0.074 {+-} 0.019 mag, of which at least an …
Date: December 14, 2005
Creator: Wang, Lifan; Strovink, Mark; Conley, Alexander; Goldhaber,Gerson; Kowalski, Marek; Perlmutter, Saul et al.
Object Type: Article
System: The UNT Digital Library
Optimizing Bandwidth Limited Problems Using One-SidedCommunication and Overlap (open access)

Optimizing Bandwidth Limited Problems Using One-SidedCommunication and Overlap

Partitioned Global Address Space languages like Unified Parallel C (UPC) are typically valued for their expressiveness, especially for computations with fine-grained random accesses. In this paper we show that the one-sided communication model used in these languages also has a significant performance advantage for bandwidth-limited applications. We demonstrate this benefit through communication microbenchmarks and a case-study that compares UPC and MPI implementations of the NAS Fourier Transform (FT) benchmark. Our optimizations rely on aggressively overlapping communication with computation but spreading communication events throughout the course of the local computation. This alleviates the potential communication bottleneck that occurs when the communication is packed into a single phase (e.g., the large all-to-all in a multidimensional FFT). Even though the new algorithms require more messages for the same total volume of data, the resulting overlap leads to speedups of over 1.75x and 1.9x for the two-sided and one-sided implementations, respectively, when compared to the default NAS Fortran/MPI release. Our best one-sided implementations show an average improvement of 15 percent over our best two-sided implementations. We attribute this difference to the lower software overhead of one-sided communication, which is partly fundamental to the semantic difference between one-sided and two-sided communication. Our UPC results use …
Date: October 14, 2005
Creator: Bell, Christian; Bonachea, Dan; Nishtala, Rajesh & Yelick, Katherine
Object Type: Article
System: The UNT Digital Library
Strontium and Actinide Separations From High Level Nuclear Waste Solutions Using Monosodium Titanate 1. Simulant Testing (open access)

Strontium and Actinide Separations From High Level Nuclear Waste Solutions Using Monosodium Titanate 1. Simulant Testing

High-level nuclear waste produced from fuel reprocessing operations at the Savannah River Site (SRS) requires pretreatment to remove {sup 137}Cs, {sup 90}Sr and alpha-emitting radionuclides (i.e., actinides) prior to disposal. Separation processes planned at SRS include caustic side solvent extraction, for {sup 137}Cs removal, and ion exchange/sorption of {sup 90}Sr and alpha-emitting radionuclides with an inorganic material, monosodium titanate (MST). The predominant alpha-emitting radionuclides in the highly alkaline waste solutions include plutonium isotopes {sup 238}Pu, {sup 239}Pu and {sup 240}Pu. This paper provides a summary of data acquired to measure the performance of MST to remove strontium and actinides from simulated waste solutions. These tests evaluated the influence of ionic strength, temperature, solution composition and the oxidation state of plutonium.
Date: April 14, 2005
Creator: Hobbs, D. T.; Barnes, M. J.; Pulmano, R. L.; Marshall, K. M.; Edwards, T. B.; Bronikowski, M. G. et al.
Object Type: Report
System: The UNT Digital Library
Detonation Reaction Zones in Condensed Explosives (open access)

Detonation Reaction Zones in Condensed Explosives

Experimental measurements using nanosecond time resolved embedded gauges and laser interferometric techniques, combined with Non-Equilibrium Zeldovich--von Neumann--Doring (NEZND) theory and Ignition and Growth reactive flow hydrodynamic modeling, have revealed the average pressure/particle velocity states attained in reaction zones of self-sustaining detonation waves in several solid and liquid explosives. The time durations of these reaction zone processes is discussed for explosives based on pentaerythritol tetranitrate (PETN), nitromethane, octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine (HMX), triaminitrinitrobenzene(TATB) and trinitrotoluene (TNT).
Date: July 14, 2005
Creator: Tarver, Craig M.
Object Type: Article
System: The UNT Digital Library
A Heavy Flavor Tracker for STAR (open access)

A Heavy Flavor Tracker for STAR

We propose to construct a Heavy Flavor Tracker (HFT) for the STAR experiment at RHIC. The HFT will bring new physics capabilities to STAR and it will significantly enhance the physics capabilities of the STAR detector at central rapidities. The HFT will ensure that STAR will be able to take heavy flavor data at all luminosities attainable throughout the proposed RHIC II era.
Date: March 14, 2005
Creator: Xu, Z.; Chen, Y.; Kleinfelder, S.; Koohi, A.; Li, S.; Huang, H. et al.
Object Type: Report
System: The UNT Digital Library
Towards a Unified Approach to Information Integration - A review paper on data/information fusion (open access)

Towards a Unified Approach to Information Integration - A review paper on data/information fusion

Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, information is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and …
Date: October 14, 2005
Creator: Whitney, Paul D.; Posse, Christian & Lei, Xingye C.
Object Type: Report
System: The UNT Digital Library