Igniting the Light Elements: The Los Alamos Thermonuclear Weapon Project, 1942-1952 (open access)

Igniting the Light Elements: The Los Alamos Thermonuclear Weapon Project, 1942-1952

The American system of nuclear weapons research and development was conceived and developed not as a result of technological determinism, but by a number of individual architects who promoted the growth of this large technologically-based complex. While some of the technological artifacts of this system, such as the fission weapons used in World War II, have been the subject of many historical studies, their technical successors--fusion (or hydrogen) devices--are representative of the largely unstudied highly secret realms of nuclear weapons science and engineering. In the postwar period a small number of Los Alamos Scientific Laboratory's staff and affiliates were responsible for theoretical work on fusion weapons, yet the program was subject to both the provisions and constraints of the US Atomic Energy Commission, of which Los Alamos was a part. The Commission leadership's struggle to establish a mission for its network of laboratories, least of all to keep them operating, affected Los Alamos's leaders' decisions as to the course of weapons design and development projects. Adapting Thomas P. Hughes's ''large technological systems'' thesis, I focus on the technical, social, political, and human problems that nuclear weapons scientists faced while pursuing the thermonuclear project, demonstrating why the early American thermonuclear bomb …
Date: July 1, 1999
Creator: Fitzpatrick, Anne C.
System: The UNT Digital Library
Studies of the terrestrial O{sub 2} and carbon cycles in sand dune gases and in biosphere 2 (open access)

Studies of the terrestrial O{sub 2} and carbon cycles in sand dune gases and in biosphere 2

Molecular oxygen in the atmosphere is coupled tightly to the terrestrial carbon cycle by the processes of photosynthesis, respiration, and burning. This dissertation examines different aspects of this coupling in four chapters. Chapter 1 explores the feasibility of using air from sand dunes to reconstruct atmospheric O{sub 2} composition centuries ago. Such a record would reveal changes in the mass of the terrestrial biosphere, after correction for known fossil fuel combustion, and constrain the fate of anthropogenic CO{sub 2}.
Date: December 31, 1995
Creator: Severinghaus, J.P.
System: The UNT Digital Library
Development of new VOC exposure metrics and their relationship to ``Sick Building Syndrome`` symptoms (open access)

Development of new VOC exposure metrics and their relationship to ``Sick Building Syndrome`` symptoms

Volatile organic compounds (VOCs) are suspected to contribute significantly to ``Sick Building Syndrome`` (SBS), a complex of subchronic symptoms that occurs during and in general decreases away from occupancy of the building in question. A new approach takes into account individual VOC potencies, as well as the highly correlated nature of the complex VOC mixtures found indoors. The new VOC metrics are statistically significant predictors of symptom outcomes from the California Healthy Buildings Study data. Multivariate logistic regression analyses were used to test the hypothesis that a summary measure of the VOC mixture, other risk factors, and covariates for each worker will lead to better prediction of symptom outcome. VOC metrics based on animal irritancy measures and principal component analysis had the most influence in the prediction of eye, dermal, and nasal symptoms. After adjustment, a water-based paints and solvents source was found to be associated with dermal and eye irritation. The more typical VOC exposure metrics used in prior analyses were not useful in symptom prediction in the adjusted model (total VOC (TVOC), or sum of individually identified VOCs ({Sigma}VOC{sub i})). Also not useful were three other VOC metrics that took into account potency, but did not adjust for …
Date: August 1, 1995
Creator: Ten Brinke, J.
System: The UNT Digital Library
Real time imaging of live cell ATP leaking or release events by chemiluminescence microscopy (open access)

Real time imaging of live cell ATP leaking or release events by chemiluminescence microscopy

The purpose of this research was to expand the chemiluminescence microscopy applications in live bacterial/mammalian cell imaging and to improve the detection sensitivity for ATP leaking or release events. We first demonstrated that chemiluminescence (CL) imaging can be used to interrogate single bacterial cells. While using a luminometer allows detecting ATP from cell lysate extracted from at least 10 bacterial cells, all previous cell CL detection never reached this sensitivity of single bacteria level. We approached this goal with a different strategy from before: instead of breaking bacterial cell membrane and trying to capture the transiently diluted ATP with the firefly luciferase CL assay, we introduced the firefly luciferase enzyme into bacteria using the modern genetic techniques and placed the CL reaction substrate D-luciferin outside the cells. By damaging the cell membrane with various antibacterial drugs including antibiotics such as Penicillins and bacteriophages, the D-luciferin molecules diffused inside the cell and initiated the reaction that produces CL light. As firefly luciferases are large protein molecules which are retained within the cells before the total rupture and intracellular ATP concentration is high at the millmolar level, the CL reaction of firefly luciferase, ATP and D-luciferin can be kept for a relatively …
Date: December 18, 2008
Creator: Zhang, Yun
System: The UNT Digital Library
Water resources development in Santa Clara Valley, California: insights into the human-hydrologic relationship (open access)

Water resources development in Santa Clara Valley, California: insights into the human-hydrologic relationship

Groundwater irrigation is critical to food production and, in turn, to humankind's relationship with its environment. The development of groundwater in Santa Clara Valley, California during the early twentieth century is instructive because (1) responses to unsustainable resource use were largely successful; (2) the proposals for the physical management of the water, although not entirely novel, incorporated new approaches which reveal an evolving relationship between humans and the hydrologic cycle; and (3) the valley serves as a natural laboratory where natural (groundwater basin, surface watershed) and human (county, water district) boundaries generally coincide. Here, I investigate how water resources development and management in Santa Clara Valley was influenced by, and reflective of, a broad understanding of water as a natural resource, including scientific and technological innovations, new management approaches, and changing perceptions of the hydrologic cycle. Market demands and technological advances engendered reliance on groundwater. This, coupled with a series of dry years and laissez faire government policies, led to overdraft. Faith in centralized management and objective engineering offered a solution to concerns over resource depletion, and a group dominated by orchardists soon organized, fought for a water conservation district, and funded an investigation to halt the decline of well …
Date: June 1, 2000
Creator: Reynolds, Jesse L. & Narasimhan, T.N.
System: The UNT Digital Library
Lymph Transport of 239PuO2 in Dogs (open access)

Lymph Transport of 239PuO2 in Dogs

None
Date: May 1, 1973
Creator: Gomez, L. S.
System: The UNT Digital Library
Microfabrication of an Implantable silicone Microelectrode array for an epiretinal prosthesis (open access)

Microfabrication of an Implantable silicone Microelectrode array for an epiretinal prosthesis

Millions of people suffering from diseases such as retinitis pigmentosa and macular degeneration are legally blind due to the loss of photoreceptor function. Fortunately a large percentage of the neural cells connected to the photoreceptors remain viable, and electrical stimulation of these cells has been shown to result in visual perception. These findings have generated worldwide efforts to develop a retinal prosthesis device, with the hope of restoring vision. Advances in microfabrication, integrated circuits, and wireless technologies provide the means to reach this challenging goal. This dissertation describes the development of innovative silicone-based microfabrication techniques for producing an implantable microelectrode array. The microelectrode array is a component of an epiretinal prosthesis being developed by a multi-laboratory consortium. This array will serve as the interface between an electronic imaging system and the human eye, directly stimulating retinal neurons via thin film conducting traces. Because the array is intended as a long-term implant, vital biological and physical design requirements must be met. A retinal implant poses difficult engineering challenges due to the size of the intraocular cavity and the delicate retina. Not only does it have to be biocompatible in terms of cytotoxicity and degradation, but it also has to be structurally …
Date: June 10, 2003
Creator: Maghribi, M
System: The UNT Digital Library
Properties of Group Five and Group Seven transactinium elements (open access)

Properties of Group Five and Group Seven transactinium elements

The detection and positive identification of the short-lived, low cross section isotopes used in the chemical studies of the heaviest elements are usually accomplished by measuring their alpha-decay, thus the nuclear properties of the heaviest elements must be examined simultaneously with their chemical properties. The isotopes 224 Pa and 266,267 Bh have been studied extensively as an integral part of the investigation of the heaviest members of the groups five and seven of the periodic table. The half-life of 224 Pa was determined to be 855 plus/minus19 ms by measuring its alpha-decay using our rotating wheel, solid state detector system at the Lawrence Berkeley National Laboratory 88-Inch Cyclotron. Protactinium was produced by bombardment of a bismuth target. New neutron rich isotopes, 267 Bh and 266 Bh, were produced in bombardments of a 249 Bk target and their decay was observed using the rotating wheel system. The 266 Bh that was produced decays with a half-life of approximately 1 s by emission of alpha particles with an average energy of 9.25 plus/minus 0.03 MeV. 267 Bh was observed to decay with a 17 s half-life by emission of alpha-particles with an average energy of 8.83 plus/minus 0.03 MeV. The chemical behavior …
Date: May 1, 2001
Creator: Wilk, Philip A.
System: The UNT Digital Library
Energy Demands and Efficiency Strategies in Data Center Buildings (open access)

Energy Demands and Efficiency Strategies in Data Center Buildings

Information technology (IT) is becoming increasingly pervasive throughout society as more data is digitally processed, stored, and transferred. The infrastructure that supports IT activity is growing accordingly, and data center energy demands haveincreased by nearly a factor of four over the past decade. Data centers house IT equipment and require significantly more energy to operate per unit floor area thanconventional buildings. The economic and environmental ramifications of continued data center growth motivate the need to explore energy-efficient methods to operate these buildings. A substantial portion of data center energy use is dedicated to removing the heat that is generated by the IT equipment. Using economizers to introduce large airflow rates of outside air during favorable weather could substantially reduce the energy consumption of data center cooling. Cooling buildings with economizers is an established energy saving measure, but in data centers this strategy is not widely used, partly owing to concerns that the large airflow rates would lead to increased indoor levels of airborne particles, which could damage IT equipment. The environmental conditions typical of data centers and the associated potential for equipment failure, however, are not well characterized. This barrier to economizer implementation illustrates the general relationship between energy use …
Date: September 1, 2009
Creator: Shehabi, Arman
System: The UNT Digital Library
A study of a tissue equivalent gelatine based tissue substitute (open access)

A study of a tissue equivalent gelatine based tissue substitute

A study of several tissue substitutes for use as volumetric dosimeters was performed. The tissue substitutes studied included tissue substitutes from previous studies and from ICRU 44. The substitutes were evaluated for an overall match to Reference Man which was used as a basis for this study. The evaluation was based on the electron stopping power, the mass attenuation coefficient, the electron density, and the specific gravity. The tissue substitute chosen also had to be capable of changing from a liquid into a solid form to maintain an even distribution of thermoluminesent dosimetry (TLD) powder and then back to a liquid for recovery of the TLD powder without adversely effecting the TLD powder. The gelatine mixture provided the closest match to the data from Reference Man tissue. The gelatine mixture was put through a series of test to determine it`s usefulness as a reliable tissue substitute. The TLD powder was cast in the gelatine mixture and recovered to determine if the TLD powder was adversely effected. The distribution of the TLD powder after being cast into the gelatin mixture was tested in insure an even was maintained.
Date: November 1, 1992
Creator: Spence, J. L.
System: The UNT Digital Library
A population-based exposure assessment methodology for carbon monoxide: Development of a carbon monoxide passive sampler and occupational dosimeter (open access)

A population-based exposure assessment methodology for carbon monoxide: Development of a carbon monoxide passive sampler and occupational dosimeter

Two devices, an occupational carbon monoxide (CO) dosimeter (LOCD), and an indoor air quality (IAQ) passive sampler were developed for use in population-based CO exposure assessment studies. CO exposure is a serious public health problem in the U.S., causing both morbidity and mortality (lifetime mortality risk approximately 10{sup -4}). Sparse data from population-based CO exposure assessments indicate that approximately 10% of the U.S. population is exposed to CO above the national ambient air quality standard. No CO exposure measurement technology is presently available for affordable population-based CO exposure assessment studies. The LOCD and IAQ Passive Sampler were tested in the laboratory and field. The palladium-molybdenum based CO sensor was designed into a compact diffusion tube sampler that can be worn. Time-weighted-average (TWA) CO exposure of the device is quantified by a simple spectrophotometric measurement. The LOCD and IAQ Passive Sampler were tested over an exposure range of 40 to 700 ppm-hours and 200 to 4200 ppm-hours, respectively. Both devices were capable of measuring precisely (relative standard deviation <20%), with low bias (<10%). The LOCD was screened for interferences by temperature, humidity, and organic and inorganic gases. Temperature effects were small in the range of 10{degrees}C to 30{degrees}C. Humidity effects were …
Date: September 1, 1997
Creator: Apte, M.G.
System: The UNT Digital Library
The ends of uncertainty: Air quality science and planning in Central California (open access)

The ends of uncertainty: Air quality science and planning in Central California

Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review …
Date: September 2003
Creator: Fine, James
System: The UNT Digital Library
A technique using a stellar spectrographic plate to measure terrestrial ozone column depth (open access)

A technique using a stellar spectrographic plate to measure terrestrial ozone column depth

This thesis examines the feasibility of a technique to extract ozone column depths from photographic stellar spectra in the 5000--7000 Angstrom spectral region. A stellar spectrographic plate is measured to yield the relative intensity distribution of a star`s radiation after transmission through the earth`s atmosphere. The amount of stellar radiation absorbed by the ozone Chappuis band is proportional to the ozone column depth. The measured column depth is within 10% the mean monthly value for latitude 36{degree}N, however the uncertainty is too large to make the measurement useful. This thesis shows that a 10% improvement to the photographic sensitivity uncertainty can decrease the column depth uncertainty to a level acceptable for climatic study use. This technique offers the possibility of measuring past ozone column depths.
Date: August 1, 1995
Creator: Wong, A.Y.
System: The UNT Digital Library
Microearthquake Study of the Salton Sea Geothermal Field, California: Evidence of Stress Triggering (open access)

Microearthquake Study of the Salton Sea Geothermal Field, California: Evidence of Stress Triggering

A digital network of 24 seismograph stations was operated from September 15, 1987 to September 30, 1988, by Lawrence Livermore National Laboratory and Unocal as part of the Salton Sea Scientific Drilling Project to study seismicity related to tectonics and geothermal activity near the drilling site. More than 2001 microearthquakes were relocated in this study in order to image any pervasive structures that may exist within the Salton Sea geothermal field. First, detailed velocity models were obtained through standard 1-D inversion techniques. These velocity models were then used to relocate events using both single event methods and Double-Differencing, a joint hypocenter location method. An anisotropic velocity model was built from anisotropy estimates obtained from well logs within the study area. During the study period, the Superstition wills sequence occurred with two moderate earthquakes of MS 6.2 and MS 6.6. These moderate earthquakes caused a rotation of the stress field as observed from the inversion of first motion data from microearthquakes at the Salton Sea geothermal field. Coulomb failure analysis also indicates that microearthquakes occurring after the Superstition Hills sequence are located within a region of stress increase suggesting stress triggering caused by the moderate earthquakes.
Date: February 1, 2002
Creator: Holland, Austin Adams
System: The UNT Digital Library
Grid Development and a Study of B-flavour tagging at D� (open access)

Grid Development and a Study of B-flavour tagging at D�

Run IIa of the D0 experiment at the Tevatron took place between Spring 2002 and Spring 2006, collecting approximately 1.2 fb{sup -1} of data. A fundamental principal of the D0 computing model is the utilization of globally distributed computing resources as part of a grid. In particular use is made of the 'SAMGrid'. The first part of this thesis describes the work undertaken at Imperial College on several D0 distributed computing projects. These included the deployment and development of parts of the SAMGrid software suite, and participation in the Winter 2003/2004 data reprocessing effort. One of the major goals of the D0 experiment is the observation of mixing in the B{sub s}{sup 0}-meson system. The measurement of the mixing frequency is important as it can be used to constrain the CKM matrix, which describes CP violation in the Standard Model. The second part of this thesis describes the development of an opposite side flavour tagging algorithm and its calibration using B{sup +} and B{sub d}{sup 0} meson decays. The application of this algorithm to an analysis of the B{sub s}{sup 0} meson system is then described, which lead to the world's first two-sided limit on the B{sub s}{sup 0} meson …
Date: September 1, 2006
Creator: Lewis, Philip William
System: The UNT Digital Library
Integrating Total Quality Management (TQM) and hazardous waste management (open access)

Integrating Total Quality Management (TQM) and hazardous waste management

The Resource Conservation and Recovery Act (RCRA) of 1976 and its subsequent amendments have had a dramatic impact on hazardous waste management for business and industry. The complexity of this law and the penalties for noncompliance have made it one of the most challenging regulatory programs undertaken by the Environmental Protection Agency (EPA). The fundamentals of RCRA include ``cradle to grave`` management of hazardous waste, covering generators, transporters, and treatment, storage, and disposal facilities. The regulations also address extensive definitions and listing/identification mechanisms for hazardous waste along with a tracking system. Treatment is favored over disposal and emphasis is on ``front-end`` treatment such as waste minimization and pollution prevention. A study of large corporations such as Xerox, 3M, and Dow Chemical, as well as the public sector, has shown that well known and successful hazardous waste management programs emphasize pollution prevention and employment of techniques such as proactive environmental management, environmentally conscious manufacturing, and source reduction. Nearly all successful hazardous waste programs include some aspects of Total Quality Management, which begins with a strong commitment from top management. Hazardous waste management at the Rocky Flats Plant is further complicated by the dominance of ``mixed waste`` at the facility. The mixed …
Date: November 1, 1993
Creator: Kirk, N.
System: The UNT Digital Library
A Proof-of-Principle Echo-enabled Harmonic Generation Free Electron Laser Experiment at SLAC (open access)

A Proof-of-Principle Echo-enabled Harmonic Generation Free Electron Laser Experiment at SLAC

With the advent of X-ray Free Electron Lasers (FELs), new methods have been developed to extend capabilities at short wavelengths beyond Self-Amplified Spontaneous Emission (SASE). In particular, seeding of a FEL allows for temporal control of the radiation pulse and increases the peak brightness by orders of magnitude. Most recently, Gennady Stupakov and colleagues at SLAC proposed a new technique: Echo-Enabled Harmonic Generation (EEHG). Here a laser microbunches the beam in an undulator and the beam is sheared in a chicane. This process is repeated with a second laser, undulator and chicane. The interplay between these allows a seeding of the X-ray laser up to the 100th harmonic of the first laser. After introducing the physics of FELs and the EEHG seeding technique, we describe contributions to the experimental effort. We will present detailed studies of the experiment including the choice of parameters and their optimization, the emittance effect, spontaneous emission in the undulators, the second laser phase effect, and measurements of the jitter between RF stations. Finally, the status and preliminary results of the Echo-7 experiment will be outlined.
Date: January 6, 2012
Creator: Pernet, Pierre-Louis
System: The UNT Digital Library
Structural studies of the activation of the two component receiver domain NTRC by multidimensional heteronuclear NMR (open access)

Structural studies of the activation of the two component receiver domain NTRC by multidimensional heteronuclear NMR

Multidimensional heteronuclear NMR spectroscopy was used to investigate the N-terminal domain of the transcriptional enhancer NTRC (NiTrogen Regulatory protein C). This domain belongs to the family of receiver domains of two-component regulatory systems involved in signal transduction. Phosphorylation of NTRC at D54 leads to an activated form of the molecule which stimulates transcription of genes involved in nitrogen regulation. Three and four dimensional NMR techniques were used to determine an intermediate resolution structure of the unphosphorylated, inactive form of the N-terminal domain of NTRC. The structure is comprised of five {alpha}-helices and a five-stranded {beta}-sheet in a ({beta}/{alpha}){sub 5} topology. Analysis of the backbone dynamics of NTRC indicate that helix 4 and strand 5 are significantly more flexible than the rest of the secondary structure of the protein and that the loops making up the active site are flexible. The short lifetime of phospho-NTRC hampers the study of this form. However, conditions for determining the resonance assignments and, possibly, the three dimensional structure of phosphorylated NTRC have been obtained. Tentative assignments of the phosphorylated form indicate that the majority of the changes that NTRC experiences upon phosphorylation occur in helix 3, strand 4, helix 4, strand 5, and the loop …
Date: May 1, 1996
Creator: Nohaile, M.J.
System: The UNT Digital Library
Characterization and refinement of carbide coating formation rates and dissolution kinetics in the Ta-C system (open access)

Characterization and refinement of carbide coating formation rates and dissolution kinetics in the Ta-C system

The interaction between carbide coating formation rates and dissolution kinetics in the tantalum-carbon system was investigated. The research was driven by the need to characterize carbide coating formation rates. The characterization of the carbide coating formation rates was required to engineer an optimum processing scheme for the fabrication of the ultracorrosion-resistant composite, carbon-saturated tantalum. A packed-bed carburization process was successfully engineered and employed. The packed-bed carburization process produced consistent, predictable, and repeatable carbide coatings. A digital imaging analysis measurement process for accurate and consistent measurement of carbide coating thicknesses was developed. A process for removing the chemically stable and extremely hard tantalum-carbide coatings was also developed in this work.
Date: October 1, 1996
Creator: Rodriguez, P.J.
System: The UNT Digital Library
Error Detection, Factorization and Correction for Multi-View Scene Reconstruction from Aerial Imagery (open access)

Error Detection, Factorization and Correction for Multi-View Scene Reconstruction from Aerial Imagery

Scene reconstruction from video sequences has become a prominent computer vision research area in recent years, due to its large number of applications in fields such as security, robotics and virtual reality. Despite recent progress in this field, there are still a number of issues that manifest as incomplete, incorrect or computationally-expensive reconstructions. The engine behind achieving reconstruction is the matching of features between images, where common conditions such as occlusions, lighting changes and texture-less regions can all affect matching accuracy. Subsequent processes that rely on matching accuracy, such as camera parameter estimation, structure computation and non-linear parameter optimization, are also vulnerable to additional sources of error, such as degeneracies and mathematical instability. Detection and correction of errors, along with robustness in parameter solvers, are a must in order to achieve a very accurate final scene reconstruction. However, error detection is in general difficult due to the lack of ground-truth information about the given scene, such as the absolute position of scene points or GPS/IMU coordinates for the camera(s) viewing the scene. In this dissertation, methods are presented for the detection, factorization and correction of error sources present in all stages of a scene reconstruction pipeline from video, in the …
Date: November 10, 2011
Creator: Hess-Flores, M.
System: The UNT Digital Library
A Globally Distributed System for Job, Data, and Information Handling for High Energy Physics (open access)

A Globally Distributed System for Job, Data, and Information Handling for High Energy Physics

The computing infrastructures of the modern high energy physics experiments need to address an unprecedented set of requirements. The collaborations consist of hundreds of members from dozens of institutions around the world and the computing power necessary to analyze the data produced surpasses already the capabilities of any single computing center. A software infrastructure capable of seamlessly integrating dozens of computing centers around the world, enabling computing for a large and dynamical group of users, is of fundamental importance for the production of scientific results. Such a computing infrastructure is called a computational grid. The SAM-Grid offers a solution to these problems for CDF and DZero, two of the largest high energy physics experiments in the world, running at Fermilab. The SAM-Grid integrates standard grid middleware, such as Condor-G and the Globus Toolkit, with software developed at Fermilab, organizing the system in three major components: data handling, job handling, and information management. This dissertation presents the challenges and the solutions provided in such a computing infrastructure.
Date: December 1, 2005
Creator: Garzoglio, Gabriele & U., /DePaul
System: The UNT Digital Library
Excited State Structural Dynamics of Carotenoids and ChargeTransfer Systems (open access)

Excited State Structural Dynamics of Carotenoids and ChargeTransfer Systems

This dissertation describes the development andimplementation of a visible/near infrared pump/mid-infrared probeapparatus. Chapter 1 describes the background and motivation ofinvestigating optically induced structural dynamics, paying specificattention to solvation and the excitation selection rules of highlysymmetric molecules such as carotenoids. Chapter 2 describes thedevelopment and construction of the experimental apparatus usedthroughout the remainder of this dissertation. Chapter 3 will discuss theinvestigation of DCM, a laser dye with a fluorescence signal resultingfrom a charge transfer state. By studying the dynamics of DCM and of itsmethyl deuterated isotopomer (an otherwise identical molecule), we areable to investigate the origins of the charge transfer state and provideevidence that it is of the controversial twisted intramolecular (TICT)type. Chapter 4 introduces the use of two-photon excitation to the S1state, combined with one-photon excitation to the S2 state of thecarotenoid beta-apo-8'-carotenal. These 2 investigations show evidencefor the formation of solitons, previously unobserved in molecular systemsand found only in conducting polymers Chapter 5 presents an investigationof the excited state dynamics of peridinin, the carotenoid responsiblefor the light harvesting of dinoflagellates. This investigation allowsfor a more detailed understanding of the importance of structuraldynamics of carotenoids in light harvesting.
Date: September 1, 2006
Creator: Van Tassle, Aaron Justin
System: The UNT Digital Library
An object-oriented extension for debugging the virtual machine (open access)

An object-oriented extension for debugging the virtual machine

A computer is nothing more then a virtual machine programmed by source code to perform a task. The program`s source code expresses abstract constructs which are compiled into some lower level target language. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low-level target implementation information to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design and data into the source code. We introduce OODIE, an object-oriented extension to programming languages that allows programmers to specify a virtual environment by describing the meaning of the design and data of a virtual machine. This specification is translated into symbolic information such that an augmented debugger can present engineers with a programmable debugging environment specifically tailored for the virtual machine that is to be debugged.
Date: December 1, 1994
Creator: Pizzi, R.G. Jr.
System: The UNT Digital Library