Evaluating Appropriateness of Emg and Flex Sensors for Classifying Hand Gestures (open access)

Evaluating Appropriateness of Emg and Flex Sensors for Classifying Hand Gestures

Hand and arm gestures are a great way of communication when you don't want to be heard, quieter and often more reliable than whispering into a radio mike. In recent years hand gesture identification became a major active area of research due its use in various applications. The objective of my work is to develop an integrated sensor system, which will enable tactical squads and SWAT teams to communicate when there is absence of a Line of Sight or in the presence of any obstacles. The gesture set involved in this work is the standardized hand signals for close range engagement operations used by military and SWAT teams. The gesture sets involved in this work are broadly divided into finger movements and arm movements. The core components of the integrated sensor system are: Surface EMG sensors, Flex sensors and accelerometers. Surface EMG is the electrical activity produced by muscle contractions and measured by sensors directly attached to the skin. Bend Sensors use a piezo resistive material to detect the bend. The sensor output is determined by both the angle between the ends of the sensor as well as the flex radius. Accelerometers sense the dynamic acceleration and inclination in 3 …
Date: May 2013
Creator: Akumalla, Sarath Chandra
System: The UNT Digital Library
Hybrid Optimization Models for Depot Location-Allocation and Real-Time Routing of Emergency Deliveries (open access)

Hybrid Optimization Models for Depot Location-Allocation and Real-Time Routing of Emergency Deliveries

Prompt and efficient intervention is vital in reducing casualty figures during epidemic outbreaks, disasters, sudden civil strife or terrorism attacks. This can only be achieved if there is a fit-for-purpose and location-specific emergency response plan in place, incorporating geographical, time and vehicular capacity constraints. In this research, a comprehensive emergency response model for situations of uncertainties (in locations' demand and available resources), typically obtainable in low-resource countries, is designed. It involves the development of algorithms for optimizing pre-and post-disaster activities. The studies result in the development of four models: (1) an adaptation of a machine learning clustering algorithm, for pre-positioning depots and emergency operation centers, which optimizes the placement of these depots, such that the largest geographical location is covered, and the maximum number of individuals reached, with minimal facility cost; (2) an optimization algorithm for routing relief distribution, using heterogenous fleets of vehicle, with considerations for uncertainties in humanitarian supplies; (3) a genetic algorithm-based route improvement model; and (4) a model for integrating possible new locations into the routing network, in real-time, using emergency severity ranking, with a high priority on the most-vulnerable population. The clustering approach to solving dept location-allocation problem produces a better time complexity, and the …
Date: May 2021
Creator: Akwafuo, Sampson E
System: The UNT Digital Library
Traffic Forecasting Applications Using Crowdsourced Traffic Reports and Deep Learning (open access)

Traffic Forecasting Applications Using Crowdsourced Traffic Reports and Deep Learning

Intelligent transportation systems (ITS) are essential tools for traffic planning, analysis, and forecasting that can utilize the huge amount of traffic data available nowadays. In this work, we aggregated detailed traffic flow sensor data, Waze reports, OpenStreetMap (OSM) features, and weather data, from California Bay Area for 6 months. Using that data, we studied three novel ITS applications using convolutional neural networks (CNNs) and recurrent neural networks (RNNs). The first experiment is an analysis of the relation between roadway shapes and accident occurrence, where results show that the speed limit and number of lanes are significant predictors for major accidents on highways. The second experiment presents a novel method for forecasting congestion severity using crowdsourced data only (Waze, OSM, and weather), without the need for traffic sensor data. The third experiment studies the improvement of traffic flow forecasting using accidents, number of lanes, weather, and time-related features, where results show significant performance improvements when the additional features where used.
Date: May 2020
Creator: Alammari, Ali
System: The UNT Digital Library

Deep Learning Methods to Investigate Online Hate Speech and Counterhate Replies to Mitigate Hateful Content

Hateful content and offensive language are commonplace on social media platforms. Many surveys prove that high percentages of social media users experience online harassment. Previous efforts have been made to detect and remove online hate content automatically. However, removing users' content restricts free speech. A complementary strategy to address hateful content that does not interfere with free speech is to counter the hate with new content to divert the discourse away from the hate. In this dissertation, we complement the lack of previous work on counterhate arguments by analyzing and detecting them. Firstly, we study the relationships between hateful tweets and replies. Specifically, we analyze their fine-grained relationships by indicating whether the reply counters the hate, provides a justification, attacks the author of the tweet, or adds additional hate. The most obvious finding is that most replies generally agree with the hateful tweets; only 20% of them counter the hate. Secondly, we focus on the hate directed toward individuals and detect authentic counterhate arguments from online articles. We propose a methodology that assures the authenticity of the argument and its specificity to the individual of interest. We show that finding arguments in online articles is an efficient alternative compared to …
Date: May 2023
Creator: Albanyan, Abdullah Abdulaziz
System: The UNT Digital Library

Toward Leveraging Artificial Intelligence to Support the Identification of Accessibility Challenges

The goal of this thesis is to support the automated identification of accessibility in user reviews or bug reports, to help technology professionals prioritize their handling, and, thus, to create more inclusive apps. Particularly, we propose a model that takes as input accessibility user reviews or bug reports and learns their keyword-based features to make a classification decision, for a given review, on whether it is about accessibility or not. Our empirically driven study follows a mixture of qualitative and quantitative methods. We introduced models that can accurately identify accessibility reviews and bug reports and automate detecting them. Our models can automatically classify app reviews and bug reports as accessibility-related or not so developers can easily detect accessibility issues with their products and improve them to more accessible and inclusive apps utilizing the users' input. Our goal is to create a sustainable change by including a model in the developer's software maintenance pipeline and raising awareness of existing errors that hinder the accessibility of mobile apps, which is a pressing need. In light of our findings from the Blackboard case study, Blackboard and the course material are not easily accessible to deaf students and hard of hearing. Thus, deaf students …
Date: May 2023
Creator: Aljedaani, Wajdi Mohammed R M., Sr.
System: The UNT Digital Library
A Data-Driven Computational Framework to Assess the Risk of Epidemics at Global Mass Gatherings (open access)

A Data-Driven Computational Framework to Assess the Risk of Epidemics at Global Mass Gatherings

This dissertation presents a data-driven computational epidemic framework to simulate disease epidemics at global mass gatherings. The annual Muslim pilgrimage to Makkah, Saudi Arabia is used to demonstrate the simulation and analysis of various disease transmission scenarios throughout the different stages of the event from the arrival to the departure of international participants. The proposed agent-based epidemic model efficiently captures the demographic, spatial, and temporal heterogeneity at each stage of the global event of Hajj. Experimental results indicate the substantial impact of the demographic and mobility patterns of the heterogeneous population of pilgrims on the progression of the disease spread in the different stages of Hajj. In addition, these simulations suggest that the differences in the spatial and temporal settings in each stage can significantly affect the dynamic of the disease. Finally, the epidemic simulations conducted at the different stages in this dissertation illustrate the impact of the differences between the duration of each stage in the event and the length of the infectious and latent periods. This research contributes to a better understanding of epidemic modeling in the context of global mass gatherings to predict the risk of disease pandemics caused by associated international travel. The computational modeling and …
Date: May 2019
Creator: Alshammari, Sultanah
System: The UNT Digital Library
Space and Spectrum Engineered High Frequency Components and Circuits (open access)

Space and Spectrum Engineered High Frequency Components and Circuits

With the increasing demand on wireless and portable devices, the radio frequency front end blocks are required to feature properties such as wideband, high frequency, multiple operating frequencies, low cost and compact size. However, the current radio frequency system blocks are designed by combining several individual frequency band blocks into one functional block, which increase the cost and size of devices. To address these issues, it is important to develop novel approaches to further advance the current design methodologies in both space and spectrum domains. In recent years, the concept of artificial materials has been proposed and studied intensively in RF/Microwave, Terahertz, and optical frequency range. It is a combination of conventional materials such as air, wood, metal and plastic. It can achieve the material properties that have not been found in nature. Therefore, the artificial material (i.e. meta-materials) provides design freedoms to control both the spectrum performance and geometrical structures of radio frequency front end blocks and other high frequency systems. In this dissertation, several artificial materials are proposed and designed by different methods, and their applications to different high frequency components and circuits are studied. First, quasi-conformal mapping (QCM) method is applied to design plasmonic wave-adapters and couplers …
Date: May 2015
Creator: Arigong, Bayaner
System: The UNT Digital Library
Extrapolating Subjectivity Research to Other Languages (open access)

Extrapolating Subjectivity Research to Other Languages

Socrates articulated it best, "Speak, so I may see you." Indeed, language represents an invisible probe into the mind. It is the medium through which we express our deepest thoughts, our aspirations, our views, our feelings, our inner reality. From the beginning of artificial intelligence, researchers have sought to impart human like understanding to machines. As much of our language represents a form of self expression, capturing thoughts, beliefs, evaluations, opinions, and emotions which are not available for scrutiny by an outside observer, in the field of natural language, research involving these aspects has crystallized under the name of subjectivity and sentiment analysis. While subjectivity classification labels text as either subjective or objective, sentiment classification further divides subjective text into either positive, negative or neutral. In this thesis, I investigate techniques of generating tools and resources for subjectivity analysis that do not rely on an existing natural language processing infrastructure in a given language. This constraint is motivated by the fact that the vast majority of human languages are scarce from an electronic point of view: they lack basic tools such as part-of-speech taggers, parsers, or basic resources such as electronic text, annotated corpora or lexica. This severely limits the …
Date: May 2013
Creator: Banea, Carmen
System: The UNT Digital Library
Content and Temporal Analysis of Communications to Predict Task Cohesion in Software Development Global Teams (open access)

Content and Temporal Analysis of Communications to Predict Task Cohesion in Software Development Global Teams

Virtual teams in industry are increasingly being used to develop software, create products, and accomplish tasks. However, analyzing those collaborations under same-time/different-place conditions is well-known to be difficult. In order to overcome some of these challenges, this research was concerned with the study of collaboration-based, content-based and temporal measures and their ability to predict cohesion within global software development projects. Messages were collected from three software development projects that involved students from two different countries. The similarities and quantities of these interactions were computed and analyzed at individual and group levels. Results of interaction-based metrics showed that the collaboration variables most related to Task Cohesion were Linguistic Style Matching and Information Exchange. The study also found that Information Exchange rate and Reply rate have a significant and positive correlation to Task Cohesion, a factor used to describe participants' engagement in the global software development process. This relation was also found at the Group level. All these results suggest that metrics based on rate can be very useful for predicting cohesion in virtual groups. Similarly, content features based on communication categories were used to improve the identification of Task Cohesion levels. This model showed mixed results, since only Work similarity and …
Date: May 2017
Creator: Castro Hernandez, Alberto
System: The UNT Digital Library
Monitoring Dengue Outbreaks Using Online Data (open access)

Monitoring Dengue Outbreaks Using Online Data

Internet technology has affected humans' lives in many disciplines. The search engine is one of the most important Internet tools in that it allows people to search for what they want. Search queries entered in a web search engine can be used to predict dengue incidence. This vector borne disease causes severe illness and kills a large number of people every year. This dissertation utilizes the capabilities of search queries related to dengue and climate to forecast the number of dengue cases. Several machine learning techniques are applied for data analysis, including Multiple Linear Regression, Artificial Neural Networks, and the Seasonal Autoregressive Integrated Moving Average. Predictive models produced from these machine learning methods are measured for their performance to find which technique generates the best model for dengue prediction. The results of experiments presented in this dissertation indicate that search query data related to dengue and climate can be used to forecast the number of dengue cases. The performance measurement of predictive models shows that Artificial Neural Networks outperform the others. These results will help public health officials in planning to deal with the outbreaks.
Date: May 2014
Creator: Chartree, Jedsada
System: The UNT Digital Library
Extracting Possessions and Their Attributes (open access)

Extracting Possessions and Their Attributes

Possession is an asymmetric semantic relation between two entities, where one entity (the possessee) belongs to the other entity (the possessor). Automatically extracting possessions are useful in identifying skills, recommender systems and in natural language understanding. Possessions can be found in different communication modalities including text, images, videos, and audios. In this dissertation, I elaborate on the techniques I used to extract possessions. I begin with extracting possessions at the sentence level including the type and temporal anchors. Then, I extract the duration of possession and co-possessions (if multiple possessors possess the same entity). Next, I extract possessions from an entire Wikipedia article capturing the change of possessors over time. I extract possessions from social media including both text and images. Finally, I also present dense annotations generating possession timelines. I present separate datasets, detailed corpus analysis, and machine learning models for each task described above.
Date: May 2020
Creator: Chinnappa, Dhivya Infant
System: The UNT Digital Library
An Efficient Approach for Dengue Mitigation: A Computational Framework (open access)

An Efficient Approach for Dengue Mitigation: A Computational Framework

Dengue mitigation is a major research area among scientist who are working towards an effective management of the dengue epidemic. An effective dengue mitigation requires several other important components. These components include an accurate epidemic modeling, an efficient epidemic prediction, and an efficient resource allocation for controlling of the spread of the dengue disease. Past studies assumed homogeneous response pattern of the dengue epidemic to climate conditions throughout the regions. The dengue epidemic is climate dependent and also it is geographically dependent. A global model is not sufficient to capture the local variations of the epidemic. We propose a novel method of epidemic modeling considering local variation and that uses micro ensemble of regressors for each region. There are three regressors that are used in the construction of the ensemble. These are support vector regression, ordinary least square regression, and a k-nearest neighbor regression. The best performing regressors get selected into the ensemble. The proposed ensemble determines the risk of dengue epidemic in each region in advance. The risk is then used in risk-based resource allocation. The proposing resource allocation is built based on the genetic algorithm. The algorithm exploits the genetic algorithm with major modifications to its main components, …
Date: May 2019
Creator: Dinayadura, Nirosha
System: The UNT Digital Library
New Computational Methods for Literature-Based Discovery (open access)

New Computational Methods for Literature-Based Discovery

In this work, we leverage the recent developments in computer science to address several of the challenges in current literature-based discovery (LBD) solutions. First, LBD solutions cannot use semantics or are too computational complex. To solve the problems we propose a generative model OverlapLDA based on topic modeling, which has been shown both effective and efficient in extracting semantics from a corpus. We also introduce an inference method of OverlapLDA. We conduct extensive experiments to show the effectiveness and efficiency of OverlapLDA in LBD. Second, we expand LBD to a more complex and realistic setting. The settings are that there can be more than one concept connecting the input concepts, and the connectivity pattern between concepts can also be more complex than a chain. Current LBD solutions can hardly complete the LBD task in the new setting. We simplify the hypotheses as concept sets and propose LBDSetNet based on graph neural networks to solve this problem. We also introduce different training schemes based on self-supervised learning to train LBDSetNet without relying on comprehensive labeled hypotheses that are extremely costly to get. Our comprehensive experiments show that LBDSetNet outperforms strong baselines on simple hypotheses and addresses complex hypotheses.
Date: May 2022
Creator: Ding, Juncheng
System: The UNT Digital Library
Metamodeling-based Fast Optimization of  Nanoscale Ams-socs (open access)

Metamodeling-based Fast Optimization of Nanoscale Ams-socs

Modern consumer electronic systems are mostly based on analog and digital circuits and are designed as analog/mixed-signal systems on chip (AMS-SoCs). the integration of analog and digital circuits on the same die makes the system cost effective. in AMS-SoCs, analog and mixed-signal portions have not traditionally received much attention due to their complexity. As the fabrication technology advances, the simulation times for AMS-SoC circuits become more complex and take significant amounts of time. the time allocated for the circuit design and optimization creates a need to reduce the simulation time. the time constraints placed on designers are imposed by the ever-shortening time to market and non-recurrent cost of the chip. This dissertation proposes the use of a novel method, called metamodeling, and intelligent optimization algorithms to reduce the design time. Metamodel-based ultra-fast design flows are proposed and investigated. Metamodel creation is a one time process and relies on fast sampling through accurate parasitic-aware simulations. One of the targets of this dissertation is to minimize the sample size while retaining the accuracy of the model. in order to achieve this goal, different statistical sampling techniques are explored and applied to various AMS-SoC circuits. Also, different metamodel functions are explored for their …
Date: May 2012
Creator: Garitselov, Oleg
System: The UNT Digital Library
Variability-aware low-power techniques for nanoscale mixed-signal circuits. (open access)

Variability-aware low-power techniques for nanoscale mixed-signal circuits.

New circuit design techniques that accommodate lower supply voltages necessary for portable systems need to be integrated into the semiconductor intellectual property (IP) core. Systems that once worked at 3.3 V or 2.5 V now need to work at 1.8 V or lower, without causing any performance degradation. Also, the fluctuation of device characteristics caused by process variation in nanometer technologies is seen as design yield loss. The numerous parasitic effects induced by layouts, especially for high-performance and high-speed circuits, pose a problem for IC design. Lack of exact layout information during circuit sizing leads to long design iterations involving time-consuming runs of complex tools. There is a strong need for low-power, high-performance, parasitic-aware and process-variation-tolerant circuit design. This dissertation proposes methodologies and techniques to achieve variability, power, performance, and parasitic-aware circuit designs. Three approaches are proposed: the single iteration automatic approach, the hybrid Monte Carlo and design of experiments (DOE) approach, and the corner-based approach. Widely used mixed-signal circuits such as analog-to-digital converter (ADC), voltage controlled oscillator (VCO), voltage level converter and active pixel sensor (APS) have been designed at nanoscale complementary metal oxide semiconductor (CMOS) and subjected to the proposed methodologies. The effectiveness of the proposed methodologies has …
Date: May 2009
Creator: Ghai, Dhruva V.
System: The UNT Digital Library
Incremental Learning with Large Datasets (open access)

Incremental Learning with Large Datasets

This dissertation focuses on the novel learning strategy based on geometric support vector machines to address the difficulties of processing immense data set. Support vector machines find the hyper-plane that maximizes the margin between two classes, and the decision boundary is represented with a few training samples it becomes a favorable choice for incremental learning. The dissertation presents a novel method Geometric Incremental Support Vector Machines (GISVMs) to address both efficiency and accuracy issues in handling massive data sets. In GISVM, skin of convex hulls is defined and an efficient method is designed to find the best skin approximation given available examples. The set of extreme points are found by recursively searching along the direction defined by a pair of known extreme points. By identifying the skin of the convex hulls, the incremental learning will only employ a much smaller number of samples with comparable or even better accuracy. When additional samples are provided, they will be used together with the skin of the convex hull constructed from previous dataset. This results in a small number of instances used in incremental steps of the training process. Based on the experimental results with synthetic data sets, public benchmark data sets from …
Date: May 2012
Creator: Giritharan, Balathasan
System: The UNT Digital Library
Probabilistic Analysis of Contracting Ebola Virus Using Contextual Intelligence (open access)

Probabilistic Analysis of Contracting Ebola Virus Using Contextual Intelligence

The outbreak of the Ebola virus was declared a Public Health Emergency of International Concern by the World Health Organisation (WHO). Due to the complex nature of the outbreak, the Centers for Disease Control and Prevention (CDC) had created interim guidance for monitoring people potentially exposed to Ebola and for evaluating their intended travel and restricting the movements of carriers when needed. Tools to evaluate the risk of individuals and groups of individuals contracting the disease could mitigate the growing anxiety and fear. The goal is to understand and analyze the nature of risk an individual would face when he/she comes in contact with a carrier. This thesis presents a tool that makes use of contextual data intelligence to predict the risk factor of individuals who come in contact with the carrier.
Date: May 2017
Creator: Gopalakrishnan, Arjun
System: The UNT Digital Library
Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation (open access)

Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation

Biological emergency response planning plays a critical role in protecting the public from possible devastating results of sudden disease outbreaks. These plans describe the distribution of medical countermeasures across a region using limited resources within a restricted time window. Thus, the ability to determine that such a plan will be feasible, i.e. successfully provide service to affected populations within the time limit, is crucial. Many of the current efforts to validate plans are in the form of live drills and training, but those may not test plan activation at the appropriate scale or with sufficient numbers of participants. Thus, this necessitates the use of computational resources to aid emergency managers and planners in developing and evaluating plans before they must be used. Current emergency response plan generation software packages such as RE-PLAN or RealOpt, provide rate-based validation analyses. However, these types of analysis may neglect details of real-world traffic dynamics. Therefore, this dissertation presents Validating Emergency Response Plan Execution Through Simulation (VERPETS), a novel, computational system for the agent-based simulation of biological emergency response plan activation. This system converts raw road network, population distribution, and emergency response plan data into a format suitable for simulation, and then performs these simulations …
Date: May 2018
Creator: Helsing, Joseph
System: The UNT Digital Library
Privacy Preserving Machine Learning as a Service (open access)

Privacy Preserving Machine Learning as a Service

Machine learning algorithms based on neural networks have achieved remarkable results and are being extensively used in different domains. However, the machine learning algorithms requires access to raw data which is often privacy sensitive. To address this issue, we develop new techniques to provide solutions for running deep neural networks over encrypted data. In this paper, we develop new techniques to adopt deep neural networks within the practical limitation of current homomorphic encryption schemes. We focus on training and classification of the well-known neural networks and convolutional neural networks. First, we design methods for approximation of the activation functions commonly used in CNNs (i.e. ReLU, Sigmoid, and Tanh) with low degree polynomials which is essential for efficient homomorphic encryption schemes. Then, we train neural networks with the approximation polynomials instead of original activation functions and analyze the performance of the models. Finally, we implement neural networks and convolutional neural networks over encrypted data and measure performance of the models.
Date: May 2020
Creator: Hesamifard, Ehsan
System: The UNT Digital Library
An Extensible Computing Architecture Design for Connected Autonomous Vehicle System (open access)

An Extensible Computing Architecture Design for Connected Autonomous Vehicle System

Autonomous vehicles have made milestone strides within the past decade. Advances up the autonomy ladder have come lock-step with the advances in machine learning, namely deep-learning algorithms and huge, open training sets. And while advances in CPUs have slowed, GPUs have edged into the previous decade's TOP 500 supercomputer territory. This new class of GPUs include novel deep-learning hardware that has essentially side-stepped Moore's law, outpacing the doubling observation by a factor of ten. While GPUs have make record progress, networks do not follow Moore's law and are restricted by several bottlenecks, from protocol-based latency lower bounds to the very laws of physics. In a way, the bottlenecks that plague modern networks gave rise to Edge computing, a key component of the Connected Autonomous Vehicle system, as the need for low-latency in some domains eclipsed the need for massive processing farms. The Connected Autonomous Vehicle ecosystem is one of the most complicated environments in all of computing. Not only is the hardware scaled all the way from 16 and 32-bit microcontrollers, to multi-CPU Edge nodes, and multi-GPU Cloud servers, but the networking also encompasses the gamut of modern communication transports. I propose a framework for negotiating, encapsulating and transferring data …
Date: May 2021
Creator: Hochstetler, Jacob Daniel
System: The UNT Digital Library

Enhancing Storage Dependability and Computing Energy Efficiency for Large-Scale High Performance Computing Systems

Access: Use of this item is restricted to the UNT Community
With the advent of information explosion age, larger capacity disk drives are used to store data and powerful devices are used to process big data. As the scale and complexity of computer systems increase, we expect these systems to provide dependable and energy-efficient services and computation. Although hard drives are reliable in general, they are the most commonly replaced hardware components. Disk failures cause data corruption and even data loss, which can significantly affect system performance and financial losses. In this dissertation research, I analyze different manifestations of disk failures in production data centers and explore data mining techniques combined with statistical analysis methods to discover categories of disk failures and their distinctive properties. I use similarity measures to quantify the degradation process of each failure type and derive the degradation signature. The derived degradation signatures are further leveraged to forecast when future disk failures may happen. Meanwhile, this dissertation also studies energy efficiency of high performance computers. Specifically, I characterize the power and energy consumption of Haswell processors which are used in multiple supercomputers, and analyze the power and energy consumption of Legion, a data-centric programming model and runtime system, and Legion applications. We find that power and energy …
Date: May 2019
Creator: Huang, Song
System: The UNT Digital Library
Models to Combat Email Spam Botnets and Unwanted Phone Calls (open access)

Models to Combat Email Spam Botnets and Unwanted Phone Calls

With the amount of email spam received these days it is hard to imagine that spammers act individually. Nowadays, most of the spam emails have been sent from a collection of compromised machines controlled by some spammers. These compromised computers are often called bots, using which the spammers can send massive volume of spam within a short period of time. The motivation of this work is to understand and analyze the behavior of spammers through a large collection of spam mails. My research examined a the data set collected over a 2.5-year period and developed an algorithm which would give the botnet features and then classify them into various groups. Principal component analysis was used to study the association patterns of group of spammers and the individual behavior of a spammer in a given domain. This is based on the features which capture maximum variance of information we have clustered. Presence information is a growing tool towards more efficient communication and providing new services and features within a business setting and much more. The main contribution in my thesis is to propose the willingness estimator that can estimate the callee's willingness without his/her involvement, the model estimates willingness level based …
Date: May 2008
Creator: Husna, Husain
System: The UNT Digital Library
A Study on Flat-Address-Space Heterogeneous Memory Architectures (open access)

A Study on Flat-Address-Space Heterogeneous Memory Architectures

In this dissertation, we present a number of studies that primarily focus on data movement challenges among different types of memories (viz., 3D-DRAM, DDRx DRAM and NVM) employed together as a flat-address heterogeneous memory system. We introduce two different hardware-based techniques for prefetching data from slow off-chip phase change memory (PCM) to fast on-chip memories. The prefetching techniques efficiently fetch data from PCM and place that data into processor-resident or 3D-DRAM-resident buffers without putting high demand on bandwidth and provide significant performance improvements. Next, we explore different page migration techniques for flat-address memory systems which differ in when to migrate pages (i.e., periodically or instantaneously) and how to manage the migrations (i.e., OS-based or hardware-based approach). In the first page migration study, we present several epoch-based page migration policies for different organizations of flat-address memories consisting of two (2-level) and three (3-level) types of memory modules. These policies have resulted in significant energy savings. In the next page migration study, we devise an efficient "on-the-fly'" page migration technique which migrates a page from slow PCM to fast 3D-DRAM whenever it receives a certain number of memory accesses without waiting for any specific time interval. Furthermore, we present a light-weight hardware-assisted …
Date: May 2019
Creator: Islam, Mahzabeen
System: The UNT Digital Library
Analysis and Optimization of Graphene FET based Nanoelectronic Integrated Circuits (open access)

Analysis and Optimization of Graphene FET based Nanoelectronic Integrated Circuits

Like cell to the human body, transistors are the basic building blocks of any electronics circuits. Silicon has been the industries obvious choice for making transistors. Transistors with large size occupy large chip area, consume lots of power and the number of functionalities will be limited due to area constraints. Thus to make the devices smaller, smarter and faster, the transistors are aggressively scaled down in each generation. Moore's law states that the transistors count in any electronic circuits doubles every 18 months. Following this Moore's law, the transistor has already been scaled down to 14 nm. However there are limitations to how much further these transistors can be scaled down. Particularly below 10 nm, these silicon based transistors hit the fundamental limits like loss of gate control, high leakage and various other short channel effects. Thus it is not possible to favor the silicon transistors for future electronics applications. As a result, the research has shifted to new device concepts and device materials alternative to silicon. Carbon is the next abundant element found in the Earth and one of such carbon based nanomaterial is graphene. Graphene when extracted from Graphite, the same material used as the lid in pencil, …
Date: May 2016
Creator: Joshi, Shital
System: The UNT Digital Library