Analysis and Optimization of Graphene FET based Nanoelectronic Integrated Circuits (open access)

Analysis and Optimization of Graphene FET based Nanoelectronic Integrated Circuits

Like cell to the human body, transistors are the basic building blocks of any electronics circuits. Silicon has been the industries obvious choice for making transistors. Transistors with large size occupy large chip area, consume lots of power and the number of functionalities will be limited due to area constraints. Thus to make the devices smaller, smarter and faster, the transistors are aggressively scaled down in each generation. Moore's law states that the transistors count in any electronic circuits doubles every 18 months. Following this Moore's law, the transistor has already been scaled down to 14 nm. However there are limitations to how much further these transistors can be scaled down. Particularly below 10 nm, these silicon based transistors hit the fundamental limits like loss of gate control, high leakage and various other short channel effects. Thus it is not possible to favor the silicon transistors for future electronics applications. As a result, the research has shifted to new device concepts and device materials alternative to silicon. Carbon is the next abundant element found in the Earth and one of such carbon based nanomaterial is graphene. Graphene when extracted from Graphite, the same material used as the lid in pencil, …
Date: May 2016
Creator: Joshi, Shital
System: The UNT Digital Library
Models to Combat Email Spam Botnets and Unwanted Phone Calls (open access)

Models to Combat Email Spam Botnets and Unwanted Phone Calls

With the amount of email spam received these days it is hard to imagine that spammers act individually. Nowadays, most of the spam emails have been sent from a collection of compromised machines controlled by some spammers. These compromised computers are often called bots, using which the spammers can send massive volume of spam within a short period of time. The motivation of this work is to understand and analyze the behavior of spammers through a large collection of spam mails. My research examined a the data set collected over a 2.5-year period and developed an algorithm which would give the botnet features and then classify them into various groups. Principal component analysis was used to study the association patterns of group of spammers and the individual behavior of a spammer in a given domain. This is based on the features which capture maximum variance of information we have clustered. Presence information is a growing tool towards more efficient communication and providing new services and features within a business setting and much more. The main contribution in my thesis is to propose the willingness estimator that can estimate the callee's willingness without his/her involvement, the model estimates willingness level based …
Date: May 2008
Creator: Husna, Husain
System: The UNT Digital Library
Epileptic Seizure Detection and Control in the Internet of Medical Things (IoMT) Framework (open access)

Epileptic Seizure Detection and Control in the Internet of Medical Things (IoMT) Framework

Epilepsy affects up to 1% of the world's population and approximately 2.5 million people in the United States. A considerable portion (30%) of epilepsy patients are refractory to antiepileptic drugs (AEDs), and surgery can not be an effective candidate if the focus of the seizure is on the eloquent cortex. To overcome the problems with existing solutions, a notable portion of biomedical research is focused on developing an implantable or wearable system for automated seizure detection and control. Seizure detection algorithms based on signal rejection algorithms (SRA), deep neural networks (DNN), and neighborhood component analysis (NCA) have been proposed in the IoMT framework. The algorithms proposed in this work have been validated with both scalp and intracranial electroencephalography (EEG, icEEG), and demonstrate high classification accuracy, sensitivity, and specificity. The occurrence of seizure can be controlled by direct drug injection into the epileptogenic zone, which enhances the efficacy of the AEDs. Piezoelectric and electromagnetic micropumps have been explored for the use of a drug delivery unit, as they provide accurate drug flow and reduce power consumption. The reduction in power consumption as a result of minimal circuitry employed by the drug delivery system is making it suitable for practical biomedical applications. …
Date: May 2020
Creator: Sayeed, Md Abu
System: The UNT Digital Library
Layout-accurate Ultra-fast System-level Design Exploration Through Verilog-ams (open access)

Layout-accurate Ultra-fast System-level Design Exploration Through Verilog-ams

This research addresses problems in designing analog and mixed-signal (AMS) systems by bridging the gap between system-level and circuit-level simulation by making simulations fast like system-level and accurate like circuit-level. The tools proposed include metamodel integrated Verilog-AMS based design exploration flows. The research involves design centering, metamodel generation flows for creating efficient behavioral models, and Verilog-AMS integration techniques for model realization. The core of the proposed solution is transistor-level and layout-level metamodeling and their incorporation in Verilog-AMS. Metamodeling is used to construct efficient and layout-accurate surrogate models for AMS system building blocks. Verilog-AMS, an AMS hardware description language, is employed to build surrogate model implementations that can be simulated with industrial standard simulators. The case-study circuits and systems include an operational amplifier (OP-AMP), a voltage-controlled oscillator (VCO), a charge-pump phase-locked loop (PLL), and a continuous-time delta-sigma modulator (DSM). The minimum and maximum error rates of the proposed OP-AMP model are 0.11 % and 2.86 %, respectively. The error rates for the PLL lock time and power estimation are 0.7 % and 3.0 %, respectively. The OP-AMP optimization using the proposed approach is ~17000× faster than the transistor-level model based approach. The optimization achieves a ~4× power reduction for the OP-AMP …
Date: May 2013
Creator: Zheng, Geng
System: The UNT Digital Library
Variability-aware low-power techniques for nanoscale mixed-signal circuits. (open access)

Variability-aware low-power techniques for nanoscale mixed-signal circuits.

New circuit design techniques that accommodate lower supply voltages necessary for portable systems need to be integrated into the semiconductor intellectual property (IP) core. Systems that once worked at 3.3 V or 2.5 V now need to work at 1.8 V or lower, without causing any performance degradation. Also, the fluctuation of device characteristics caused by process variation in nanometer technologies is seen as design yield loss. The numerous parasitic effects induced by layouts, especially for high-performance and high-speed circuits, pose a problem for IC design. Lack of exact layout information during circuit sizing leads to long design iterations involving time-consuming runs of complex tools. There is a strong need for low-power, high-performance, parasitic-aware and process-variation-tolerant circuit design. This dissertation proposes methodologies and techniques to achieve variability, power, performance, and parasitic-aware circuit designs. Three approaches are proposed: the single iteration automatic approach, the hybrid Monte Carlo and design of experiments (DOE) approach, and the corner-based approach. Widely used mixed-signal circuits such as analog-to-digital converter (ADC), voltage controlled oscillator (VCO), voltage level converter and active pixel sensor (APS) have been designed at nanoscale complementary metal oxide semiconductor (CMOS) and subjected to the proposed methodologies. The effectiveness of the proposed methodologies has …
Date: May 2009
Creator: Ghai, Dhruva V.
System: The UNT Digital Library
Metamodeling-based Fast Optimization of  Nanoscale Ams-socs (open access)

Metamodeling-based Fast Optimization of Nanoscale Ams-socs

Modern consumer electronic systems are mostly based on analog and digital circuits and are designed as analog/mixed-signal systems on chip (AMS-SoCs). the integration of analog and digital circuits on the same die makes the system cost effective. in AMS-SoCs, analog and mixed-signal portions have not traditionally received much attention due to their complexity. As the fabrication technology advances, the simulation times for AMS-SoC circuits become more complex and take significant amounts of time. the time allocated for the circuit design and optimization creates a need to reduce the simulation time. the time constraints placed on designers are imposed by the ever-shortening time to market and non-recurrent cost of the chip. This dissertation proposes the use of a novel method, called metamodeling, and intelligent optimization algorithms to reduce the design time. Metamodel-based ultra-fast design flows are proposed and investigated. Metamodel creation is a one time process and relies on fast sampling through accurate parasitic-aware simulations. One of the targets of this dissertation is to minimize the sample size while retaining the accuracy of the model. in order to achieve this goal, different statistical sampling techniques are explored and applied to various AMS-SoC circuits. Also, different metamodel functions are explored for their …
Date: May 2012
Creator: Garitselov, Oleg
System: The UNT Digital Library
Improving Software Quality through Syntax and Semantics Verification of Requirements Models (open access)

Improving Software Quality through Syntax and Semantics Verification of Requirements Models

Software defects can frequently be traced to poorly-specified requirements. Many software teams manage their requirements using tools such as checklists and databases, which lack a formal semantic mapping to system behavior. Such a mapping can be especially helpful for safety-critical systems. Another limitation of many requirements analysis methods is that much of the analysis must still be done manually. We propose techniques that automate portions of the requirements analysis process, as well as clarify the syntax and semantics of requirements models using a variety of methods, including machine learning tools and our own tool, VeriCCM. The machine learning tools used help us identify potential model elements and verify their correctness. VeriCCM, a formalized extension of the causal component model (CCM), uses formal methods to ensure that requirements are well-formed, as well as providing the beginnings of a full formal semantics. We also explore the use of statecharts to identify potential abnormal behaviors from a given set of requirements. At each stage, we perform empirical studies to evaluate the effectiveness of our proposed approaches.
Date: December 2018
Creator: Gaither, Danielle
System: The UNT Digital Library
Content and Temporal Analysis of Communications to Predict Task Cohesion in Software Development Global Teams (open access)

Content and Temporal Analysis of Communications to Predict Task Cohesion in Software Development Global Teams

Virtual teams in industry are increasingly being used to develop software, create products, and accomplish tasks. However, analyzing those collaborations under same-time/different-place conditions is well-known to be difficult. In order to overcome some of these challenges, this research was concerned with the study of collaboration-based, content-based and temporal measures and their ability to predict cohesion within global software development projects. Messages were collected from three software development projects that involved students from two different countries. The similarities and quantities of these interactions were computed and analyzed at individual and group levels. Results of interaction-based metrics showed that the collaboration variables most related to Task Cohesion were Linguistic Style Matching and Information Exchange. The study also found that Information Exchange rate and Reply rate have a significant and positive correlation to Task Cohesion, a factor used to describe participants' engagement in the global software development process. This relation was also found at the Group level. All these results suggest that metrics based on rate can be very useful for predicting cohesion in virtual groups. Similarly, content features based on communication categories were used to improve the identification of Task Cohesion levels. This model showed mixed results, since only Work similarity and …
Date: May 2017
Creator: Castro Hernandez, Alberto
System: The UNT Digital Library
Procedural Generation of Content for Online Role Playing Games (open access)

Procedural Generation of Content for Online Role Playing Games

Video game players demand a volume of content far in excess of the ability of game designers to create it. For example, a single quest might take a week to develop and test, which means that companies such as Blizzard are spending millions of dollars each month on new content for their games. As a result, both players and developers are frustrated with the inability to meet the demand for new content. By generating content on-demand, it is possible to create custom content for each player based on player preferences. It is also possible to make use of the current world state during generation, something which cannot be done with current techniques. Using developers to create rules and assets for a content generator instead of creating content directly will lower development costs as well as reduce the development time for new game content to seconds rather than days. This work is part of the field of computational creativity, and involves the use of computers to create aesthetically pleasing game content, such as terrain, characters, and quests. I demonstrate agent-based terrain generation, and economic modeling of game spaces. I also demonstrate the autonomous generation of quests for online role playing games, …
Date: August 2014
Creator: Doran, Jonathon
System: The UNT Digital Library
Analysis and Performance of a Cyber-Human System and Protocols for Geographically Separated Collaborators (open access)

Analysis and Performance of a Cyber-Human System and Protocols for Geographically Separated Collaborators

This dissertation provides an innovative mechanism to collaborate two geographically separated people on a physical task and a novel method to measure Complexity Index (CI) and calculate Minimal Complexity Index (MCI) of a collaboration protocol. The protocol is represented as a structure, and the information content of it is measured in bits to understand the complex nature of the protocol. Using the complexity metrics, one can analyze the performance of a collaborative system and a collaboration protocol. Security and privacy of the consumers are vital while seeking remote help; this dissertation also provides a novel authorization framework for dynamic access control of resources on an input-constrained appliance used for completing the physical task. Using the innovative Collaborative Appliance for REmote-help (CARE) and with the support of a remotely located expert, fifty-nine subjects with minimal or no prior mechanical knowledge are able to elevate a car for replacing a tire in an average time of six minutes and 53 seconds and with an average protocol complexity of 171.6 bits. Moreover, thirty subjects with minimal or no prior plumbing knowledge are able to change the cartridge of a faucet in an average time of ten minutes and with an average protocol complexity …
Date: December 2017
Creator: Jonnada, Srikanth
System: The UNT Digital Library
A Control Theoretic Approach for Resilient Network Services (open access)

A Control Theoretic Approach for Resilient Network Services

Resilient networks have the ability to provide the desired level of service, despite challenges such as malicious attacks and misconfigurations. The primary goal of this dissertation is to be able to provide uninterrupted network services in the face of an attack or any failures. This dissertation attempts to apply control system theory techniques with a focus on system identification and closed-loop feedback control. It explores the benefits of system identification technique in designing and validating the model for the complex and dynamic networks. Further, this dissertation focuses on designing robust feedback control mechanisms that are both scalable and effective in real-time. It focuses on employing dynamic and predictive control approaches to reduce the impact of an attack on network services. The closed-loop feedback control mechanisms tackle this issue by degrading the network services gracefully to an acceptable level and then stabilizing the network in real-time (less than 50 seconds). Employing these feedback mechanisms also provide the ability to automatically configure the settings such that the QoS metrics of the network is consistent with those specified in the service level agreements.
Date: December 2018
Creator: Vempati, Jagannadh Ambareesh
System: The UNT Digital Library
Exploring Physical Unclonable Functions for Efficient Hardware Assisted Security in the IoT (open access)

Exploring Physical Unclonable Functions for Efficient Hardware Assisted Security in the IoT

Modern cities are undergoing rapid expansion. The number of connected devices in the networks in and around these cities is increasing every day and will exponentially increase in the next few years. At home, the number of connected devices is also increasing with the introduction of home automation appliances and applications. Many of these appliances are becoming smart devices which can track our daily routines. It is imperative that all these devices should be secure. When cryptographic keys used for encryption and decryption are stored on memory present on these devices, they can be retrieved by attackers or adversaries to gain control of the system. For this purpose, Physical Unclonable Functions (PUFs) were proposed to generate the keys required for encryption and decryption of the data or the communication channel, as required by the application. PUF modules take advantage of the manufacturing variations that are introduced in the Integrated Circuits (ICs) during the fabrication process. These are used to generate the cryptographic keys which reduces the use of a separate memory module to store the encryption and decryption keys. A PUF module can also be recon gurable such that the number of input output pairs or Challenge Response Pairs (CRPs) …
Date: May 2019
Creator: Yanambaka, Venkata Prasanth
System: The UNT Digital Library
Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation (open access)

Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation

Biological emergency response planning plays a critical role in protecting the public from possible devastating results of sudden disease outbreaks. These plans describe the distribution of medical countermeasures across a region using limited resources within a restricted time window. Thus, the ability to determine that such a plan will be feasible, i.e. successfully provide service to affected populations within the time limit, is crucial. Many of the current efforts to validate plans are in the form of live drills and training, but those may not test plan activation at the appropriate scale or with sufficient numbers of participants. Thus, this necessitates the use of computational resources to aid emergency managers and planners in developing and evaluating plans before they must be used. Current emergency response plan generation software packages such as RE-PLAN or RealOpt, provide rate-based validation analyses. However, these types of analysis may neglect details of real-world traffic dynamics. Therefore, this dissertation presents Validating Emergency Response Plan Execution Through Simulation (VERPETS), a novel, computational system for the agent-based simulation of biological emergency response plan activation. This system converts raw road network, population distribution, and emergency response plan data into a format suitable for simulation, and then performs these simulations …
Date: May 2018
Creator: Helsing, Joseph
System: The UNT Digital Library
SIMON: A Domain-Agnostic Framework for Secure Design and Validation of Cyber Physical Systems (open access)

SIMON: A Domain-Agnostic Framework for Secure Design and Validation of Cyber Physical Systems

Cyber physical systems (CPS) are an integration of computational and physical processes, where the cyber components monitor and control physical processes. Cyber-attacks largely target the cyber components with the intention of disrupting the functionality of the components in the physical domain. This dissertation explores the role of semantic inference in understanding such attacks and building resilient CPS systems. To that end, we present SIMON, an ontological design and verification framework that captures the intricate relationship(s) between cyber and physical components in CPS by leveraging several standard ontologies and extending the NIST CPS framework for the purpose of eliciting trustworthy requirements, assigning responsibilities and roles to CPS functionalities, and validating that the trustworthy requirements are met by the designed system. We demonstrate the capabilities of SIMON using two case studies – a vehicle to infrastructure (V2I) safety application and an additive manufacturing (AM) printer. In addition, we also present a taxonomy to capture threat feeds specific to the AM domain.
Date: December 2021
Creator: Yanambaka Venkata, Rohith
System: The UNT Digital Library
Using Blockchain to Ensure Reputation Credibility in Decentralized Review Management (open access)

Using Blockchain to Ensure Reputation Credibility in Decentralized Review Management

In recent years, there have been incidents which decreased people's trust in some organizations and authorities responsible for ratings and accreditation. For a few prominent examples, there was a security breach at Equifax (2017), misconduct was found in the Standard & Poor's Ratings Services (2015), and the Accrediting Council for Independent Colleges and Schools (2022) validated some of the low-performing schools as delivering higher standards than they actually were. A natural solution to these types of issues is to decentralize the relevant trust management processes using blockchain technologies. The research problems which are tackled in this thesis consider the issue of trust in reputation for assessment and review credibility at different angles, in the context of blockchain applications. We first explored the following questions. How can we trust courses in one college to provide students with the type and level of knowledge which is needed in a specific workplace? Micro-accreditation on a blockchain was our solution, including using a peer-review system to determine the rigor of a course (through a consensus). Rigor is the level of difficulty in regard to a student's expected level of knowledge. Currently, we make assumptions about the quality and rigor of what is learned, but …
Date: December 2023
Creator: Zaccagni, Zachary James
System: The UNT Digital Library

Optimization of Massive MIMO Systems for 5G Networks

In the first part of the dissertation, we provide an extensive overview of sub-6 GHz wireless access technology known as massive multiple-input multiple-output (MIMO) systems, highlighting its benefits, deployment challenges, and the key enabling technologies envisaged for 5G networks. We investigate the fundamental issues that degrade the performance of massive MIMO systems such as pilot contamination, precoding, user scheduling, and signal detection. In the second part, we optimize the performance of the massive MIMO system by proposing several algorithms, system designs, and hardware architectures. To mitigate the effect of pilot contamination, we propose a pilot reuse factor scheme based on the user environment and the number of active users. The results through simulations show that the proposed scheme ensures the system always operates at maximal spectral efficiency and achieves higher throughput. To address the user scheduling problem, we propose two user scheduling algorithms bases upon the measured channel gain. The simulation results show that our proposed user scheduling algorithms achieve better error performance, improve sum capacity and throughput, and guarantee fairness among the users. To address the uplink signal detection challenge in the massive MIMO systems, we propose four algorithms and their system designs. We show through simulations that the …
Date: August 2020
Creator: Chataut, Robin
System: The UNT Digital Library
Probabilistic Analysis of Contracting Ebola Virus Using Contextual Intelligence (open access)

Probabilistic Analysis of Contracting Ebola Virus Using Contextual Intelligence

The outbreak of the Ebola virus was declared a Public Health Emergency of International Concern by the World Health Organisation (WHO). Due to the complex nature of the outbreak, the Centers for Disease Control and Prevention (CDC) had created interim guidance for monitoring people potentially exposed to Ebola and for evaluating their intended travel and restricting the movements of carriers when needed. Tools to evaluate the risk of individuals and groups of individuals contracting the disease could mitigate the growing anxiety and fear. The goal is to understand and analyze the nature of risk an individual would face when he/she comes in contact with a carrier. This thesis presents a tool that makes use of contextual data intelligence to predict the risk factor of individuals who come in contact with the carrier.
Date: May 2017
Creator: Gopalakrishnan, Arjun
System: The UNT Digital Library

A Top-Down Policy Engineering Framework for Attribute-Based Access Control

The purpose of this study is to propose a top-down policy engineering framework for attribute-based access control (ABAC) that aims to automatically extract ACPs from requirement specifications documents, and then, using the extracted policies, build or update an ABAC model. We specify a procedure that consists of three main components: 1) ACP sentence identification, 2) policy element extraction, and 3) ABAC model creation and update. ACP sentence identification processes unrestricted natural language documents and identify the sentences that carry ACP content. We propose and compare three different methodologies from different disciplines, namely deep recurrent neural networks (RNN-based), biological immune system (BIS-based), and a combination of multiple natural language processing techniques (PMI-based) in order to identify the proper methodology for extracting ACP sentences from irrelevant text. Our evaluation results improve the state-of-the-art by a margin of 5% F1-Measure. To aid future research, we also introduce a new dataset that includes 5000 sentences from real-world policy documents. ABAC policy extraction extracts ACP elements such as subject, object, and action from the identified ACPs. We use semantic roles and correctly identify ACP elements with an average F1 score of 75%, which bests the previous work by 15%. Furthermore, as SRL tools are often …
Date: May 2020
Creator: Narouei, Masoud
System: The UNT Digital Library
Capacity and Throughput Optimization in Multi-cell 3G WCDMA Networks (open access)

Capacity and Throughput Optimization in Multi-cell 3G WCDMA Networks

User modeling enables in the computation of the traffic density in a cellular network, which can be used to optimize the placement of base stations and radio network controllers as well as to analyze the performance of resource management algorithms towards meeting the final goal: the calculation and maximization of network capacity and throughput for different data rate services. An analytical model is presented for approximating the user distributions in multi-cell third generation wideband code division multiple access (WCDMA) networks using 2-dimensional Gaussian distributions by determining the means and the standard deviations of the distributions for every cell. This model allows for the calculation of the inter-cell interference and the reverse-link capacity of the network. An analytical model for optimizing capacity in multi-cell WCDMA networks is presented. Capacity is optimized for different spreading factors and for perfect and imperfect power control. Numerical results show that the SIR threshold for the received signals is decreased by 0.5 to 1.5 dB due to the imperfect power control. The results also show that the determined parameters of the 2-dimensional Gaussian model match well with traditional methods for modeling user distribution. A call admission control algorithm is designed that maximizes the throughput in multi-cell …
Date: December 2005
Creator: Nguyen, Son
System: The UNT Digital Library
Space and Spectrum Engineered High Frequency Components and Circuits (open access)

Space and Spectrum Engineered High Frequency Components and Circuits

With the increasing demand on wireless and portable devices, the radio frequency front end blocks are required to feature properties such as wideband, high frequency, multiple operating frequencies, low cost and compact size. However, the current radio frequency system blocks are designed by combining several individual frequency band blocks into one functional block, which increase the cost and size of devices. To address these issues, it is important to develop novel approaches to further advance the current design methodologies in both space and spectrum domains. In recent years, the concept of artificial materials has been proposed and studied intensively in RF/Microwave, Terahertz, and optical frequency range. It is a combination of conventional materials such as air, wood, metal and plastic. It can achieve the material properties that have not been found in nature. Therefore, the artificial material (i.e. meta-materials) provides design freedoms to control both the spectrum performance and geometrical structures of radio frequency front end blocks and other high frequency systems. In this dissertation, several artificial materials are proposed and designed by different methods, and their applications to different high frequency components and circuits are studied. First, quasi-conformal mapping (QCM) method is applied to design plasmonic wave-adapters and couplers …
Date: May 2015
Creator: Arigong, Bayaner
System: The UNT Digital Library
Evaluation of Call Mobility on Network Productivity in Long Term Evolution Advanced (LTE-A) Femtocells (open access)

Evaluation of Call Mobility on Network Productivity in Long Term Evolution Advanced (LTE-A) Femtocells

The demand for higher data rates for indoor and cell-edge users led to evolution of small cells. LTE femtocells, one of the small cell categories, are low-power low-cost mobile base stations, which are deployed within the coverage area of the traditional macro base station. The cross-tier and co-tier interferences occur only when the macrocell and femtocell share the same frequency channels. Open access (OSG), closed access (CSG), and hybrid access are the three existing access-control methods that decide users' connectivity to the femtocell access point (FAP). We define a network performance function, network productivity, to measure the traffic that is carried successfully. In this dissertation, we evaluate call mobility in LTE integrated network and determine optimized network productivity with variable call arrival rate in given LTE deployment with femtocell access modes (OSG, CSG, HYBRID) for a given call blocking vector. The solution to the optimization is maximum network productivity and call arrival rates for all cells. In the second scenario, we evaluate call mobility in LTE integrated network with increasing femtocells and maximize network productivity with variable femtocells distribution per macrocell with constant call arrival rate in uniform LTE deployment with femtocell access modes (OSG, CSG, HYBRID) for a given …
Date: December 2017
Creator: Sawant, Uttara
System: The UNT Digital Library
The Procedural Generation of Interesting Sokoban Levels (open access)

The Procedural Generation of Interesting Sokoban Levels

As video games continue to become larger, more complex, and more costly to produce, research into methods to make game creation easier and faster becomes more valuable. One such research topic is procedural generation, which allows the computer to assist in the creation of content. This dissertation presents a new algorithm for the generation of Sokoban levels. Sokoban is a grid-based transport puzzle which is computational interesting due to being PSPACE-complete. Beyond just generating levels, the question of whether or not the levels created by this algorithm are interesting to human players is explored. A study was carried out comparing player attention while playing hand made levels versus their attention during procedurally generated levels. An auditory Stroop test was used to measure attention without disrupting play.
Date: May 2015
Creator: Taylor, Joshua
System: The UNT Digital Library
New Frameworks for Secure Image Communication in the Internet of Things (IoT) (open access)

New Frameworks for Secure Image Communication in the Internet of Things (IoT)

The continuous expansion of technology, broadband connectivity and the wide range of new devices in the IoT cause serious concerns regarding privacy and security. In addition, in the IoT a key challenge is the storage and management of massive data streams. For example, there is always the demand for acceptable size with the highest quality possible for images to meet the rapidly increasing number of multimedia applications. The effort in this dissertation contributes to the resolution of concerns related to the security and compression functions in image communications in the Internet of Thing (IoT), due to the fast of evolution of IoT. This dissertation proposes frameworks for a secure digital camera in the IoT. The objectives of this dissertation are twofold. On the one hand, the proposed framework architecture offers a double-layer of protection: encryption and watermarking that will address all issues related to security, privacy, and digital rights management (DRM) by applying a hardware architecture of the state-of-the-art image compression technique Better Portable Graphics (BPG), which achieves high compression ratio with small size. On the other hand, the proposed framework of SBPG is integrated with the Digital Camera. Thus, the proposed framework of SBPG integrated with SDC is suitable …
Date: August 2016
Creator: Albalawi, Umar Abdalah S
System: The UNT Digital Library
An Extensible Computing Architecture Design for Connected Autonomous Vehicle System (open access)

An Extensible Computing Architecture Design for Connected Autonomous Vehicle System

Autonomous vehicles have made milestone strides within the past decade. Advances up the autonomy ladder have come lock-step with the advances in machine learning, namely deep-learning algorithms and huge, open training sets. And while advances in CPUs have slowed, GPUs have edged into the previous decade's TOP 500 supercomputer territory. This new class of GPUs include novel deep-learning hardware that has essentially side-stepped Moore's law, outpacing the doubling observation by a factor of ten. While GPUs have make record progress, networks do not follow Moore's law and are restricted by several bottlenecks, from protocol-based latency lower bounds to the very laws of physics. In a way, the bottlenecks that plague modern networks gave rise to Edge computing, a key component of the Connected Autonomous Vehicle system, as the need for low-latency in some domains eclipsed the need for massive processing farms. The Connected Autonomous Vehicle ecosystem is one of the most complicated environments in all of computing. Not only is the hardware scaled all the way from 16 and 32-bit microcontrollers, to multi-CPU Edge nodes, and multi-GPU Cloud servers, but the networking also encompasses the gamut of modern communication transports. I propose a framework for negotiating, encapsulating and transferring data …
Date: May 2021
Creator: Hochstetler, Jacob Daniel
System: The UNT Digital Library