3GPP Long Term Evolution LTE Scheduling (open access)

3GPP Long Term Evolution LTE Scheduling

Future generation cellular networks are expected to deliver an omnipresent broadband access network for an endlessly increasing number of subscribers. Long term Evolution (LTE) represents a significant milestone towards wireless networks known as 4G cellular networks. A key feature of LTE is the implementation of enhanced Radio Resource Management (RRM) mechanism to improve the system performance. The structure of LTE networks was simplified by diminishing the number of the nodes of the core network. Also, the design of the radio protocol architecture is quite unique. In order to achieve high data rate in LTE, 3rd Generation Partnership Project (3GPP) has selected Orthogonal Frequency Division Multiplexing (OFDM) as an appropriate scheme in terms of downlinks. However, the proper scheme for an uplink is the Single-Carrier Frequency Domain Multiple Access due to the peak-to-average-power-ratio (PAPR) constraint. LTE packet scheduling plays a primary role as part of RRM to improve the system’s data rate as well as supporting various QoS requirements of mobile services. The major function of the LTE packet scheduler is to assign Physical Resource Blocks (PRBs) to mobile User Equipment (UE). In our work, we formed a proposed packet scheduler algorithm. The proposed scheduler algorithm acts based on the number …
Date: December 2013
Creator: Alotaibi, Sultan
System: The UNT Digital Library
An Accelerometer-based Gesture Recognition System for a Tactical Communications Application (open access)

An Accelerometer-based Gesture Recognition System for a Tactical Communications Application

In modern society, computers are primarily interacted with via keyboards, touch screens, voice recognition, video analysis, and many others. For certain applications, these methods may be the most efficient interface. However, there are applications that we can conceive where a more natural interface could be convenient and connect humans and computers in a more intuitive and natural way. These applications are gesture recognition systems and range from the interpretation of sign language by a computer to virtual reality control. This Thesis proposes a gesture recognition system that primarily uses accelerometers to capture gestures from a tactical communications application. A segmentation algorithm is developed based on the accelerometer energy to segment these gestures from an input sequence. Using signal processing and machine learning techniques, the segments are reduced to mathematical features and classified with support vector machines. Experimental results show that the system achieves an overall gesture recognition accuracy of 98.9%. Additional methods, such as non-gesture recognition/suppression, are also proposed and tested.
Date: December 2015
Creator: Tidwell, Robert S., Jr.
System: The UNT Digital Library
Adaptive Planning and Prediction in Agent-Supported Distributed Collaboration. (open access)

Adaptive Planning and Prediction in Agent-Supported Distributed Collaboration.

Agents that act as user assistants will become invaluable as the number of information sources continue to proliferate. Such agents can support the work of users by learning to automate time-consuming tasks and filter information to manageable levels. Although considerable advances have been made in this area, it remains a fertile area for further development. One application of agents under careful scrutiny is the automated negotiation of conflicts between different user's needs and desires. Many techniques require explicit user models in order to function. This dissertation explores a technique for dynamically constructing user models and the impact of using them to anticipate the need for negotiation. Negotiation is reduced by including an advising aspect to the agent that can use this anticipation of conflict to adjust user behavior.
Date: December 2004
Creator: Hartness, Ken T. N.
System: The UNT Digital Library
Analysis and Performance of a Cyber-Human System and Protocols for Geographically Separated Collaborators (open access)

Analysis and Performance of a Cyber-Human System and Protocols for Geographically Separated Collaborators

This dissertation provides an innovative mechanism to collaborate two geographically separated people on a physical task and a novel method to measure Complexity Index (CI) and calculate Minimal Complexity Index (MCI) of a collaboration protocol. The protocol is represented as a structure, and the information content of it is measured in bits to understand the complex nature of the protocol. Using the complexity metrics, one can analyze the performance of a collaborative system and a collaboration protocol. Security and privacy of the consumers are vital while seeking remote help; this dissertation also provides a novel authorization framework for dynamic access control of resources on an input-constrained appliance used for completing the physical task. Using the innovative Collaborative Appliance for REmote-help (CARE) and with the support of a remotely located expert, fifty-nine subjects with minimal or no prior mechanical knowledge are able to elevate a car for replacing a tire in an average time of six minutes and 53 seconds and with an average protocol complexity of 171.6 bits. Moreover, thirty subjects with minimal or no prior plumbing knowledge are able to change the cartridge of a faucet in an average time of ten minutes and with an average protocol complexity …
Date: December 2017
Creator: Jonnada, Srikanth
System: The UNT Digital Library
An Approach Towards Self-Supervised Classification Using Cyc (open access)

An Approach Towards Self-Supervised Classification Using Cyc

Due to the long duration required to perform manual knowledge entry by human knowledge engineers it is desirable to find methods to automatically acquire knowledge about the world by accessing online information. In this work I examine using the Cyc ontology to guide the creation of Naïve Bayes classifiers to provide knowledge about items described in Wikipedia articles. Given an initial set of Wikipedia articles the system uses the ontology to create positive and negative training sets for the classifiers in each category. The order in which classifiers are generated and used to test articles is also guided by the ontology. The research conducted shows that a system can be created that utilizes statistical text classification methods to extract information from an ad-hoc generated information source like Wikipedia for use in a formal semantic ontology like Cyc. Benefits and limitations of the system are discussed along with future work.
Date: December 2006
Creator: Coursey, Kino High
System: The UNT Digital Library
Automated Classification of Emotions Using Song Lyrics (open access)

Automated Classification of Emotions Using Song Lyrics

This thesis explores the classification of emotions in song lyrics, using automatic approaches applied to a novel corpus of 100 popular songs. I use crowd sourcing via Amazon Mechanical Turk to collect line-level emotions annotations for this collection of song lyrics. I then build classifiers that rely on textual features to automatically identify the presence of one or more of the following six Ekman emotions: anger, disgust, fear, joy, sadness and surprise. I compare different classification systems and evaluate the performance of the automatic systems against the manual annotations. I also introduce a system that uses data collected from the social network Twitter. I use the Twitter API to collect a large corpus of tweets manually labeled by their authors for one of the six emotions of interest. I then compare the classification of emotions obtained when training on data automatically collected from Twitter versus data obtained through crowd sourced annotations.
Date: December 2012
Creator: Schellenberg, Rajitha
System: The UNT Digital Library

Automated Defense Against Worm Propagation.

Access: Use of this item is restricted to the UNT Community
Worms have caused significant destruction over the last few years. Network security elements such as firewalls, IDS, etc have been ineffective against worms. Some worms are so fast that a manual intervention is not possible. This brings in the need for a stronger security architecture which can automatically react to stop worm propagation. The method has to be signature independent so that it can stop new worms. In this thesis, an automated defense system (ADS) is developed to automate defense against worms and contain the worm to a level where manual intervention is possible. This is accomplished with a two level architecture with feedback at each level. The inner loop is based on control system theory and uses the properties of PID (proportional, integral and differential controller). The outer loop works at the network level and stops the worm to reach its spread saturation point. In our lab setup, we verified that with only inner loop active the worm was delayed, and with both loops active we were able to restrict the propagation to 10% of the targeted hosts. One concern for deployment of a worm containment mechanism was degradation of throughput for legitimate traffic. We found that with proper …
Date: December 2005
Creator: Patwardhan, Sudeep
System: The UNT Digital Library
Automated Syndromic Surveillance using Intelligent Mobile Agents (open access)

Automated Syndromic Surveillance using Intelligent Mobile Agents

Current syndromic surveillance systems utilize centralized databases that are neither scalable in storage space nor in computing power. Such systems are limited in the amount of syndromic data that may be collected and analyzed for the early detection of infectious disease outbreaks. However, with the increased prevalence of international travel, public health monitoring must extend beyond the borders of municipalities or states which will require the ability to store vasts amount of data and significant computing power for analyzing the data. Intelligent mobile agents may be used to create a distributed surveillance system that will utilize the hard drives and computer processing unit (CPU) power of the hosts on the agent network where the syndromic information is located. This thesis proposes the design of a mobile agent-based syndromic surveillance system and an agent decision model for outbreak detection. Simulation results indicate that mobile agents are capable of detecting an outbreak that occurs at all hosts the agent is monitoring. Further study of agent decision models is required to account for localized epidemics and variable agent movement rates.
Date: December 2007
Creator: Miller, Paul
System: The UNT Digital Library
Biomedical Semantic Embeddings: Using Hybrid Sentences to Construct Biomedical Word Embeddings and its Applications (open access)

Biomedical Semantic Embeddings: Using Hybrid Sentences to Construct Biomedical Word Embeddings and its Applications

Word embeddings is a useful method that has shown enormous success in various NLP tasks, not only in open domain but also in biomedical domain. The biomedical domain provides various domain specific resources and tools that can be exploited to improve performance of these word embeddings. However, most of the research related to word embeddings in biomedical domain focuses on analysis of model architecture, hyper-parameters and input text. In this paper, we use SemMedDB to design new sentences called `Semantic Sentences'. Then we use these sentences in addition to biomedical text as inputs to the word embedding model. This approach aims at introducing biomedical semantic types defined by UMLS, into the vector space of word embeddings. The semantically rich word embeddings presented here rivals state of the art biomedical word embedding in both semantic similarity and relatedness metrics up to 11%. We also demonstrate how these semantic types in word embeddings can be utilized.
Date: December 2019
Creator: Shaik, Arshad
System: The UNT Digital Library
Boosting for Learning From Imbalanced, Multiclass Data Sets (open access)

Boosting for Learning From Imbalanced, Multiclass Data Sets

In many real-world applications, it is common to have uneven number of examples among multiple classes. The data imbalance, however, usually complicates the learning process, especially for the minority classes, and results in deteriorated performance. Boosting methods were proposed to handle the imbalance problem. These methods need elongated training time and require diversity among the classifiers of the ensemble to achieve improved performance. Additionally, extending the boosting method to handle multi-class data sets is not straightforward. Examples of applications that suffer from imbalanced multi-class data can be found in face recognition, where tens of classes exist, and in capsule endoscopy, which suffers massive imbalance between the classes. This dissertation introduces RegBoost, a new boosting framework to address the imbalanced, multi-class problems. This method applies a weighted stratified sampling technique and incorporates a regularization term that accommodates multi-class data sets and automatically determines the error bound of each base classifier. The regularization parameter penalizes the classifier when it misclassifies instances that were correctly classified in the previous iteration. The parameter additionally reduces the bias towards majority classes. Experiments are conducted using 12 diverse data sets with moderate to high imbalance ratios. The results demonstrate superior performance of the proposed method compared …
Date: December 2013
Creator: Abouelenien, Mohamed
System: The UNT Digital Library
Capacity and Throughput Optimization in Multi-cell 3G WCDMA Networks (open access)

Capacity and Throughput Optimization in Multi-cell 3G WCDMA Networks

User modeling enables in the computation of the traffic density in a cellular network, which can be used to optimize the placement of base stations and radio network controllers as well as to analyze the performance of resource management algorithms towards meeting the final goal: the calculation and maximization of network capacity and throughput for different data rate services. An analytical model is presented for approximating the user distributions in multi-cell third generation wideband code division multiple access (WCDMA) networks using 2-dimensional Gaussian distributions by determining the means and the standard deviations of the distributions for every cell. This model allows for the calculation of the inter-cell interference and the reverse-link capacity of the network. An analytical model for optimizing capacity in multi-cell WCDMA networks is presented. Capacity is optimized for different spreading factors and for perfect and imperfect power control. Numerical results show that the SIR threshold for the received signals is decreased by 0.5 to 1.5 dB due to the imperfect power control. The results also show that the determined parameters of the 2-dimensional Gaussian model match well with traditional methods for modeling user distribution. A call admission control algorithm is designed that maximizes the throughput in multi-cell …
Date: December 2005
Creator: Nguyen, Son
System: The UNT Digital Library
CLUE: A Cluster Evaluation Tool (open access)

CLUE: A Cluster Evaluation Tool

Modern high performance computing is dependent on parallel processing systems. Most current benchmarks reveal only the high level computational throughput metrics, which may be sufficient for single processor systems, but can lead to a misrepresentation of true system capability for parallel systems. A new benchmark is therefore proposed. CLUE (Cluster Evaluator) uses a cellular automata algorithm to evaluate the scalability of parallel processing machines. The benchmark also uses algorithmic variations to evaluate individual system components' impact on the overall serial fraction and efficiency. CLUE is not a replacement for other performance-centric benchmarks, but rather shows the scalability of a system and provides metrics to reveal where one can improve overall performance. CLUE is a new benchmark which demonstrates a better comparison among different parallel systems than existing benchmarks and can diagnose where a particular parallel system can be optimized.
Date: December 2006
Creator: Parker, Brandon S.
System: The UNT Digital Library

Combinatorial-Based Testing Strategies for Mobile Application Testing

This work introduces three new coverage criteria based on combinatorial-based event and element sequences that occur in the mobile environment. The novel combinatorial-based criteria are used to reduce, prioritize, and generate test suites for mobile applications. The combinatorial-based criteria include unique coverage of events and elements with different respects to ordering. For instance, consider the coverage of a pair of events, e1 and e2. The least strict criterion, Combinatorial Coverage (CCov), counts the combination of these two events in a test case without respect to the order in which the events occur. That is, the combination (e1, e2) is the same as (e2, e1). The second criterion, Sequence-Based Combinatorial Coverage (SCov), considers the order of occurrence within a test case. Sequences (e1, ..., e2) and (e2,..., e1) are different sequences. The third and strictest criterion is Consecutive-Sequence Combinatorial Coverage (CSCov), which counts adjacent sequences of consecutive pairs. The sequence (e1, e2) is only counted if e1 immediately occurs before e2. The first contribution uses the novel combinatorial-based criteria for the purpose of test suite reduction. Empirical studies reveal that the criteria, when used with event sequences and sequences of size t=2, reduce the test suites by 22.8%-61.3% while the reduced …
Date: December 2020
Creator: Michaels, Ryan P.
System: The UNT Digital Library
Comparative Study of RSS-Based Collaborative Localization Methods in Wireless Sensor Networks (open access)

Comparative Study of RSS-Based Collaborative Localization Methods in Wireless Sensor Networks

In this thesis two collaborative localization techniques are studied: multidimensional scaling (MDS) and maximum likelihood estimator (MLE). A synthesis of a new location estimation method through a serial integration of these two techniques, such that an estimate is first obtained using MDS and then MLE is employed to fine-tune the MDS solution, was the subject of this research using various simulation and experimental studies. In the simulations, important issues including the effects of sensor node density, reference node density and different deployment strategies of reference nodes were addressed. In the experimental study, the path loss model of indoor environments is developed by determining the environment-specific parameters from the experimental measurement data. Then, the empirical path loss model is employed in the analysis and simulation study of the performance of collaborative localization techniques.
Date: December 2006
Creator: Koneru, Avanthi
System: The UNT Digital Library

Comparison and Evaluation of Existing Analog Circuit Simulator using Sigma-Delta Modulator

Access: Use of this item is restricted to the UNT Community
In the world of VLSI (very large scale integration) technology, there are many different types of circuit simulators that are used to design and predict the circuit behavior before actual fabrication of the circuit. In this thesis, I compared and evaluated existing circuit simulators by considering standard benchmark circuits. The circuit simulators which I evaluated and explored are Ngspice, Tclspice, Winspice (open source) and Spectre® (commercial). I also tested standard benchmarks using these circuit simulators and compared their outputs. The simulators are evaluated using design metrics in order to quantify their performance and identify efficient circuit simulators. In addition, I designed a sigma-delta modulator and its individual components using the analog behavioral language Verilog-A. Initially, I performed simulations of individual components of the sigma-delta modulator and later of the whole system. Finally, CMOS (complementary metal-oxide semiconductor) transistor-level circuits were designed for the differential amplifier, operational amplifier and comparator of the modulator.
Date: December 2006
Creator: Ale, Anil Kumar
System: The UNT Digital Library
A Control Theoretic Approach for Resilient Network Services (open access)

A Control Theoretic Approach for Resilient Network Services

Resilient networks have the ability to provide the desired level of service, despite challenges such as malicious attacks and misconfigurations. The primary goal of this dissertation is to be able to provide uninterrupted network services in the face of an attack or any failures. This dissertation attempts to apply control system theory techniques with a focus on system identification and closed-loop feedback control. It explores the benefits of system identification technique in designing and validating the model for the complex and dynamic networks. Further, this dissertation focuses on designing robust feedback control mechanisms that are both scalable and effective in real-time. It focuses on employing dynamic and predictive control approaches to reduce the impact of an attack on network services. The closed-loop feedback control mechanisms tackle this issue by degrading the network services gracefully to an acceptable level and then stabilizing the network in real-time (less than 50 seconds). Employing these feedback mechanisms also provide the ability to automatically configure the settings such that the QoS metrics of the network is consistent with those specified in the service level agreements.
Date: December 2018
Creator: Vempati, Jagannadh Ambareesh
System: The UNT Digital Library
Design and Analysis of Novel Verifiable Voting Schemes (open access)

Design and Analysis of Novel Verifiable Voting Schemes

Free and fair elections are the basis for democracy, but conducting elections is not an easy task. Different groups of people are trying to influence the outcome of the election in their favor using the range of methods, from campaigning for a particular candidate to well-financed lobbying. Often the stakes are too high, and the methods are illegal. Two main properties of any voting scheme are the privacy of a voter’s choice and the integrity of the tally. Unfortunately, they are mutually exclusive. Integrity requires making elections transparent and auditable, but at the same time, we must preserve a voter’s privacy. It is always a trade-off between these two requirements. Current voting schemes favor privacy over auditability, and thus, they are vulnerable to voting fraud. I propose two novel voting systems that can achieve both privacy and verifiability. The first protocol is based on cryptographical primitives to ensure the integrity of the final tally and privacy of the voter. The second protocol is a simple paper-based voting scheme that achieves almost the same level of security without usage of cryptography.
Date: December 2013
Creator: Yestekov, Yernat
System: The UNT Digital Library

Design and Optimization of Components in a 45nm CMOS Phase Locked Loop

Access: Use of this item is restricted to the UNT Community
A novel scheme of optimizing the individual components of a phase locked loop (PLL) which is used for stable clock generation and synchronization of signals is considered in this work. Verilog-A is used for the high level system design of the main components of the PLL, followed by the individual component wise optimization. The design of experiments (DOE) approach to optimize the analog, 45nm voltage controlled oscillator (VCO) is presented. Also a mixed signal analysis using the analog and digital Verilog behavior of components is studied. Overall a high level system design of a PLL, a systematic optimization of each of its components, and an analog and mixed signal behavioral design approach have been implemented using cadence custom IC design tools.
Date: December 2006
Creator: Sarivisetti, Gayathri
System: The UNT Digital Library
The Design Of A Benchmark For Geo-stream Management Systems (open access)

The Design Of A Benchmark For Geo-stream Management Systems

The recent growth in sensor technology allows easier information gathering in real-time as sensors have grown smaller, more accurate, and less expensive. The resulting data is often in a geo-stream format continuously changing input with a spatial extent. Researchers developing geo-streaming management systems (GSMS) require a benchmark system for evaluation, which is currently lacking. This thesis presents GSMark, a benchmark for evaluating GSMSs. GSMark provides a data generator that creates a combination of synthetic and real geo-streaming data, a workload simulator to present the data to the GSMS as a data stream, and a set of benchmark queries that evaluate typical GSMS functionality and query performance. In particular, GSMark generates both moving points and evolving spatial regions, two fundamental data types for a broad range of geo-stream applications, and the geo-streaming queries on this data.
Date: December 2011
Creator: Shen, Chao
System: The UNT Digital Library
Detection and Classification of Heart Sounds Using a Heart-Mobile Interface (open access)

Detection and Classification of Heart Sounds Using a Heart-Mobile Interface

An early detection of heart disease can save lives, caution individuals and also help to determine the type of treatment to be given to the patients. The first test of diagnosing a heart disease is through auscultation - listening to the heart sounds. The interpretation of heart sounds is subjective and requires a professional skill to identify the abnormalities in these sounds. A medical practitioner uses a stethoscope to perform an initial screening by listening for irregular sounds from the patient's chest. Later, echocardiography and electrocardiography tests are taken for further diagnosis. However, these tests are expensive and require specialized technicians to operate. A simple and economical way is vital for monitoring in homecare or rural hospitals and urban clinics. This dissertation is focused on developing a patient-centered device for initial screening of the heart sounds that is both low cost and can be used by the users on themselves, and later share the readings with the healthcare providers. An innovative mobile health service platform is created for analyzing and classifying heart sounds. Certain properties of heart sounds have to be evaluated to identify the irregularities such as the number of heart beats and gallops, intensity, frequency, and duration. Since …
Date: December 2016
Creator: Thiyagaraja, Shanti
System: The UNT Digital Library
Detection of Generalizable Clone Security Coding Bugs Using Graphs and Learning Algorithms (open access)

Detection of Generalizable Clone Security Coding Bugs Using Graphs and Learning Algorithms

This research methodology isolates coding properties and identifies the probability of security vulnerabilities using machine learning and historical data. Several approaches characterize the effectiveness of detecting security-related bugs that manifest as vulnerabilities, but none utilize vulnerability patch information. The main contribution of this research is a framework to analyze LLVM Intermediate Representation Code and merging core source code representations using source code properties. This research is beneficial because it allows source programs to be transformed into a graphical form and users can extract specific code properties related to vulnerable functions. The result is an improved approach to detect, identify, and track software system vulnerabilities based on a performance evaluation. The methodology uses historical function level vulnerability information, unique feature extraction techniques, a novel code property graph, and learning algorithms to minimize the amount of end user domain knowledge necessary to detect vulnerabilities in applications. The analysis shows approximately 99% precision and recall to detect known vulnerabilities in the National Institute of Standards and Technology (NIST) Software Assurance Metrics and Tool Evaluation (SAMATE) project. Furthermore, 72% percent of the historical vulnerabilities in the OpenSSL testing environment were detected using a linear support vector classifier (SVC) model.
Date: December 2018
Creator: Mayo, Quentin R
System: The UNT Digital Library
Detection of Ulcerative Colitis Severity and Enhancement of Informative Frame Filtering Using Texture Analysis in Colonoscopy Videos (open access)

Detection of Ulcerative Colitis Severity and Enhancement of Informative Frame Filtering Using Texture Analysis in Colonoscopy Videos

There are several types of disorders that affect our colon’s ability to function properly such as colorectal cancer, ulcerative colitis, diverticulitis, irritable bowel syndrome and colonic polyps. Automatic detection of these diseases would inform the endoscopist of possible sub-optimal inspection during the colonoscopy procedure as well as save time during post-procedure evaluation. But existing systems only detects few of those disorders like colonic polyps. In this dissertation, we address the automatic detection of another important disorder called ulcerative colitis. We propose a novel texture feature extraction technique to detect the severity of ulcerative colitis in block, image, and video levels. We also enhance the current informative frame filtering methods by detecting water and bubble frames using our proposed technique. Our feature extraction algorithm based on accumulation of pixel value difference provides better accuracy at faster speed than the existing methods making it highly suitable for real-time systems. We also propose a hybrid approach in which our feature method is combined with existing feature method(s) to provide even better accuracy. We extend the block and image level detection method to video level severity score calculation and shot segmentation. Also, the proposed novel feature extraction method can detect water and bubble frames …
Date: December 2015
Creator: Dahal, Ashok
System: The UNT Digital Library
Direct Online/Offline Digital Signature Schemes. (open access)

Direct Online/Offline Digital Signature Schemes.

Online/offline signature schemes are useful in many situations, and two such scenarios are considered in this dissertation: bursty server authentication and embedded device authentication. In this dissertation, new techniques for online/offline signing are introduced, those are applied in a variety of ways for creating online/offline signature schemes, and five different online/offline signature schemes that are proved secure under a variety of models and assumptions are proposed. Two of the proposed five schemes have the best offline or best online performance of any currently known technique, and are particularly well-suited for the scenarios that are considered in this dissertation. To determine if the proposed schemes provide the expected practical improvements, a series of experiments were conducted comparing the proposed schemes with each other and with other state-of-the-art schemes in this area, both on a desktop class computer, and under AVR Studio, a simulation platform for an 8-bit processor that is popular for embedded systems. Under AVR Studio, the proposed SGE scheme using a typical key size for the embedded device authentication scenario, can complete the offline phase in about 24 seconds and then produce a signature (the online phase) in 15 milliseconds, which is the best offline performance of any known …
Date: December 2008
Creator: Yu, Ping
System: The UNT Digital Library
Energy-Aware Time Synchronization in Wireless Sensor Networks (open access)

Energy-Aware Time Synchronization in Wireless Sensor Networks

I present a time synchronization algorithm for wireless sensor networks that aims to conserve sensor battery power. The proposed method creates a hierarchical tree by flooding the sensor network from a designated source point. It then uses a hybrid algorithm derived from the timing-sync protocol for sensor networks (TSPN) and the reference broadcast synchronization method (RBS) to periodically synchronize sensor clocks by minimizing energy consumption. In multi-hop ad-hoc networks, a depleted sensor will drop information from all other sensors that route data through it, decreasing the physical area being monitored by the network. The proposed method uses several techniques and thresholds to maintain network connectivity. A new root sensor is chosen when the current one's battery power decreases to a designated value. I implement this new synchronization technique using Matlab and show that it can provide significant power savings over both TPSN and RBS.
Date: December 2006
Creator: Saravanos, Yanos
System: The UNT Digital Library