To see the other types of publications on this topic, follow the link: Global Data Synchronization Network.

Dissertations / Theses on the topic 'Global Data Synchronization Network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Global Data Synchronization Network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Barnes, Richard Neil. "Global synchronization of asynchronous computing systems." Master's thesis, Mississippi State : Mississippi State University, 2001. http://library.msstate.edu/etd/show.asp?etd=etd-10262001-094922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bort, Tomáš. "Product Information Management - bohatství ukryté v datech o produktu." Master's thesis, Vysoká škola ekonomická v Praze, 2008. http://www.nusl.cz/ntk/nusl-12434.

Full text
Abstract:
The exceeding supply over demand and very hard competitive conditions are nowadays the main features of the majority of sectors. A successful company is the one that is able to satisfy specific customers' needs, the one that has efficient cooperation with its suppliers throughout the whole supply chain and also the one that is able to speed up the in-house information exchange. Thus the company has to seek constantly new and innovative solutions. This is not possible without standardization and automatization of business processes. This master's thesis is dedicated to one of the possible solutions -- the Product Information Management (PIM). Since it is intended for business managers (without deep IT knowledge), at the beginning it answers the question why it is so important to know master data and to manage it. It specializes in managing product data, brings its comprehensive overview and identifies the advantages and drawbacks of the implementation as well as financial and organizational impacts. The consecutive chapter deals with simplified yet applicable approach to data management analysis (with emphasis on the PIM) and based on research, it mentions main mistakes of the implementation. In addition to the overview of main vendors of the PIM solution, it presents the latest trends in the PIM. Besides internal data synchronization, the thesis analyses several product standards -- the fundamental step towards external data synchronization, the key topic of the practical part. The whole thesis is conceived to provide an organization with a simple yet compact and therefore very effective tool for master product data insight and thus to help it to gain a competitive advantage.
APA, Harvard, Vancouver, ISO, and other styles
3

Dehmelt, Chris. "A Hybrid Data Acquisition Architecture on the CH-53K Program." International Foundation for Telemetering, 2010. http://hdl.handle.net/10150/604252.

Full text
Abstract:
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California
As today's flight test programs need for sensor and bus data continue to increase, there has been associated requirements to provide modern system output products and support higher encoder data rates. The CH-53K Heavy Lift Replacement (HLR) Program is an example in which the instrumentation data requirements have increased significantly over previous helicopter programs and necessitated the introduction of new technologies and capabilities. The CH-53K Program utilizes a hybrid system architecture that combines the benefits of legacy PCM and modern networked system architectures. The system provides for maintaining the required system-wide synchronized sampling capabilities, while providing real-time data access and system control over a vehicle network. Serial Streaming Telemetry (SST)-to-vNET Adapters are employed to enable many of these capabilities. This paper describes the instrumentation requirements for the CH-53K program and the features, tools and performance of its data acquisition system - which addressed all requirements while minimizing the overall impact to the existing instrumentation infrastructure.
APA, Harvard, Vancouver, ISO, and other styles
4

Sarkar, Souradip. "Multiple clock domain synchronization for network on chips." Online access for everyone, 2007. http://www.dissertations.wsu.edu/Thesis/Fall2007/S_Sarkar_112907.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sheriff, Nathirulla. "Time Synchronization In ANT Wireless Low Power Sensor Network." Thesis, Tekniska Högskolan, Högskolan i Jönköping, JTH, Data- och elektroteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-15068.

Full text
Abstract:
Short range wireless data communication networks that are used for sport and health care are sometimes called Wireless Body Area Networks (WBANs) and they are located more or less on a person. Sole Integrated Gait Sensor (SIGS) is a research project in WBAN, where wireless pressure sensors are placed like soles in the shoes of persons with different kinds of deceases. The sensors can measure the pressure of the foot relative to the shoe i.e. the load of the two legs is measured. This information can be useful e.g. to not over or under load a leg after joint replacement or as a bio feedback system to help e.g. post stroke patients to avoid falling. The SIGS uses the ANT Protocol and radio specification. ANT uses the 2.4 GHz ISM band and TDMA is used to share a single frequency. The scheduling of time slots is adaptive isochronous co-existence i.e. the scheduling is not static and each transmitter sends periodically but checks for interference with other traffic on the radio channel. In this unidirectional system sole sensors are masters (transmitters) and the WBAN server is the slave in ANT sense. The message rate is chosen as 8 Hz which is suitable for low power consumption. Hence in the SIGS system, it is necessary to synchronize the left and the right foot sensors because of low message rate. In our thesis, we found a method and developed a prototype to receive the time synchronized data in WBAN server from ANT wireless sensor nodes in SIGS system. For this thesis work, a hardware prototype design was developed. The USB and USART communication protocols were also implemented in the hardware prototype. The suitable method for time synchronization was implemented on the hardware prototype. The implemented method receives the sensor data, checks for the correct stream of data; add timestamp to the sensor data and transmit the data to the Linux WBAN server. The time slots allocation in the ANT protocol was found. Alternative solution for the time synchronization in ANT protocol was also provided. The whole SIGS system was tested for its full functionality. The experiments and analysis which we performed were successful and the results obtained provided good time synchronization protocol for ANT low power wireless sensor network and for Wireless Bio-feedback system.
APA, Harvard, Vancouver, ISO, and other styles
6

Grubinger, Michael, and Felix Strohmeier. "AUTONOMOUS ACQUISITION OF ENVIRONMENTAL DATA IN A GLOBAL NETWORK ENVIRONMENT." International Foundation for Telemetering, 2001. http://hdl.handle.net/10150/607597.

Full text
Abstract:
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada
This paper presents the results of a feasibility study undertaken by the University of Salzburg (Austria), investigating the autonomous acquisition of environmental data in a global network. A suggested application which is used as the basis of this paper is a volcano monitoring system which would be able to track the activity of a volcano and act as a disaster warning system. The background Volcano observation data required for such a system is covered, before discussing the concepts for sensor data acquisition, storage and processing. A final analysis is then presented of the opportunities for the transmission by packet radio (both terrestrial and satellite).
APA, Harvard, Vancouver, ISO, and other styles
7

Coffey, Thomas. "A distributed global-wide security system." Thesis, University of Ulster, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.260989.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gorang, Brandon Paul. "Scheduling a global engine maintenance network." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104398.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Thesis: S.M. in Engineering Systems, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 83-84).
This thesis addresses the allocation of gas turbine aircraft engines to maintenance facilities. Scheduling a global engine maintenance network can be very complex and challenging. This project pertains particularly to the V2500 IAE engine maintenance network managed by Pratt & Whitney. Using a mathematical program to automate engine allocation was believed to reduce the workload on the organization and the cost of maintaining the 3100 engine fleet. An introduction to the engine maintenance network will be covered along with an explanation of Fleet Hour Agreements (FHA). A literature review of mathematical programming is included to provide background of pertinent information. The current state of the business is analyzed. An integer linear program is developed to closely represent the current state of the business. Historical data was used to feed the model, and the outputs from the model were compared to actuals. A sensitivity analysis is performed to better understand the constraints of the current business and the feasibility of the model. An optimization model should not be used to plan engine maintenance given the current state of business. The business is too dynamic and the network is highly constrained by capacity. The results also show a much smaller savings than were originally expected. This is mostly due to better understanding the cost of maintaining the engines at the different shops. The variation was much lower than originally expected. The current state is operating close to optimal with great flexibility and should continue on as is.
by Brandon Paul Gorang.
M.B.A.
S.M. in Engineering Systems
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Xiaobo. "Collaboration Instance Manager of UbiCollab 2008 : Collaboration Instance Synchronization and Management in P2P network." Thesis, Norwegian University of Science and Technology, Department of Computer and Information Science, 2008. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-9714.

Full text
Abstract:

This report is for my research of Collaboration Instance Manager of UbiCollab project. UbiCollab want to be the platform for ubiquitous collaborative active. UbiCollab project aims to develop a distributed collaborative platform which makes people in distributed space ubiquitous collaborate with friends and colleagues. Collaboration instance manager (CIM) is a core component of the UbiCollab platform, which manage such collaborative activities. My research topics of CIM include in the P2P network development by using JXME, the data synchronization through this P2P network and how to manage these synchronized date by using a local file system. The result of my research is a CIM system, which deployed as OSGI bundle. User can use that do some collaborative active. This CIM system manage the service level of data synchronization, other modules and applications can use that to handle data synchronization between each other without know the details of how to implement it. For that purpose I first reviewed the related theories of distributed systems, ubiquitous systems, mobile systems and CSCW. After that review I researched on some alternatives for developing such system and choose the candidate technologies for my prototype. Secondly I analyzed the requirements of UbiCollab and designed the prototype. Based on that design, I implemented and tested that CIM system based on agreed common scenarios and developed a simple GUI for show the utility. Finally, I evaluate the system by analysis system requirements and scenario criteria.

APA, Harvard, Vancouver, ISO, and other styles
10

Durbeck, Lisa J. "Global Energy Conservation in Large Data Networks." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/78291.

Full text
Abstract:
Seven to ten percent of the energy used globally goes towards powering information and communications technology (ICT): the global data- and telecommunications network, the private and commercial datacenters it supports, and the 19 billion electronic devices around the globe it interconnects, through which we communicate, and access and produce information. As bandwidth and data rates increase, so does the volume of traffic, as well as the absolute amount of new information digitized and uploaded onto the Net and into the cloud each second. Words like gigabit and terabyte were needless fifteen years ago in the public arena; now, they are common phrases. As people use their networked devices to do more, to access more, to send more, and to connect more, they use more energy--not only in their own devices, but also throughout the ICT. While there are many endeavors focused on individual low-power devices, few are examining broad strategies that cross the many boundaries of separate concerns within the ICT; also, few are assessing the impact of specific strategies on the global energy supply: at a global scale. This work examines the energy savings of several such strategies; it also assesses their efficacy in reducing energy consumption, both within specific networks and within the larger ICT. All of these strategies save energy by reducing the work done by the system as a whole on behalf of a single user, often by exploiting commonalities among what many users around the globe are also doing to amortize the costs.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
11

Martin, Fredric W. "DISTRIBUTED ARCHITECTURE FOR A GLOBAL TT&C NETWORK." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/608562.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
Use of top-down design principles and standard interface techniques provides the basis for a global telemetry data collection, analysis, and satellite control network with a high degree of survivability via use of distributed architecture. Use of Commercial Off-The-Shelf (COTS) hardware and software minimizes costs and provides for easy expansion and adaption to new satellite constellations. Adaptive techniques and low cost multiplexers provide for graceful system wide degradation and flexible data distribution.
APA, Harvard, Vancouver, ISO, and other styles
12

Newton, Todd, Evan Grim, and Myron Moodie. "Considerations for Deploying IEEE 1588 V2 in Network-Centric Data Acquisition and Telemetry Systems." International Foundation for Telemetering, 2008. http://hdl.handle.net/10150/606167.

Full text
Abstract:
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California
Network-centric architectures continue to gain acceptance in data acquisition and telemetry systems. Though networks by nature impose non-deterministic transit time of data through a given link, the IEEE 1588 standard provides a means to remove this jitter by distributing time messages to the data acquisition units themselves. But like all standards, they evolve over time. The same is true with IEEE 1588, which is releasing its second version later this year. This paper discusses the challenges of the first version of the IEEE 1588 standard that Version 2 set out to address, potential challenges with Version 2, and interoperability issues that may arise when incorporating a mixture of Version 1 and Version 2 devices.
APA, Harvard, Vancouver, ISO, and other styles
13

Thakor, Mitali Nitish. "Algorithmic detectives against child trafficking : data, entrapment, and the new global policing network." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/107039.

Full text
Abstract:
Thesis: Ph. D. in History, Anthropology, and Science, Technology and Society (HASTS), Massachusetts Institute of Technology, Program in Science, Technology and Society, 2016.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 244-268).
My dissertation explores how "anti-trafficking" has emerged as a global network of humanitarian professionals, law enforcement, and software companies collaborating to address the issue of child exploitation and trafficking online. I argue that the anti-trafficking network consolidates expertise through a shared moralizing politics of bureaucracy and carceral sensibility of securitization. This network mobilizes the issue of child protection to expand the reach of technologies of search and prediction, and to afford legitimation to a newly normalized level of digital surveillance. My findings are based on three years of ethnographic fieldwork and interviews with the United Nations and anti-trafficking organizations in Thailand, with a child protection NGO and police in the Netherlands, and with software companies and law enforcement in the United States. I use two case studies to support my argument that the child protection movement has motivated the expansion of digital policing and surveillance: 1) image detection software developed in collaboration between social media and software companies and international law enforcement organizations; and 2) the design and deployment of a 3D moving avatar of a photorealistic girl used in a child sex exploitation sting operation by an NGO working with an advertising firm. I draw from queer feminist phenomenology to introduce 'proximity' as a governing concept for understanding expert sociality and digital surveillance. Child protection operates in a global affective economy of fear, in which the risk of violence is always anticipated and close. The new global policing network keeps exploitation proximate through the humanitarian ideology of emancipation that motivates child protection, and through publicity of technological campaigns, in order to produce public acquiescence to the spectacles of digital surveillance, shaming, and punishment.
by Mitali Nitish Thakor.
Ph. D. in History, Anthropology, and Science, Technology and Society (HASTS)
APA, Harvard, Vancouver, ISO, and other styles
14

Wåhslén, Jonas, Ibrahim Orhan, Dennis Sturm, and Thomas Lindh. "Performance Evaluation of Time Syncrhonization and Clock Drift Compensation in Wireless Personal Area Network." KTH, Data- och elektroteknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-123218.

Full text
Abstract:
Efficient algorithms for time synchronization, including compensation for clock drift, are essential in order to obtain reliable fusion of data samples from multiple wireless sensor nodes. This paper evaluates the performance of algorithms based on three different approaches; one that synchronizes the local clocks on the sensor nodes, and a second that uses a single clock on the receiving node (e.g. a mobile phone), and a third that uses broadcast messages. The performances of the synchronization algorithms are evaluated in wireless personal area networks, especially Bluetooth piconets and ZigBee/IEEE 802.15.4 networks. A new approach for compensation of clock drift and a realtime implementation of single node synchronization from the mobile phone are presented and tested. Finally, applications of data fusion and time synchronization are shown in two different use cases; a kayaking sports case, and monitoring of heart and respiration of prematurely born infants.

QC 20130605

APA, Harvard, Vancouver, ISO, and other styles
15

Ahmed, Faizan. "Global IoT Coverage Through Aerial And Satellite Network." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-281245.

Full text
Abstract:
Internet of Things (IoT) and Machine Type Communication (MTC) have got more momentum in the last few years but still, need to be penetrated with their full swing in our daily life. This can be possible with general framework that provides global network coverage. Non-terrestrial networks comprised of satellites and aerial platforms are expected to provide next-generation communication services in underserved and un-served areas by ensuring the quality of service that cannot be covered by existing terrestrial networks owing to economical and geographical limitations. The aim of this thesis is to formulate a set of massive and critical MTC use cases such as global environment monitoring, tracking of shipping containers and smart agriculture, and assess their comprehensive requirements like data size, sensor node density and uplink capacity and discuss possible network architectures and deployments focusing on satellite or aerial networks. A rigorous discussion on different network architectures to address the requirements have been presented, that involve (1) Low Earth Orbit (LEO) satellite based network, (2) High Altitude Platform (HAP) based network, and (3) HAP and UAV based network. The proposed network architectures have been simulated and analyzed using MATLAB tools for respective use cases in terms of required number of satellites or aerial platforms. The criteria for selection of network architectures for the use cases are based on the minimum number of satellites or aerial platforms. The results show that LEO constellation consisting of 260 satellites are feasible concerning deployment and management for global environment monitoring network. Similarly, 1440 LEO satellites provide global coverage for tracking of shipping containers. Smart agriculture use case requires high throughput, and hence HAP and UAV integrated network architecture is more realistic for a fully autonomous system as compared to other network architectures. Cooperative control and management of set of agricultural machines can be performed at the UAV. Simulation results show that single UAV can be capable of commanding and controlling the agricultural smart machines in one square kilometer crop field and can send the summary of events to the central station via a HAP.
Internet of Things (IoT) och maskintypkommunikation (MTC) har fått mer fart under de senaste åren men måste fortfarande penetreras med sin fulla sväng i vårt dagliga liv. Detta kan vara möjligt med allmän ramverk som ger global nätverkstäckning. Icke- markbundna nät bestående av satelliter och flygplattformar förväntas tillhandahålla nästa generations kommunikationstjänster i undervärdiga och obetjänade områden genom att säkerställa kvaliteten påtjänster som inte kan täckas av befintliga marknät pågrund av ekonomiska och geografiska begränsningar. Syftet med den här avhandlingen är att formulera en uppsättning massiva och kritiska MTC-användningsfall som global miljöövervakning, spårning av fraktcontainrar och smart jordbruk, och utvärdera deras omfattande krav som datastorlek, sensornoddensitet och upplänkkapacitet och diskutera möjliga nätverk arkitekturer och distributioner med fokus påsatellit- eller flygnät. En rigorös diskussion om olika nätverksarkitekturer för att möta kraven har presenterats, som involverar (1) Low Earth Orbit (LEO) satellitbaserat nätverk, (2) High Altitude Platform (HAP) baserat nätverk, och (3) HAP och UAV baserat nätverk. De föreslagna nätverksarkitekturerna har simulerats och analyserats med MATLAB-verktyg för respek- tive användningsfall i termer av det nödvändiga antalet satelliter eller flygplattformar. Kriterierna för val av nätverksarkitekturer för användningsfallen är baserade pådet minsta antalet satelliter eller flygplattformar. Resultaten visar att LEO-konstellationen bestående av 260 satelliter är möjlig när det gäller distribution och hantering för globalt miljöövervakningsnätverk. Påliknande sätt ger 1440 LEO-satelliter global täckning för spårning av fraktcontainrar. Småjordbruksanvändningsfall kräver hög kapacitet, och följaktligen är HAP och UAV integrerad nätverksarkitektur mer realistisk för ett helt autonomt system jämfört med andra nätverksarkitekturer. Kooperativ kontroll och hantering av jordbruksmaskiner kan utföras vid UAV. Simuleringsresultat visar att en enda UAV kan vara kapabel att kommandera och kontrollera jordbrukssmarta maskiner i ett kvadratkilometer grödningsfält och kan skicka sammanfattningen av händelser till centralstationen via HAP.
APA, Harvard, Vancouver, ISO, and other styles
16

Dehmelt, Chris. "Integration of Smart Sensor Buses into Distributed Data Acquisition Systems." International Foundation for Telemetering, 2005. http://hdl.handle.net/10150/604924.

Full text
Abstract:
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada
As requirements for the amount of test data continues to increase, instrumentation engineers are under pressure to deploy data acquisition systems that reduce the amount of associated wiring and overall system complexity. Smart sensor buses have been long considered as one approach to address this issue by placing the appropriate signal conditioners close to their respective sensors and providing data back over a common bus. However, the inability to adequately synchronize the operation of the sensor bus to the system master, which is required to correlate analog data measurements, has precluded their use. The ongoing development and deployment of smart sensor buses has reached the phase in which integration into a larger data acquisition system environment must be considered. Smart sensor buses, such as IntelliBus™, have their own unique mode of operation based on a pre-determined sampling schedule, which however, is typically asynchronous to the operation of the (master or controller) data acquisition system and must be accounted for when attempting to synchronize the two systems. IRIG Chapter 4 type methods for inserting data into a format, as exemplified by the handling of MIL-STD-1553 data, could be employed, with the disadvantage of eliminating any knowledge as to when a particular measurement was sampled, unless it is time stamped (similar to the time stamping function that is provided to mark receipt of 1553 command words). This can result in excessive time data as each sensor bus can manage a large number of analog sensor inputs and multiple sensor buses must be accommodated by the data acquisition system. The paper provides an example, using the Boeing developed IntelliBus system and the L3 Communications - Telemetry East NetDAS system, of how correlated data can be acquired from a smart sensor bus as a major subsystem component of a larger integrated data acquisition system. The focus will be specifically on how the IntelliBus schedule can be synchronized to that of the NetDAS formatter. Sample formats will be provided along with a description of how a standalone NetDAS stack and an integrated NetDAS-IntelliBus system would be programmed to create the required output, taking into account the unique sampling characteristics of the sensor bus.
APA, Harvard, Vancouver, ISO, and other styles
17

Igugu, Onajite Johnson. "LAPSync : a Location-Aware Protocol for Remote File Synchronization." Thesis, Högskolan Väst, Institutionen för ingenjörsvetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-4460.

Full text
Abstract:
Commercial provisioning of file synchronization services (FSS) relies entirely on protocols that utilize a remote central server that is often located in the cloud to host important files. Updates at user computers are propagated to the central server and from the central server to other sources in need of such updates. Therefore, a synchronization operation between two computers located on the same network often results in file data transmission to and from this local network at least twice. This introduces an unnecessary bandwidth usage on a user’s local network and has become an issue, especially since there is an increase in the demands for internet resources. This thesis work presents a new file synchronization protocol towards FSS known as LAPSync (location-aware protocol for remote file synchronization). This paper also proposes a hierarchical synchronization mechanism by utilizing LAPSync. Our proposed solution relies on the ability of LAPSync clients to acquire knowledge about the location of clients participating in the synchronization operation to construct a hierarchical synchronization path. Finally, we implement our protocol into a prototype and conduct experiments to compare with drop-box (a popular file synchronization service). The results obtained show that LAPSync offers a reduction in bandwidth usage when the files to be synchronized exist on the same local network
APA, Harvard, Vancouver, ISO, and other styles
18

Sharaf, Taysseer. "Statistical Learning with Artificial Neural Network Applied to Health and Environmental Data." Scholar Commons, 2015. http://scholarcommons.usf.edu/etd/5866.

Full text
Abstract:
The current study illustrates the utilization of artificial neural network in statistical methodology. More specifically in survival analysis and time series analysis, where both holds an important and wide use in many applications in our real life. We start our discussion by utilizing artificial neural network in survival analysis. In literature there exist two important methodology of utilizing artificial neural network in survival analysis based on discrete survival time method. We illustrate the idea of discrete survival time method and show how one can estimate the discrete model using artificial neural network. We present a comparison between the two methodology and update one of them to estimate survival time of competing risks. To fit a model using artificial neural network, you need to take care of two parts; first one is the neural network architecture and second part is the learning algorithm. Usually neural networks are trained using a non-linear optimization algorithm such as quasi Newton Raphson algorithm. Other learning algorithms are base on Bayesian inference. In this study we present a new learning technique by using a mixture of the two available methodologies for using Bayesian inference in training of neural networks. We have performed our analysis using real world data. We have used patients diagnosed with skin cancer in the United states from SEER database, under the supervision of the National Cancer Institute. The second part of this dissertation presents the utilization of artificial neural to time series analysis. We present a new method of training recurrent artificial neural network with Hybrid Monte Carlo Sampling and compare our findings with the popular auto-regressive integrated moving average (ARIMA) model. We used the carbon dioxide monthly average emission to apply our comparison, data collected from NOAA.
APA, Harvard, Vancouver, ISO, and other styles
19

Jalilov, Orkhan, Thomas Köhler, Manjula Vithanapathirana, Shironica P. Karunanayaka, Sandra Hummel, and Bridget Sheehan. "EdTec Implementation in a global higher education network: Empirical data from a field study in South Asia." TUDpress, 2020. https://tud.qucosa.de/id/qucosa%3A73594.

Full text
Abstract:
This paper examines the appropriateness of using educational technologies toward increasing flexibility of learning in a global higher education in South Asia. The integration of information and communication technology (ICT) into education is widely perceived as an essential aspect of teaching and learning in contemporary society and therefore embodied in education policies across many countries, Cambodia and Sri Lanka included. Authors consider the argument that while interactive educational technologies may be appropriate in countries in which self-directed study and student autonomy are emphasised, a similar use of educational technologies may be found appropriate. Yet, in South Asian countries, education has traditionally been more tightly structured and teacher-directed that is why this paper does examine government policies toward the use of educational technologies in higher education in Cambodia and Sri Lanka. Qualitative analyses of both needs and challenges of introducing and implementing ICT in these particular cultural contexts are considered as preconditions for an effective implementation of Higher Education (HE) skill development. Subsequently, a plan is concluded of how to implement EdTec in that HE network to trigger awareness about further steps of the recent measure.
APA, Harvard, Vancouver, ISO, and other styles
20

Bommakanti, Hemanth Ram Kartik. "Impact of Time Synchronization Accuracy in Integrated Navigation Systems." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-260239.

Full text
Abstract:
Global Navigation Satellite System/Inertial Measurement Unit (GNSS/IMU) Integrated Navigation Systems (INS) integrate the positive features of GNSS and IMU for optimal navigation guidance in high accuracy outdoor navigation systems, for example using Extended Kalman Filter (EKF) techniques. Time synchronization of IMU data with precise GNSS based time is necessary to accurately synchronize the two systems. This must be done in real-time for time sensitive navigation applications such as autonomous vehicles. The research is done in two parts. The first part is the simulation of inaccurate time-stamping in a single axis of nonlinear input data in a gyroscope and an accelerometer, to obtain the timing error value that is tolerable by a high accuracy GNSS/INS system. The second part is the creation of a real-time algorithm using an STM32 embedded system enabled with FreeRTOS real-time kernel for a GNSS receiver and antenna, along with an IMU sensor. A comparative analysis of the time synchronized system and an unsynchronized system is done based on the errors produced using gyroscope and accelerometer readings along a single axis from the IMU sensor, by conducting static and rotational tests on a revolving chair.The simulation concludes that a high accuracy GNSS/INS system can tolerate a timing error of up to 1 millisecond. The real-time solution provides IMU data paired with updated GNSS based time-stamps every 5 milliseconds. The timing jitter is reduced to a range of ±1 millisecond. Analysis of final angular rotation error and final position error from gyroscope and accelerometer readings respectively, indicate that the real-time algorithm produces a reduction in errors when the system is static, but there is no statistical evidence showing the reduction of errors from the results of the rotational tests.
GNSS / IMU integrerade navigationssystem kombinerar de positiva egenskaperna hos GNSS och IMU för optimal prestanda i noggranna navigationssystem. Detta görs med hjälp av sensorfusion, till exempel EKF. Tidssynkronisering av IMU-data med exakt GNSS-baserad tid är nödvändigt för att noggrant synkronisera de två systemen. Detta måste göras i realtid för tidskänsliga navigationsapplikationer såsom autonoma fordon. Forskningen görs i två delar. Den första delen är simulering av icke-linjär rörelse i en axel med felaktig tidsstämpling hos ett gyroskop och en accelerometer. Detta görs för att erhålla det högsta tidsfel som är acceptabelt hos ett GNSS / INS-system med hög noggrannhet. Den andra delen är skapandet av en realtidsalgoritm med ett inbyggt STM32-system med FreeRTOS som realtidskärna för en GNSSmottagare och antenn, tillsammans med en IMU-sensor. En jämförande analys av det tidssynkroniserade systemet mot ett osynkroniserat system görs baserat på de positionsfel längs en axel som produceras av gyroskopoch accelerometermätningar. Detta görs genom att utföra statiska och roterande tester med hjälp av en roterande stol.Simuleringen visar att ett noggrant GNSS / INS-system tolererar ett tidsfel på upp till 1 millisekund. Realtidslösningen ger IMU-data med tidsstämplar synkroniserade med GNSS-tid var femte millisekund. Tidsjittret reduceras till ett intervall mellan ± 1 millisekund. Analysen av det slutliga vinkelrotationsfelet och positionsfelet från gyroskopoch accelerometermätningar indikerar att realtidsalgoritmen ger ett lägre fel när systemet är statiskt. Det finns dock inga statistiska bevis för förbättringen från resultaten av rotationstesterna.
APA, Harvard, Vancouver, ISO, and other styles
21

Bhattacharya, Sumit. "A Real-Time Bi-Directional Global Positioning System Data Link Over Internet Protocol." Ohio University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1121355433.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Korte, Hubert. "The integration of Wide Area Network Differential Global Positioning Systems (WAN-DGPS) into yield mapping on the combine harvester." Thesis, University of Newcastle Upon Tyne, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Braga, Carlos Roberto. "Study of solar-interplanetary-geomagnetic disturbances using data from the Global Muon Detector Network and the LASCO coronagraph." Instituto Nacional de Pesquisas Espaciais, 2011. http://urlib.net/sid.inpe.br/mtc-m19/2011/02.07.20.31.

Full text
Abstract:
O objetivo do trabalho da Dissertação é estudar distúrbios solar-interplanetário-geomagnéticos, como ejeções de massa coronais solares (Coronal Mass Ejections - CMEs) usando observações de coronógrafos de luz branca e de raios cósmicos de alta energia (muons). A partir de imagens do coronógrafo LASCO-C3 (Large Angle and Spectroscopic Coronagraph), ejeções coronais de massa (CMEs) foram segmentadas de forma supervisionada por textura. O contorno identificado foi utilizado para estimar velocidades radiais e de expansão de um conjunto de 57 CMEs associadas a eventos solares próximos ao limbo. Optou-se por segmentação por textura, buscando-se parametrizar estimativas de velocidades de CME que não são consenso. De forma geral o contorno identificado pela técnica mostrou-se coerente com a definição de CME e a posição angular, velocidade radial e de expansão estimadas são similares aos resultados anteriores obtidos por catálogos produzidos manualmente. Por outro lado, usando dados de raios cósmicos de alta energia (muons), assinaturas precedentes a chegada da massa de plasma solar foram estudadas usando dados da Rede Mundial de Detectores de Muons (GMDN). Foi elaborada e estudada a distribuição da intensidade de raios cósmicos como função do ângulo de pitch para períodos associados às 16 tempestades geomagnéticas fracas ou moderadas observadas em 2008. Em 14 dos eventos foram observados possíveis precursores, tanto acréscimos como decréscimos sistemáticos. Não há razão identificada para a ausência de precursores nos dois eventos restantes.
The objective of this work is to study solar-interplanetary-geomagnetic disturbances like coronal mass ejections (CMEs) using observations from the white light coronagraph and high-energy cosmic ray (muons). Images from the Large Angle and Spectroscopic Coronagraph (LASCO-C3) were segmented by texture in a supervised way and the identified contour was used to estimate the radial and expansion speed of a set of 57 limb CMEs for the period between 1997 and 2001. Texture analysis was chosen in a way to parameterize the estimation of CMEs contours, which are not always consensus. In a general view, the identified contour is in agreement with the CME definition and the estimate position angle, radial speed and expansion speed are in agreement with previous catalogs manually done. In the other hand, using high-energy cosmic ray (muons) observations, signatures preceding the arrival of plasma structures were studied using data from the Global Muon Detector Network (GMDN). Pitch angle distributions were done for periods associated with the 16 small and moderate geomagnetic storms observed in 2008. Fourteen of them show some possible precursors, both precursory increases and precursory decreases. No clear reason was found yet for not seeing precursors in the remaining two events.
APA, Harvard, Vancouver, ISO, and other styles
24

Zinas, N. "Development and assessment of a new rover-enhanced network based data processing strategy for Global Navigation Satellite Systems." Thesis, University College London (University of London), 2010. http://discovery.ucl.ac.uk/19913/.

Full text
Abstract:
Real Time GNSS networks established across countries over the last fifteen years, provide centimetre level accuracy for a wide range of applications differing from precise ship docking to land surveying. Public sector organizations deploy GNSS Networks to support infrastructure projects. Private companies establish their own networks for commercial or private use. The number of users of these networks has steadily increased over the years, and a potential new market has been created. This thesis focuses on the exploitation of the advantages of using multiple users operating within a GNSS network as part of the system, as a virtual network of stations that can operate autonomously and combined with the reference station network. A new network-RTK methodology that encompasses the multiple users of network RTK services for instantaneous positioning, is presented. Users are equipped with a means of two-way communication that enables the data to be transmitted to a central processing facility on an epoch by epoch basis. A centralised approach reduces the need for complex algorithms at the user side. The methodology is tested through the use of data from the South California Integrated GNSS Network (SCIGN) for two different network formations with different numbers of users operating simultaneously. It is shown that regional estimation of relative ionospheric corrections from the proposed methodology has a positive effect on the overall ambiguity resolution success rates when compared to the use of a standard ionospheric model. It is demonstrated that the use of the multiple rovers operating in the network increase the ambiguity success rates for the ones outside the network area by almost 20%. A smaller improvement for rovers near the boundaries of the network is also achieved. Finally, new prospects for regional atmospheric modeling become available, since the algorithm estimates all the double difference ambiguities for the reference station-rover system.
APA, Harvard, Vancouver, ISO, and other styles
25

Loy, James Michael. "RELATING NATURALISTIC GLOBAL POSITIONING SYSTEM (GPS) DRIVING DATA WITH LONG-TERM SAFETY PERFORMANCE OF ROADWAYS." DigitalCommons@CalPoly, 2013. https://digitalcommons.calpoly.edu/theses/1078.

Full text
Abstract:
This thesis describes a research study relating naturalistic Global Positioning System (GPS) driving data with long-term traffic safety performance for two classes of roadways. These two classes are multilane arterial streets and limited access highways. GPS driving data used for this study was collected from 33 volunteer drivers from July 2012 to March 2013. The GPS devices used were custom GPS data loggers capable of recording speed, position, and other attributes at an average rate of 2.5 hertz. Linear Referencing in ESRI ArcMAP was performed to assign spatial and other roadway attributes to each GPS data point collected. GPS data was filtered to exclude data with high horizontal dilution of precision (HDOP), incorrect heading attributes or other GPS communication errors. For analysis of arterial roadways, the Two-Fluid model parameters were chosen as the measure for long-term traffic safety analysis. The Two-Fluid model was selected based on previous research which showed correlation between the Two-Fluid model parameters n and Tm and total crash rate along arterial roadways. Linearly referenced GPS data was utilized to obtain the total travel time and stop time for several half-mile long trips along two arterial roadways, Grand Avenue and California Boulevard, in San Luis Obispo. Regression between log transformed values of these variables (total travel time and stop time) were used to derive the parameters n and Tm. To estimate stop time for each trip, a vehicle “stop” was defined when the device was traveling at less than 2 miles per hour. Results showed that Grand Avenue had a higher value for n and a lower value for Tm, which suggests that Grand Avenue may have worse long-term safety performance as characterized by long-term crash rates. However, this was not verified with crash data due to incomplete crash data in the TIMS database. Analysis of arterial roadways concluded by verifying GPS data collected in the California Boulevard study with sample data collected utilizing a traditional “car chase” methodology, which showed that no significant difference in the two data sources existed when trips included noticeable stop times. For analysis of highways the derived measurement of vehicle jerk, or rate of change of acceleration, was calculated to explore its relationship with long-term traffic safety performance of highway segments. The decision to use jerk comes from previous research which utilized high magnitude jerk events as crash surrogate, or near-crash events. Instead of using jerk for near-crash analysis, the measurement of jerk was utilized to determine the percentage of GPS data observed below a certain negative jerk threshold for several highway segments. These segments were ¼-mile and ½-mile long. The preliminary exploration was conducted with 39 ¼-mile long segments of US Highway 101 within the city limits of San Luis Obispo. First, Pearson’s correlation coefficients were estimated for rate of ‘high’ jerk occurrences on these highway segments (with definitions of ‘high’ depending on varying jerk thresholds) and an estimate of crash rates based on long-term historical crash data. The trends in the correlation coefficients as the thresholds were varied led to conducting further analysis based on a jerk threshold of -2 ft./sec3 for the ¼-mile segment analysis and -1 ft./sec3 for the ¼-mile segment analysis. Through a negative binomial regression model, it was shown that utilizing the derived jerk percentage measure showed a significant correlation with the total number of historical crashes observed along US Highway 101. Analysis also showed that other characteristics of the roadway, including presences of a curve, presence of weaving (indicated by the presence of auxiliary lanes), and average daily traffic (ADT) did not have a significant correlation with observed crashes. Similar analysis was repeated for 19 ½-mile long segments in the same study area, and it was found the percentage of high negative jerk metric was again significant with historical crashes. The ½-mile negative binomial regression for the presence of curve was also a significant variable; however the standard error for this determination was very high due to a low sample size of analysis segments that did not contain curves. Results of this research show the potential benefit that naturalistic GPS driving data can provide for long-term traffic safety analysis, even if data is unaccompanied with any additional data (such as live video feed) collected with expensive vehicle instrumentation. The methodologies of this study are repeatable with many GPS devices found in certain consumer electronics, including many newer smartphones.
APA, Harvard, Vancouver, ISO, and other styles
26

Bleyle, Derek. "A Secure Web Based Data Collection and Distribution System for Global Positioning System Research." Ohio University / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1097605631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Bakousseva, Renata. "Integrated supply and production network design." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/105632.

Full text
Abstract:
Thesis: S.M. in Engineering Systems, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 59).
As Company X looks to improve customer service and deliver new growth opportunities, it is driving toward a more efficient, aligned and effective organization that eliminates waste through integration of its supply and production networks. The current manufacturing system is optimized for high volume products with low demand variation signals, and is used for all products regardless of demand characteristics. The effects of such a system on the supply network are higher holding cost and stale inventory, while the effects on the business are lost sales and higher total delivered cost. A more responsive production system is an opportunity to reduce strain on the supply network, reduce total delivered cost and improve product fulfillment. Analysis of a portfolio of products demonstrates two main findings: (1) considerable impact of inventory cost on the total delivered cost and (2) a definitive case for differentiated manufacturing strategy - for high and low volume products. Previously only manufacturing cost had been used to make the decision of which system might better fit the goals of providing products in a timely and cost efficient manner. However, the uncovering of the impact of inventory cost on the total delivered cost has challenged that perception. An analysis was also performed on various algorithms which optimize (1) the product lot size and (2) job scheduling on machines. EOQ and a Mixed Integer Program were both analyzed for lot size determination, with the latter demonstrating more cost efficient and production efficient results due to more flexibility with the time scale and the consideration of manufacturing capacity. Finally, a couple of bin packing algorithm heuristics were tested for job scheduling. The results demonstrated significant time savings in job scheduling and have highlighted the need to automate the scheduling process.
by Renata Bakousseva.
S.M. in Engineering Systems
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
28

Opperman, B. D. L. "Reconstructing ionospheric TEC over South Africa using signals from a regional GPS network." Thesis, Rhodes University, 2008. http://hdl.handle.net/10962/d1005273.

Full text
Abstract:
Radio signals transmitted by GPS satellites orbiting the Earth are modulated as they propagate through the electrically charged plasmasphere and ionosphere in the near-Earth space environment. Through a linear combination of GPS range and phase measurements observed on two carrier frequencies by terrestrial-based GPS receivers, the ionospheric total electron content (TEC) along oblique GPS signal paths may be quantified. Simultaneous observations of signals transmitted by multiple GPS satellites and observed from a network of South African dual frequency GPS receivers, constitute a spatially dense ionospheric measurement source over the region. A new methodology, based on an adjusted spherical harmonic (ASHA) expansion, was developed to estimate diurnal vertical TEC over the region using GPS observations over the region. The performance of the ASHA methodology to estimate diurnal TEC and satellite and receiver differential clock biases (DCBs) for a single GPS receiver was first tested with simulation data and subsequently applied to observed GPS data. The resulting diurnal TEC profiles estimated from GPS observations compared favourably to measurements from three South African ionosondes and two other GPS-based methodologies for 2006 solstice and equinox dates. The ASHA methodology was applied to calculating diurnal two-dimensional TEC maps from multiple receivers in the South African GPS network. The space physics application of the newly developed methodology was demonstrated by investigating the ionosphere’s behaviour during a severe geomagnetic storm and investigating the long-term ionospheric stability in support of the proposed Square Kilometre Array (SKA) radio astronomy project. The feasibility of employing the newly developed technique in an operational near real-time system for estimating and dissimenating TEC values over Southern Africa using observations from a regional GPS receiver network, was investigated.
APA, Harvard, Vancouver, ISO, and other styles
29

Billian, Bruce. "Next Generation Design of a Frequency Data Recorder Using Field Programmable Gate Arrays." Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/34560.

Full text
Abstract:

The Frequency Disturbance Recorder (FDR) is a specialized data acquisition device designed to monitor fluctuations in the overall power system. The device is designed such that it can be attached by way of a standard wall power outlet to the power system. These devices then transmit their calculated frequency data through the public internet to a centralized data management and storage server.

By distributing a number of these identical systems throughout the three major North American power systems, Virginia Tech has created a Frequency Monitoring Network (FNET). The FNET is composed of these distributed FDRs as well as an Information Management Server (IMS). Since frequency information can be used in many areas of power system analysis, operation and control, there are a great number of end uses for the information provided by the FNET system. The data provides researchers and other users with the information to make frequency analyses and comparisons for the overall power system. Prior to the end of 2004, the FNET system was made a reality, and a number of FDRs were placed strategically throughout the United States.

The purpose of this thesis is to present the elements of a new generation of FDR hardware design. These elements will enable the design to be more flexible and to lower reliance on some vendor specific components. Additionally, these enhancements will offload most of the computational processing required of the system to a commodity PC rather than an embedded system solution that is costly in both development time and financial cost. These goals will be accomplished by using a Field Programmable Gate Array (FPGA), a commodity off-the-shelf personal computer, and a new overall system design.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
30

Natarajan, Sriram. "SURGNET an integrated surgical data transmission system over collaborative networks /." Amherst, Mass. : University of Massachusetts Amherst, 2009. http://scholarworks.umass.edu/theses/257/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Kilcrease, Patrick N. "Employing a secure Virtual Private Network (VPN) infrastructure as a global command and control gateway to dynamically connect and disconnect diverse forces on a task-force-by-task-force basis." Thesis, Monterey, California : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Sep/09Sep%5FKilcrease.pdf.

Full text
Abstract:
Thesis (M.S. in Information Technology Management)--Naval Postgraduate School, September 2009.
Thesis Advisor(s): Barreto, Albert. "September 2009." Description based on title screen as viewed on 6 November 2009. Author(s) subject terms: Virtual Private Network, GHOSTNet, maritime interdiction operations, internet protocol security, encapsulating security protocol, data encryption standard. Includes bibliographical references (p. 83-84). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
32

Macmillan, Michael Reed. "Network methods for inventory management in capacity constrained retail stores." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104395.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Thesis: S.M. in Engineering Systems, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 81-82).
Zara leads the fast-fashion industry introducing over 12,000 unique items per year [5], selling in over 2,000 stores and generating rch16091 [British pound symbol]11.6 Bn in yearly sales [8]. Critical to this success, Zara's Distribution Department continually focuses on improving the algorithms and programs which the company uses to move clothing through the supply chain. Demand variability and short product lifecycles make this task extremely challenging, especially when coupled with limited storage space in many Zara stores. This thesis helps stores which are challenged by low storage capacity and high consumer demand by testing three inventory management methods. The first method creates a virtual cost in the inventory redistribution algorithm, which decreases the likelihood that an over-capacity store will hold an item. This method decreased the amount of post transfer inventory by 15 % in capacity constrained stores while only experiencing a .1% loss of profits when compared to the current process. The second method opens new transfer routes for capacity constrained stores to move inventory into stores which benefit from the additional items, while reducing the non-performing stock at the capacity constrained store. These store to store routes quickly transfer items while reducing the stock of the origin store. The final method improves existing capacity returns, which automatically move inventory from capacity constrained stores back to the Distribution Center. The new method optimizes the selection of items to improve redistribution to other stores, resulting in additional full priced sales, while removing the same amount of items from the origin store. The implementation of these processes will reduce stock management problems experienced at Zara stores, while ensuring that other stores have the opportunity to sell at full price.
by Michael Reed Macmillan.
M.B.A.
S.M. in Engineering Systems
APA, Harvard, Vancouver, ISO, and other styles
33

Lu, Rong. "Statistical Methods for Functional Genomics Studies Using Observational Data." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1467830759.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Larocca, Ana Paula Camargo. "Análise de estratégias para processamento de redes geodésicas com o sistema de posicionamento global - GPS." Universidade de São Paulo, 2000. http://www.teses.usp.br/teses/disponiveis/18/18137/tde-24052006-101143/.

Full text
Abstract:
O presente trabalho consiste de apresentação de metodologia para estudo, elaboração e análise de estratégias para processamento de observáveis GPS, para a constituição de redes geodésicas. No desenvolvimento deste trabalho foram utilizados os dados observados da rede geodésica do Estado de São Paulo, concluída em 1994. Esta rede é constituída por vinte e quatro pontos distribuídos pelo estado, mais o vértice CHUA, que é o vértice fundamental da triangulação do Sistema Geodésico Brasileiro. Através das estratégias elaboradas são analisados diversos fatores de importância relevante nos processamentos dos dados GPS, como: influência de dados meteorológicos no processamento de linhas bases longas; resultados de processamentos com efemérides transmitidas e precisas; resultados de processamentos com linhas bases de comprimentos homogêneos e menores ou igual a 150 km; resultados de processamentos considerando apenas duas horas e trinta minutos do tempo total de duração das sessões de observação. Os resultados dos ajustamentos destas estratégias são comparados entre si e se apresenta, então, análises e conclusões sobre a influência dos fatores analisados
The present work consists in the presentation of a methodology for study, elaboration and analyses of strategies to process GPS observables for geodetic networks. In the development of this work, GPS data of the geodetic network of the State of Sao Paulo, concluded in 1994, were used. This network is composed twenty-four points scattered in the State, plus the vertex CHUA, that is the fundamental point of the triangulation of the brazilian geodesy system. Through the strategies elaborated, several factors of main importance for data GPS processing, are analyzed, such as: the influence of meteorological data processing of long baselines; the results of data processing with broadcast and precise ephemeris; the results of data processing with baselines of homogeneous lengths and smaller than or equal to 150 km; the results of data processing considering only two hours and thirty minutes of the total time of duration of the observation sessions. The results of the adjustment of these strategies are compared to each other, followed by analyses and conclusions about the influence of these factors on data processing
APA, Harvard, Vancouver, ISO, and other styles
35

Zhuang, Yuwen. "Metric Based Automatic Event Segmentation and Network Properties Of Experience Graphs." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1337372416.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Donde, Shrinish. "Support for Emulated 5G-System Bridge in a Time-Sensitive Bridged Network." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284514.

Full text
Abstract:
Time Sensitive Networking (TSN) defined in the IEEE 802.1 working group, is an important enabler for industrial Internet of things, specifically industry 4.0. 3GPP release 16 specifications includes the 5G system as a logical TSN bridge, thus promoting the integration of 5G technology with TSN. This combination provides wireless deterministic communication thus ensuring low, bounded delay and near-zero packet loss. In this thesis, we implement a 5G system in- tegration with TSN using a discrete event network simulator (NS-3). Further, we propose a simplified per egress port scheduling algorithm based on IEEE 802.1Q (scheduled traffic standard) running in the Centralized Network Con- troller (CNC). Average packet delay, average jitter, average throughput and the packet loss is measured for comparing the performance difference when our TSN scheduler is used versus when it is not. The designed system is tested by measuring it’s network impact in terms of average delay and packet loss. The 5GS logical bridge behavior is simulated by varying the 5G bridge de- lay dynamically. For every frame transmission in the queue, the processing delay of a particular bridge is varied with pre-defined set of values. Two sets of 5GS bridge delay variations are considered, i.e. between 1-10ms and 5- 10ms respectively. On calculating the network impact, we conclude that the overall impact on the network decreases as the variation range for the delay gets smaller. This proves that higher delay variations have a significant impact whereas smaller delay variations have a negligible impact on the network. For the latter case, the system delay is considerably stable and thus can be used for industrial applications in real-life TSN scenarios.
Tidskritiska nätverk (TSN) definierat i IEEE 802.1-arbetsgruppen, är en vik- tig faktor för det industriella Sakernas Internet, särskilt när det gäller Industri4.0. Specifikationer enligt 3GPP release 16 inkluderar 5G-system som en lo- gisk TSN-brygga, som främjar integrationen av 5G-teknik med TSN. 5G med TSN ger trådlös deterministisk kommunikation som säkerställer låg, begrän- sad fördröjning och nästan noll paketförlust. I denna rapport implementerar vi en 5G-systemintegration med TSN med hjälp av en diskret händelse simu- lator (NS-3). Dessutom föreslår vi en förenklad algoritm för schemaläggning av portar per utgång baserat på IEEE 802.1Q (Scheduled Traffic Standard) som körs i en centraliserad nätverks-controller (CNC). Genomsnittlig paket- fördröjning, genomsnittlig fördröjningsvariation, genomsnittlig genomström- ning och paketförlust mäts för att jämföra prestandaskillnaden när vår TSN- schemaläggare används jämfört med när den inte används. Det utformade sy- stemet testas genom att mäta nätverkets påverkan i termer av genomsnittlig fördröjning och paketförlust. 5GS logiska bryggbeteende simuleras genom att dynamiskt variera 5G-bryggfördröjningen. För varje bildöverföring varieras bryggans bearbetningsfördröjning med en fördefinierad uppsättning värden. Två fördefinierade uppsättningar av 5GS-fördröjningsvariationer beaktas som ligger mellan 1-10ms respektive 5-10ms. När vi beräknar nätverkspåverkan drar vi slutsatsen att den totala effekten på nätverket minskar när variationen i fördröjningen blir mindre. Detta visar att högre fördröjningsvariationer har en signifikant effekt medan mindre fördröjningsvariationer har en försumbar effekt. I det senare fallet är systemfördröjningen betydligt stabilare och kan användas för tillämpningar i verkliga TSN-scenarier.
APA, Harvard, Vancouver, ISO, and other styles
37

Piesker, Björn. "Constraint-basierte Generierung realitätsnaher Eisenbahnnetze." Master's thesis, Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2007/1532/.

Full text
Abstract:
Diese Arbeit befasst sich mit der Entwicklung einer Applikation, welche Infrastrukturdaten über Eisenbahnnetze generiert. Dabei bildet die Erzeugung der topologischen Informationen den Schwerpunkt dieser Arbeit. Der Anwender charakterisiert hierfür vorab das gewünschte Eisenbahnnetz, wobei die geforderten Eigenschaften die Randbedingungen darstellen, die bei der Synthese zu beachten sind. Zur Einhaltung dieser Bedingungen wird die Constraint-Programmierung eingesetzt, welche durch ihr spezielles Programmierparadigma konsistente Lösungen effizient erzeugt. Dies wird u.a. durch die Nachnutzung so genannter globaler Constraints erreicht. Aus diesem Grund wird insbesondere auf den Einsatz der Constraint-Programmierung bei der Modellierung und Implementierung der Applikation eingegangen.
This work deals with the development of an application, which generates infrastructure data of railway networks. The focus of this work concentrates on the generation process of topological information. As input for the application a characterization of the intended railway network is given as attributes, which are handled as constraints in the generation process. To satisfy these restrictions constraint programming, a special programming paradigm, which is able to search efficently consistent solutions, is applied. In particular, the use of so-called global constraints improves the computation. For that reason the role of constraint-programming in modelling and implementing these application is discussed in more detail.
APA, Harvard, Vancouver, ISO, and other styles
38

Tiesler, Russell Colin, and n/a. "A Decade of GPS geodesy in the Australian region: a review of the GDA94 and its performance within a time series analysis of a 10 year data set in ITRF 2000." University of Canberra. Information Sciences & Engineering, 2005. http://erl.canberra.edu.au./public/adt-AUC20051202.114435.

Full text
Abstract:
The University of Canberra (UC) has been involved in GPS processing since the late 1980s. This processing commenced with the GOTEX 1988 campaign and progressed through a series of project specific regional campaigns to the current daily processing of a distributed set of continuously operating sites for the determination of precise GPS station positions for user applications. Most of these earlier campaigns covered only short periods of time, ranging from a few weeks to multiple occupations of a few days to a time over one to two years. With software developments, these multiple occupations were able to be combined to produce results from which crustal motion velocities could be extracted. This first became feasible with the processing of the Australian National Network (ANN), which yielded realistic tectonic velocities from two occupations (1992 and 1993) of sites 12 months apart. Subsequently, this was successfully extended by a further 12 months, with re-occupation of certain sites for a third time in 1994. Analysis of the results indicated that the accuracy of determining the earth signals improved as the time span from first to last observation was increased. The same was true also for the determination of the position of global references sites. However, by current standards the results achieved were poor. Consequently, the process was extended to combine the results of subsequent campaigns with the original ANN data set. From 1995 to 1999, campaigns were conducted across Australia, covering many State and tide gauge sites included in the original ANN solution. These provided additional multiple occupations to improve the determinations for both position and velocity. UC has maintained a data set of the global IGS sites, commencing with the IGS pilot campaign of 1992. Daily data sets for those global sites, which contained days common to the regional campaigns, were processed to produce our own independent global orbit and reference frame connection. The motivation for doing so was fourfold. �Firstly, to see if historic data could be reprocessed using current modern software and thus be able to be incorporated in this and other analysts research programs. �Secondly, to compare the results of the reprocessing of the original data set using modern software with the original ANN solution and then validate both the solutions. �Thirdly, to extend the timespan of observations processed to include more recent campaigns on as many original sites as possible. This to achieve a stronger solution upon which to base the determination of an Australian tectonic plate velocity model and provide quality assurance on the solution comparisons with re-observed sites. �Fourthly, to develop a set of transformation parameters between current coordinate systems and the GDA94 system so as to be able to incorporate new results into the previous system. The final selection of regional and global sessions, spanning from mid 1992 to late 2002, contained almost 1000 individual daily solutions. From this 10 year data span a well determined rigid plate tectonic motion model was produced for Australia. This site velocity model was needed to develop a transformation between the thesis solution in ITRF00 an the GDA94 solution in ITRF92. The significant advantage of the plate velocity model is that all Australian sites can now have computed a realistic velocity, rather than being given a value which has been interpolated between sites whose velocities had been determined over a one or two year span. This plate velocity model is compared with the current tectonic motion NNR-NUVEL-1A model and other recently published models. To perform the comparison between the thesis solution in ITRF00 and the GDA solution in ITRF92 a transformation was developed between the two reference systems. This set of transformation parameters, in conjunction with the plate velocity model developed, enables site solutions at any epoch in the current ITRF00 to be converted onto the GDA94, and vice versa, with a simple, non-varying seven parameter transformation. The comparisons between the solutions are analysed for both horizontal position and height consistency. There were 77 sites whose differences were compared. The horizontal consistency was within estimated precisions for 75 of the 77 sites. However, the vertical comparisons revealed many of the single epoch sites, especially in 1992, have inconsistent results between the two solutions. The heights from this thesis for some West Australian sites were compared with analysis done by DOLA and the height recoveries are very similar, indicating a weakness in the GDA94 solution for some of the single epoch sites. Some of these differences have been resolved but others are still under investigation. This thesis describes the repocessing of the original ANN data set, the addition of later data sets, the results obtained, and the validation comparisons of the old and new solutions. As well as the plate velocity model, transformation is provided which enables the user to compute between the GDA94 system, and any epoch result in ITRF00. Recommendations are made as to which sites need additional work. This includes sites which only need further analysis or investigation and those which require further observations to achieve a result which will have acceptable accuracy and reliability.
APA, Harvard, Vancouver, ISO, and other styles
39

Aldabbas, Hamza. "Securing data dissemination in vehicular ad hoc networks." Thesis, De Montfort University, 2012. http://hdl.handle.net/2086/7987.

Full text
Abstract:
Vehicular ad hoc networks (VANETs) are a subclass of mobile ad hoc networks (MANETs) in which the mobile nodes are vehicles; these vehicles are autonomous systems connected by wireless communication on a peer-to-peer basis. They are self-organized, self-configured and self-controlled infrastructure-less networks. This kind of network has the advantage of being able to be set-up and deployed anywhere and anytime because it has no infrastructure set-up and no central administration. Distributing information between these vehicles over long ranges in such networks, however, is a very challenging task, since sharing information always has a risk attached to it especially when the information is confidential. The disclosure of such information to anyone else other than the intended parties could be extremely damaging, particularly in military applications where controlling the dissemination of messages is essential. This thesis therefore provides a review of the issue of security in VANET and MANET; it also surveys existing solutions for dissemination control. It highlights a particular area not adequately addressed until now: controlling information flow in VANETs. This thesis contributes a policy-based framework to control the dissemination of messages communicated between nodes in order to ensure that message remains confidential not only during transmission, but also after it has been communicated to another peer, and to keep the message contents private to an originator-defined subset of nodes in the VANET. This thesis presents a novel framework to control data dissemination in vehicle ad hoc networks in which policies are attached to messages as they are sent between peers. This is done by automatically attaching policies along with messages to specify how the information can be used by the receiver, so as to prevent disclosure of the messages other than consistent with the requirements of the originator. These requirements are represented as a set of policy rules that explicitly instructs recipients how the information contained in messages can be disseminated to other nodes in order to avoid unintended disclosure. This thesis describes the data dissemination policy language used in this work; and further describes the policy rules in order to be a suitable and understandable language for the framework to ensure the confidentiality requirement of the originator. This thesis also contributes a policy conflict resolution that allows the originator to be asked for up-to-date policies and preferences. The framework was evaluated using the Network Simulator (NS-2) to provide and check whether the privacy and confidentiality of the originators’ messages were met. A policy-based agent protocol and a new packet structure were implemented in this work to manage and enforce the policies attached to packets at every node in the VANET. Some case studies are presented in this thesis to show how data dissemination can be controlled based on the policy of the originator. The results of these case studies show the feasibility of our research to control the data dissemination between nodes in VANETs. NS-2 is also used to test the performance of the proposed policy-based agent protocol and demonstrate its effectiveness using various network performance metrics (average delay and overhead).
APA, Harvard, Vancouver, ISO, and other styles
40

Ivanecký, Ján. "Odhad hloubky pomocí konvolučních neuronových sítí." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2016. http://www.nusl.cz/ntk/nusl-255462.

Full text
Abstract:
This thesis deals with depth estimation using convolutional neural networks. I propose a three-part model as a solution to this problem. The model contains a global context network which estimates coarse depth structure of the scene, a gradient network which estimates depth gradients and a refining network which utilizes the outputs of previous two networks to produce the final depth map. Additionally, I present a normalized loss function for training neural networks. Applying normalized loss function results in better estimates of the scene's relative depth structure, however it results in a loss of information about the absolute scale of the scene.
APA, Harvard, Vancouver, ISO, and other styles
41

Cridland, Doug, and Chris Dehmelt. "LONG TERM VEHICLE HEALTH MONITORING." International Foundation for Telemetering, 2007. http://hdl.handle.net/10150/604406.

Full text
Abstract:
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada
While any vehicle that is typically part of a flight test campaign is heavily instrumented to validate its performance, long term vehicle health monitoring is performed by a significantly reduced number of sensors due to a number of issues including cost, weight and maintainability. The development and deployment of smart sensor buses has reached a time in which they can be integrated into a larger data acquisition system environment. The benefits of these types of buses include a significant reduction in the amount of wiring and overall system complexity by placing the appropriate signal conditioners close to their respective sensors and providing data back over a common bus, that also provides a single power source. The use of a smart-sensor data collection bus, such as IntelliBus™1 or IEEE-1451, along with the continued miniaturization of signal conditioning devices, leads to the interesting possibility of permanently embedding data collection capabilities within a vehicle after the initial flight test effort has completed, providing long-term health-monitoring and diagnostic functionality that is not available today. This paper will discuss the system considerations and the benefits of a smart sensor based system and how pieces can be transitioned from flight qualification to long-term vehicle health monitoring in production vehicles.
APA, Harvard, Vancouver, ISO, and other styles
42

Hassani, Mujtaba. "CONSTRUCTION EQUIPMENT FUEL CONSUMPTION DURING IDLING : Characterization using multivariate data analysis at Volvo CE." Thesis, Mälardalens högskola, Akademin för ekonomi, samhälle och teknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-49007.

Full text
Abstract:
Human activities have increased the concentration of CO2 into the atmosphere, thus it has caused global warming. Construction equipment are semi-stationary machines and spend at least 30% of its life time during idling. The majority of the construction equipment is diesel powered and emits toxic emission into the environment. In this work, the idling will be investigated through adopting several statistical regressions models to quantify the fuel consumption of construction equipment during idling. The regression models which are studied in this work: Multivariate Linear Regression (ML-R), Support Vector Machine Regression (SVM-R), Gaussian Process regression (GP-R), Artificial Neural Network (ANN), Partial Least Square Regression (PLS-R) and Principal Components Regression (PC-R). Findings show that pre-processing has a significant impact on the goodness of the prediction of the explanatory data analysis in this field. Moreover, through mean centering and application of the max-min scaling feature, the accuracy of models increased remarkably. ANN and GP-R had the highest accuracy (99%), PLS-R was the third accurate model (98% accuracy), ML-R was the fourth-best model (97% accuracy), SVM-R was the fifth-best (73% accuracy) and the lowest accuracy was recorded for PC-R (83% accuracy). The second part of this project estimated the CO2 emission based on the fuel used and by adopting the NONROAD2008 model.  Keywords:
APA, Harvard, Vancouver, ISO, and other styles
43

Mnie, Filali Imane. "Distribution multi-contenus sur Internet." Thesis, Université Côte d'Azur (ComUE), 2016. http://www.theses.fr/2016AZUR4068/document.

Full text
Abstract:
Dans cette thèse, nous nous sommes intéressés aux protocoles pair-à-pair (P2P), qui représentent une solution prometteuse pour la diffusion et le partage de données à faible coût sur Internet. Nous avons mené, dans un premier temps, une étude comportementale de différents protocoles P2P pour le partage de fichier (distribution de contenus sans contrainte de temps) puis le live. Dans la première étude centréesur le partage de fichier, nous avons montré l’impact d’Hadopi sur le comportement des utilisateurs et discuté l’efficacité des protocoles en fonction du contenu et l’efficacité protocolaire, en se basant sur les choix des utilisateurs. BitTorrent s’est nettement démarqué au cours de cette étude, notamment pour les grands contenus. En ce qui concerne le live, nous nous sommes intéressés à la qualité de servicedu réseau de distribution live Sopcast, car plus de 60% des événements live diffusés en P2P le sont sur ce réseau. Notre analyse approfondie de ces deux modes de distribution nous a fait nous recentrer sur BitTorrent, qui est à la base de tous les protocoles P2P Live, et est efficace en partage de fichier et complètement open source. Dans la seconde partie de la thèse, nous avons proposé et implémenté dansun environnement contrôlé un nouveau protocole sur la base de BitTorrent avec des mécanismes protocolaires impliquant tous les pairs dans la gestion du réseau. Ces nouveaux mécanismes permettent d’augmenter l’efficacité du protocole via une meilleure diffusion, tant pour le live que le partage de fichier, de métadonnées (la pièce la plus rare) et via une méthode dite de push, par laquelle un client va envoyer du contenu aux pairs les plus dans le besoin
In this study, we focused on peer-to-peer protocols (P2P), which represent a promising solution for data dissemination and content delivery at low-cost in the Internet. We performed, initially, a behavioral study of various P2P protocols for file sharing (content distribution without time constraint) and live streaming. Concerning file sharing, we have shown the impact of Hadopi on users’ behavior and discussed the effectiveness of protocols according to content type, based on users’ choice. BitTorrent appeared as the most efficient approach during our study, especially when it comes to large content. As for streaming, we studied the quality of service of Sopcast, a live distribution network that accounts for more than 60% of P2P broadcast live events. Our in-depth analysis of these two distributionmodes led us to focus on the BitTorrent protocol because of its proven efficiency in file sharing and the fact that it is open source. In the second part of the thesis, we proposed and implemented a new protocol based on BitTorrent, in a controlled environment. The modifications that we proposed allow to increase the efficiency of the protocol through improved dissemination of metadata (the rarest piece), both for live and file sharing. An enhanced version is introduced with a push method, where nodes that lag behind receive an extra service so as to improve the overall performance
APA, Harvard, Vancouver, ISO, and other styles
44

Berrios-Ayala, Mark. "Brave New World Reloaded: Advocating for Basic Constitutional Search Protections to Apply to Cell Phones from Eavesdropping and Tracking by Government and Corporate Entities." Honors in the Major Thesis, University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/1547.

Full text
Abstract:
Imagine a world where someone’s personal information is constantly compromised, where federal government entities AKA Big Brother always knows what anyone is Googling, who an individual is texting, and their emoticons on Twitter. Government entities have been doing this for years; they never cared if they were breaking the law or their moral compass of human dignity. Every day the Federal government blatantly siphons data with programs from the original ECHELON to the new series like PRISM and Xkeyscore so they can keep their tabs on issues that are none of their business; namely, the personal lives of millions. Our allies are taking note; some are learning our bad habits, from Government Communications Headquarters’ (GCHQ) mass shadowing sharing plan to America’s Russian inspiration, SORM. Some countries are following the United States’ poster child pose of a Brave New World like order of global events. Others like Germany are showing their resolve in their disdain for the rise of tyranny. Soon, these new found surveillance troubles will test the resolve of the American Constitution and its nation’s strong love and tradition of liberty. Courts are currently at work to resolve how current concepts of liberty and privacy apply to the current conditions facing the privacy of society. It remains to be determined how liberty will be affected as well; liberty for the United States of America, for the European Union, the Russian Federation and for the people of the World in regards to the extent of privacy in today’s blurred privacy expectations.
B.S.
Bachelors
Health and Public Affairs
Legal Studies
APA, Harvard, Vancouver, ISO, and other styles
45

"Data Synchronization in a Network-Volatile Mobile Ecosystem." Thesis, 2014. http://hdl.handle.net/10388/ETD-2014-09-1765.

Full text
Abstract:
Today, it is a major issue for mobile applications to maintain a replica state of the server on mobile devices. This creates the need to keep data on both the server and the mobile. In such cases, when the data changes on the server, the new state of the data has to be updated on the mobile in order to maintain a consistent view of the data flow. However, mobile devices communicate over wireless mediums (.e.g., Bluetooth, Wi-Fi, 3.5G/4G, etc.) which can experience intermittent connectivity. The volatility of the network is also influenced by low-bandwidth. The direct effects of these issues are high latency and inconsistency issues between the data on the mobile clients and the remote servers. In this work, I present a detail review on the topic of data synchronization in mobile networks. Then, a generic architecture called MobiQ is proposed which can keep working in an offline mode to record local modifications and can synchronize with the remote servers when connectivity is restored. This is achieved through the proposal of an efficient synchronization protocol which combines different synchronization and replication strategies. Moreover, the MobiQ framework provides a secured environment to work with data. The implemented architecture is designed and tested in mobile questionnaire system and the result is encouraging.
APA, Harvard, Vancouver, ISO, and other styles
46

Lin, Yu-Jyue, and 林鈺玨. "Synchronization and System Integration for Low-Data-Rate Wireless Network." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/ne4788.

Full text
Abstract:
碩士
國立臺灣科技大學
電子工程系
107
Many industries are using new model to develop business due to the Internet of Things (IoT) is expanding at a rapid rate. The feature of IoT is small, light, low power consumption and low cost, so we use WARP V3 Reference Design which based on the Standard of IEEE 802.11a/g and OFDM PHY’s transceiver to achieve narrow bandwidth and low carrier frequency. In order to implement the transceiver for IoT, predecessor integrated the RF AD9361 into the system, but AD9361 has larger frequency offset which cause higher packet error rate (PER); and further, WARP V3 Reference Design cannot demodulation successfully at frequency offset up to 40 ppm. The purpose of this thesis is to design synchronization algorithm which based on WARP V3 reference design, and implement the low-data-rate WLAN. To meet this goal, subcarrier spacing should be reduced, also the time to transmit and receive packet should be increased, which makes OFDM modulation more sensitive to carrier frequency offset and sample frequency offset. Therefore, the main goal of this thesis is to propose a new algorithm for frequency offset estimation. Finally, we use the FPGA board to verify our algorithm.
APA, Harvard, Vancouver, ISO, and other styles
47

Mahmud, Muhammad Umar. "Data quality assessment of nignet network." Master's thesis, 2012. http://hdl.handle.net/10400.6/3423.

Full text
Abstract:
NIGNET is a network of Continuously Operating Reference Stations (CORS) of GNSS (Global Navigation Satellite Systems) stations established in 2009 by the Federal Government of Nigeria through the Office of the Surveyor General of the Federation (OSGoF), which is the office charged with the Surveying and Mapping in Nigeria. Presently, the network consists of twelve(12) stations established with a good geographic distribution. The principal objective of this research project is to assess the data quality of NIGNET Network by processing the data and analysing the derived Time Series. This was done by evaluating the errors on the Time-Series and by computing the relative motions, using different plate tectonic models with respect to (w.r.t.) Nubia plate.
APA, Harvard, Vancouver, ISO, and other styles
48

Liu, Chris, and 劉嘉言. "Implementation of Control/Data Plane Routing Tables Synchronization on A Network Processor-based Router." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/22955305109085439395.

Full text
Abstract:
碩士
國立中正大學
電機工程所
94
In recent years, router architecture has evolved from the traditional general purpose CPU based architecture to the network processor based architecture. On network processors, for the purpose of load balancing between Control Plane and Forwarding Plane, the concept of separated Control Plane and Forwarding Plane on different operating systems and different CPUs has been brought up and implemented. It makes the network processor concentrate on processing packets’ receiving and forwarding so that the network processor does not have to consume additional resource to execute signaling and routing processes. As a result, the network can have good performance even on high speed transmission. Separating the Control Plane and Forwarding Plane, on the other hand, also could generate some new problems. In this thesis, we use IXDP 2400 network processors produced by Intel Coporation as the development platform. Intel also provides the IXA SDK tools for developing their network processors. We use the CP-PDK which is inside the IXA SDK to allow Control Plane to communicate with Forwarding Plane. In particular, we design and implement a routing table sychronization mechanism to make the routing tables on Control Plane and Forwarding Plane to have the same route entries.
APA, Harvard, Vancouver, ISO, and other styles
49

Allotey, Asuquo Kofi Essien. "Data protection and transborder data flows : implications for Nigeria's integration into the global network economy." Thesis, 2014. http://hdl.handle.net/10500/13903.

Full text
Abstract:
One of the realities that developing countries like Nigeria have to face today is that national and international markets have become more and more interconnected through the global platform of telecommunications and the Internet. This global networked economy is creating a paradigm shift in the focus of development goals and strategies particularly for developing countries. Globalisation is driving the nations of the world more into political and economic integration. These integrations are enhanced by a globally interconnected network of economic and communication systems at the apex of which is the Internet. This network of networks thrives on and encourages the expansion of cross-border flows of ideas and information, goods and services, technology and capital. Being an active member of the global network economy is essential to Nigeria’s economic development. It must plug into the network or risk being shut out. The global market network operates by means of rules and standards that are largely set by the dominant players in the network. Data protection is a critical component of the regime of rules and standards that govern the global network economy; it is evolving into an international legal order that transcends geographical boundaries. The EU Directive on data protection is the de facto global standard for data protection; it threatens to exclude non-EU countries without an adequate level of privacy protection from the EU market. More than 50 countries have enacted data protection laws modelled on the EU standard. Access to the huge EU market is a major motivation for the current trend in global harmonisation of domestic data protection laws. This trend provides a compelling reason for examining the issues relating to data protection and trans-border data flows and their implications for Nigeria’s desire to integrate into the global network economy. There are two primary motivations for legislating restrictions on the flow of data across national boundaries. The first is the concern for the privacy of the citizens, and second, securing the economic well-being of a nation. It is important that Nigeria’s privacy protection keeps pace with international norms in the provision of adequate protection for information privacy order to prevent potential impediments to international trading opportunities.
Public, Constitutional, & International
LLD
APA, Harvard, Vancouver, ISO, and other styles
50

HONG, JHENG-JYUN, and 洪正峻. "A Study of Resolving Data Synchronization Problems between Mobile Device and Server under Poor Network Environment." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/00577830246825597872.

Full text
Abstract:
碩士
國立臺北護理健康大學
資訊管理研究所
104
Currently, the mobile application development is increasing; one of the common uses is through a mobile database. But considering the environmental and network problems, they might influence data synchronization between mobile devices and server databases. There is still no module or product that can reach the data synchronization between server and mobile devices. Therefore, this study uses the web apps of long-term care systems as an example to resolve database synchronization problems between server and mobile device. The system can preload the data into the user's mobile device. Users can then add or modify the form data offline, and the system can automatically refresh the data online. The research framework of this study is divided into Data Preloaded Module and Data Uploaded Module. Data Preloaded Module is mainly dealing with downloading and updating data to the mobile devices that match the required tasks and user permissions. In order to improve the efficiency and use offline, Data Preloaded Module will check the data integrity when it is connected. Data Uploaded Module is mainly dealing with data recording; it would be modified in the mobile database and upload the data to the server database. During the work, two related tables have been analysed and used in the Data Preloaded Module and Data Uploaded Module. This study has constructed the Data Preloaded Module and the Data Uploaded Module for data synchronization. We hope to share experiences for mobile systems research in the future, promoting and improving this module, and increasing system performance and usability.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography