Academic literature on the topic 'Computers / Data Transmission Systems / General'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Computers / Data Transmission Systems / General.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Computers / Data Transmission Systems / General"

1

Venable, Richard M. "Data Transmission Through the Telephone Network: Protocols, Pitfalls, and Some Examples." Journal of AOAC INTERNATIONAL 69, no. 5 (September 1, 1986): 749–54. http://dx.doi.org/10.1093/jaoac/69.5.749.

Full text
Abstract:
Abstract Invariably, the situation arises where it is desirable to transfer data from one computer to another, especially from small laboratory systems, word processors, or home computers to large mainframe computers. In many of these cases, there are no common storage media; home computers do not have 9-track tape drives and large mainframes do not have 5¼ in. floppy disk drives. Transmission of data through the telephone network is a viable method for data transfer, which is paradoxically both easier than many believe and more difficult than some may claim. One of the keys to successful data transmission is an understanding of telecommunications protocols, i.e., the rules governing intersystem communication through the telephone network. Some of the most common protocols allow exchanging ASCII-coded data at either 300 or 1200 baud. A variety of computer systems can be used, including IBM and DEC mainframes, a Wang word processor, an IBM PC-compatible microcomputer, and the Atari 800 microcomputer. A specific example is the use of the Atari 800 as an APL terminal, complete with the custom character set, standard ASCII text, and data transfer.
APA, Harvard, Vancouver, ISO, and other styles
2

Akhmetshina, Eleonora G. "MODELING DATA TRANSMISSION SYSTEMS USING MODERN INFORMATION TECHNOLOGIES." T-Comm 15, no. 8 (2021): 52–57. http://dx.doi.org/10.36724/2072-8735-2021-15-8-52-57.

Full text
Abstract:
When modeling data transmission systems for various purposes, including computer and telecommunication networks, both components of mathematical modeling are widely used. These are simulation modeling and analytical modeling based on queuing theory. At the same time, researchers can always compare the results obtained by means of simulation and analytical modeling. From modern technologies of simulation modeling, one can single out the IT GURU Academic Edition technologies, represented by the Opnet Modeler and Riverbed Modeler software products with powerful graphical editors. Graphic editors allow you to create simulation models of data transmission systems of any complexity, and launch and run their models to obtain statistics of the main performance indicators of these systems. Comparison of the simulation results with the results of queuing systems (QS) of the G/G/1 type makes it possible to assess the adequacy of those and other mathematical models. This article summarizes the results of the author’s publications on G/G/1 systems based on time-shifted distribution laws such as exponential, hyperexponential, and Erlang distribution. Thus, these distribution laws for the random variables used provide the coefficients of variation less than, equal or greater than one. This fact is important from the point of view of the queuing theory, because the average delay of claims in the system directly depends on the coefficients of variations in the time intervals for the arrival and servicing of claims.
APA, Harvard, Vancouver, ISO, and other styles
3

Voropaieva, A., G. Stupak, and O. Zhabko. "INFORMATION TRANSMISSION ALGORITHMS FOR INFRASTRUCTURE COMPUTER-INTEGRATED DATA PROCESSING SYSTEMS." Naukovyi visnyk Donetskoho natsionalnoho tekhnichnoho universytetu 1(6), no. 2(7) (2021): 14–23. http://dx.doi.org/10.31474/2415-7902-2021-1(6)-2(7)-14-23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Younus, Safwan Hafeedh, Aubida A. Al-Hameed, Ahmed Taha Hussein, Mohammed Thamer Alresheedi, and Jaafar M. H. Elmirghani. "Parallel Data Transmission in Indoor Visible Light Communication Systems." IEEE Access 7 (2019): 1126–38. http://dx.doi.org/10.1109/access.2018.2886398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Birvinskas, Darius, Vacius Jusas, Ignas Martisius, and Robertas Damasevicius. "Fast DCT algorithms for EEG data compression in embedded systems." Computer Science and Information Systems 12, no. 1 (2015): 49–62. http://dx.doi.org/10.2298/csis140101083b.

Full text
Abstract:
Electroencephalography (EEG) is widely used in clinical diagnosis, monitoring and Brain - Computer Interface systems. Usually EEG signals are recorded with several electrodes and transmitted through a communication channel for further processing. In order to decrease communication bandwidth and transmission time in portable or low cost devices, data compression is required. In this paper we consider the use of fast Discrete Cosine Transform (DCT) algorithms for lossy EEG data compression. Using this approach, the signal is partitioned into a set of 8 samples and each set is DCT-transformed. The least-significant transform coefficients are removed before transmission and are filled with zeros before an inverse transform. We conclude that this method can be used in real-time embedded systems, where low computational complexity and high speed is required.
APA, Harvard, Vancouver, ISO, and other styles
6

Elhoseny, Mohamed, Gustavo Ramirez-Gonzalez, Osama M. Abu-Elnasr, Shihab A. Shawkat, N. Arunkumar, and Ahmed Farouk. "Secure Medical Data Transmission Model for IoT-Based Healthcare Systems." IEEE Access 6 (2018): 20596–608. http://dx.doi.org/10.1109/access.2018.2817615.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Inyutin, S. A. "A Method for Reducing the Register Effect for Modular Data Formats." Informacionnye Tehnologii 28, no. 8 (August 15, 2022): 405–10. http://dx.doi.org/10.17587/it.28.405-410.

Full text
Abstract:
The substantiation of the author's method of reducing redundancy from the register effect when placing the components of a tuple representing a numerical value in a modular format designed for storage, transmission and processing in modular arithmetic in a specialized SIMD processor of parallel structure is given. Modular coding allows to obtain parallel execution of ring operations in independent computing paths. This, according to Amdahl's law, accelerates the execution of the computational process on multiprocessor computing systems or on multiple cores. Modular data formats are not consistent with the binary bit grid of a multiprocessor computer. In homogeneous binary registers designed to display modulo deductions, redundancy occurs because not all possible binary combinations in a digital register are used to display data. The method is based on the redistribution of the redundancy of digital registers used to display the components of the modular tuple, which allows to reduce to zero the register effect and redundancy of the representation of the components of the tuple. This makes it possible to obtain a dense packing of components of vector modular formats in homogeneous digital registers, which makes the development of SIMD architecture computers processing data in computer modular formats promising. The simulation results allow us to obtain mutually simple bases of the modular number system that meet the conditions of a new patented method for the complete elimination of redundancy.
APA, Harvard, Vancouver, ISO, and other styles
8

Riznyk, Volodymyr. "Big Data Process Engineering under Manifold Coordinate Systems." WSEAS TRANSACTIONS ON INFORMATION SCIENCE AND APPLICATIONS 18 (April 2, 2021): 7–11. http://dx.doi.org/10.37394/23209.2021.18.2.

Full text
Abstract:
This paper involves techniques for improving the quality indices of big data process engineering with respect to high-performance coded design, transmission speed, and reliability under manifold coordinate systems. The system formed with limited number of basis vectors. The set of modular sums of the vectors including themselves form t-dimensional toroidal coordinate grid over the toroid, and the basis is sub-set of general number of grid coordinate set. These design techniques make it possible to configure high performance information technology for big data coding design and vector signal processing. The underlying mathematical principles relate to the optimal placement of structural elements in spatially or temporally distributed systems by the appropriate algebraic constructions based on cyclic groups in extensions of Galois fields, and development of the scientific basis for optimal solutions for wide classes of technological problems in big data process engineering and computer science.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Ziheng, Heng Chen, and Weiguo Wu. "Client-Aware Negotiation for Secure and Efficient Data Transmission." Energies 13, no. 21 (November 4, 2020): 5777. http://dx.doi.org/10.3390/en13215777.

Full text
Abstract:
In Wireless Sensor Networks (WSNs), server clusters, and other systems requiring secure transmission, the overhead of data encryption and transmission is often not negligible. Unfortunately, a conflict exists between security and efficiency in processing data. Therefore, this paper proposes a strategy to overcome this conflict, called Client-Aware Negotiation for Secure and Efficient Data Transmission (CAN-SEAT). This strategy allows a client with different security transmission requirements to use the appropriate data security transmission without modifying the client. Two methods are designed for different clients. The first method is based on two-way authentication and renegotiation. After handshakes, the appropriate data security transmission scheme is selected according to the client requirements. Another method is based on redirection, which can be applied when the client does not support two-way authentication or renegotiation. For the characteristics of different architecture, this paper classifies and discusses symmetric key algorithms, asymmetric key algorithms, and hardware encryption instructions. In four application scenarios, the CAN-SEAT strategy is tested. Compared with the general transmission strategy, when only software encryption is used, the data processing and transmission cost can be reduced by 89.41% in the best case and by 15.40% in the worst case. When supporting hardware encryption, the cost can be reduced by 85.30% and 24.63%, respectively. A good effect was produced on the experimental platforms XiLinx, FT-2000+, and Intel processors. To the best of our knowledge, for Client-Aware Negotiation (CAN), this is the first method to be successfully deployed on a general system. CAN-SEAT can be easily combined with other energy-efficient strategies.
APA, Harvard, Vancouver, ISO, and other styles
10

Melián, José M., Adán Jiménez, María Díaz, Alejandro Morales, Pablo Horstrand, Raúl Guerra, Sebastián López, and José F. López. "Real-Time Hyperspectral Data Transmission for UAV-Based Acquisition Platforms." Remote Sensing 13, no. 5 (February 25, 2021): 850. http://dx.doi.org/10.3390/rs13050850.

Full text
Abstract:
Hyperspectral sensors that are mounted in unmanned aerial vehicles (UAVs) offer many benefits for different remote sensing applications by combining the capacity of acquiring a high amount of information that allows for distinguishing or identifying different materials, and the flexibility of the UAVs for planning different kind of flying missions. However, further developments are still needed to take advantage of the combination of these technologies for applications that require a supervised or semi-supervised process, such as defense, surveillance, or search and rescue missions. The main reason is that, in these scenarios, the acquired data typically need to be rapidly transferred to a ground station where it can be processed and/or visualized in real-time by an operator for taking decisions on the fly. This is a very challenging task due to the high acquisition data rate of the hyperspectral sensors and the limited transmission bandwidth. This research focuses on providing a working solution to the described problem by rapidly compressing the acquired hyperspectral data prior to its transmission to the ground station. It has been tested using two different NVIDIA boards as on-board computers, the Jetson Xavier NX and the Jetson Nano. The Lossy Compression Algorithm for Hyperspectral Image Systems (HyperLCA) has been used for compressing the acquired data. The entire process, including the data compression and transmission, has been optimized and parallelized at different levels, while also using the Low Power Graphics Processing Units (LPGPUs) embedded in the Jetson boards. Finally, several tests have been carried out to evaluate the overall performance of the proposed design. The obtained results demonstrate the achievement of real-time performance when using the Jetson Xavier NX for all the configurations that could potentially be used during a real mission. However, when using the Jetson Nano, real-time performance has only been achieved when using the less restrictive configurations, which leaves room for further improvements and optimizations in order to reduce the computational burden of the overall design and increase its efficiency.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Computers / Data Transmission Systems / General"

1

Tennant, Robert Satchwell. "An alternative peripheral executive for the data general AOS/VS operating system." Thesis, Rhodes University, 1990. http://hdl.handle.net/10962/d1002031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Blake, Carl David. "A REAL-TIME MULTI-TASKING OPERATING SYSTEM FOR GENERAL PURPOSE APPLICATIONS." Thesis, The University of Arizona, 1985. http://hdl.handle.net/10150/275400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Knee, Simon. "Opal : modular programming using the BSP model." Thesis, University of Oxford, 1997. http://ora.ox.ac.uk/objects/uuid:97d95f01-a098-499c-8c07-303b853c2460.

Full text
Abstract:
Parallel processing can provide the huge computational resources that are required to solve todays grand challenges, at a fraction of the cost of developing sequential machines of equal power. However, even with such attractive benefits the parallel software industry is still very small compared to its sequential counterpart. This has been attributed to the lack of an accepted parallel model of computation, therefore leading to software which is architecture dependent with unpredictable performance. The Bulk Synchronous Parallel (BSP) model provides a solution to these problems and can be compared to the Von Neumann model of sequential computation. In this thesis we investigate the issues involved in providing a modular programming environment based on the BSP model. Using our results we present Opal, a BSP programming language that has been designed for parallel programming-in-the-large. While other BSP languages and libraries have been developed, none of them provide support for libraries of parallel algorithms. A library mechanism must be introduced into BSP without destroying the existing cost model. We examine such issues and show that the active library mechanism of Opal leads to algorithms which still have predictable performance. If algorithms are to retain acceptable levels of performance across a range of machines then they must be able to adapt to the architecture that they are executing on. Such adaptive algorithms require support from the programming language, an issue that has been addressed in Opal. To demonstrate the Opal language and its modular features we present a number of example algorithms. Using an Opal compiler that has been developed we show that we can accurately predict the performance of these algorithms. The thesis concludes that by using Opal it is possible to program the BSP model in a modular fashion that follows good software engineering principles. This enables large scale parallel software to be developed that is architecture independent, has predictable performance and is adaptive to the target architecture.
APA, Harvard, Vancouver, ISO, and other styles
4

Braginton, Pauline. "Taxonomy of synchronization and barrier as a basic mechanism for building other synchronization from it." CSUSB ScholarWorks, 2003. https://scholarworks.lib.csusb.edu/etd-project/2288.

Full text
Abstract:
A Distributed Shared Memory(DSM) system consists of several computers that share a memory area and has no global clock. Therefore, an ordering of events in the system is necessary. Synchronization is a mechanism for coordinating activities between processes, which are program instantiations in a system.
APA, Harvard, Vancouver, ISO, and other styles
5

Jones-Diette, Julie Susan. "Establishment of methods for extracting and analysing patient data from electronic practice management software systems used in first opinion veterinary practice in the UK." Thesis, University of Nottingham, 2014. http://eprints.nottingham.ac.uk/14345/.

Full text
Abstract:
Examining patient records is a useful way to identify common conditions and treatment outcomes in veterinary practice and data gathered can be fed back to the profession to assist with clinical decision making. This research aimed to develop a method to extract clinical data from veterinary electronic patient records (EPRs) and to assess the value of the data extracted for use in practice-based research. The transfer of new research from continuing professional development (CPD) into practice was also considered. An extensible mark-up language (XML) schema was designed to extract information from a veterinary EPR. The analysis of free text was performed using a content analysis program and a clinical terms dictionary was created to mine the extracted data. Data collected by direct observation was compared to the extracted data. A review of research published in the proceedings of a popular veterinary CPD event, British Small Animal Veterinary Association (BSAVA) Congress, was appraised for evidence quality. All animal records were extracted and validation confirmed 100% accuracy. The content analysis produced results with a high specificity (100%) and the mined data analysis was successful in assessing the prevalence of a specific disease. On comparison, the data extracted from the EPR contained only 65% of all data recorded by direct observation. The review of BSAVA Congress abstracts found the majority of the clinical research abstracts (CRAs) presented to be case reports and case series, with differences in focus between CRAs and veterinary lecture stream abstracts. This study has demonstrated that data extraction using an XML schema is a viable method for the capture of patient data from veterinary EPRs. The next step will be to understand the differences found between data collected by observation and extraction, and to investigate how research presented as CPD is received, appraised and applied by the veterinary profession.
APA, Harvard, Vancouver, ISO, and other styles
6

McGookin, David Kerr. "Understanding and improving the identification of concurrently presented earcons." Thesis, Connect to e-thesis. Move to record for print version, 2004. http://theses.gla.ac.uk/14/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Scarlato, Michele. "Sicurezza di rete, analisi del traffico e monitoraggio." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/3223/.

Full text
Abstract:
Il lavoro è stato suddiviso in tre macro-aree. Una prima riguardante un'analisi teorica di come funzionano le intrusioni, di quali software vengono utilizzati per compierle, e di come proteggersi (usando i dispositivi che in termine generico si possono riconoscere come i firewall). Una seconda macro-area che analizza un'intrusione avvenuta dall'esterno verso dei server sensibili di una rete LAN. Questa analisi viene condotta sui file catturati dalle due interfacce di rete configurate in modalità promiscua su una sonda presente nella LAN. Le interfacce sono due per potersi interfacciare a due segmenti di LAN aventi due maschere di sotto-rete differenti. L'attacco viene analizzato mediante vari software. Si può infatti definire una terza parte del lavoro, la parte dove vengono analizzati i file catturati dalle due interfacce con i software che prima si occupano di analizzare i dati di contenuto completo, come Wireshark, poi dei software che si occupano di analizzare i dati di sessione che sono stati trattati con Argus, e infine i dati di tipo statistico che sono stati trattati con Ntop. Il penultimo capitolo, quello prima delle conclusioni, invece tratta l'installazione di Nagios, e la sua configurazione per il monitoraggio attraverso plugin dello spazio di disco rimanente su una macchina agent remota, e sui servizi MySql e DNS. Ovviamente Nagios può essere configurato per monitorare ogni tipo di servizio offerto sulla rete.
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Yi-Sheng. "A token based MAC protocol for wireless ad hoc networks." Thesis, 2003. http://hdl.handle.net/10413/4172.

Full text
Abstract:
The emergence of portable terminals in work and living environments is accelerating the progression of wireless networks. A wireless ad hoc network is a new network concept where users establish peer-to-peer communication among themselves independently, in their small area. Since the wireless medium is a shared resource, it becomes an important design issue to efficiently allocate bandwidth among users. MAC (Medium Access Control) layer arbitrates the channel access to the wireless medium and is also responsible for bandwidth allocation to different users, therefore a large amount of research has been conducted on various MAC protocols for ad hoc wireless networks. This dissertation begins with a survey of existing wireless MAC protocols. The survey includes protocols designed for different network generations and topologies, classifying them based on architecture and mode of operation. Next, we concentrate on the MAC protocols proposed for distributed wireless networks. We propose a new MAC protocol based on a token-passing strategy; which not only incorporates the advantages of the guaranteed access scheme into the distributed type of wireless networks, but also the data rate and delay level QoS guarantees. Data rate QoS provides fairness into sharing of the channel, while delay level QoS introduces a flexible prioritized access to channels by adjusting transmission permission to the current network traffic activities. A simulation model for the protocol is developed and delay and throughput performance results are presented. To examine the efficiency and performance of the proposed MAC scheme in an ad hoc wireless environment, it is incorporated into the Bluetooth structured network. The model is then simulated in the Bluetooth environment and performance results are presented. Furthermore, an analytical model is proposed and an approximate delay analysis conducted for the proposed MAC scheme. Analytical results are derived and compared with results obtained from computer simulations. The dissertation concludes with suggestions for improvements and future work.
Thesis (M.Sc.-Engineering)-University of Natal, 2003.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Computers / Data Transmission Systems / General"

1

A, Vixie Paul, ed. Sendmail: Theory and practice. Boston: Digital Press, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gralla, Preston. How Wireless Works. Upper Saddle River: Pearson Education, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gralla, Preston. How wireless works. 2nd ed. Indianapolis, Ind: Que, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gralla, Preston. Cómo funcionan las redes inalámbricas. Madrid: Anaya Multimedia, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

National Research Council (U.S.). National Research Network Review Committee. Toward a national research network. Washington, D.C: National Academy Press, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Peikari, Cyrus. Maximum Wireless Security. Upper Saddle River: Pearson Education, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Seth, Fogie, ed. Maximum wireless security. Indianapolis, Ind: Sams, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

inc, McGraw-Hill, ed. McGraw-Hill data communications dictionary: Definitions and descriptions of general and SNA terms, recommendations, standards, interchange codes, IBM communications products, and units of measure. New York: McGraw-Hill, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yuan, Michael Juntao. Developing scalable series 40 applications: A guide for Java developers. Upper Saddle River, NJ: Addison-Wesley, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ledford, Jerri. Cut the cord!: The consumer's guide to VoIP. Boston, MA: Thomson Course Technology, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Computers / Data Transmission Systems / General"

1

Houankpo, H. G. K., and Dmitry Kozyrev. "Reliability Model of a Homogeneous Warm-Standby Data Transmission System with General Repair Time Distribution." In Distributed Computer and Communication Networks, 443–54. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36614-8_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Marwedel, Peter. "System Software." In Embedded Systems, 203–37. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-60910-8_4.

Full text
Abstract:
AbstractIn order to cope with the complexity of applications of embedded systems, reuse of components is a key technique. As pointed out by Sangiovanni-Vincentelli (The context for platform-based design. IEEE Design and Test of Computers, 2002), software and hardware components must be reused in the platform-based design methosdology (see p. 296). These components comprise knowledge from earlier design efforts and constitute intellectual property (IP). Standard software components that can be reused include system software components such as embedded operating systems (OSs) and middleware. The last term denotes software that provides an intermediate layer between the OS and application software. This chapter starts with a description of general requirements for embedded operating systems. This includes real-time capabilities as well as adaptation techniques to provide just the required functionality. Mutually exclusive access to resources can result in priority inversion, which is a serious problem for real-time systems. Priority inversion can be circumvented with resource access protocols. We will present three such protocols: the priority inheritance, priority ceiling, and stack resource protocols. A separate section covers the ERIKA real-time system kernel. Furthermore, we will explain how Linux can be adapted to systems with tight resource constraints. Finally, we will provide pointers for additional reusable software components, like hardware abstraction layers (HALs), communication software, and real-time data bases. Our description of embedded operating systems and of middleware in this chapter is consistent with the overall design flow.
APA, Harvard, Vancouver, ISO, and other styles
3

Vocaturo, Eugenio. "Image Classification Techniques." In Handbook of Research on Disease Prediction Through Data Analytics and Machine Learning, 22–49. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-2742-9.ch003.

Full text
Abstract:
The image processing task, aimed at interpreting and classifying the contents of the images, has attracted the attention of researchers since the early days of computers. With the advancement of computing system technology, image categorization has found increasingly broader applications, covering new generation disciplines such as image analysis, object recognition, and computer vision, with applications quite general both in scientific and humanistic fields. The automatic recognition, description, and classification of the structures contained in the images are of fundamental importance in a vast set of scientific and engineering fields that require the acquisition, processing, and transmission of information in visual form. Classification tasks also include those related to the categorization of images, such as the construction of a recognition system, the representation of patterns, the selection and extraction of features, and the definition of automatic recognition methods. Image analysis is of collective interest and it is a hot topics of current research.
APA, Harvard, Vancouver, ISO, and other styles
4

Rani, Madhu, Shagun, and Manisha Gupta. "AI and Over-the-Top (OTT)." In Revolutionizing Business Practices Through Artificial Intelligence and Data-Rich Environments, 188–99. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-4950-9.ch010.

Full text
Abstract:
Artificial intelligence (AI) is a field of study that focuses on the development and theory of computer systems that are capable of doing tasks that would normally need the intelligence of humans. Language translation, decision-making, and speech recognition are only a few of the tasks that, in general, need the use of a human brain to be performed properly. In the context of content delivery, an OTT platform (also known as over the top) is a platform that does not provide video via traditional cable or receivers. Video and audio distribution via the internet without the involvement of a multiple system operator (MSO) is referred to as online video and audio distribution. When it comes to the administration and transmission of information, there are several options. Viewers are able to access it from any place at any time and save it for later viewing convenience.
APA, Harvard, Vancouver, ISO, and other styles
5

Caeiro, José Jasnau, and João Carlos Martins. "Water Management for Rural Environments and IoT." In Harnessing the Internet of Everything (IoE) for Accelerated Innovation Opportunities, 83–99. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-7332-6.ch004.

Full text
Abstract:
Internet of Things (IoT) systems are starting to be developed for applications in the management of water quality monitoring systems. The chapter presents some of the work done in this area and also shows some systems being developed by the authors for the Alentejo region. A general architecture for water quality monitoring systems is discussed. The important issue of computer security is mentioned and connected to recent publications related to the blockchain technology. Web services, data transmission technology, micro web frameworks, and cloud IoT services are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
6

Caeiro, José Jasnau, and João Carlos Martins. "Water Management for Rural Environments and IoT." In Research Anthology on Blockchain Technology in Business, Healthcare, Education, and Government, 246–62. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-5351-0.ch015.

Full text
Abstract:
Internet of Things (IoT) systems are starting to be developed for applications in the management of water quality monitoring systems. The chapter presents some of the work done in this area and also shows some systems being developed by the authors for the Alentejo region. A general architecture for water quality monitoring systems is discussed. The important issue of computer security is mentioned and connected to recent publications related to the blockchain technology. Web services, data transmission technology, micro web frameworks, and cloud IoT services are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Bourguet, Marie-Luce. "An Overview of Multimodal Interaction Techniques and Applications." In Human Computer Interaction, 95–101. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-87828-991-9.ch008.

Full text
Abstract:
Desktop multimedia (multimedia personal computers) dates from the early 1970s. At that time, the enabling force behind multimedia was the emergence of the new digital technologies in the form of digital text, sound, animation, photography, and, more recently, video. Nowadays, multimedia systems mostly are concerned with the compression and transmission of data over networks, large capacity and miniaturized storage devices, and quality of services; however, what fundamentally characterizes a multimedia application is that it does not understand the data (sound, graphics, video, etc.) that it manipulates. In contrast, intelligent multimedia systems at the crossing of the artificial intelligence and multimedia disciplines gradually have gained the ability to understand, interpret, and generate data with respect to content. Multimodal interfaces are a class of intelligent multimedia systems that make use of multiple and natural means of communication (modalities), such as speech, handwriting, gestures, and gaze, to support human-machine interaction. More specifically, the term modality describes human perception on one of the three following perception channels: visual, auditive, and tactile. Multimodality qualifies interactions that comprise more than one modality on either the input (from the human to the machine) or the output (from the machine to the human) and the use of more than one device on either side (e.g., microphone, camera, display, keyboard, mouse, pen, track ball, data glove). Some of the technologies used for implementing multimodal interaction come from speech processing and computer vision; for example, speech recognition, gaze tracking, recognition of facial expressions and gestures, perception of sounds for localization purposes, lip movement analysis (to improve speech recognition), and integration of speech and gesture information. In 1980, the put-that-there system (Bolt, 1980) was developed at the Massachusetts Institute of Technology and was one of the first multimodal systems. In this system, users simultaneously could speak and point at a large-screen graphics display surface in order to manipulate simple shapes. In the 1990s, multimodal interfaces started to depart from the rather simple speech-and-point paradigm to integrate more powerful modalities such as pen gestures and handwriting input (Vo, 1996) or haptic output. Currently, multimodal interfaces have started to understand 3D hand gestures, body postures, and facial expressions (Ko, 2003), thanks to recent progress in computer vision techniques.
APA, Harvard, Vancouver, ISO, and other styles
8

Bourguet, Marie-Luce. "An Overview of Multimodal Interaction Techniques and Applications." In Encyclopedia of Human Computer Interaction, 451–56. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-562-7.ch068.

Full text
Abstract:
Desktop multimedia (multimedia personal computers) dates from the early 1970s. At that time, the enabling force behind multimedia was the emergence of the new digital technologies in the form of digital text, sound, animation, photography, and, more recently, video. Nowadays, multimedia systems mostly are concerned with the compression and transmission of data over networks, large capacity and miniaturized storage devices, and quality of services; however, what fundamentally characterizes a multimedia application is that it does not understand the data (sound, graphics, video, etc.) that it manipulates. In contrast, intelligent multimedia systems at the crossing of the artificial intelligence and multimedia disciplines gradually have gained the ability to understand, interpret, and generate data with respect to content. Multimodal interfaces are a class of intelligent multimedia systems that make use of multiple and natural means of communication (modalities), such as speech, handwriting, gestures, and gaze, to support human-machine interaction. More specifically, the term modality describes human perception on one of the three following perception channels: visual, auditive, and tactile. Multimodality qualifies interactions that comprise more than one modality on either the input (from the human to the machine) or the output (from the machine to the human) and the use of more than one device on either side (e.g., microphone, camera, display, keyboard, mouse, pen, track ball, data glove). Some of the technologies used for implementing multimodal interaction come from speech processing and computer vision; for example, speech recognition, gaze tracking, recognition of facial expressions and gestures, perception of sounds for localization purposes, lip movement analysis (to improve speech recognition), and integration of speech and gesture information. In 1980, the put-that-there system (Bolt, 1980) was developed at the Massachusetts Institute of Technology and was one of the first multimodal systems. In this system, users simultaneously could speak and point at a large-screen graphics display surface in order to manipulate simple shapes. In the 1990s, multimodal interfaces started to depart from the rather simple speech-and-point paradigm to integrate more powerful modalities such as pen gestures and handwriting input (Vo, 1996) or haptic output. Currently, multimodal interfaces have started to understand 3D hand gestures, body postures, and facial expressions (Ko, 2003), thanks to recent progress in computer vision techniques.
APA, Harvard, Vancouver, ISO, and other styles
9

Chander, Bhanu. "The State-of-the-Art Cryptography Techniques for Secure Data Transmission." In Handbook of Research on Intrusion Detection Systems, 284–305. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2242-4.ch014.

Full text
Abstract:
Cryptography is a progression where message correspondences are intelligently sent from one abuser to an additional abuser which endows with frequent defense services like privacy, data truthfulness, or verification to the wireless transportation structure. An encryption method keeps exceptional crucial contribution to communication safety measures. Here we mentioned characteristics of various Symmetric and Asymmetric encryption techniques along with inclusion of optimization techniques in cryptography for decrease computation difficulty. Moreover, advanced encryption techniques such as Zero-knowledge, Multi-party, Homomorphism encryptions, and Cognitive cryptography, Blockchain with their associated protocols are described. The present day's extensive research practices on quantum computer machines explain mathematical tribulations which are complicated or stubborn for classical computers. Quantum cryptography, challenges, Goal of Quantum resistant cryptography with associated literature work is described.
APA, Harvard, Vancouver, ISO, and other styles
10

Guo, Junxia, Gang Lu, Zili Xie, Jiawei Wen, and Nanshan Xu. "An Intelligent Marshalling Model for Enterprise Station Freight Railway." In Fuzzy Systems and Data Mining VI. IOS Press, 2020. http://dx.doi.org/10.3233/faia200740.

Full text
Abstract:
Railway marshalling and transportation is an important component of the production supply chain for large and medium-sized enterprises in China. Traditional inefficient manual-made marshalling plans usually are not optimal in time or energy consuming. An efficient method needs to be developed to find the optimal marshalling plans automatically. This paper mainly studies the railway train automatic marshalling in large and medium-sized enterprises in China. Based on the investigation at the train station of a certain enterprise, according to the railway track information, carriage information, and production task information, this paper designs the abstracted railway state definitions of the station. Then based on the state definitions, the scheduling rules, and the objective function of time cost and economic cost, this paper converts abstract scheduling instructions into a general railway automatic marshalling model which can be executed by computers. By introducing the greedy strategies into different situations to optimize the algorithm of tracks occupation, carriages selection and train path selection in the model, the planning efficiency can be improved while ensuring the economic benefits of the enterprises and the quality of the formation plan. The experimental results show that the proposed model can generate fewer marshalling plans and find the optimal one faster in most cases, which proves the feasibility and availability of the model.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Computers / Data Transmission Systems / General"

1

Budai, Csaba, László L. Kovács, and József Kövecses. "Analysis of the Effect of Coulomb Friction on Haptic Systems Dynamics." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59961.

Full text
Abstract:
Dissipation mechanisms and dissipative forces play a pivotal role in the operations and performance of human-machine interfaces and particularly in haptic systems. Dissipation is a very difficult phenomenon to model. Coulomb friction in general can be the most influential element in systems involving multiple direct contact connections such as joints with transmissions or mechanically guided components. Coulomb friction includes non-smooth discontinuity and can induce complex dynamic behaviours. Very little attention has been paid to the analysis of the effects of Coulomb friction in haptic devices. In this paper we illustrate, by experiment, analysis and simulation, the nature of the dynamic behaviour caused by Coulomb friction in haptic sampled-data systems. We demonstrate that a simple model can represent this behaviour, and show the effects of the haptic system parameters on this dynamics.
APA, Harvard, Vancouver, ISO, and other styles
2

Play, Daniel, Nicolas Fritsch, Ste´phane Huot, and Eric Ayax. "Numerical Simulations of Timing Belt Camshaft Layout: Local and Global Behavior." In ASME 2003 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/detc2003/ptg-48008.

Full text
Abstract:
Mechanical behaviors of power transmission systems have to be defined during preliminary design. Today, numerical simulations replace classical tests. General experimental validation of numerical results obtained with computer softwares was performed before. The design of timing belt camshaft layouts follows the same way. Meshing of timing belt on pulleys is related to local behavior of load transmission while dynamic response of a whole camshaft drive layout is related to global behavior. Because of the complex nature of phenomena that take place in such mechanical systems and due to practical requirements in Design Office concerning limitation of computer times for example, local and global analyses are made separately but results of the local analysis serve as inputs for the second analysis. The purpose of the local analysis is to optimize pulley groove profiles of pulleys in order to insure both smaller dynamic excitations and a larger timing belt fatigue life. Simulations of tooth meshing are made under non-linear FEM study. Tooth meshing is described step by step and both bending of inner timing belt cords and transmission error effects are defined in relation to specific shapes of pulley groove profiles. The quasi-static transmission error constitutes one of input data for global dynamic simulations through specific in-house software DSTD (Dynamic Simulation of Timing Drive). Dynamic loads and dynamic transmission error are obtained in relation with inertia, stiffness and damping of mechanical elements and with timing belt characteristics. Results are discussed in relation with design parameters of camshaft drive layout.
APA, Harvard, Vancouver, ISO, and other styles
3

Manhartsgruber, Bernhard. "Repetitive Excitation Control for Precise Measurement of Wave Propagation in Fluid Power Systems." In ASME/BATH 2015 Symposium on Fluid Power and Motion Control. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/fpmc2015-9599.

Full text
Abstract:
Wave propagation effects in fluid power systems in general and the more specialized field of transmission line modelling have gained substantial research interest in the fluid power community. While the mathematical tools applied in computer models become more and more sophisticated, the availability of highly precise experimental data is still very limited in the fluid power literature. This paper focuses on a rather simple wave propagation experiment with a hydraulic transmission line featuring a servo-valve for excitation at one end and a blocked end boundary condition at the other end. The goal is to achieve precise control of a periodic excitation pressure waveform at the servo-valve boundary by repetitive or iterative learning control techniques. High resolution (24 bit) analog to digital conversion of the measured pressure signals together with the application of periodic averaging techniques allow for a highly precise measurement of the wave propagation dynamics including a margin of error analysis of the results. In future research, the measurement data will be used for refinement of fluid material laws for transmission line models as well as for studying the influence of geometric features like sharp edged diameter changes or elbow joints.
APA, Harvard, Vancouver, ISO, and other styles
4

Merkulova, A. G., S. A. Kalinina, and M. V. Skavronskaya. "FATIGUE ASSESSMENT OF THE VISUAL ANALYZER OF MULTIMONITOR SYSTEMS OPERATORS." In The 16th «OCCUPATION and HEALTH» Russian National Congress with International Participation (OHRNC-2021). FSBSI “IRIOH”, 2021. http://dx.doi.org/10.31089/978-5-6042929-2-1-2021-1-350-354.

Full text
Abstract:
Abstract: Introduction. Working at a computer is associated with an intense cognitive load and an increased load on the visual analyzer due to the peculiarities of the screen image transmission. From 60 to 90% of users suffer from computer visual syndrome, more than 40% experience visual discomfort. If it is necessary to use several software windows in the workplace, multimonitor systems are increasingly used, however, there is still no data on their effect on the visual analyzer. Research objective. Assessment of the state of the visual analyzer of multimonitor systems operators in the dynamics of the work shift. Materials and methods. The study involved 26 operators of multimonitor systems (age 36.7 ± 8.3 years, experience 5.8 ± 3.0 years). The assessment of labor intensity in accordance with the Guidelines R 2.2.2006-05 and ergonomic analysis of workplaces were carried out. The functional state of the visual analyzer was assessed using eye tracking, accommodometry, sequential contrast perception time, subjective assessment of asthenopia symptoms. Research results. The labor intensity of operators of multimonitor systems corresponds to class 3.2. Ergonomic assessment of workplaces indicates irregularities in the arrangement of equipment and office furniture. There were no statistically significant differences in the indicators of oculomotor activity in the dynamics of the shift, while low values of the frequency of blinking were noted only in workers with an irrationally organized workplace, as well as when observing one monitor. By the end of the shift, the volume of accommodation decreased by 19.0%, the time of perception of sequential contrast by 15.3%, the most pronounced symptoms of asthenopia were general and visual fatigue. Conclusions. The use of multimonitor systems leads to the development of asthenopia by the end of the shift, however, the decrease in the volume of accommodation and the time of perception of consistent contrast are more pronounced in workers with one monitor. Due to the impossibility of changing the work process and reducing the class of NT, workers should pay special attention to the ergonomic characteristics of the workplace, compliance with work and rest regimes, prevention of the development of asthenopia and general fatigue.
APA, Harvard, Vancouver, ISO, and other styles
5

Garcilazo, Rafael, Xin Xue, and V. Sundararajan. "Feasibility of Wireless Sensors for Health Monitoring in Large Induction Motors." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-87834.

Full text
Abstract:
Wired sensor systems are currently used to monitor the performance and health of electric motors. Since the sensors need to be wired, these systems can only use few sensor modalities which are generally insufficient to detect the wide range of faults in motors. Wireless sensors, on the other hand, allow access to sensors mounted in accessible locations and on rotating parts. They are easy to install and maintain. However, the reliability of the transmission due to electromagnetic interference and the fidelity of the data due to high winding temperatures inside the motor need to be examined. This paper studies the feasibility of wireless sensors inside a 200hp AC induction motor. Two wireless sensors are attached inside the motor — one on the stator frame and one on the rotating shaft. A wired sensor is attached on the outside of the stator frame to study the fidelity of data from the wireless sensor. The packet delivery performance as a function of spatial location in terms of direction and distance with respect to the base station and the fidelity of data received by the base station are studied. The results show that an average of 97% and 87.9% of the data from the wireless sensor attached on the stator frame and shaft respectively is received at the base station, thus showing that wireless sensors can be reliably used inside the motor.
APA, Harvard, Vancouver, ISO, and other styles
6

Sittakul, Vitawat, Sarinya Pasakawee, and Piya Kovintavewat. "Data Transmission of Zigbee over Fiber." In 2019 34th International Technical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC). IEEE, 2019. http://dx.doi.org/10.1109/itc-cscc.2019.8793406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ming Hu and Junshan Zhang. "Rate adaptation for bursty data transmission in CDMA networks." In Conference Record. Thirty-Fifth Asilomar Conference on Signals, Systems and Computers. IEEE, 2001. http://dx.doi.org/10.1109/acssc.2001.987778.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lin, Di, and Fabrice Labeau. "A scheme of bandwidth allocation for the transmission of medical data." In 2010 44th Asilomar Conference on Signals, Systems and Computers. IEEE, 2010. http://dx.doi.org/10.1109/acssc.2010.5757531.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hakansson, Victor Wattin, Naveen K. D. Venkategowda, and Stefan Werner. "Optimal Transmission Threshold and Channel Allocation Strategies for Heterogeneous Sensor Data." In 2021 55th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2021. http://dx.doi.org/10.1109/ieeeconf53345.2021.9723275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sang, Zhiqian, and Xun Xu. "Development of a Smart Computer Numerical Control System." In ASME 2013 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/imece2013-65984.

Full text
Abstract:
Traditional Computer Numerical Control (CNC) machines use ISO6983 (G/M code) for part programming. G/M code has a number of drawbacks and one of them is lack of interoperability. The Standard for the Exchange of Product for NC (STEP-NC) as a potential replacement for G/M code aims to provide a unified and interoperable data model for CNC. In a modern CNC machine tool, more and more motors, actuators and sensors are implemented and connected to the NC system, which leads to large quantity of data being transmitted. The real-time Ethernet field-bus is faster and more deterministic and can fulfill the requirement of data transmission in the high-speed and high-precision machining scenarios. It can provide more determinism on communication, openness, interoperability and reliability than a traditional field-bus. With a traditional CNC system using G/M code, when the machining is interrupted by incidents, restarting the machining process is time-consuming and highly experience-dependent. The proposed CNC controller can generate just-in-time tool paths for feature-based machining from a STEP-NC file. When machining stoppage occurs, the system can recover from stoppage incidents with minimum human intervention. This is done by generating new tool paths for the remaining machining process with or without the availability of the original cutting tool. The system uses a real-time Ethernet field-bus as the connection between the controller and the motors.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Computers / Data Transmission Systems / General"

1

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova, and Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3677.

Full text
Abstract:
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
APA, Harvard, Vancouver, ISO, and other styles
2

Daudelin, Francois, Lina Taing, Lucy Chen, Claudia Abreu Lopes, Adeniyi Francis Fagbamigbe, and Hamid Mehmood. Mapping WASH-related disease risk: A review of risk concepts and methods. United Nations University Institute for Water, Environment and Health, December 2021. http://dx.doi.org/10.53328/uxuo4751.

Full text
Abstract:
The report provides a review of how risk is conceived of, modelled, and mapped in studies of infectious water, sanitation, and hygiene (WASH) related diseases. It focuses on spatial epidemiology of cholera, malaria and dengue to offer recommendations for the field of WASH-related disease risk mapping. The report notes a lack of consensus on the definition of disease risk in the literature, which limits the interpretability of the resulting analyses and could affect the quality of the design and direction of public health interventions. In addition, existing risk frameworks that consider disease incidence separately from community vulnerability have conceptual overlap in their components and conflate the probability and severity of disease risk into a single component. The report identifies four methods used to develop risk maps, i) observational, ii) index-based, iii) associative modelling and iv) mechanistic modelling. Observational methods are limited by a lack of historical data sets and their assumption that historical outcomes are representative of current and future risks. The more general index-based methods offer a highly flexible approach based on observed and modelled risks and can be used for partially qualitative or difficult-to-measure indicators, such as socioeconomic vulnerability. For multidimensional risk measures, indices representing different dimensions can be aggregated to form a composite index or be considered jointly without aggregation. The latter approach can distinguish between different types of disease risk such as outbreaks of high frequency/low intensity and low frequency/high intensity. Associative models, including machine learning and artificial intelligence (AI), are commonly used to measure current risk, future risk (short-term for early warning systems) or risk in areas with low data availability, but concerns about bias, privacy, trust, and accountability in algorithms can limit their application. In addition, they typically do not account for gender and demographic variables that allow risk analyses for different vulnerable groups. As an alternative, mechanistic models can be used for similar purposes as well as to create spatial measures of disease transmission efficiency or to model risk outcomes from hypothetical scenarios. Mechanistic models, however, are limited by their inability to capture locally specific transmission dynamics. The report recommends that future WASH-related disease risk mapping research: - Conceptualise risk as a function of the probability and severity of a disease risk event. Probability and severity can be disaggregated into sub-components. For outbreak-prone diseases, probability can be represented by a likelihood component while severity can be disaggregated into transmission and sensitivity sub-components, where sensitivity represents factors affecting health and socioeconomic outcomes of infection. -Employ jointly considered unaggregated indices to map multidimensional risk. Individual indices representing multiple dimensions of risk should be developed using a range of methods to take advantage of their relative strengths. -Develop and apply collaborative approaches with public health officials, development organizations and relevant stakeholders to identify appropriate interventions and priority levels for different types of risk, while ensuring the needs and values of users are met in an ethical and socially responsible manner. -Enhance identification of vulnerable populations by further disaggregating risk estimates and accounting for demographic and behavioural variables and using novel data sources such as big data and citizen science. This review is the first to focus solely on WASH-related disease risk mapping and modelling. The recommendations can be used as a guide for developing spatial epidemiology models in tandem with public health officials and to help detect and develop tailored responses to WASH-related disease outbreaks that meet the needs of vulnerable populations. The report’s main target audience is modellers, public health authorities and partners responsible for co-designing and implementing multi-sectoral health interventions, with a particular emphasis on facilitating the integration of health and WASH services delivery contributing to Sustainable Development Goals (SDG) 3 (good health and well-being) and 6 (clean water and sanitation).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography