Academic literature on the topic 'Transmission basse latence'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Transmission basse latence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Transmission basse latence"

1

Kraemer, Rolf, Marcin Brzozowski, and Stefan Nowak. "Reliable architecture for heterogeneous home-networks: The OMEGA I-MAC approach." Facta universitatis - series: Electronics and Energetics 25, no. 1 (2012): 43–58. http://dx.doi.org/10.2298/fuee1201043k.

Full text
Abstract:
Home networks are becoming more and more popular. Today's state of the art is that several different technologies, like Wireless LAN (WLAN), Power Line Communication (PLC), or Ethernet, are being used concurrently to connect home devices. If any connection fault happens, the communication stops and new connectivity has to be established. Since the fault detection can last several seconds, it reduces the "Quality of Experience" dramatically, and ongoing transmissions can be disturbed. Therefore, reliability of transmission is a necessary precondition to fulfil customers' expectations. In our approach, we suggest an additional protocol layer, dubbed layer 2.5, which manages any available connectivity and automatically chooses a new connection with the correct properties, for instance HDTV stream. It balances load situations and can also be used to intelligently distribute traffic between nodes. The system has been proven to work in standard LINUX kernel implementation with a speed up to 1 Gb/s and with an extreme low latency. The topology control and signalling components have been implemented in LINUX user space and work on best effort bases. Within this paper, we outline the architectural considerations and show the initial results. The work has been used as the starting point for a new IEEE standardization, i.e., IEEE1905.1. This group started with the OMEGA I-MAC architecture, introduced here. The work described here has received funding from the European Commission's Seventh Framework Programme FP7/2007-2013 under grant agreement no. 213311, also referred to as OMEGA.
APA, Harvard, Vancouver, ISO, and other styles
2

Ballotta, L., G. Peserico, F. Zanini, and P. Dini. "To Compute or Not to Compute? Adaptive Smart Sensing in Resource-Constrained Edge Computing." IEEE Transactions on Network Science and Engineering 11, no. 1 (2024): 736–49. https://doi.org/10.1109/TNSE.2023.3306202.

Full text
Abstract:
We consider a network of smart sensors for an edge computing application that sample a time-varying signal and send updates to a base station for remote global monitoring. Sensors are equipped with sensing and compute, and can either send raw data or process them on-board before transmission. Limited hardware resources at the edge generate a fundamental latency-accuracy trade-off: raw measurements are inaccurate but timely, whereas accurate processed updates are available after processing delay. Hence, one needs to decide when sensors should transmit raw measurements or rely on local processing to maximize network monitoring performance. To tackle this sensing design problem, we model an estimation-theoretic optimization framework that embeds both computation and communication latency, and propose a Reinforcement Learning-based approach that dynamically allocates computational resources at each sensor. Effectiveness of our proposed approach is validated through numerical experiments motivated by smart sensing for the Internet of Drones and self-driving vehicles. In particular, we show that, under constrained computation at the base station, monitoring performance can be further improved by an online sensor selection. © 2013 IEEE.
APA, Harvard, Vancouver, ISO, and other styles
3

Loua, L. R., M. A. Budihardjo, and S. Sudarno. "A review on a machine learning approach of an intelligent irrigation monitoring system with edge computing and the internet of things." IOP Conference Series: Earth and Environmental Science 896, no. 1 (2021): 012029. http://dx.doi.org/10.1088/1755-1315/896/1/012029.

Full text
Abstract:
Abstract Water consumption during irrigation has been a much-researched area in agricultural activities, and due to the frugal nature of different practiced irrigation systems, quite a sufficient amount of water is wasted. As a result, intelligent systems have been designed to integrate water-saving techniques and climatic data collection to improve irrigation. An innovative decision-making system was developed that used Ontology to make 50% of the decision while sensor values make the remaining 50%. Collectively, the system bases its decision on a KNN machine learning algorithm for irrigation scheduling. It also uses two different database servers, an edge and an IoT server, along with a GSM module to reduce the burden of the data transmission while also reducing the latency rate. With this method, the sensors could trace and analyze the data within the network using the edge server before transferring it to the IoT server for future watering requirements. The water-saving technique ensured that the crops obtained the required amount of water to ensure crop growth and prevent the soil from reaching its wilting point. Furthermore, the reduced irrigation water also limits the potential runoff events. The results were displayed using an android application.
APA, Harvard, Vancouver, ISO, and other styles
4

Tehseen, Rabia, Uzma Omer, Maham Mehr Awan, Rubab Javaid, Ayesha Zaheer, and Madiha Yousaf. "Impact of climatic anomalies and reservoir induced seismicity on earthquake generation using Federated Learning." VFAST Transactions on Software Engineering 12, no. 1 (2024): 133–51. http://dx.doi.org/10.21015/vtse.v12i1.1729.

Full text
Abstract:
In this article, impact of climatic anomalies and artificial hydraulic loading on earthquake generation has been studied using federated learning (FL) technique and a model for the prediction of earthquake has been proposed. Federated Learning being one of the most recent techniques of machine learning (ML) guarantees that the proposed model possesses the intrinsic ability to handle all concerns related to data involving data privacy, data availability, data security, and network latency glitches involved in earthquake prediction by restricting data transmission to the network during different stages of model training. The main objective of this study is to determine the impact of artificial stresses and climatic anomalies on increase and decrease in regional seismicity. Experimental verification of proposed model has been carried out within 100 km radial area from 34.708o N, 72.5478o E in Western Himalayan region. Regional data of atmospheric temperature, air pressure, rainfall, water level of reservoir and seismicity has been collected on hourly bases from 1985 till 2022. In this research, four client stations at different points within the selected area have been established to train local models by calculating time lag correlation between multiple data parameters. These local models are transmitted to central server where global model is trained for generating earthquake alert with ten days lead time alarming a specific client that reported high correlation among all selected parameters about expected earthquake.
APA, Harvard, Vancouver, ISO, and other styles
5

Stone, S., P. Witkovsky, and M. Schutte. "A chromatic horizontal cell in the Xenopus retina: intracellular staining and synaptic pharmacology." Journal of Neurophysiology 64, no. 6 (1990): 1683–94. http://dx.doi.org/10.1152/jn.1990.64.6.1683.

Full text
Abstract:
1. We identified a chromatic-type horizontal cell (C-cell) in the Xenopus retina by intracellular dye injection with Lucifer yellow or horseradish peroxidase (HRP). C-cells hyperpolarized in response to blue light and depolarized in response to red light. 2. In either photopic or mesopic states, moderate-intensity blue and red stimuli evoked responses that were inverted with respect to each other but of similar waveform and latency. In the presence of a bright green adapting field, the maximal voltage (Vmax) of the hyperpolarizing and depolarizing response component approached 30 mV; the kinetics of both waveforms were fast, and the hyperpolarizing response was followed by a small depolarizing overshoot at light OFF. Thus the blue-sensitive photoreceptor is capable of initiating large visual signals under photopic conditions when transmission from green-sensitive rods is suppressed. Under mesopic conditions (no adapting field) the kinetics of both waveforms were slower. The Vmax of the hyperpolarizing response reached 30-40 mV, whereas the cone-mediated depolarization saturated at 15 mV. 3. Both response components of the C-cell showed large receptive fields with no center-surround antagonism. 4. The C-cell perikaryon was located in the distal inner nuclear layer. It emitted four to seven long, tapering processes that ran horizontally for 90-100 microns. Two kinds of terminal dendrites, short and long, extended from the tapering processes toward the layer of photoreceptor bases. 5. Glycine (5-10 mM) completely eliminated the depolarizing response of the C-cell, whereas the hyperpolarizing component was unaffected. In contrast, gamma-aminobutyric acid (GABA; 5-10 mM) had no obvious effect on either component. 6. The C-cell light response was modified in two stages by cis-2,3-piperidine dicarboxylic acid (cis-PDA; 0.5-5 mM): first the depolarizing response disappeared; then the membrane potential hyperpolarized concomitant with a large reduction or elimination of the hyperpolarizing light response. In contrast, DL-2-amino-4-phosphonobutyric acid (APB) had no obvious effect on either response component or the membrane potential of the cell. 7. Our pharmacological findings are consistent with the view that the hyperpolarizing response in the C-cell is mediated by direct synaptic input from a blue-sensitive photoreceptor. The depolarizing response mediated by the red-sensitive cone could be explained by a direct synapse from the red cone or an indirect pathway involving luminosity (L-type) horizontal cells.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Transmission basse latence"

1

Aklouf, Mourad. "Video for events : Compression and transport of the next generation video codec." Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG029.

Full text
Abstract:
L'acquisition et la diffusion de contenus avec une latence minimale sont devenus essentiel dans plusieurs domaines d'activités tels que la diffusion d'évènements sportifs, la vidéoconférence, la télé-présence, la télé-opération de véhicules ou le contrôle à distance de systèmes. L'industrie de la diffusion en direct a connu une croissance en 2020, et son importance va encore croitre au cours des prochaines années grâce à l'émergence de nouveaux codecs vidéo à haute efficacité reposant sur le standard Versatile Video Coding(VVC)et à la cinquième génération de réseaux mobiles (5G).Les méthodes de streaming de type HTTP Adaptive Streaming (HAS) telles que MPEG-DASH, grâce aux algorithmes d'adaptation du débit de transmission de vidéo compressée, se sont révélées très efficaces pour améliorer la qualité d'expérience (QoE) dans un contexte de vidéo à la demande (VOD).Cependant, dans les applications où la latence est critique, minimiser le délai entre l'acquisition de l'image et son affichage au récepteur est essentiel. La plupart des algorithmes d'adaptation de débit sont développés pour optimiser la transmission vidéo d'un serveur situé dans le cœur de réseau vers des clients mobiles. Dans les applications nécessitant un streaming à faible latence, le rôle du serveur est joué par un terminal mobile qui va acquérir, compresser et transmettre les images via une liaison montante comportant un canal radio vers un ou plusieurs clients. Les approches d'adaptation de débit pilotées par le client sont par conséquent inadaptées. De plus, les HAS, pour lesquelles la prise de décision se fait avec une périodicité de l'ordre de la seconde ne sont pas suffisamment réactives lors d'une mobilité importante du serveur et peuvent engendrer des délais importants. Il est donc essentiel d'utiliser une granularité d'adaptation très fine afin de réduire le délai de bout-en-bout. En effet, la taille réduite des tampons d'émission et de réception afin de minimiser la latence rend plus délicate l'adaptation du débit dans notre cas d'usage. Lorsque la bande passante varie avec une constante de temps plus petite que la période avec laquelle la régulation est faite, les mauvaises décisions de débit de transmission peuvent induire un surcroit de latence important.L'objet de cette thèse est d'apporter des éléments de réponse à la problématique de la transmission vidéo à faible latence depuis des terminaux (émetteurs) mobiles. Nous présentons d'abord un algorithme d'adaptation de débit image-par-image pour la diffusion à faible latence. Une approche de type Model Predictive Control (MPC) est proposée pour déterminer le débit de codage de chaque image à transmettre. Cette approche utilise des informations relatives au niveau de tampon de l'émetteur et aux caractéristiques du canal de transmission. Les images étant codées en direct, un modèle reliant le paramètre de quantification (QP) au débit de sortie du codeur vidéo est nécessaire. Nous avons donc proposé un nouveau modèle reliant le débit au paramètre de quantification et à la distorsion de l'image précédente. Ce modèle fournit de bien meilleurs résultats dans le contexte d'une décision prise image par image du débit de codage que les modèle de référence de la littérature.En complément des techniques précédentes, nous avons également proposé des outils permettant de réduire la complexité de codeurs vidéo tels que VVC. La version actuelle du codeur VVC (VTM10) a un temps d'exécution neuf fois supérieur à celui du codeur HEVC. Par conséquent, le codeur VVC n'est pas adapté aux applications de codage et diffusion en temps réel sur les plateformes actuellement disponibles. Dans ce contexte, nous présentons une méthode systématique, de type branch-and-prune, permettant d'identifier un ensemble d'outils de codage pouvant être désactivés tout en satisfaisant une contrainte sur l'efficacité de codage. Ce travail contribue à la réalisation d'un codeur VVC temps réel<br>The acquisition and delivery of video content with minimal latency has become essential in several business areas such as sports broadcasting, video conferencing, telepresence, remote vehicle operation, or remote system control. The live streaming industry has grown in 2020 and it will expand further in the next few years with the emergence of new high-efficiency video codecs based on the Versatile Video Coding (VVC) standard and the fifth generation of mobile networks (5G).HTTP Adaptive Streaming (HAS) methods such as MPEG-DASH, using algorithms to adapt the transmission rate of compressed video, have proven to be very effective in improving the quality of experience (QoE) in a video-on-demand (VOD) context.Nevertheless, minimizing the delay between image acquisition and display at the receiver is essential in applications where latency is critical. Most rate adaptation algorithms are developed to optimize video transmission from a server situated in the core network to mobile clients. In applications requiring low-latency streaming, such as remote control of drones or broadcasting of sports events, the role of the server is played by a mobile terminal. The latter will acquire, compress, and transmit the video and transmit the compressed stream via a radio access channel to one or more clients. Therefore, client-driven rate adaptation approaches are unsuitable in this context because of the variability of the channel characteristics. In addition, HAS, for which the decision-making is done with a periodicity of the order of a second, are not sufficiently reactive when the server is moving, which may generate significant delays. It is therefore important to use a very fine adaptation granularity in order to reduce the end-to-end delay. The reduced size of the transmission and reception buffers (to minimize latency) makes it more difficult to adapt the throughput in our use case. When the bandwidth varies with a time constant smaller than the period with which the regulation is made, bad transmission rate decisions can induce a significant latency overhead.The aim of this thesis is to provide some answers to the problem of low-latency delivery of video acquired, compressed, and transmitted by mobile terminals. We first present a frame-by-frame rate adaptation algorithm for low latency broadcasting. A Model Predictive Control (MPC) approach is proposed to determine the coding rate of each frame to be transmitted. This approach uses information about the buffer level of the transmitter and about the characteristics of the transmission channel. Since the frames are coded live, a model relating the quantization parameter (QP) to the output rate of the video encoder is required. Hence, we have proposed a new model linking the rate to the QP of the current frame and to the distortion of the previous frame. This model provides much better results in the context of a frame-by-frame decision on the coding rate than the reference models in the literature.In addition to the above techniques, we have also proposed tools to reduce the complexity of video encoders such as VVC. The current version of the VVC encoder (VTM10) has an execution time nine times higher than that of the HEVC encoder. Therefore, the VVC encoder is not suitable for real-time encoding and streaming applications on currently available platforms. In this context, we present a systematic branch-and-prune method to identify a set of coding tools that can be disabled while satisfying a constraint on coding efficiency. This work contributes to the realization of a real-time VVC coder
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Transmission basse latence"

1

Luo, Zicheng, Xiaohan Li, Demu Zou, and Hao Bai. "Federated Reinforcement Learning Algorithm with Fair Aggregation for Edge Caching." In Advances in Transdisciplinary Engineering. IOS Press, 2024. https://doi.org/10.3233/atde241221.

Full text
Abstract:
Edge caching is employed to solve the challenge of massive data requests, ensuring the quality of user experience. However, existed edge caching algorithms often overlook issues related to user mobility, and privacy protection, non-identically and independently distributed (non-i.i.d.) characteristics of content requests among base stations. To tackle these challenges, this paper proposes Federated Reinforcement Learning Algorithm with Fair Aggregation for Edge Caching (FFA-PPO) algorithm. This paper primarily focuses on scenario of non-i.i.d. content requests in multi-base-station and multi-mobile-user network. We model this problem as a Markov Decision Process (MDP) problem and propose a federated reinforcement learning method to solve MDP problem. The goal is to minimize the content transmission latency of base stations. FFA-PPO algorithm resolves gradient conflicts by seeking the optimal gradient vector within a local ball centered at the averaged gradient which ensures model’s fairness. In conclusion, simulation results prove that the proposed FFA-PPO algorithm outperforms other baseline algorithms in terms of content transmission latency, model’s fairness.
APA, Harvard, Vancouver, ISO, and other styles
2

Nayak, Roopashree, Pavanalaxmi S., and Praveen Kumar M. "5G-6G." In Advances in Environmental Engineering and Green Technologies. IGI Global, 2023. http://dx.doi.org/10.4018/979-8-3693-0819-6.ch001.

Full text
Abstract:
Mobile networks play a crucial role in facilitating communication through the transmission and reception of radio wave signals. These networks are composed of interconnected cells provided by base stations, enabling wide geographic coverage. The evolution of mobile networks has progressed through several stages. It began with the analog-based first-generation systems that provided basic voice communication. The current phase is the fifth generation, which aims to deliver exceptional performance with faster speeds, low latency, and connectivity density. Ongoing research and development continue to shape the evolution of mobile networks, with technologies like 6G on the horizon, promising even faster speeds and transformative use cases. The network infrastructure for 5G and 6G plays a crucial role in enabling the capabilities and delivering the promised benefits of these advanced wireless communication technologies. The industry applications of 5G and the anticipated applications of 6G are diverse and have the potential to revolutionize various sectors.
APA, Harvard, Vancouver, ISO, and other styles
3

Rath, Prabhakar, Smita Rani Parija, and Kishan Gupta. "The Effective Cost-Reduction Plan for Particle Swarm Optimization-Based Mobile Location Monitoring in 5G Communications." In The Role of Network Security and 5G Communication in Smart Cities and Industrial Transformation. BENTHAM SCIENCE PUBLISHERS, 2025. https://doi.org/10.2174/9789815305876125010005.

Full text
Abstract:
The focus on cost reduction within mobile communication networks has become a key subject of attention due to its significant proportion of the overall cost utilization structure of information and communication technology (ICT). This research digs into the area of 5G networks, which include a heterogeneous mix of mega cells and small cells with a clear demarcation between data and control planes. The paper considers two categories of information or data. There are two categories of data flow or traffic: high-rate traffic for data and low-rate data congestion. Large-scale cellular base stations, or MBSs, are responsible for controlling and regulating signals in the conventional architecture for separation. In contrast, a small cell base station (SBS) controls data transmission at both low and high rates. An MBS manages control signals and- the pace of data flow within the modified separation architecture under consideration, whereas an SBS controls a high-speed data flow. An efficient energy saving method is presented to improve the cost-effectiveness of base stations (BSs). The amount of user equipment (UEs) seeking high-rate data traffic and the number of UEs present within overlapping areas that are generally covered by the considered BS and neighboring BSs are used to establish the operational state of a BS. To implement this cost-cutting method, Particle swarm optimization (PSO) finds an application to create a problem related to optimizing something and find its answer. The findings unequivocally demonstrate that the suggested energy-saving approach, as implemented within the redesigned split network design, surpasses the energy efficiency achieved by traditional energy-efficient techniques, Both of them have distinct network structures that are basic and customized. Additionally, this suggested plan significantly reduces cumulative latency, offering a highly promising strategy for enhancing overall network efficiency.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Transmission basse latence"

1

Mishra, Prabodh Kumar, Snigdhaswin Kar, Chun-Chih Lin, Kuang-Ching Wang, and Linke Guo. "Enabling Robust Communication Among Military Ground Vehicles Using Multi-Connectivity." In WCX SAE World Congress Experience. SAE International, 2023. http://dx.doi.org/10.4271/2023-01-0110.

Full text
Abstract:
&lt;div class="section abstract"&gt;&lt;div class="htmlview paragraph"&gt;Vehicles-to-Everything or V2X communications provide attractive advantages in achieving reliable and high-performance connectivity amongst ground and aerial military vehicles. The 5G New Radio (NR) based cellular-V2X (C-V2X) technology, can support wide coverage areas with higher data rates and lower latencies needed for demanding military applications ranging from real-time sensing to navigation of autonomous military ground vehicles. Millimeter wave technology (mmWave) is critical to meet such throughput and latency requirements. However, mmWave links have a low transmission range and are often subject to blockages due to factors like weather, terrain, etc. that make them unreliable. Multi-connectivity with packet duplication can be used to enhance the reliability and latency by transmitting concurrently over independent links between a mobile device and multiple base stations. We propose and evaluate a novel method based on new radio dual connectivity (NR-DC) and packet duplication techniques to achieve reliable communication between military ground vehicles, especially in mobility scenarios. We further propose and analyze a Channel State Information Reference Signal Received Quality (CSI-RSRQ) based duplication strategy to improve the system's radio resource utilization. Channel State Information Reference Signal symbols in downlink transmissions are used to accurately compute the CSI-RSRQ values of the radio channel in real-time. This is critical on the battlefield for real-time awareness and adaptive control in fast-changing environments. Prototyped in the Simu5G network simulator and MATLAB, our results show packet duplication achieved less than 5 milliseconds of latency with zero packet loss under mobility.&lt;/div&gt;&lt;/div&gt;
APA, Harvard, Vancouver, ISO, and other styles
2

Shiono, Show, Yukitoshi Sanada, Ryota Kimura, Hiroki Matsuda, and Ryo Sawai. "Latency of Uplink Non-Orthogonal Multiple Access Grant-Free Transmission with Multiuser Detection in Base Station." In 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall). IEEE, 2019. http://dx.doi.org/10.1109/vtcfall.2019.8891232.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jaworski, Maciej, Ryszard Wnuk, Małgorzata Cieślak, and Bogna Goetzendorf-Grabowska. "Experimental Investigation and Mathematical Modelling of Thermal Performance Characteristics of Textiles Incorporating Phase Change Materials (PCMs)." In Environmental Engineering. VGTU Technika, 2017. http://dx.doi.org/10.3846/enviro.2017.260.

Full text
Abstract:
Impregnation of textiles (fabrics) by phase change materials (PCMs) changes their thermal properties. High thermal capacity of PCMs, due to large enthalpy of phase change (latent heat), increases the potential of these materials for heat accumulation, but also modifies heat transfer in transient states what improves their insulating characteristics. The paper presents selected results of both experimental and theoretical investigation of the thermal behavior of textiles impregnated with phase change materials, PCMs, under variable thermal loads. Thermal capacity of textiles containing different amounts of microencapsulated PCM were measured with DSC. Then, their thermal behavior characteristics were investigated under irradiation from a solar simulator (heating phase) and during cooling in the regime of natural convection. Mathematical model of heat transfer in the textiles, including radiative and convective boundary conditions, was formulated. Computer simulations of the processes under study, validated on the base of experimental results, allowed to determine important properties of the textiles, such as coefficients of absorption and transmission for solar radiation. Overall thermal characteristics of the textiles, i.e. temperature variations under different thermal loads, are also presented in the paper.
APA, Harvard, Vancouver, ISO, and other styles
4

Reis, Gabriele Pereira dos, Diego Bezerra Soares, Isabela Reis Manzoli, and Lohraine Talia Domingues. "MECANISMOS BIOMOLECULARES DA SUCCINILCOLINA NA HIPERTERMIA MALIGNA." In II Congresso Brasileiro de Biologia Molecular On-line. Revista Multidisciplinar em Saúde, 2021. http://dx.doi.org/10.51161/rems/2326.

Full text
Abstract:
Introdução: A Hipertermia Maligna (HM) é uma doença farmacogenética potencialmente grave, latente e de herança autossômica dominante caracterizada pelo excesso de resposta hipermetabólica durante exposição ao anestésico inalatório ou a um determinado relaxante muscular como a succinilcolina. Essa fisiopatologia desencadeia quadros de acidose metabólica, alterações cardiovasculares, rigidez muscular, falência renal e destruição completa do músculo estriado esquelético. Objetivos: Devido à alta letalidade dessa doença, bem como sua susceptibilidade em todos os grupos étnicos e em ambos os sexos, faz-se necessário mais estudos que elucidam o papel dos mecanismos clínicos e biomoleculares envolvidos no surgimento da HM. Sob esse viés, foi levantado o seguinte questionamento: “Quais os aspectos fisiopatológicos e biomoleculares da succinilcolina na Hipertermia Maligna?”. Material e Métodos: A pesquisa consiste em uma revisão de literatura com o intuito de esclarecer o papel da succinilcolina nos mecanismos que desencadeiam a hipertermia maligna. Para tanto, utilizou-se as bases de dados SciELO, PubMed e MedLine e a literatura Biologia Molecular Básica, do Zaha 5ª edição. Resultados: A partir dos estudos, foi possível observar que os mecanismos moleculares causam a potencialização do efluxo de cálcio do retículo sarcoplasmático do músculo esquelético em indivíduos susceptíveis, após a indução por anestésicos ou relaxantes musculares. Diante disso, há um acúmulo de cálcio no mioplasma que, por sua vez, irá ocasionar uma contínua contração muscular, fato que corrobora a diminuição de ATP e um processo de rigidez muscular e todos os sinais e sintomas associados à doença. Tendo em vista que a succinilcolina é um relaxante muscular despolarizante, ela se torna um fator que estimula o bloqueio da transmissão do impulso nervoso na placa mioneural promovendo o surgimento da Hipertermia Maligna. Conclusão: Em síntese, por meio desse estudo, evidencia-se a atuação da succinilcolina como um mecanismo biomolecular que propicia a ativação de aspectos farmacogenéticos em pacientes suscetíveis à HM e, portanto, o uso do medicamento está contraindicado nessas situações.
APA, Harvard, Vancouver, ISO, and other styles
5

Apolinário, Joelma Maria dos Santos da Silva. "VIROLOGIA DO HERPES ZOSTER ABORDAGEM CLÍNICA E FARMACOTERAPÊUTICA." In II Congresso Brasileiro de Saúde On-line. Revista Multidisciplinar em Saúde, 2021. http://dx.doi.org/10.51161/rems/1435.

Full text
Abstract:
Introdução: As infecções por herpesvírus humanos conduzem a uma ampla sintomatologia. Desse modo, caracterizam-se princialmente pelo estabelecimento de uma infecção latente nas células nervosas, com a possibilidade de reativação por estímulos biológicos, psicológicos ou ambientais. Essas infecções são bastante importantes em indivíduos imunodeprimidos sejam eles acometidos pelo vírus HIV, algumas neoplasias, pacientes em tratamentos com quimioterápicos dentre outros comprometimentos. De acordo com a evolução da patologia os sintomas podem até provocar a morte. As grávidas por sua vez tendem a desenvolver a infecção com mais facilidade, uma vez que pode ser transmitida para o feto (transmissão vertical). Objetivo: Investigar e comprovar a virologia do Herpes zoster em relação à clínica e abordagem farmacoterapêutica. Metodologia: A realização deste estudo foi embasada em pesquisas de abordagem qualitativa e descritiva, quanto aos procedimentos, revisão bibliográfica ou método de revisão integrativa da literatura. Os dados foram coletados nas bases de dados SciELO, BDENF, LILACS e MEDLINE. A seleção dos artigos obedeceu aos critérios que abordassem o tema em questão com recorte temporal de 2015 a 2021. Resultados: Estudo comprovam que o tratamento farmacológico do Herpes zoster inclui dois aspéctos básicos principais: O uso de antiretrovirais e a diminuição da sintomatologia da dor. O aciclovir é o medicamento antivral de primeira escolha, seguida de fanciclovir e valaciclovir estes possuem uma posologia de melhor adesão pelo paciente, porém uma desvantagem é seu custo elevado, o que acaba dificultando o processo de tratamento da doença. Conclusão: A posologia em geral tem duração de sete dias, as doses iniciais recomendadas são: Aciclovir 800 mg, cinco vezes ao dia; Valaciclovir 1000 mg, três vezes ao dia; Fanciclovir 500 mg, três vezes ao dia. Para o manejo da neurite aguda, pode-se fazer uso de paracetamol, dipirona ou anti-inflamatórios não-esteroides. No caso de dor moderada a grave, pode-se associar analgésicos opióides, como codeína ou tramadol, também é possível utilizar pomadas e cremes específicos para este fim, no entanto vale salientar que a aplicação desses medicamentos deve ser sempre realizada com espátulas ou cotonetes, para evitar o contágio e a propagação do vírus.
APA, Harvard, Vancouver, ISO, and other styles
6

Cheng, Minmin, Xianying Liu, Yinggang Jing, and Kui Xu. "DHP: Cloud-Edge Collaborative Internet Framework for Nuclear Power Industry." In 2024 31st International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2024. http://dx.doi.org/10.1115/icone31-134444.

Full text
Abstract:
Abstract Due to the numerous production links of nuclear power plants, complex organization and process, and the distribution of multiple nuclear power plants in different regions, the safety and efficient operation of the nuclear power Group as a whole has brought great challenges. To establish an industrial Internet at the level of the nuclear power group, provide means to eliminate data islands, integrate massive data resources and improve data use efficiency, improve the safety management level and operation performance of nuclear power plants, and reduce costs, this paper designed a cloud edge collaborative architecture of nuclear power industrial Internet DHP, which meets the coordination and interaction and safety requirements of nuclear power plant side and center side. It creates the foundation for low latency, high concurrency, and reliable production command in the safety production management system and overhaul command center. According to the data characteristics of the nuclear power industry, a timing data acquisition and configuration tool and a running soft gateway are adapted to meet the timing data access requirements of typical business scenarios of the nuclear industry and have the ability to access high-frequency vibration and low-frequency timing data at the same time. Independently designed and implemented a complete technical framework for the acquisition, transmission, storage, and processing of high-dimensional time series data in the nuclear industry, which can support real-time access and calculation of millions of measurement points. Based on metadata management, an access channel for multi-source heterogeneous data is established to coordinate data extraction, data processing, data storage, data scheduling, quality audit, data service and other links. The end-to-end, closed-loop data governance and control mechanism is studied, and the nuclear power data is governed and assets are opened with this method. The whole life cycle management of nuclear power data is realized, which meets the data resource demand in the field of equipment reliability management, and improves the management ability and query efficiency of complex data. Taking the China Nuclear Power Data Center and six nuclear power bases as representatives, the platform construction and engineering practice have formed a data sharing and collaboration channel between the data center and the edge side. The research results show that DHP gathers and processes China’s nuclear power production and management data, and realizes effective data integration and processing so that multiple production, operation, maintenance, and other applications can be based on consistent data for timely perception, fast processing, and efficient decision-making. Therefore, the nuclear power industry Internet cloud edge collaboration framework established in this paper can provide support for digital and intelligent nuclear power, has a wide application prospect, and can provide technical references for the construction of nuclear power data ecology.
APA, Harvard, Vancouver, ISO, and other styles
7

Matias, Camila de Melo Cesarino, Sandy de Oliveira Fievet, Larissa Veras Menezes, Louise Dias Lima, Ana Beatriz Rodrigues Barros da Silva, and Aline Rezende de Souza Mendes. "A subnotificação de sífilis congênita durante a pandemia de COVID-19." In 46º Congresso da SGORJ e Trocando Ideias XXV. Zeppelini Editorial e Comunicação, 2022. http://dx.doi.org/10.5327/jbg-0368-1416-2022132s1052.

Full text
Abstract:
Introdução: A sífilis é uma infecção de evolução crônica e sistêmica, considerada um importante indicador de qualidade da atenção materno-infantil. Quando não tratada de forma adequada, a doença pode evoluir, comprometendo órgãos internos como coração, fígado e sistema nervoso central. A transmissão da sífilis dá-se nos estágios primário e secundário da doença e diminui ao longo das formas latente e terciária. Na gestação, a sífilis não tratada pode levar a desfechos adversos, tais como abortamento, natimortalidade, óbito neonatal, prematuridade ou manifestações clínicas e precoces da doença. A notificação dos casos de sífilis congênita é obrigatória conforme portaria do Ministério da Saúde, pois é por meio dela que são obtidos dados epidemiológicos para embasar as decisões a serem tomadas para o manejo dos casos subsequentes. Ainda que existam tratamentos disponíveis desde 1930, esta doença continua sendo uma complicação de saúde pública, especialmente em países em desenvolvimento como o Brasil. De 2010 a 2018 foi registrado um aumento crescente nas taxas de detecção de sífilis gestacional e nas taxas de infecção por sífilis congênita. Entretanto, durante a pandemia de COVID-19, observou-se redução no número de casos notificados de sífilis congênita, especialmente em regiões mais afastadas, onde o isolamento social prejudicou a procura por serviços de saúde. Objetivo: Analisar a incidência de sífilis congênita nas cinco regiões brasileiras durante o período da pandemia e, com isso, fornecer dados para que em longo prazo os danos causados pelo surto de SARS-CoV-2 possam ser minimizados. Materiais e métodos: Foi realizado um estudo transversal sobre a incidência de sífilis congênita nos períodos de ápice da pandemia de COVID-19. Foram usados como descritores as plataformas Scientific Electronic Library Online (SciELO), United States National Library of Medicine (PubMed), e usou-se como base de dados o repositório do Departamento de Informática do Sistema Único de Saúde (DATASUS). Resultados: Observa-se que nos períodos de pico da pandemia, de março a novembro de 2020, houve redução na notificação de sífilis congênita na Região Norte, que foi de 510 para 433 casos; na Região Nordeste, houve leve aumento para 1.037 casos; na Região Sul, a redução foi de 2.349 para 2.347; na Região Sudeste, foi de 652 para 651; e, na Região Centro-Oeste, de 456 para 405 novos casos. Conclusão: Na análise dos dados obtidos, observa-se redução na notificação de casos de sífilis congênita, principalmente nas Regiões Norte e Centro-Oeste. Esta análise permite inferir a possibilidade de ter ocorrido menor detecção desses casos durante a pandemia de COVID-19, pautada por diversos motivos, como a menor busca pelo serviço de saúde, e que acarretará consequências para a saúde da população materno-infantil. Infere-se, portanto, que a subnotificação dos casos de sífilis congênita pode implicar a necessidade de uma estratégia de aperfeiçoamento das ações públicas para o controle e a promoção de saúde, visando à qualidade de vida das futuras gerações.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!