To see the other types of publications on this topic, follow the link: AQRM.

Dissertations / Theses on the topic 'AQRM'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'AQRM.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Pasin, Débora Brunheroto [UNESP]. "Avaliação quantitativa de riscos microbiológicos (AQRM) associados à E. coli em águas cinza." Universidade Estadual Paulista (UNESP), 2013. http://hdl.handle.net/11449/98301.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:29:31Z (GMT). No. of bitstreams: 0 Previous issue date: 2013-08-01Bitstream added on 2014-06-13T19:59:10Z : No. of bitstreams: 1 pasin_db_me_bauru.pdf: 597021 bytes, checksum: 186b1dd9901aafb4d694462951b7f7c7 (MD5)<br>Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)<br>O reúso de águas cinza apresenta-se como uma alternativa de ampliação da oferta de água que pode contribuir para a conservação dos recursos híbridos, perante a escassez da água, não apenas pela qualidade, mas também pela quantidade. Os riscos associados à exposição rotineira ou acidental dessa fonte de alternativa devem, entretanto, ser considerados, para que se possam estabelecer práticas seguras de reúso, uma vez que as águas de reúso podem apresentar patógenos, tais como: vírus, bactérias, protozoários e helmintos. O presente trabalho teve como objetivo avaliar quantitativamente os riscos microbiológicos das diversas fontes de exposições dos usuários a E. coli na água cinza sem tratamento, a fim de definir uma faixa de Valores Máximos Permitidos (VMP) por meio de conceito de riscos aceitáveis 10-3 e 10-6 pppa (por pessoa por ano), para as diversas finalidades de reúso. Para tal, foram avaliadas a exposição, a dose-resposta e a probabilidade de infecção para diferentes finalidades de reúso. O modelo de beta-Poisson foi empregado para avaliação da probabilidade de infecção. A dose infectante (N50), a concentração de microorganismos, a rota de exposição, os volumes ingeridos (acidentalmente e rotineiramente), os parâmetros de interação agente-hospedeiro (α e β), bem como a frequencia de exposição foram avaliados a partir de uma compilação sistemática de dados da literatura. E, por meio da Avaliação Quantitativa de Riscos Microbiológicos (AQRM), o maior risco de infecção identificado decorreu do reúso de águas cinza misturadas para a finalidade de balnearidade e irrigação de culturas alimentares por meio de ingestão de alimentos, incorrendo em risco de aproximadamente 9,9 pessoas de cada dez indivíduos expostos, sem considerar os efeitos de diluição da água. Esse cenário, resultou...<br>The reuse of greywater is presented as an alternative to increasing the supply of water that can contribute to the conservation of water resources against water scarcity, not only the quality but also the quantity. The risks associated with exposure routine or accidental this alternative source should, however, be considered, so that they establish safe practices of reuse, since reusing water may have pathogens, such as viruses, bacteria, protozoa and helminths. The present study aimed to quantitatively evaluate the microbiological risks of different sources of exposure of users to E. coli in untreated gray water, in order to define a range of Maximum Values Allowed (MVA) through the concept of acceptable risk 10-3 and 10-6 pppy (per person per year), for the various purposes of reuse. To this end, we evaluated the exposure, the dose -response and the probability of infection for different purpose reuse. The beta-Poisson model was used to assess the likehood of infection. The infective dose (N50), the concentration of microorganisms, the route of exposure, the volumes ingested (accidentally and roubinely), the parameters of agent-host interaction (α and β), and the frequency of exposure were evaluated from one systematic compilation of literature. And, through Quantitative Microbiological Risk Assessment (QMRA), the greatest risk of infection identified resulted from the reuse of greywater mixed for the purpose of bathing and irrigation of food crops through food intake, resulting in risk of approximately 9,9 out of ten people exposed individuals, without considering the effects dilution water. This situation has resulted in MVA 5.25 to 105 MPN/100 mL and 3.95 to 39.5 MPN/100 mL, respectively, to an acceptable risk of 10-3 ppy and 0.00 to 0.10 MPN/100 mL and 0.00 to 0.04 MPN/100mL simultaneously to risk of 10-6 pppy. The lowest risk of infection was due... (Complete abstract click electronic access below)
APA, Harvard, Vancouver, ISO, and other styles
2

Pasin, Débora Brunheroto. "Avaliação quantitativa de riscos microbiológicos (AQRM) associados à E. coli em águas cinza /." Bauru, 2013. http://hdl.handle.net/11449/98301.

Full text
Abstract:
Orientador: Rodrigo Braga Moruzzi<br>Banca: Marcelo de Julio<br>Banca: Gustavo Henrique Ribeiro da Silva<br>Resumo: O reúso de águas cinza apresenta-se como uma alternativa de ampliação da oferta de água que pode contribuir para a conservação dos recursos híbridos, perante a escassez da água, não apenas pela qualidade, mas também pela quantidade. Os riscos associados à exposição rotineira ou acidental dessa fonte de alternativa devem, entretanto, ser considerados, para que se possam estabelecer práticas seguras de reúso, uma vez que as águas de reúso podem apresentar patógenos, tais como: vírus, bactérias, protozoários e helmintos. O presente trabalho teve como objetivo avaliar quantitativamente os riscos microbiológicos das diversas fontes de exposições dos usuários a E. coli na água cinza sem tratamento, a fim de definir uma faixa de Valores Máximos Permitidos (VMP) por meio de conceito de riscos aceitáveis 10-3 e 10-6 pppa (por pessoa por ano), para as diversas finalidades de reúso. Para tal, foram avaliadas a exposição, a dose-resposta e a probabilidade de infecção para diferentes finalidades de reúso. O modelo de beta-Poisson foi empregado para avaliação da probabilidade de infecção. A dose infectante (N50), a concentração de microorganismos, a rota de exposição, os volumes ingeridos (acidentalmente e rotineiramente), os parâmetros de interação agente-hospedeiro (α e β), bem como a frequencia de exposição foram avaliados a partir de uma compilação sistemática de dados da literatura. E, por meio da Avaliação Quantitativa de Riscos Microbiológicos (AQRM), o maior risco de infecção identificado decorreu do reúso de águas cinza misturadas para a finalidade de balnearidade e irrigação de culturas alimentares por meio de ingestão de alimentos, incorrendo em risco de aproximadamente 9,9 pessoas de cada dez indivíduos expostos, sem considerar os efeitos de diluição da água. Esse cenário, resultou... (Resumo completo, clicar acesso eletrônico abaixo)<br>Abstract: The reuse of greywater is presented as an alternative to increasing the supply of water that can contribute to the conservation of water resources against water scarcity, not only the quality but also the quantity. The risks associated with exposure routine or accidental this alternative source should, however, be considered, so that they establish safe practices of reuse, since reusing water may have pathogens, such as viruses, bacteria, protozoa and helminths. The present study aimed to quantitatively evaluate the microbiological risks of different sources of exposure of users to E. coli in untreated gray water, in order to define a range of Maximum Values Allowed (MVA) through the concept of acceptable risk 10-3 and 10-6 pppy (per person per year), for the various purposes of reuse. To this end, we evaluated the exposure, the dose -response and the probability of infection for different purpose reuse. The beta-Poisson model was used to assess the likehood of infection. The infective dose (N50), the concentration of microorganisms, the route of exposure, the volumes ingested (accidentally and roubinely), the parameters of agent-host interaction (α and β), and the frequency of exposure were evaluated from one systematic compilation of literature. And, through Quantitative Microbiological Risk Assessment (QMRA), the greatest risk of infection identified resulted from the reuse of greywater mixed for the purpose of bathing and irrigation of food crops through food intake, resulting in risk of approximately 9,9 out of ten people exposed individuals, without considering the effects dilution water. This situation has resulted in MVA 5.25 to 105 MPN/100 mL and 3.95 to 39.5 MPN/100 mL, respectively, to an acceptable risk of 10-3 ppy and 0.00 to 0.10 MPN/100 mL and 0.00 to 0.04 MPN/100mL simultaneously to risk of 10-6 pppy. The lowest risk of infection was due... (Complete abstract click electronic access below)<br>Mestre
APA, Harvard, Vancouver, ISO, and other styles
3

Duqué, Benjamin. "Quantification du niveau de contamination de Campylobacter jejuni dans la filière volaille - Influence de la variabilité des souches et de l'histoire cellulaire." Thesis, Nantes, Ecole nationale vétérinaire, 2020. http://www.theses.fr/2020ONIR137F.

Full text
Abstract:
Campylobacter jejuni est la principale cause d'entérites humaines d'origine bactérienne dans le monde et la volaille est le principal vecteur de contamination. L'objectif principal de cette étude était d'évaluer l'effet du processus d'abattage des volailles sur le comportement de C. jejuni à un stress ultérieur, en tenant compte de la variabilité des souches et de l'influence de l'histoire cellulaire. Certaines étapes clés du processus d'abattage générant un stress chaud et froid ont été reproduites au laboratoire. L'hypothèse était que cette histoire cellulaire pouvait influencer l'inactivation de C. jejuni pendant l'étape ultérieure de stockage au froid. Le niveau de contamination du poulet en Campylobacter chez le consommateur a été estimé et les résultats obtenus satisfont l'objectif de performance définis par l'ICMSF. 11 a été aussi confirmé que l'histoire cellulaire avait un impact sur le comportement de C. jejuni. Les mécanismes moléculaires sous-jacents à l'histoire cellulaire ont été investigués par RT-qPCR appliquée à 3 souches, dans l'objectif de déterminer des gènes «biomarqueurs». L'expression de ces biomarqueurs permettraient de prédire le comportement ultérieur du pathogène. Utilisant la régression PLS, un modèle prédictif a été bati pour une seule des trois souches de C. jejuni, aboutissant à l'identification de 9 gènes biomarqueurs. La construction d'un modèle généralisable à l'ensemble des souches, basé sur la quantification de l'expression de gènes biomarqueurs, nécessitera d'explorer de nouvelles pistes méthodologiques. Ceci pourrait ouvrir la voie vers une nouvelle génération de modèles d'appréciation du risque<br>Campylobacter jejuni is the leading cause of human enteritis of bacterial origin worldwide and poultry is the main vector of contamination. The main objective of this study was to assess the effect of the slaughter poultry process on the behaviour of C. jejuni on the subsequent stress, taking into account strain variability and the influence of cell history. Some key steps of the slaughter process that generate hot and cold stress were mimicked in the laboratory. The influence of cell history on the inactivation of C. jejuni during the subsequent step of cold storage was hypothesized. The contamination level of Campy/obacter on chicken at the consumer's place was estimated and the results complied with the performance objective defined by the ICMSF. lt was also confirmed that cell history had an impact on the behaviour of C. jejuni. The molecular mechanisms underlying the cell history were investigated by RT-qPCR applied to 3 strains, with the aim of determining 11biomarker11 genes. The expression of these biomarkers could predict the subsequent behaviour of the pathogen. Using PLS regression, a predictive model was built for only one of the three strains of C. jejuni, resulting in the identification of 9 biomarker genes. The construction of a generalized model based on the quantification of biomarker gene expression, will require the exploration of new methodologies. This could pave the way for new generation risk assessment models
APA, Harvard, Vancouver, ISO, and other styles
4

Miconnet, Nicolas. "Contributions méthodologiques à l'appréciation quantitative de l'exposition aux dangers microbiens alimentaires." Paris 6, 2006. http://www.theses.fr/2006PA066066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Santos, Jeferson Gaspar dos. "Aplicação da metodologia de análise de perigo e pontos críticos de controle na disposição de efluentes tratados em solos tropicais e seu potencial uso agrícola." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/6/6134/tde-05092016-143024/.

Full text
Abstract:
Introdução - A prática de reúso pela aplicação de esgotos tratados na agricultura atualmente é considerada parte integrante dos recursos hídricos por muitas nações. No entanto, oferece riscos à saúde da população pela presença de bactérias, cistos de protozoários, ovos de helmintos e vírus, capazes de sobreviver por longos períodos em ambientes desfavoráveis e apresentarem um curto tempo entre a infecção e o desenvolvimento da doença. Objetivo - Estabelecer medidas preventivas para o lançamento de efluentes nos ecossistemas aquáticos, controle sanitário na redução de patógenos transmitidos por águas contaminadas e a disposição de efluentes tratados em solos tropicais como ferramentas de auxílio do manejo integrado dos recursos hídricos em bacias hidrográficas, por meio do uso de metodologia de gerenciamento de risco com aplicação do Sistema APPCC, em um conjunto de tecnologias na produção de efluente tratado de esgoto de origem doméstica para avaliar o potencial de emprego em solos agrícolas. Métodos - A pesquisa foi desenvolvida na estação de tratamento de esgoto e no campo experimental para pesquisas de utilização de efluente na agricultura no município de Lins. Os indicadores biológicos estudados e os pontos de amostragem foram determinados através da utilização da metodologia de controle de qualidade Análise de Perigos e Pontos Críticos de Controle (APPCC). Foram coletadas amostras do solo irrigado e dos poços de monitoramento do lençol freático. As amostras foram submetidas a análises físico-químicas, microbiológicas e parasitológicas. Aos resultados obtidos foi aplicada a Avaliação Quantitativa do Risco Microbiológico, para a determinação do risco anual de infecção. Resultados - O efluente final apresentou concentrações médias de 8,13x105 NMP/100 mL de coliformes totais e 4,69x105 NMP/100 mL de Escherichia coli, não sendo observados ovos de helmintos. O solo apresentou maior concentração de coliformes totais e E. coli na camada superficial 1,01x106 e 8,70x103, diminuindo ao longo da profundidade. Os ovos de helmintos foram encontrados entre a superfície e os 15 cm de profundidade em concentrações entre 0,07 e 0,87 ovos/g de peso seco. Foi constatada a presença de adenovírus em concentração de 3,56x105 cópias genômicas/g entre os 5 e 10 cm de profundidade em 2015. Nos poços de monitoramento foram constatadas concentrações de coliformes totais variando entre < 1,00 e 1,01x103 NMP/100 mL e E. coli entre <1,00 e 1,00x100 NMP/100 mL. Não foi observada a presença de vírus entéricos. O risco anual de infecção estimado para E. coli no efluente final foi 1,02x10-2 pppa, no solo variou entre 2,68x10-3 e 7,59x10-3 pppa e nos poços de monitoramento variou entre 1,42x10-8e 4,38x10-9 pppa. Para os ovos de helmintos observados no efluente do tratamento primário o risco anual de infecção calculado foi 4,04x10-2 pppa e no solo variou entre 9,57x10-1 e 9,76x10-1 pppa. Conclusões No período da pesquisa, as amostras dos sistemas de tratamento de esgoto e de irrigação demonstraram atender as expectativas de remoção de patógenos condizentes aos seus parâmetros de projeto, havendo restrições para seu lançamento em corpos hídricos e aplicação na agricultura. O solo mostrou capacidade de retenção dos organismos patogênicos nos primeiros 60 cm de profundidade, reduzindo as suas concentrações a níveis inferiores a 1,00 NMP/100 mL nos aquíferos.<br>Introduction - Currently, the practice of reuse through the application of treated sewage in agriculture considers an integral part of water resources for many nations. However, threatens the health of the population by the presence of bacteria, protozoan cysts, helminth eggs and viruses able to survive for long periods in harsh environments and present a short time between infection and development of disease. Objective - Establish preventive measures for the release of effluents on aquatic ecosystems, sanitary control in the reduction of pathogens transmitted by contaminated water and the disposal of treated effluent in tropical soils as aid tools in integrated management of water resources in river basins. For this purpose it was employed as risk management methodology the Hazard Analysis and Critical Control Points system (HACCP) in a set of technologies in the treated effluent production of domestic sewage to assess the employment potential in agricultural soils. Methods - The study was conducted in the sewage treatment plant and in an adjoining experimental field situated at Lins city. The biological indicators and the sampling points were determined by using the quality control methodology Hazard Analysis and Critical Control Points (HACCP). Samples of irrigated soil and groundwater monitoring wells were collected. The samples were subjected to analysis, physical-chemical, microbiological and parasitological. The results obtained was applied to Quantitative Microbial Risk Assessment (QRMA), for determining the annual risk of infection. Results - The final effluent presented average concentrations of 8.13x105 MPN / 100 mL of total coliforms and 4.69x105 MPN / 100 mL of Escherichia coli, not being observed helminth eggs. The soil showed the highest concentration of total coliforms and E. coli in the surface layer and 1.01x106 8.70x103, decreasing along the depth. Eggs from helminths were found between the surface and 15 cm depth in concentrations from 0.07 to 0.87 eggs / g dry weight. It was found the presence of adenovirus in concentration 3.56x105 genomic copies / g between 5 and 10 cm depth in 2015. In monitoring wells total coliform concentrations were found ranging from <1.00 and 1.01x103 NMP / 100 mL and E. coli between <1.00 and 1.00x100 MPN / 100 mL. It was not observed the presence of enteric viruses. The estimated annual risk of infection to E. coli in the final effluent was 1.02x10-2 pppy, soil ranged from 2.68x10-3 and 7.59x10-3 pppy and monitoring wells ranged from 1.42x10-8 e 4.38x10-9 pppy. For helminth eggs observed in the effluent from the primary treatment, the annual risk of infection was calculated 4.04x10-2 pppy and soil ranged from 9.57x10-1 and 9.76x10-1 pppy. Conclusions - During the survey period, samples of sewage and irrigation treatment systems showed meet the expectation removal of pathogens consistent to its design parameters, with restrictions on its release in water and applied in agriculture bodies. The soil of pathogenic organisms showed retention capacity in the first 60 cm depth by reducing its concentration to levels below 1.00 MPN / 100 mL in aquifers.
APA, Harvard, Vancouver, ISO, and other styles
6

Pinto, Karla Cristiane. "Estimativa de risco de infecção por Giardia sp e Cryptosporidium sp pela ingestão de água durante atividades de recreação de contato primário." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/6/6134/tde-16112016-155737/.

Full text
Abstract:
O uso das águas costeiras para fins recreacionais está associado com benefícios à saúde e bem-estar, todavia eventuais impactos negativos podem diminuir estes benefícios. Esses usos variam de acordo com o tipo de atividade desenvolvida, sendo que a recreação de contato primário requer contato direto e prolongado com a água, durante a qual pode ocorrer ingestão acidental. A Resolução CONAMA nº 274/2000 dispõe sobre os critérios de balneabilidade e reza que as condições da qualidade das águas recreacionais devem ser avaliadas através de indicadores microbiológicos de contaminação fecal, e ainda recomenda que seja realizada pesquisa de organismos patogênicos em praias sistematicamente impróprias. Dada a escassez de dados da ocorrência de patógenos em águas costeiras, no período de 2010 a 2012, a CETESB realizou o Estudo de microrganismos patogênicos nas praias do Litoral Paulista pesquisando enterovírus, adenovírus, vírus da hepatite A, Cryptosporidium sp e Giardia sp, no intuito de preencher esta lacuna e gerar dados primários. Assim, o objetivo deste trabalho foi estimar a probabilidade de infecção por Cryptosporidium sp e Giardia sp após exposição a águas recreacionais costeiras usando como ferramenta a Avaliação Quantitativa de Risco Microbiológico (AQRM), como também o risco de doença. As concentrações de (oo)cistos nas águas das praias são oriundas dos relatórios de Qualidade das Praias Litorâneas no Estado de São Paulo da CETESB dos anos de 2011 e 2012. Nesse período foram analisadas 203 amostras coletadas de 12 praias na 1ª fase e de cinco praias na 2ª fase para a pesquisa de ocorrência de (oo)cistos. As amostras de água foram coletadas na isóbata de um metro, com frequência mensal. Giardia sp foi o microrganismo mais frequente, presente em 43 por cento das amostras e Cryptosporidium sp em 13 por cento . O cenário de exposição considerou tipos de atividade, tipos de usuários (crianças, adultos e esportistas), concentração de (oo)cistos, volume de ingestão, duração e frequência da exposição. A probabilidade de infecção foi maior em praias com mais amostras positivas para oocistos e cistos, no grupo dos esportistas e para Giardia sp. Em alguns casos os valores de risco de doença ultrapassaram o risco tolerável pela U.S. EPA (2012) de 3,6 por cento casos de gastroenterite, assim como ultrapassaram os resultados de incidência acumulada encontradas por LAMPARELLI et al. (2015). Os resultados apontaram a necessidade de melhoria nos sistemas de tratamento de efluentes no Litoral Paulista. A AQRM é uma ferramenta capaz de estimar a probabilidade de infecção no cenário das águas recreacionais e pode auxiliar no gerenciamento dos riscos.<br>The use of coastal water for recreational purposes has been associated with benefits to health and well-being; however some negative impacts can diminish such benefits. The usages can vary according to the type of activity but the primary contact demands physical contact resulting in a high probability in accidental ingestion of water. Brazilian legislation for coastal recreational waters CONAMA 274/2000 establishes criteria for fecal indicator bacteria and furthermore recommends investigation of pathogenic organisms for beaches which classification is systematically as improper. Given the scarcity of data referring to pathogenic presence in beaches´ waters, CETESB carried out a study, in 2010 and 2012, for quantifying enterovirus, adenovirus, hepatitis A virus, Cryptosporidium sp and Giardia sp in coastal waters of São Paulo state in order to obtain data about their occurrence of these pathogens in coastal waters. The objective of this study was to estimate the annual risk of infection and disease for Giardia sp and Cryptosporidium sp by ingestion of water during primary contact recreation using QMRA approach. Concentrations of both parasites were taken from the annual report entitled Quality of coastal beaches in São Paulo state by CETESB (2011 and 2012). In these years were analyzed 203 samples of water for quantifying (oo)cysts of Giardia and Cryptosporidium from 12 beaches in the first year and five beaches in the second year of research. The samples were collected at one meter isobaths, with monthly frequency. Giardia was the most frequent parasite present in 43 per cent of samples and Cryptosporidium sp in 13 per cent . Exposure scenario was built considering types of activity, beach goers (children, adults and athletes), concentration of parasites, ingestion rate, duration and frequency of exposure. The probability of annual infection was higher in beaches in which there were more positive results for parasites for athletes and for Giardia infection. The tolerable risk for gastroenteritis by USEPA, which is 3.6 per cent , was overpassed in some cases. Though the results found in this study overpassed the cumulative incidence reported by LAMPARELLI et al. (2015). The results indicate the need for improvements in wastewater treatment systems in the coastal area of São Paulo. As QMRA is a tool capable in estimating the probability of infection it can help to highlight crucial issues in risk management.
APA, Harvard, Vancouver, ISO, and other styles
7

Haghighizadeh, Navin. "TCP/AQM Congestion Control Based on the H2/H∞ Theory." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35222.

Full text
Abstract:
This thesis uses a modern control approach to address the Internet traffic control issues in the Transport Layer. Through literature review, we are interested in using the H2/H∞ formulation to obtain the good transient performance of an H2 controller and the good robust property from an H∞ controller while avoiding their deficiencies. The H2/H∞ controller is designed by formulating an optimization problem using the H2-norm and the H∞-norm of the system, which can be solved by an LMI approach using MATLAB. Our design starts with the modeling of a router and the control system by augmenting the network plant function with the Sensitivity function S, the Complementary Sensitivity function T and the Input Sensitivity function U. These sensitivity functions along with their weight functions are used to monitor the closed-loop dynamics of the traffic control. By choosing different combinations of the sensitivity functions, we can obtain the SU, the ST and the STU controllers. Both the window-based and rate-based version of these different types of H2/H∞ controllers have been designed and investigated. We have also proved that these controllers are stable using Lyapunov’s First Method. Next, we verify the performance of the controllers by OPNET simulation using different performance measures of queue length, throughput, queueing delay, packet loss rate and goodput. Our performance evaluation via simulation has demonstrated the robustness and the better transient response such as the rise/fall time and the peak queue value. We have also investigated the controller performances subject to network dynamics as well as through comparison with other controllers. Finally, we have improved these controllers for real-time application. They are capable to update/renew the controller in a short time whenever new network parameter values are detected so that the optimum performance can be maintained.
APA, Harvard, Vancouver, ISO, and other styles
8

Ryan, James D. "An alliance built upon necessity: AQIM, Boko Haram, and the African "arch of instability"." Monterey, California: Naval Postgraduate School, 2013. http://hdl.handle.net/10945/34732.

Full text
Abstract:
Approved for public release; distribution is unlimited<br>This paper examines numerous linkages between two influential terrorist organizations operating in Sub-Saharan Africa, Boko Haram and Al-Qaeda in the Islamic Maghreb (AQIM), and what would be the political and security ramifications on United States foreign policy toward Sub-Saharan Africa stemming from the enhanced partnership between these groups. I argue that containment of these groups and their current operations through overwhelming military supremacy does not project a sustainable way forward for not only the United States, but more importantly, for the international community. The ongoing instability in the Sahel could have enormous second- and third-order negative effects on the entire region. The threat both groups represent with their freedom of movement should not be underestimated. Both receive some form of active and passive support from their respected indigenous populations, and as they evolve, are becoming more sophisticated in their training, funding, and methods of employment. Regional Islamic safe havens could be created through the union of Boko Haram and AQIM as their modes of shared ideology, financing, and tactics move forward. Therefore, a strategy of moderate containment through enhanced engagement by leveraging all lines of operations, coupled with soft and hard power, will increase the likelihood of long-term stability.
APA, Harvard, Vancouver, ISO, and other styles
9

Hajji, Khalifa. "The origins and strategic objectives of the Al Qaeda organization in the Islamic Maghreb (AQIM)." Thesis, Monterey, Calif. : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Dec/09Dec%5FHajji.pdf.

Full text
Abstract:
Thesis (M.S. in Defense Analysis)--Naval Postgraduate School, December 2009.<br>Thesis Advisor(s): Hafez, Mohammed M. Second Reader: Lee, Doowan. "December 2009." Author(s) subject terms: Algerian radical group evolution from the FIS to the GIA, the GSPC then to the AQIM; AQIM origin and strategy; Links between AQIM and North African radical groups; GICM: The Moroccan Islamic Combat Group; LIFG: the Libyan Islamic Fighting Group; MTI: The Islamic Tendency Movement; Terrorism is North Africa. Description based on title screen as viewed on Jan. 26, 2010. Includes bibliographical references (p. 81-84). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
10

Franked, Lennart, and David Håsäther. "Implementation and evaluation of a queuing discipline in Linux." Thesis, Mid Sweden University, Department of Information Technology and Media, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-11401.

Full text
Abstract:
<p>Streaming video and VoIP are two popular services used over the Internet, and as the number of users increases, the demand on the network routers also increases. Since both streaming video and VoIP have a variable traffic flow, the routers must always have some free space in their receive buffers to handle traffic bursts. If not, packet loss may occur that will result in a degraded quality of the services. In this project, a fuzzy logic based Active Queue Management (AQM) will be implemented, which might help reduce this problem. This algorithm has currently only been tested in a simulated environment. This algorithm will then be evaluated and compared to some of the existing AQMs. The results will also be compared to a stream that only uses a First-In, First-Out (FIFO) queue, which will work as a baseline. Since an AQM is not the only means used to reduce delay and jitter, different AQMs will also be combined with two different transport protocols, User Datagram Protocol (UDP) and the new Datagram Congestion Control Protocol (DCCP). The resulting implementation performed on a par with RED (Random Early Detection), one of the most common AQMs.</p>
APA, Harvard, Vancouver, ISO, and other styles
11

Al-Hammouri, Ahmad Tawfiq. "INTERNET CONGESTION CONTROL: COMPLETE STABILITY REGION FOR PI AQM AND BANDWIDTH ALLOCATION IN NETWORKED CONTROL." online version, 2008. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=case1189088621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Peterson, Todd Alan. "AQM Shell Development - Creating a Framework for Airspace and Airfield Operations and Air Quality Visualization Software." Thesis, Virginia Tech, 1997. http://hdl.handle.net/10919/37026.

Full text
Abstract:
It is believed that the analysis of air traffic impacts on air quality will benefit from attention to the three-dimensional nature of the air traffic network as well as the actions of individual aircraft during the study period. With the existence of air traffic simulation models, the actions of individual aircraft may already be defined in a simulated environment. SIMMOD, the Federal Aviation Administration's airport and airspace modeling software, performs such models of scheduled air traffic. The results of such models may be used to determine the impacts of scheduled air traffic on air quality as well as other parameters. This report addresses the interpretation of output from SIMMOD models for use in air quality analysis and visualization of the air traffic network, and the application of these techniques in a stand-alone computer program. This program, named AQM for its purpose in assisting development of Air Quality Models, provides a working framework for future development of software for detailed air quality analysis and visualization.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
13

Toresson, Ludwig. "Making a Packet-value Based AQM on a Programmable Switch for Resource-sharing and Low Latency." Thesis, Karlstads universitet, Institutionen för matematik och datavetenskap (from 2013), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-82568.

Full text
Abstract:
There is a rapidly growing number of advanced applications running over the internet that requires ultra-low latency and high throughput. Bufferbloat is one of the most known problems which add delay in the form of packets being enqueued into large buffers before being transmitted. This has been solved with the developments of various Active Queue Management (AQM) schemes to control how large the queue buffers are allowed to grow. Another aspect that is important today is how the available bandwidth can be shared between applications with different priorities. The Per-Packet Value (PPV) concept has been presented as a solution for resource-sharing by marking packets according to predefined marking policies. The packet value will be taken into consideration to make drop/mark decisions, which leads to higher packet values being prioritized at bottleneck links.  In this thesis, a design of a packet value-based AQM on a programmable Barefoot Tofino switch will be presented. It will use a combination of the Proportional Integral Controller (PIE) AQM scheme and the PPV concept to make drop decisions when queuing delay is discovered. Packet value statistics are collected through the P4 programmable data plane to maintain knowledge about the distribution of packet values. With the dropping probability calculated through the PIE AQM scheme, a decision can be made about which packets should be dropped.  An evaluation shows that with the implemented PV AQM, a low queuing delay can be achieved by dropping an appropriate amount of packets. It also shows that the PV AQM controls the resource-sharing between different traffic flows according to a predefined marking policy.
APA, Harvard, Vancouver, ISO, and other styles
14

Wickramarathna, Thamali Dilusha N. "Modeling and Performance Evaluation of a Delay and Marking Based Congestion Controller." Scholarly Repository, 2008. http://scholarlyrepository.miami.edu/oa_theses/101.

Full text
Abstract:
Achieving high performance in high capacity data transfers over the Internet has long been a daunting challenge. The current standard of Transmission Control Protocol (TCP), TCP Reno, does not scale efficiently to higher bandwidths. Various congestion controllers have been proposed to alleviate this problem. Most of these controllers primarily use marking/loss or/and delay as distinct feedback signals from the network, and employ separate data transfer control strategies that react to either marking/loss or delay. While these controllers have achieved better performance compared to existing TCP standard, they suffer from various shortcomings. Thus, in our previous work, we designed a congestion control scheme that jointly exploits both delay and marking; D+M (Delay Marking) TCP. We demonstrated that D+M TCP can adapt to highly dynamic network conditions and infrastructure using ns-2 simulations. Yet, an analytical explanation of D+M TCP was needed to explain why it works as observed. Furthermore, D+M TCP needed extensive simulations in order to assess its performance, especially in relation to other high-speed protocols. Therefore, we propose a model for D+M TCP based on distributed resource optimization theory. Based on this model, we argue that D+M TCP solves the network resource allocation problem in an optimal manner. Moreover, we analyze the fairness properties of D+M TCP, and its coexistence with different queue management algorithms. Resource optimization interpretation of D+M TCP allows us to derive equilibrium values of steady state of the controller, and we use ns-2 simulations to verify that the protocol indeed attains the analytical equilibria. Furthermore, dynamics of D+M TCP is also explained in a mathematical framework, and we show that D+M TCP achieves analytical predictions. Modeling the dynamics gives insights to the stability and convergence properties of D+M TCP, as we outline in the thesis. Moreover, we demonstrate that D+M TCP is able to achieve excellent performance in a variety of network conditions and infrastructure. D+M TCP achieved performance superior to most of the existing high-speed TCP versions in terms of link utilization, RTT fairness, goodput, and oscillatory behavior, as confirmed by comparative ns-2 simulations.
APA, Harvard, Vancouver, ISO, and other styles
15

Lee, Choong-Soo. "A Credit-based Home Access Point (CHAP) to Improve Application Quality on IEEE 802.11 Networks." Digital WPI, 2010. https://digitalcommons.wpi.edu/etd-dissertations/315.

Full text
Abstract:
"Increasing availability of high-speed Internet and wireless access points has allowed home users to connect not only their computers but various other devices to the Internet. Every device running different applications requires unique Quality of Service (QoS). It has been shown that delay- sensitive applications, such as VoIP, remote login and online game sessions, suffer increased latency in the presence of throughput-sensitive applications such as FTP and P2P. Currently, there is no mechanism at the wireless AP to mitigate these effects except explicitly classifying the traffic based on port numbers or host IP addresses. We propose CHAP, a credit-based queue management technique, to eliminate the explicit configuration process and dynamically adjust the priority of all the flows from different devices to match their QoS requirements and wireless conditions to improve application quality in home networks. An analytical model is used to analyze the interaction between flows and credits and resulting queueing delays for packets. CHAP is evaluated using Network Simulator (NS2) under a wide range of conditions against First-In-First- Out (FIFO) and Strict Priority Queue (SPQ) scheduling algorithms. CHAP improves the quality of an online game, a VoIP session, a video streaming session, and a Web browsing activity by 20%, 3%, 93%, and 51%, respectively, compared to FIFO in the presence of an FTP download. CHAP provides these improvements similar to SPQ without an explicit classification of flows and a pre- configured scheduling policy. A Linux implementation of CHAP is used to evaluate its performance in a real residential network against FIFO. CHAP reduces the web response time by up to 85% compared to FIFO in the presence of a bulk file download. Our contributions include an analytic model for the credit-based queue management, simulation, and implementation of CHAP, which provides QoS with minimal configuration at the AP."
APA, Harvard, Vancouver, ISO, and other styles
16

Sup, Luciano Mauro Arley. "Novas metodologias AQM e TCP visando a eficiência de fluxos de controle UDP e dados TCP/IP compartilhados." reponame:Repositório Institucional da UnB, 2017. http://repositorio.unb.br/handle/10482/31307.

Full text
Abstract:
Tese (doutorado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2017.<br>Submitted by Raquel Almeida (raquel.df13@gmail.com) on 2018-02-21T17:02:16Z No. of bitstreams: 1 2017_LucianoMauroArleySup_PARCIAL.pdf: 4566541 bytes, checksum: ed7c646700fed8a4900c009de162ecfc (MD5)<br>Approved for entry into archive by Raquel Viana (raquelviana@bce.unb.br) on 2018-02-27T17:13:33Z (GMT) No. of bitstreams: 1 2017_LucianoMauroArleySup_PARCIAL.pdf: 4566541 bytes, checksum: ed7c646700fed8a4900c009de162ecfc (MD5)<br>Made available in DSpace on 2018-02-27T17:13:33Z (GMT). No. of bitstreams: 1 2017_LucianoMauroArleySup_PARCIAL.pdf: 4566541 bytes, checksum: ed7c646700fed8a4900c009de162ecfc (MD5) Previous issue date: 2018-02-27<br>Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES).<br>Na atualidade a Internet se tornou uma rede capaz de interconectar diversos tipos de usuários, dispositivos, casas, edifícios, plantas industriais, automóveis, hospitais, escolas, paradigma conhecido como Internet das Coisas (IoC). Convertendo-se assim em um meio que abriga fluxos de dados TCP/IP e fluxos UDP de sistemas de controle em rede denominados NCS (Networked Control Systems) utilizados em diversas aplicações, tais como, automação industrial, automação predial, telecirurgias, smart grid e smart city. Para estudar ambos os tipos de fluxos em uma abordagem concomitante, neste trabalho, primeiramente foi desenvolvida uma metodologia que admite modelar e simular topologias de redes de comunicação junto com seus protocolos e algoritmos usando o UPPAAL, uma ferramenta de software que admite modelar sistemas físicos e lógicos através de autômatos temporizados e também fazer simulações e verificações determinísticas e estatísticas do modelo. Então, foram modelados sistemas de controle através da Internet junto com outros fluxos TCP/IP genéricos, e foi observado e estudado como as características do sistema de comunicação afetam o desempenho do sistema de controle em diferentes cenários de simulações e com diferentes técnicas AQM (Active Queue Management) implementadas nos roteadores. Neste contexto, com o objetivo de corrigir esses efeitos, e visando à eficiência de ambos os tipos de fluxos (maior vazão para fluxos TCP/IP e menor ITAE (Integral Time-weighted Absolute Error) para a NCS) foi desenvolvida uma nova metodologia que combina ações AQM com ações TCP/IP, chamada ENCN (Explicit Non-Congestion Notification) cujo desempenho foi comparado com as técnicas AQM Drop Tail, RED, CoDel e PIE trabalhando em conjunto com TCP-Reno, e com os protocolos TCP-Jersey, DCTCP e E-DCTCP (todos modelados e simulados no UPPAAL). Resultados de simulações e verificações estatísticas mostraram que comparado com essas metodologias o ENCN melhorou tanto a vazão e a justiça dos fluxos TCP/IP quanto o ITAE da NCS (fluxo UDP). Porém, o ENCN utiliza alguns recursos tecnológicos (bits para notificar não congestionamento) que nem sempre estão disponíveis na prática. Para superar essa limitação do ENCN, foi desenvolvida uma metodologia chamada ANCE (Acknowledge-based Non- Congestion Estimation) a qual trabalha de forma análoga a ENCN, porém, utiliza o valor estimado do tamanho da fila ao invés de utilizar uma notificação explícita de não congestionamento advinda dos roteadores, o que torna essa técnica mais viável para implementação. Finalmente, para superar algumas limitações de ANCE, foi desenvolvido um protocolo de transporte de dados chamado TCP-Puerto-Londero que a diferença de ANCE faz um ajuste adaptativo da janela de transmissão de dados em função de um atraso relativo no caminho de ida, ao invés de utilizar o RTT. Resultados de simulações mostraram que apesar de não utilizar recursos para fazer notificações explícitas de não congestionamento, o TCPPuerto-Londero fornece um desempenho comparável ao ENCN, superando o TCP-Jersey em 12,03 % e o E-DCTCP em 4,21 % em termos de vazão para os fluxos TCP/IP, e reduzindo o ITAE da NCS em 36,84 %, 36,68 % e 4,16 % em relação ao TCP-Jersey, o E-DCTCP e o ENCN, respectivamente. Esse bom desempenho do TCP-Puerto-Londero também foi observado nas verificações estatísticas do modelo.<br>Actually the Internet has become a network capable of interconnecting several types of users, objects, houses, buildings, industrial plants, automobiles, hospitals, schools, this paradigm is known as Internet of Things (IoC). Thus, the Internet has becoming a communication medium that hosts TCP/IP data flows and networked control systems (NCS) UDP flows used in several applications, such as, industrial automation, building automation, remote surgery , smart grid and Smart city. To study both types of flows in a concomitant approach, in this work, a modeling methodology was developed that supports modeling and simulation of communication network topologies along with their protocols and others algorithms using the UPPAAL, a software tool that admit modeling of physical and logical systems through timed automata and make simulations and statistical verifications. Thus, control systems over the Internet were modeled along with other generic flows on which we can observe and study how the communication network features affect control performance behavior for different simulation scenarios and with different AQM (Active Queue Management) techniques implemented in the routers. Then, in order to improve the tradeoff between TCP-throughput and ITAE (Integral Time-weighted Absolute Error) for the NCS that share de same network topology, we've developed a new AQM technique called ENCN (Explicit Non-Congestion Notification) whose performance was compared with TCP-Jersey, DCTCP and, E-DCTCP ECN based transport layer protocols, and with Drop Tail, RED, CoDel, and PIE schemes implemented joint with TCP-Reno protocol (all them modeled in UPPAAL). Simulations and statistical verifications show that the ENCN provides better throughput and fairness for TCP/IP flows, and ITAE for NCS UDP flow compared to other methodologies. However, the ENCN uses some technological resources (bits to notify non-congestion) that are not always available. In order to overcome this ENCN limitation we developed a methodology called ANCE (Acknowledge-Based Non- Congestion Estimation), which work like ENCN but making inferences about queue length instead to using explicit non-congestion notification coming from the routers, which makes this technique more feasible for implementation. Finally, in order to overcome some ANCE limitations we developed a new transport layer protocol called TCP-Puerto-Londero, which rather than using the RTT, makes an adaptive adjustment of the congestion window as a function of a relative delay in the forward path. Simulation results showed that TCP-Puerto- Londero provides performance comparable to ENCN, overcoming TCP-Jersey by 12.03 % and the E-DCTCP in 4.21 % in terms of throughput for TCP/IP flows and reducing the ITAE of the NCS UDP flow by 36.84 %, 36.68 % and 4.16 % in relation to the TCP-Jersey, E-DCTCP, and ENCN, respectively. This good performance of TCP-Puerto-Londero was also observed in the statistical verifications.
APA, Harvard, Vancouver, ISO, and other styles
17

Phirke, Vishal Vasudeo. "Traffic Sensitive Active Queue Management for Improved Quality of Service." Digital WPI, 2002. https://digitalcommons.wpi.edu/etd-theses/780.

Full text
Abstract:
The Internet, traditionally FTP, e-mail and Web traffic, is increasingly supporting emerging applications such as IP telephony, video conferencing and online games. These new genres of applications have different requirements in terms of throughput and delay than traditional applications. For example, interactive multimedia applications, unlike traditional applications, have more stringent delay constraints and less stringent loss constraints. Unfortunately, the current Internet offers a monolithic best-effort service to all applications without considering their specific requirements. Adaptive RED (ARED) is an Active Queue Management (AQM) technique, which optimizes the router for throughput. Throughput optimization provides acceptable QoS for traditional throughput sensitive applications, but is unfair for these new delay sensitive applications. While previous work has used different classes of QoS at the router to accommodate applications with varying requirements, thus far all have provided just 2 or 3 classes of service for applications to choose from. We propose two AQM mechanisms to optimize router for better overall QoS. Our first mechanism, RED-Worcester, is a simple extension to ARED in order to tune ARED for better average QoS support. Our second mechanism, REDBoston, further extends RED-Worcester to improve the QoS for all flows. Unlike earlier approaches, we do not predefine classes of service, but instead provide a continuum from which applications can choose. We evaluate our approach using NS-2 and present results showing the amount of improvement in QoS achieved by our mechanisms over ARED.
APA, Harvard, Vancouver, ISO, and other styles
18

Dahlberg, Love. "A Data Plane native PPV PIE Active Queue Mangement Scheme using P4 on a Programmable Switching ASIC." Thesis, Karlstads universitet, Institutionen för matematik och datavetenskap (from 2013), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-84552.

Full text
Abstract:
New internet services require low and stable latency, which is difficult to provide with traditional routers and queuing mechanisms. Current routers aim to provide high throughput using large buffers causing considerable network latency under load. Recently, Active Queue Management (AQM) algorithms have been proposed to reduce such problem by actively controlling queue lengths to maintain target latencies. However, AQMs are difficult to implement in switching Application-Specific Integrated Circuits (ASIC) due to inherent architectural constraints. On the other hand, resource sharing is another important goal aiming to differentiate traffic and allocating more resources to different traffic types.  The objective of this thesis is to implement the AQM algorithm Proportional Integral Controller Enhanced (PIE) with a packet marking based resource sharing concept Per Packet Value (PPV) on a programmable switching ASIC using the novel network programmability concept P4. Our solution is designed to maintain low and controllable latency and to utilize the bottleneck link efficiently, while observing the bandwidth sharing properties of the marking scheme. Our goal is to show that Data Plane native implementations of PPV PIE using the Tofino is possible without severely limiting performance or accuracy. The solution places the computation of PIE's drop probability estimation on a timer in the Data Plane utilizing a state machine, packet mirroring, packet recirculation and approximative arithmetics implemented by lookup tables. Additionally, a small control loop is required in order to update lookup tables based on packet statistics from the Control Plane.  In our evaluation using a Tofino based testbed, we evaluate the impact of different parameters on both Control Plane latency, Data Plane throughput and delay for both static and dynamic traffic scenarios. Our results demonstrate commendable performance in terms of controlling queuing delay, effective throughput and bandwidth share when taking operator policy in regard.
APA, Harvard, Vancouver, ISO, and other styles
19

Bhandarkar, Sumitha. "Congestion control algorithms of TCP in emerging networks." [College Station, Tex. : Texas A&M University, 2006. http://hdl.handle.net/1969.1/ETD-TAMU-1757.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Corpeno, Rebeca. "Development of method for myosin- and actin-measurements in musclefibers." Thesis, Uppsala University, Department of Medical Biochemistry and Microbiology, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9172.

Full text
Abstract:
<p>The purpose of this study was to gain more knowledge about the deleterious effects of decreased muscle protein concentration on skeletal muscle function, by measuring the concentrations of myosin and actin in single pig muscle fibres. The pigs were earlier used in an experimental animal model to study the early stages of acute quadriplegic myopathy (AQM), a disease that is found in mechanically ventilated intensive care unit patients. Percutaneous biopsies were taken from these pigs and where now used in this study.</p><p>Even though the method used was accurately tested and theoretically working, certain problems arose. These problems were unexpected and caused problems to the study. The method used to measure the concentration of myosin and actin, an ELISA, gave no logical results. The reason could not be found and because of the time limit of this project no results from the AQM-pigs were gained. The efforts to make the method work is described and discussed.</p>
APA, Harvard, Vancouver, ISO, and other styles
21

Reynier, Julien. "Modélisation mathématique et simulation de TCP par des méthodes de champ moyen." Phd thesis, Ecole Polytechnique X, 2006. http://pastel.archives-ouvertes.fr/pastel-00002032.

Full text
Abstract:
Avec l'avènement d'Internet, le monde est entré dans une ère nouvelle. Celle d'un partage quasiment sans limite de l'information. Ce n'est pas seulement un medium de communication nouveau qui est apparu, mais bien aussi un changement dans la façon de vivre de chacun. L'un des aspects de cette révolution est le transport effectué par le réseau utilisant souvent le protocole TCP. Aujourd'hui encore les usages sont parfois limités par les débits disponibles. La nouvelle génération d'Internet, peut-être issue des recherches sur Internet 2, offrira des possibilités sans doute encore plus étonnantes. Ce mémoire porte sur l'étude du partage de bande passante par TCP. Elle commence par deux chapitre introductifs sur TCP (chapitre 1) et ses modèles mathématiques (chapitre 2) qui conduisent à trois parties. La première partie est très mathématique, partant d'un exemple très simple au chapitre 3 sur des tirages avec ou sans remise d'une urne, elle montre ensuite comment généraliser la méthode de champ moyen de «lookdown process» à des trajectoires avec des dynamiques complexes à travers trois chapitres ; le chapitre 4 pose le problème d'utilisateurs qui interagissent à travers une ressource partagée par l'intermédiaire de sauts poissonniens ; le chapitre 5 développe une méthode inspirée du «lookdown process» pour résoudre ce premier problème ; enfin, le chapitre 6 propose une généralisation où l'interaction se fait par des sauts à intensités mais aussi par des sauts synchrones. La deuxième partie présente un travail joint avec David McDonald et François Baccelli sur la modélisation de TCP dans le cas de téléchargements de longue durée et une généralisation pour tenir compte d'utilisateurs non persistants. Le chapitre 8 présente le modèle utilisé, il s'agit d'étudier un grand nombre d'utilisateurs qui partagent une ressource commune qui est la bande passante. Ces utilisateurs se servent du protocole TCP sous un certain nombre d'hypothèses simplificatrices (modèle fluide, nombre d'utilisateurs constant avec le temps). Le chapitre 9 poursuit en proposant la démonstration de la convergence en champ moyen : lorsque le nombre d'utilisateurs qui partagent une ressource rare devient grand, les équations des files d'attentes/débits au routeur deviennent des équations aux dérivées partielles déterministes. L'étude mathématique de ces équations de transport est l'objet du chapitre 10. Elles y sont établies, simplifiées et étudiées. En particulier, on y voit une étude de la stabilité d'une file d'attente sous certains contrôles centralisés comme RED. Cette partie s'achève avec le chapitre 11 qui propose une généralisation du modèle pour prendre en compte les utilisateurs du protocole HTTP sur TCP. La troisième partie est dédiée aux simulations. Le chapitre 12 montre certains aspects de la conception du simulateur des équations aux dérivées partielles des utilisateurs de TCP persistants et de leur généralisation à HTTP. Le simulateur est ensuite validé expérimentalement puis nous discutons des principales améliorations qu'il faudrait apporter au simulateur et au modèle. Enfin le chapitre 13 propose deux exemples d'utilisations du simulateur, le premier illustre le problème de la congestion pour les utilisateurs de HTTP en insistant sur le fait que le problème est plus grave qu'on pourrait s'y attendre ; le second présente un effet de turbulence mathématique pour HTTP/TCP, c'est à dire que dans certaines conditions, nous trouvons deux états limites, un où le débit est constant et un état oscillatoire où le débit moyen est moindre.
APA, Harvard, Vancouver, ISO, and other styles
22

Galarraga, Castillo Omar Antonio. "Simulation of surgery effect on cerebral palsy gait by supervised machine learning." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLE006/document.

Full text
Abstract:
La paralysie cérébrale (PC) cause souvent d’importants troubles de la marche. Suite à un examen clinique et une analyse quantifiée de la marche (AQM), ces troubles peuvent être traités par une chirurgie orthopédique, dite multi-sites, au cours de laquelle plusieurs corrections chirurgicales sont faites simultanément à différents niveaux des membres inférieurs. Les améliorations cinématiques apportées par ce traitement, bien que parfois très efficaces, demeurent à ce jour difficilement prévisibles. L’objectif de cette thèse est de simuler par apprentissage statistique supervisé l’effet de la chirurgie sur les signaux de marche, notamment les signaux cinématiques.Ce simulateur vise à montrer le résultat probable de la marche postopératoire afin d’aider à la décision chirurgicale. Une base de données constituée de 134 enfants atteints de PC, ayant été opérés et ayant eu au moins une AQM avant et après la chirurgie, a été exploitée. Les signaux cinématiques ont été prétraités et les données cliniques manquantes ont été imputées.Des caractéristiques des données prétraitées ont été extraites en utilisant différentes méthodes telles que l’approximation des courbes, la sélection de variables et la réduction de dimension par analyse en composantes principales. Ensuite des régressions ont été faites en utilisant différentes méthodes telles que la régression multilinéaire, la régression non linéaire avec des réseaux de neurones et l’apprentissage par ensembles. Les différentes méthodes testées ont été comparées entre elles, ainsi qu’avec d’autres méthodes trouvées dans la littérature. Il s’agit de la première fois que l’effet de la chirurgie sur la marche paralysée cérébrale est simulé de façon quantitative pour des nombreuses combinaisons chirurgicales et des nombreux patterns de marche<br>Cerebral Palsy frequently leads to gait troubles. After a physical examination and a Clinical Gait Analysis (CGA), these walking troubles are usually treated by orthopedic surgery, called single event multi-level surgery (SEMLS), in which several surgical corrections are simultaneously done at different levels of the lower limbs. Kinematic improvements obtained by this treatment are sometimes very efficient, but at this moment they remain difficultly predictable. The objective of this thesis is to simulate the effect of surgery on gait parameters, using supervised statisticalmachine learning. The purpose of the simulator is to show the most likely gait outcome in order to improve decision-making in SEMLS. The database was composed of 134 children with cerebral palsy that have undergone surgery and have had at least one CGA before and after the treatment. Gait signals were preprocessed and physical examination missing data were imputed. Features of the preprocessed data were extracted using different techniques such ascurve fitting, variable selection and dimensionality reduction. Then regressions were performed utilizing different methods such as multiple linear regression, feedforward neural networks and ensemble learning. The tested methods and their performances were compared between them andto other methods in the literature. This work represents the first time that the effect of surgery on cerebral palsy gait is quantitatively simulated for a large number of surgical combinations and numerous different gait patterns
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Wu. "Parameter self-tuning in internet congestion control." Thesis, Loughborough University, 2010. https://dspace.lboro.ac.uk/2134/6361.

Full text
Abstract:
Active Queue Management (AQM) aims to achieve high link utilization, low queuing delay and low loss rate in routers. However, it is difficult to adapt AQM parameters to constantly provide desirable transient and steady-state performance under highly dynamic network scenarios. They need to be a trade-off made between queuing delay and utilization. The queue size would become unstable when round-trip time or link capacity increases, or would be unnecessarily large when round-trip time or link capacity decreases. Effective ways of adapting AQM parameters to obtain good performance have remained a critical unsolved problem during the last fifteen years. This thesis firstly investigates existing AQM algorithms and their performance. Based on a previously developed dynamic model of TCP behaviour and a linear feedback model of TCP/RED, Auto-Parameterization RED (AP-RED) is proposed which unveils the mechanism of adapting RED parameters according to measurable network conditions. Another algorithm of Statistical Tuning RED (ST-RED) is developed for systematically tuning four key RED parameters to control the local stability in response to the detected change in the variance of the queue size. Under variable network scenarios like round-trip time, link capacity and traffic load, no manual parameter configuration is needed. The proposed ST-RED can adjust corresponding parameters rapidly to maintain stable performance and keep queuing delay as low as possible. Thus the sensitivity of RED's performance to different network scenarios is removed. This Statistical Tuning algorithm can be applied to a PI controller for AQM and a Statistical Tuning PI (ST-PI) controller is also developed. The implementation of ST-RED and ST-PI is relatively straightforward. Simulation results demonstrate the feasibility of ST-RED and ST-PI and their capabilities to provide desirable transient and steady-state performance under extensively varying network conditions.
APA, Harvard, Vancouver, ISO, and other styles
24

Høiland-Jørgensen, Toke. "On the Bleeding Edge : Debloating Internet Access Networks." Licentiate thesis, Karlstads universitet, Institutionen för matematik och datavetenskap (from 2013), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-47001.

Full text
Abstract:
As ever more devices are connected to the internet, and applications turn ever more interactive, it becomes more important that the network can be counted on to respond reliably and without unnecessary delay. However, this is far from always the case today, as there can be many potential sources of unnecessary delay. In this thesis we focus on one of them: Excess queueing delay in network routers along the path, also known as bufferbloat. We focus on the home network, and treat the issue in three stages. We examine latency variation and queueing delay on the public internet and show that significant excess delay is often present. Then, we evaluate several modern AQM algorithms and packet schedulers in a residential setting, and show that modern AQMs can almost entirely eliminate bufferbloat and extra queueing latency for wired connections, but that they are not as effective for WiFi links. Finally, we go on to design and implement a solution for bufferbloat at the WiFi link, and also design a workable scheduler-based solution for realising airtime fairness in WiFi. Also included in this thesis is a description of Flent, a measurement tool used to perform most of the experiments in the other papers, and also used widely in the bufferbloat community.<br>HITS, 4707
APA, Harvard, Vancouver, ISO, and other styles
25

Daneryd, Oscar. "Congestion Management at the Network Edge." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-26082.

Full text
Abstract:
In the Internet of today there is a demand for both high bandwidth and low delays. Bandwidth-heavy applications such as large downloads or video streaming compete with more delay-sensitive applications; web-browsing, VoIP and video games. These applications represent a growing share of Internet traffic. Buffers are an essential part of network equipment. They prevent packet loss and help maintain hight throughput. As bandwidths have increased so have the buffer sizes. In some cases way to much. This, and the fact that Active Queue Management (AQM) is seldom implemented, has given rise to a phenomenon called Bufferbloat. Bufferbloat is manifested at the bottleneck of the network path by large flows creating standing queues that choke out smaller, and usually delay-sensitive, flows. Since the bottleneck is often located at the consumer edge, this is where the focus of this thesis lies. This work evaluates three different AQM solutions that lower delays without requiring complicated configuration; CoDel, FQ_CoDel and PIE. FQ_CoDel had the best performance in the tests, with the lowest consistent delays and high throughput. This thesis recommends that AQM is implemented at the network edge, preferably FQ_CoDel.
APA, Harvard, Vancouver, ISO, and other styles
26

Pierantozzi, Stefano. "Modellazione e comparazione del protocollo TCP in scenari di mobilità." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/7440/.

Full text
Abstract:
Nell'ultimo ventennio l'impatto delle tecnologie wireless ha rivoluzionato il modo di comunicare. Tuttavia oltre a svariati benefici sono emersi diversi problemi di integrazione e ottimizzazione. Uno tra i protocolli più conosciuto e utilizzato in ambito di comunicazioni di rete, il TCP, viene sempre più spesso usato all'interno di sistemi wireless, per le sue caratteristiche di affidabilità e controllo, senza però fornire supporto specifico. Ciò è materia di forte dibattito e ricerca, che mira a cercare di raffinare le differenti versioni di TCP per renderle wireless-oriented. In questo lavoro si analizzano due varianti di sistema che sfruttano il TCP in scenari di mobilità, una con TCP classico e l'altra con TCP modificato tramite l'aggiunta di un meccanismo di ritrasmissione anticipata, e se ne studiano i vari aspetti e comportamenti, valutandone le prestazioni per mezzo di metodi matematici consolidati in letteratura.
APA, Harvard, Vancouver, ISO, and other styles
27

VIANA, Vanina Cardoso. "Desenvolvimento de metodologia de análise quantitativa de risco para dutovias de petróleo e derivados." Universidade Federal de Pernambuco, 2011. https://repositorio.ufpe.br/handle/123456789/5387.

Full text
Abstract:
Made available in DSpace on 2014-06-12T17:38:44Z (GMT). No. of bitstreams: 2 arquivo509_1.pdf: 2632583 bytes, checksum: 534eb9b0779ed97c63f7a6d3f31123ae (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2011<br>Coordenação de Aperfeiçoamento de Pessoal de Nível Superior<br>O transporte por oleodutos é considerado o transporte mais seguro de substâncias a longas distâncias, o GLP pode ser transportado na sua forma líquida por dutovias, mas a sua construção oferece um risco às comunidades circunvizinhas a faixa de duto, pois o GLP ao ser lançado na atmosfera e à temperatura ambiente torna-se um produto gasoso, inflamável, inodoro e asfixiante em altas concentrações. A Análise Quantitativa de Riscos (AQR) permite quantificar esses riscos, de forma a contribuir para a tomada de decisão quanto às ações para redução dos mesmos. Neste sentido, o presente trabalho desenvolveu e aplicou uma metodologia de AQR em dutovia de GLP, quando foram identificados seis cenários acidentais, dos quais três apresentaram alto grau de severidade. Foi realizado um estudo em que foi comprovada a inexistência de padronização dos níveis de tolerabilidade dos riscos, nos manuais de análise de riscos nos estados brasileiros analisados. Foram obtidos os alcances dos efeitos fiscos e da vulnerabilidade dos cenários selecionados através do software EFFECTS 8.1, o de maior alcance foi de 219,1 metros, para a tipologia acidental de incêndio em nuvem, para o cenário 05 que corresponde à grande liberação de GLP, no período noturno, correspondendo a 100% de fatalidade. Foram estimadas as freqüências de ocorrência e calculados os riscos através do software RiskCurves 7.6 e avaliados através da comparação dos limites de tolerabilidade para dutovias, todos os riscos estiveram na zona de aceitabilidade
APA, Harvard, Vancouver, ISO, and other styles
28

Koenemann, Kai. "Rebel Group Funding and Engagement in Rebel Governance: A Comparative Case Study." Thesis, Uppsala universitet, Institutionen för freds- och konfliktforskning, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385134.

Full text
Abstract:
This thesis addresses an identified gap in the field of rebel governance and rebel funding, by theorizing and investigating how differences in rebel group funding sources affect a group’s engagement in rebel governance, distinguishing funding through natural resources from funding through non-natural resources. It is highlighted that these sources differ in three fundamental ways: their necessity for civilian labor and cooperation, the extent to which equipment, technology and infrastructure are required, and the expected time of pay-off. It is hypothesized that the degree to which a rebel group depends on natural resources determines the likelihood to which it engages in rebel governance - i.e. intervenes in all security, political, social, health and educational spheres of civilian life. This hypothesis is investigated through a comparative case study of two rebel groups from 2003 to 2018: the Taliban in Afghanistan, which generated its funding primarily through Afghanistan’s opium economy, and the Salafist Group for Preaching and Combat, later known as Al-Qaeda in Maghreb, which generated its funding through ‘criminal activities’ such as kidnappings for ransom. The findings suggest some level of support for the hypothesis. Inconsistencies in the findings limiting generalizability and the need for further investigations are discussed.
APA, Harvard, Vancouver, ISO, and other styles
29

Sordi, Valentina. "A STATE IN THE STATE : THE ROLE OF TRANSNATIONAL AGENTS IN THE DESTATALIZATION PROCESS: THE CASE OF MALI." Thesis, Högskolan Dalarna, Afrikanska studier, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:du-30721.

Full text
Abstract:
The thesis focuses on the current Malian situation, that observed a severe penetration of jihadist groups (transnational agents), and attempts to understand this phenomenon in accordance with a theoretical framework that accounts for the crisis of the concept of Westphalian state in contemporary international panorama. The analysis is structured on two levels, investigating the structural issues and the political and social mutations, both on the regional and on the state dimension.
APA, Harvard, Vancouver, ISO, and other styles
30

Li, Zhi. "Fuzzy logic based robust control of queue management and optimal treatment of traffic over TCP/IP networks." University of Southern Queensland, Faculty of Sciences, 2005. http://eprints.usq.edu.au/archive/00001461/.

Full text
Abstract:
Improving network performance in terms of efficiency, fairness in the bandwidth, and system stability has been a research issue for decades. Current Internet traffic control maintains sophistication in end TCPs but simplicity in routers. In each router, incoming packets queue up in a buffer for transmission until the buffer is full, and then the packets are dropped. This router queue management strategy is referred to as Drop Tail. End TCPs eventually detect packet losses and slow down their sending rates to ease congestion in the network. This way, the aggregate sending rate converges to the network capacity. In the past, Drop Tail has been adopted in most routers in the Internet due to its simplicity of implementation and practicability with light traffic loads. However Drop Tail, with heavy-loaded traffic, causes not only high loss rate and low network throughput, but also long packet delay and lengthy congestion conditions. To address these problems, active queue management (AQM) has been proposed with the idea of proactively and selectively dropping packets before an output buffer is full. The essence of AQM is to drop packets in such a way that the congestion avoidance strategy of TCP works most effectively. Significant efforts in developing AQM have been made since random early detection (RED), the first prominent AQM other than Drop Tail, was introduced in 1993. Although various AQMs also tend to improve fairness in bandwidth among flows, the vulnerability of short-lived flows persists due to the conservative nature of TCP. It has been revealed that short-lived flows take up traffic with a relatively small percentage of bytes but in a large number of flows. From the user’s point of view, there is an expectation of timely delivery of short-lived flows. Our approach is to apply artificial intelligence technologies, particularly fuzzy logic (FL), to address these two issues: an effective AQM scheme, and preferential treatment for short-lived flows. Inspired by the success of FL in the robust control of nonlinear complex systems, our hypothesis is that the Internet is one of the most complex systems and FL can be applied to it. First of all, state of the art AQM schemes outperform Drop Tail, but their performance is not consistent under different network scenarios. Research reveals that this inconsistency is due to the selection of congestion indicators. Most existing AQM schemes are reliant on queue length, input rate, and extreme events occurring in the routers, such as a full queue and an empty queue. This drawback might be overcome by introducing an indicator which takes account of not only input traffic but also queue occupancy for early congestion notification. The congestion indicator chosen in this research is traffic load factor. Traffic load factor is in fact dimensionless and thus independent of link capacity, and also it is easy to use in more complex networks where different traffic classes coexist. The traffic load indicator is a descriptive measure of the complex communication network, and is well suited for use in FL control theory. Based on the traffic load indicator, AQM using FL – or FLAQM – is explored and two FLAQM algorithms are proposed. Secondly, a mice and elephants (ME) strategy is proposed for addressing the problem of the vulnerability of short-lived flows. The idea behind ME is to treat short-lived flows preferably over bulk flows. ME’s operational location is chosen at user premise gateways, where surplus processing resources are available compared to other places. By giving absolute priority to short-lived flows, both short and long-lived flows can benefit. One problem with ME is starvation of elephants or long-lived flows. This issue is addressed by dynamically adjusting the threshold distinguishing between mice and elephants with the guarantee that minimum capacity is maintained for elephants. The method used to dynamically adjust the threshold is to apply FL. FLAQM is deployed to control the elephant queue with consideration of capacity usage of mice packets. In addition, flow states in a ME router are periodically updated to maintain the data storage. The application of the traffic load factor for early congestion notification and the ME strategy have been evaluated via extensive experimental simulations with a range of traffic load conditions. The results show that the proposed two FLAQM algorithms outperform some well-known AQM schemes in all the investigated network circumstances in terms of both user-centric measures and network-centric measures. The ME strategy, with the use of FLAQM to control long-lived flow queues, improves not only the performance of short-lived flows but also the overall performance of the network without disadvantaging long-lived flows.
APA, Harvard, Vancouver, ISO, and other styles
31

Olander, Terese. "Molecular gas around the binary star R Aquarii." Thesis, Uppsala universitet, Institutionen för fysik och astronomi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-325828.

Full text
Abstract:
At the end of the lives of low- to intermediate mass stars they can be found on the asymptotic giant branch (AGB). The AGB phase ends when the entire circumstellar envelope (CSE) is blown away in a superwindphase, in the end creating a planetary nebula. It is unknown what shapes the CSE and the planetary nebula. Binarity is a favored theory. In order to test this theory the CSE around the star R Aquarii has been studied using the emission from different molecules observed with ALMA. R Aquarii is a nearby binary system and therefore easy to study. The system consists of a Mira variable on the AGB and a hot white dwarf. It was found that only in the emission from the 12CO J=3–2 transition were the CSE resolved enough for any structure to be seen. The morphology was irregular and no clear symmetry was seen. A spot in the same molecular line was detected at high velocities (v = -23 km/s) relative to the star at a projected distance of 7 arcsec south of R Aqr. Line profiles for 12CO and 13CO follow the same shape but differs in magnitude, indicating that they can be found in the same structure. A mass loss rate of 6.5·10-7 solar masses per year was calculated for R Aquarii using line intensities obtained from the line profile of 12CO. The morphology and kinematics of the CO CSE of R Aquarii are discussed within the limitations of the current data set. More observations with better resolution are needed to better understand the morphology of the CSE of R Aquarii and draw firm conclusions.
APA, Harvard, Vancouver, ISO, and other styles
32

Mohamed, Mahmud H. Etbega. "Some Active Queue Management Methods for Controlling Packet Queueing Delay. Design and Performance Evaluation of Some New Versions of Active Queue Management Schemes for Controlling Packet Queueing Delay in a Buffer to Satisfy Quality of Service Requirements for Real-time Multimedia Applications." Thesis, University of Bradford, 2009. http://hdl.handle.net/10454/4258.

Full text
Abstract:
Traditionally the Internet is used for the following applications: FTP, e-mail and Web traffic. However in the recent years the Internet is increasingly supporting emerging applications such as IP telephony, video conferencing and online games. These new applications have different requirements in terms of throughput and delay than traditional applications. For example, interactive multimedia applications, unlike traditional applications, have more strict delay constraints and less strict loss constraints. Unfortunately, the current Internet offers only a best-effort service to all applications without any consideration to the applications specific requirements. In this thesis three existing Active Queue Management (AQM) mechanisms are modified by incorporating into these a control function to condition routers for better Quality of Service (QoS). Specifically, delay is considered as the key QoS metric as it is the most important metric for real-time multimedia applications. The first modified mechanism is Drop Tail (DT), which is a simple mechanism in comparison with most AQM schemes. A dynamic threshold has been added to DT in order to maintain packet queueing delay at a specified value. The modified mechanism is referred to as Adaptive Drop Tail (ADT). The second mechanism considered is Early Random Drop (ERD) and, iii in a similar way to ADT, a dynamic threshold has been used to keep the delay at a required value, the main difference being that packets are now dropped probabilistically before the queue reaches full capacity. This mechanism is referred to as Adaptive Early Random Drop (AERD). The final mechanism considered is motivated by the well known Random Early Detection AQM mechanism and is effectively a multi-threshold version of AERD in which packets are dropped with a linear function between the two thresholds and the second threshold is moveable in order to change the slope of the dropping function. This mechanism is called Multi Threshold Adaptive Early Random Drop (MTAERD) and is used in a similar way to the other mechanisms to maintain delay around a specified level. The main focus with all the mechanisms is on queueing delay, which is a significant component of end-to-end delay, and also on reducing the jitter (delay variation) A control algorithm is developed using an analytical model that specifies the delay as a function of the queue threshold position and this function has been used in a simulation to adjust the threshold to an effective value to maintain the delay around a specified value as the packet arrival rate changes over time. iv A two state Markov Modulated Poisson Process is used as the arrival process to each of the three systems to introduce burstiness and correlation of the packet inter-arrival times and to present sudden changes in the arrival process as might be encountered when TCP is used as the transport protocol and step changes the size of its congestion window. In the investigations it is assumed the traffic source is a mixture of TCP and UDP traffic and that the mechanisms conserved apply to the TCP based data. It is also assumed that this consists of the majority proportion of the total traffic so that the control mechanisms have a significant effect on controlling the overall delay. The three mechanisms are evaluated using a Java framework and results are presented showing the amount of improvement in QoS that can be achieved by the mechanisms over their non-adaptive counterparts. The mechanisms are also compared with each other and conclusions drawn.
APA, Harvard, Vancouver, ISO, and other styles
33

Huard, Yannick. "Objectivation de l'équilibre en stabilité debout et lors du cycle de marche chez le sujet âgé autonome chuteur : apport de l'Ostéopathie." Thesis, Reims, 2015. http://www.theses.fr/2015REIMS024/document.

Full text
Abstract:
Les troubles de la stabilité demeurent fréquents chez la personne âgée, dont la chute, avec des conséquences néfastes sur l’autonomie. Trois essais randomisés ont été menés afin d’identifier les variables distinguant le sujet âgé autonome chuteur et d’analyser l’incidence d’un traitement ostéopathique. La 1ère étude concerne 33 sujets : 15 chuteurs (68,3 ± 2,7 ans) et 18 non-chuteurs (67,7 ± 2,5 ans). Trois paramètres stabilométriques et trois tests cliniques permettent de distinguer les deux populations (p &lt; 0,05). Le traitement ostéopathique améliore les caractéristiques évaluées des sujets chuteurs (plus de différence significative). La 2ème étude concerne 40 sujets chuteurs lombalgiques : 20 recevant un traitement ostéopathique(69,5 ± 3,9 ans) et 20 ne le recevant pas (69,9 ± 3,4 ans). Cette étude permet d’identifier que la mobilité lombaire est restreinte chez le sujet âgé chuteur et que le traitement ostéopathique améliore l’amplitude de mouvement lombaire juste après le traitement ainsi qu’à sept jours (p ≤0,01). La 3ème étude concerne 34 sujets : 17 chuteurs (71,3 ± 3,5 ans) et 17 non-chuteurs (71,5 ± 4,2 ans). Quatre variables cinématiques permettent de distinguer les deux populations (p ≤ 0,04). Le coefficient de détermination R2 ainsi que le Gait Variability Index confirment cette distinction. Le traitement ostéopathique améliore les caractéristiques évaluées des sujets chuteurs (plus de différence significative)<br>Balance disorders, as the fall, remain frequently in the elderly, with adverse consequences on the autonomy. Three randomized trials have been conducted to identify the parameters distinguishing the “fallers autonomous elderly” and to analyze the impact of an osteopathic treatment. The 1st study concerns 33 elderly patients: 15 fallers (68,3 ± 2,7 years) and 18 no-fallers (67,7 ± 2,5years). Three stabilometric parameters and three clinical tests distinguish the two populations (p &lt;0,05). Moreover, the osteopathic treatment improves the evaluated characteristics of fallers elderly (no significant difference). The 2nd study concerns 40 fallers and lombalgic elderly: 20 receiving an osteopathic treatment (69,5 ±3,9 years) and 20 without the osteopathic treatment (69,9 ± 3,4 years). This study identifies that the lumbar mobility is restricted for every fallers elderly and the osteopathic treatment improves the lumbar motion just after the treatment, as well as seven days after it (p ≤ 0,01). The 3rd study concerns 34 elderly patients: 17 fallers (71,3 ± 3,5 years) and 17 no-fallers (71,5 ± 4,2years). Four cinematic parameters distinguish the two populations (p ≤ 0,04). The coefficient ofdetermination R2 and the Gait Variability Index confirm that distinction. The osteopathic treatment improves the evaluated characteristics of fallers elderly (no significant difference)
APA, Harvard, Vancouver, ISO, and other styles
34

Ortega, Jiménez Sara. "Simulació numèrica mesoscalar de l'ozó troposfèric a Catalunya." Doctoral thesis, Universitat de Barcelona, 2009. http://hdl.handle.net/10803/756.

Full text
Abstract:
L'ozó és un contaminant secundari que es pot trobar a la troposfera en quantitats elevades en les èpoques amb major radiació solar. Té efectes nocius sobre els éssers vius i per això és necessari controlar les seves concentracions amb estacions de mesura i models numèrics.<br/><br/>Amb l'objectiu de modelitzar les concentracions d'ozó troposfèric a Catalunya s'ha dissenyat un model de qualitat de l'aire (AQM) format per un model d'emissions, un model meteorològic i un model fotoquímic. El model s'ha aplicat a quatre casos d'estudi a l'any 2003.<br/><br/>L'AQM està format pels models MM5/MNEQA/CMAQ. El model d'emissions utilitzat és un model numèric que s'ha desenvolupat en aquesta tesi i utilitza metodologia top&#8208;down i bottom&#8208;up. El Model Numèric d'Emissions per a la Qualitat de l'Aire, MNEQA, considera les emissions biogèniques i antropogèniques d'òxids de nitrogen (NOx), hidrocarburs (VOC), monòxid de carboni (CO) i partícules (PM10, PM2.5). Té estructura modular i tot i que s'ha aplicat a la zona de Catalunya pot adaptar&#8208;se a d'altres zones. L'anàlisi dels resultats per als casos d'estudi mostra un patró de les emissions coherent amb la distribució de la densitat de població. Les emissions són majors a la zona de Barcelona, rodalies i al llarg dels eixos principals de comunicació rodada. Una comparativa dels resultats amb un altre model (EMEP) mostra similitud en la quantitat diària de NOx, MNEQA atribueix a la zona d'aplicació del model (Catalunya) un 5% més de NOx que EMEP. Pel que respecta als VOCs, MNEQA atribueix un 18% menys que EMEP.<br/><br/>S'ha utilitzat el model meteorològic MM5 en mode no hidrostàtic per a les simulacions de les condicions atmosfèriques. La validació de les simulacions meteorològiques mostra una tendència del model MM5 a infraestimar la temperatura, essent més acusada en les hores amb temperatures elevades. A més el model tendeix a sobreestimar el mòdul del vent amb un RMSE de 1.7 m/s. No obstant els errors són d'ordre similar als donats en altres estudis publicats.<br/><br/>La simulació fotoquímica per als quatre casos d'estudi es va realitzar amb el model CMAQ. Les simulacions mostren diferents nivells de concentració d'ozó, essent superiors en els dos períodes d'agost per als quals es van observar temperatures elevades i nivells d'ozó que en alguns casos superaren el llindar d'informació a la població. El model no és capaç de reproduir les altes concentracions observades en algunes zones, però sí reprodueix la tendència. És possible que el model estigui limitat per un dèficit d'emissions de VOCs.<br/><br/>S'ha establert la base per a una estructura operativa en la modelització d'ozó, que ha demostrat un encert moderat en relació als valors observats. El model d'emissions és una nova eina per a l'estudi d'emissions i en la modelització de la qualitat de l'aire.<br><i>In order to study and forecast tropospheric ozone, an air quality modelling system (AQM) has been adapted to the north of Spain (Catalonia). High ozone episodes are observed every summer in Catalonia. The AQM is composed by a new emissions model (MNEQA), a meteorological model (MM5) and a photochemical model (CMAQ). The AQM was applied to four periods of summer 2003, in order to test the model in the region.<br/><br/>The Numerical Emissions Model for Air Quality (MNEQA) is presented in this work. The model was created to be used in photochemical simulations and emission control strategies relating to tropospheric ozone pollutants. MNEQA processes available local information from external files and is easily adaptable to any desired spatial resolution. Results in the studied area showed a pattern of emissions similar to the distribution of population. The emissions are bigger in Barcelona, in the surrounding area and in the main roads. A comparisson of the results with an other model (EMEP) showed similar results in NOx, MNEQA gives a 5% more of NOx than EMEP. While MNEQA disagrees with EMEP in a 18% for VOCs.<br/><br/>The photochemical simulation in the four cases of study showed different levels of ozone concentrations, being higher in the two periods of august in accord with the observations and the high temperatures. The AQM failed in forecasting some high levels in concrete areas, but it was able to reproduce the tendency. The model could be limitated by the emissions of VOCs.<br/><br/>The base for an opearative structure to model tropospheric ozone is established in this work. Nevertheless, the system has shown a moderate agreement with observations, the emissions model (MNEQA) has acted as a valuable tool in the study of emissions in air quality modelling. </i>
APA, Harvard, Vancouver, ISO, and other styles
35

Fares, Rasha H. A. "Performance modelling and analysis of congestion control mechanisms for communication networks with quality of service constraints. An investigation into new methods of controlling congestion and mean delay in communication networks with both short range dependent and long range dependent traffic." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/5435.

Full text
Abstract:
Active Queue Management (AQM) schemes are used for ensuring the Quality of Service (QoS) in telecommunication networks. However, they are sensitive to parameter settings and have weaknesses in detecting and controlling congestion under dynamically changing network situations. Another drawback for the AQM algorithms is that they have been applied only on the Markovian models which are considered as Short Range Dependent (SRD) traffic models. However, traffic measurements from communication networks have shown that network traffic can exhibit self-similar as well as Long Range Dependent (LRD) properties. Therefore, it is important to design new algorithms not only to control congestion but also to have the ability to predict the onset of congestion within a network. An aim of this research is to devise some new congestion control methods for communication networks that make use of various traffic characteristics, such as LRD, which has not previously been employed in congestion control methods currently used in the Internet. A queueing model with a number of ON/OFF sources has been used and this incorporates a novel congestion prediction algorithm for AQM. The simulation results have shown that applying the algorithm can provide better performance than an equivalent system without the prediction. Modifying the algorithm by the inclusion of a sliding window mechanism has been shown to further improve the performance in terms of controlling the total number of packets within the system and improving the throughput. Also considered is the important problem of maintaining QoS constraints, such as mean delay, which is crucially important in providing satisfactory transmission of real-time services over multi-service networks like the Internet and which were not originally designed for this purpose. An algorithm has been developed to provide a control strategy that operates on a buffer which incorporates a moveable threshold. The algorithm has been developed to control the mean delay by dynamically adjusting the threshold, which, in turn, controls the effective arrival rate by randomly dropping packets. This work has been carried out using a mixture of computer simulation and analytical modelling. The performance of the new methods that have<br>Ministry of Higher Education in Egypt and the Egyptian Cultural Centre and Educational Bureau in London
APA, Harvard, Vancouver, ISO, and other styles
36

Fares, Rasha Hamed Abdel Moaty. "Performance modelling and analysis of congestion control mechanisms for communication networks with quality of service constraints : an investigation into new methods of controlling congestion and mean delay in communication networks with both short range dependent and long range dependent traffic." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/5435.

Full text
Abstract:
Active Queue Management (AQM) schemes are used for ensuring the Quality of Service (QoS) in telecommunication networks. However, they are sensitive to parameter settings and have weaknesses in detecting and controlling congestion under dynamically changing network situations. Another drawback for the AQM algorithms is that they have been applied only on the Markovian models which are considered as Short Range Dependent (SRD) traffic models. However, traffic measurements from communication networks have shown that network traffic can exhibit self-similar as well as Long Range Dependent (LRD) properties. Therefore, it is important to design new algorithms not only to control congestion but also to have the ability to predict the onset of congestion within a network. An aim of this research is to devise some new congestion control methods for communication networks that make use of various traffic characteristics, such as LRD, which has not previously been employed in congestion control methods currently used in the Internet. A queueing model with a number of ON/OFF sources has been used and this incorporates a novel congestion prediction algorithm for AQM. The simulation results have shown that applying the algorithm can provide better performance than an equivalent system without the prediction. Modifying the algorithm by the inclusion of a sliding window mechanism has been shown to further improve the performance in terms of controlling the total number of packets within the system and improving the throughput. Also considered is the important problem of maintaining QoS constraints, such as mean delay, which is crucially important in providing satisfactory transmission of real-time services over multi-service networks like the Internet and which were not originally designed for this purpose. An algorithm has been developed to provide a control strategy that operates on a buffer which incorporates a moveable threshold. The algorithm has been developed to control the mean delay by dynamically adjusting the threshold, which, in turn, controls the effective arrival rate by randomly dropping packets. This work has been carried out using a mixture of computer simulation and analytical modelling. The performance of the new methods that have.
APA, Harvard, Vancouver, ISO, and other styles
37

Kumar, Abhishek Anand. "Traffic sensitive quality of service controller." Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-0114104-230026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Santi, Juliana de 1982. "Gerenciamento ativo de filas para o protocolo "High Speed Transmission Control Protocol" em redes com produto banda-atraso elevado." [s.n.], 2008. http://repositorio.unicamp.br/jspui/handle/REPOSIP/276151.

Full text
Abstract:
Orientador: Nelson Luis Saldanha da Fonseca<br>Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação<br>Made available in DSpace on 2018-08-13T10:35:13Z (GMT). No. of bitstreams: 1 Santi_Julianade_M.pdf: 1658984 bytes, checksum: 8a9f078587406a06815484e4fe057f7d (MD5) Previous issue date: 2008<br>Resumo: A utilização eficiente da banda passante em redes de alta velocidade e grandes atrasos, denominadas redes com produto banda-atraso elevado (PBA), tornou-se um grande desafio. Isto ocorre devido aos ajustes do protocolo Transmission Control Protocol (TCP). O High Speed TCP (HSTCP), uma variante do TCP para redes com PBA elevado, emprega ajustes mais agressivos permitindo, assim, que a utilização da banda seja escalável. As políticas de Gerenciamento Ativo de Filas ou Active Queue Management (AQM), monitoram o nível de ocupação das filas nos roteadores e notificam o congestionamento incipiente aos emissores TCP através do descarte/marcação de pacotes. O sistema de controle de congestionamento apresenta natureza de retroalimentação, na qual a taxa de transmissão dos nós fontes é ajustada em função do nível de ocupação da fila. Os controladores AQM determinam a probabilidade de descarte/marcação para maximizar a vazão e minimizar perdas, garantindo, assim, a estabilidade do tamanho da fila independentemente das variações das condições da rede. Neste trabalho, define-se a política de gerenciamento ativo de filas HSTCP-H2 para redes com PBA elevado que utilizam o protocolo HSTCP. Para a derivação de HSTCP­H2: são utilizadas técnicas de Teoria de Controle Ótimo. A principal característica desta política é considerar o atraso do sistema o que permite melhor utilização dos recursos disponíveis. A estabilidade e os objetivos de desempenho do sistema são expressos e solu­cionados através de Desigualdades Matriciais Lineares, permitindo que os parâmetros do controlador possam ser calculados através da solução de um problema convexo simples. Diferentes controladores foram derivados considerando-se diferentes objetivos de de­sempenho, os quais consideram as características de redes com produto banda-atraso elevado. Através de simulações, os desempenhos dos controladores derivados são avalia­dos e a eficácia do controlador que apresentou o melhor desempenho foi comparado com o desempenho da política de AQM RED. São considerados cenários com enlace gargalo único e com múltiplos gargalos.<br>Abstract: The efficient utilization of bandwidth in high speed and large delay networks, called high bandwidth-delay product networks (BDP), has become a major challenge. This is due to adjustments of the Transmission Control Protocol (TCP). The High Speed TCP HSTCP): a TCP variant to high BDP networks, employs more aggressive adjustments, allowing scalable bandwidth utilization. The Active Queue Management (AQM) policies monitor the queue length in the routers and notify incipient congestion to TCP source by marking or dropping packets. The congestion control system presents intrinsic feedback nature, where the transmission rates of the sources are adjusted according to the level of congestion inferred by the queue occupancy. The AQM controllers determine the dropping marking probability values to maximize throughput and minimize losses, giving guarantees to stabilize the queue length independent of network conditions. In this work, it is defined HSTCP-H2, an active queue management policy to high BDP networks, which adopt the HSTCP as their transport protocol. Optimal control theory is used to conceive HSTCP-H2. The novelty of the proposed approach lies in consider the delay of the system which allows better use of available resources. Furthermore, in the proposed approach, stability and performance objectives are completely expressed as Linear Matrix Inequalities (LMIs), thus requiring the solution of a single convex problem for the computation of the controller parameters. Different controllers are derived considering different design goals, which take into ac­count the characteristics of the high bandwidth-delay product networks. The performance produced by different optimal controllers was investigated. The efficacy of the control­ler with the best performance was then compared to the performance of RED policy. The simulation experiments were carried out using topologies with single and multiple bottleneck.<br>Mestrado<br>Redes de Computadores<br>Mestre em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
39

Rezgui, Taysir. "Musculoskeletal modeling of cerebral palsy children." Compiègne, 2012. http://www.theses.fr/2012COMP1991.

Full text
Abstract:
La modélisation musculosquelettique est aujourd’hui utilisée dans de nombreux domaines tels que l’analyse de la marche pathologique et la simulation des traitements thérapeutiques et chirurgicaux. Dans le cadre de la paralysie cérébrale (PC), la prise en considération des spécificités des patients, des troubles neurologiques et des déformations osseuses est nécessaire. Etant donné que les techniques d’imagerie médicale sont encore marginales en routine clinique, le recours aux modèles génériques reste donc indispensable. Notre étude rétrospective vise le développement d’un modèle musculosquelettique (MMS) générique adapté aux enfants PC. Une première étude détermine les limites d’un tel modèle pour la marche normale, les marches pathologiques des enfants paralysés cérébraux, et les postures pathologiques imitées par une population saine. Une seconde étude propose une technique de calibration pour raffiner les paramètres du MMS à partir des données recueillies de l’analyse quantifiée de la marche (AQM). Ainsi, on a pu déduire que, même si les résultats estimés sont représentatifs pour les adultes sains, le MMS standard présente des limites concernant la cinématique et les forces musculaires prédites pour les enfants sains et les enfants PC. D’autre part, la procédure de calibration influe de façon positive sur les données prédites comme les activations musculaires et les forces musculaires. Ce travail montre que le MMS générique peut être calibré à partir des données de l’AQM afin d’améliorer les résultats du modèle. Cette technique pourrait représenter une nouvelle perspective dans les applications cliniques de la modélisation musculosquelettique<br>The analysis of pathological gait using musculoskeletal modeling is a promising approach to qualify and quantify the pathology as well as to monitor the potential recovery after therapy. When dealing with cerebral palsy, its specific neurological disorders and consequently bones deformities, specific-subject musculoskeletal models has been developed. The imaging techniques are still unaffordable in clinical practises. So, using the LifeMod software, we aimed to develop musculoskeletal model in a retrospective study to evaluate the accuracy of surgical treatments on cerebral palsy. Two principles studies are performed. First, relying on the accuracy of a rescaled generic adult skeleton, the musculoskeletal modeling limitation have been determined when applying normal gait and pathological crouch and jump postures, imitated by healthy adults and children. Second, calibration technique had been developed to refine the model’s parameters based on data collectid from the subject. Results from musculoskeletal modeling are compared to gait analysis date. As results, even if the model outputs gave correct results with healthy adults, the standard rescaled musculoskeletal modeling showed limits on predicted kinematics and muscle forces for healthy and CP children. The refinement of subject-specific joint parameters and driving the model with the experimental GRF data have a huge influence in model outputs and improve quantitatively the predicted muscle activations and forces. This work pointed out that the parameters of a rescaled generic musculoskeletal moded can be refined and personalized to improve model’s outcomes. It may represent a new perspective in clinical applications
APA, Harvard, Vancouver, ISO, and other styles
40

Wang, Lan. "Performance modeling of congestion control and resource allocation under heterogeneous network traffic : modeling and analysis of active queue management mechanism in the presence of poisson and bursty traffic arrival processes." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/4455.

Full text
Abstract:
Along with playing an ever-increasing role in the integration of other communication networks and expanding in application diversities, the current Internet suffers from serious overuse and congestion bottlenecks. Efficient congestion control is fundamental to ensure the Internet reliability, satisfy the specified Quality-of-Service (QoS) constraints and achieve desirable performance in response to varying application scenarios. Active Queue Management (AQM) is a promising scheme to support end-to-end Transmission Control Protocol (TCP) congestion control because it enables the sender to react appropriately to the real network situation. Analytical performance models are powerful tools which can be adopted to investigate optimal setting of AQM parameters. Among the existing research efforts in this field, however, there is a current lack of analytical models that can be viewed as a cost-effective performance evaluation tool for AQM in the presence of heterogeneous traffic, generated by various network applications. This thesis aims to provide a generic and extensible analytical framework for analyzing AQM congestion control for various traffic types, such as non-bursty Poisson and bursty Markov-Modulated Poisson Process (MMPP) traffic. Specifically, the Markov analytical models are developed for AQM congestion control scheme coupled with queue thresholds and then are adopted to derive expressions for important QoS metrics. The main contributions of this thesis are listed as follows: • Study the queueing systems for modeling AQM scheme subject to single-class and multiple-classes Poisson traffic, respectively. Analyze the effects of the varying threshold, mean traffic arrival rate, service rate and buffer capacity on the key performance metrics. • Propose an analytical model for AQM scheme with single class bursty traffic and investigate how burstiness and correlations affect the performance metrics. The analytical results reveal that high burstiness and correlation can result in significant degradation of AQM performance, such as increased queueing delay and packet loss probability, and reduced throughput and utlization. • Develop an analytical model for a single server queueing system with AQM in the presence of heterogeneous traffic and evaluate the aggregate and marginal performance subject to different threshold values, burstiness degree and correlation. • Conduct stochastic analysis of a single-server system with single-queue and multiple-queues, respectively, for AQM scheme in the presence of multiple priority traffic classes scheduled by the Priority Resume (PR) policy. • Carry out the performance comparison of AQM with PR and First-In First-Out (FIFO) scheme and compare the performance of AQM with single PR priority queue and multiple priority queues, respectively.
APA, Harvard, Vancouver, ISO, and other styles
41

Chung, Jae Won. "Congestion control for streaming media." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-081805-084831/.

Full text
Abstract:
Dissertation (Ph.D.) -- Worcester Polytechnic Institute.<br>Keywords: streaming media; streaming transport protocol; active queue management (AQM); Internet congestion control. Includes bibliographical references (p. 236-248).
APA, Harvard, Vancouver, ISO, and other styles
42

Lindsey, Summer Elyse. "Women's Security After War: Protection and Punishment in eastern Democratic Republic of Congo." Thesis, 2019. https://doi.org/10.7916/d8-aqdm-7511.

Full text
Abstract:
Does violence against women increase in the aftermath of war? If so, why? Scholars and policy-makers have begun to ask questions about violence against women in the post-conflict space, yet complexities in measurement and a focus on outcomes (rather than mechanisms) leave essential questions unanswered. This dissertation refines and scopes these questions to learn about whether, how, and why the social context that supports violence against women changes as a result of war. The central argument of this dissertation is that armed conflict fosters protective masculine norms that, in turn, affect how communities socially sanction or punish local crimes, including violence against women. Drawing insights from feminist theory, economics, social psychology and political science, the theory of protective masculine norms describes a process by which the gendered nature of protection and exigencies of community security lead communities to choose more severe punishment for public crimes deemed to threaten their communities. Protection tradeoffs, however, also lead people to choose less severe punishment for other "private" crimes. I derive and examine the observable implications of this theory in the context of eastern DR Congo, a place where there are high levels of violence against women that has also been exposed to high levels of insecurity associated with armed violence in the distant and recent past. Chapter 1 lays the framework for the dissertation; describing the social nature of violence against women, processes of norm change, the research approach, and the derivation of protective masculine norms theory. Then, because protective masculine norms are broadly shared across societies, Chapter 2 investigates the nature of war, law, and punishment processes in eastern DR Congo to understand how the theory and findings travel to other contexts. Chapter 3 motivates the theory of protective masculine norms by providing the empirical foundation for differentiating between forms of violence against women and placing them in a framework with other crimes. Contrary to prominent theories about empowerment, backlash and violent masculinities; armed conflict fails to affect preferences for punishing rape and domestic violence in a unidirectional way. Armed conflict increases how severely people prefer to punish rape and stealing, but decreases how severely people prefer to punish domestic violence. The qualitative evidence underscores the relevance of disaggregating crimes against women in terms of public community threats and private crimes. Chapter 4 explicates the theory of protective masculine norms, grounding it in the literature and in the case. I examine the quantitative and descriptive evidence related to alternative hypotheses that may account for armed conflict's effects: exposure to wartime crimes, security structures and demographic change. Finding little support for alternative theories, I describe the design of and results from qualitative work probing central propositions within protective masculine norms theory: Protection is gendered, people have shared memories of conflict incidents, this affects their subsequent behaviors, and internal crimes are related to perceived provision of protection. Since sanctioning is a public act subject to group dynamics and norms, Chapter 5 examines the implications of protective masculine norms and the findings about preference change for how groups choose to punish crimes. Armed conflict may affect how groups choose to punish crimes by changing individual-level preferences, by changing group dynamics, neither, or both. I find that armed conflict affects group preferences primarly through individual-level preference change, underscoring the relevance of preference change for social sanctioning in the aftermath of war. The data also show that group dynamics make people's preferences more extreme, suggesting the importance of norms to shaping preferences - a central tenet of the theory. Chapter 6 discusses the emerging research agenda of protective masculine norms and its contributions. Questions remain about levels of violence against women after war. But, already protective masculine norms has begun to unify a formerly disparate set of findings emerging about armed conflict, domestic violence, and social and legal change.
APA, Harvard, Vancouver, ISO, and other styles
43

Lucas, Pedro Henrique Saraiva. "Development of the server for the NanoSen-AQM Project." Master's thesis, 2019. http://hdl.handle.net/10316/87926.

Full text
Abstract:
Dissertação de Mestrado em Engenharia Informática apresentada à Faculdade de Ciências e Tecnologia<br>With the advancements in technology and medicine, there has been an increase in interest in air pollution, which is associated with many health problems. Air quality monitoring platforms have been created, although they only use data from governmental stations, gathered by expensive sensors. Platforms using low-cost sensors would increase the coverage of air quality monitoring, yet, since measurements collected by these sensors are not always reliable, such systems could not guarantee the quality of their data. A possible solution is to develop a platform that performs quality control on data gathered by sensors. The NanoSen-AQM project proposes the development of an air quality monitoring platform, which uses measurements originated from low-cost nanosensors (also developed by the project). In order to guarantee the quality of the measurements, the platform calibrates the data received from the sensors. This curricular internship has the objective of developing the server (the back-end) for the NanoSen-AQM project. The development encompasses research on the state of the art (concepts, technologies, and systems related to the project), leading to the definition of the requirements and the architecture. The implementation of the system is also performed, along with its testing, and, finally, its deployment for production. The NanoSen-AQM project will demonstrate that it is possible to monitor air quality economically, allowing the public to have access to air pollution data in its daily routine.<br>Com o avanço da tecnologia e da medicina, houve um crescente interesse pela poluição atmosférica, associada a vários problemas de saúde. Foram criadas plataformas de monitorização da qualidade do ar, porém, estas apenas utilizam dados de estações governamentais, captados através de sensores dispendiosos. Plataformas que utilizassem sensores de baixo custo aumentariam a cobertura da monitorização da qualidade do ar, no entanto, como as medidas captadas por estes sensores nem sempre são fiáveis, tais sistemas não conseguiriam garantir dados de qualidade. Uma possível solução é desenvolver um sistema que realize controlo de qualidade aos dados recolhidos por estes sensores. O projeto NanoSen-AQM propõe o desenvolvimento de uma plataforma de monitorização da qualidade do ar, que utilize dados provenientes de nanosensores de baixo custo (também desenvolvidos pelo projeto). De forma a garantir a qualidade das medições, a plataforma efetua calibrações nos dados recebidos dos sensores. Este estágio curricular tem como objetivo o desenvolvimento do servidor do NanoSen-AQM (o back-end). O desenvolvimento engloba uma pesquisa pelo estado da arte (conceitos, tecnologias e sistemas relacionados com o projeto), levando à definição dos requisitos e da arquitetura. A implementação do sistema também é efetuada, assim como os testes ao mesmo e, por fim, o seu deployment para produção. O projeto NanoSen-AQM demonstrará que é possível monitorizar a qualidade do ar de forma económica, permitindo ao público ter acesso a informações confiáveis da qualidade do ar na sua rotina diária.<br>Universidade de Coimbra - Bolsa de Investigação para Licenciado, durante um período de 6 meses (Fevereiro a Julho de 2019).
APA, Harvard, Vancouver, ISO, and other styles
44

Huang, Mieng Jang, and 黃銘璋. "Improving AQM Performance and Fairness in Networks with Multiple Bottlenecks." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/73119800277526681901.

Full text
Abstract:
碩士<br>國立中興大學<br>資訊科學研究所<br>93<br>In the thesis, we investigate the net work stability and fairness problem in a network with multiple bottleneck. We also propose a mechanism, which is based on the well known AQM algorithm – RED, to improve the performance, in terms of queue stability, link utilization, and flow fairness, of the network with multiple bottleneck. As we know, little research has been done in this area so far. In a multiple-bottleneck network, where more than two links get into congestion at the same time, when a flow passes through more than one RED queues, it will be controlled by all these AQMs at the same time. Packets belonging to this kind of flows will apt to be discarded more easily than those passing single bottleneck. Since all routers perform AQM function independently, coincidently control will also make network unstable, both in queue length and link utilization. We study above problems with in-depth experiments using a well known simulator NS2. We propose a novel AQM mechanism which uses multiple virtual queues in a single FIFO RED queue. Each virtual queue is assigned a different priority, and service is provided according to the priority assigned. We also use NS2 to prove the effectiveness of this approach.
APA, Harvard, Vancouver, ISO, and other styles
45

Silva, Jorge André Dias. "Development of a client application for the NanoSen-AQM project." Master's thesis, 2019. http://hdl.handle.net/10316/87313.

Full text
Abstract:
Dissertação de Mestrado em Engenharia Informática apresentada à Faculdade de Ciências e Tecnologia<br>Com o avanço da tecnologia e o aumento da poluição a nível mundial, houve um crescimento na procura de meios para o acompanhamento e monitorização da qualidade do ar, uma vez que afeta diretamente a população. Devido a essa procura, surgiram ideias de novas plataformas e sistemas, de forma a fornecer alguns dados relativos à poluição do ar aos utilizadores das mais diversas maneiras. Assim, e com o objetivo de satisfazer as necessidades da população, em tempo real, de forma simples, sustentável e eficiente, foi criado o projeto NanoSen-AQM. O projeto desenvolveu assim, juntamente com um cliente front-end, um sistema, com base em vários componentes eletrónicos que suportam o uso de sensores low-cost de baixo consumo, espalhados por diferentes localizações no território europeu para medir a quantidade de poluentes no ar. Devido ao facto dos sensores serem de baixo custo, pequenos, leves e simples de utilizar, o sistema é facilmente integrável em estações, unidades móveis e equipamentos pessoais de medição de poluição do ar e, portanto, adequado para uso em redes de sensores.Sendo o foco deste documento a realização do cliente front-end do projeto, integrado com o sistema backend, são apresentados todos os passos do seu desenvolvimento, passando pelo estado da arte, análise de requisitos, arquitetura, prototipagem, desenvolvimento, testes e documentação. Por fim, tem-se uma plataforma front-end híbrida e dinâmica, que se destaca perante as restantes no mercado, que abrange os utilizadores web e mobile (Android e iOS) de maneira eficiente, que permite a rápida análise e registo da qualidade do ar em qualquer zona da Europa.<br>With the evolution of technology and the increase of pollution worldwide, there has been a growth in the search for means to follow and monitor air quality, since it directly affects the population. Due to this demand, ideas of new platforms and systems have emerged in order to provide some data on air pollution to users in a variety of ways. Thus, in order to meet the needs of the population in a real-time, simple, sustainable and accessible way, the NanoSenAQM project was created. To achieve this, the project developed, along with a front-end customer, a system based on various electronic components that support the use of low-cost low-consumption sensors scattered across different European locations to measure the amount of pollutants in the air. Since the sensors are low cost, small, light and simple to use, the system is easily integrable into stations, mobile units and personal air pollution measurement equipment and therefore suitable for use in sensor networks. The focus of this document is the realization of the front-end client of the project, integrated with the backend system, presenting all the steps of its development, going through the state of the art, requirements analysis, architecture, prototyping, development, testing and documentation.Lastly, there is a hybrid and dynamic front-end platform, which stands out from the rest in the market, which covers web and mobile users (Android and iOS) in an efficient way, allowing rapid analysis and recording of air quality in any part of Europe.<br>Universidade de Coimbra - Bolsa de Investigação para Licenciado, durante um período de 6 meses (Fevereiro a Julho de 2019).
APA, Harvard, Vancouver, ISO, and other styles
46

Chu, Hsiu-Yuan, and 朱秀媛. "Fuzzy Modeling and Control of TCP Network with AQM Routers." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/91100696483590420586.

Full text
Abstract:
碩士<br>國立臺灣海洋大學<br>資訊工程學系<br>93<br>Transmission Control Protocol (TCP) is the protocol of choice for the widely used World Wide Web (HTTP), file transfer (FTP), TELNET, and email (SMTP) applications, because it provides a reliable transmission service. The TCP flow and congestion control mechanisms are designed to determine the capacity available in the network and vary its transmission rate accordingly. The sender maintains a congestion window state variable, which represents the amount of data that can be transmitted at a given time according to the congestion in the network. Recently, Active Queue Management (AQM) has been proposed to support the end-to-end congestion control for TCP traffic regulation on the Internet. For the purpose of alleviating congestion for IP networks and providing some notion of Quality of Service (QoS), the AQM schemes are designed to improve the Internet applications. Earliest efforts on AQM, e.g., Random Early Detection (RED), are essentially heuristic without systematic analysis. The fluid-flow based dynamic models of TCP network make it possible to design AQM routers using feedback control theory. This TCP/AQM dynamic model was derived using delay differential equations. The model based AQM control methodologies were almost developed based on the linearized TCP network. In this thesis, we attempt to use the Takagi-Sugeno (T-S) fuzzy modeling technique to construct an approximate mathematical model for the nonlinear TCP network with AQM routers. It will be shown that the T-S fuzzy model can provide better approximation than the linearized dynamic model. Based on the well built T-S fuzzy model, a T-S fuzzy controller design approach is sequentially developed for the AQM routers. The proposed T-S fuzzy control scheme is accomplished by using the Parallel Distributed Compensation (PDC) concept and Iterative Linear Matrix Inequality (ILMI) algorithm. At last, we will show the effectiveness of our approach compared to widely used AQM schemes through numerical simulations.
APA, Harvard, Vancouver, ISO, and other styles
47

Wu, Hsien-Ming, and 吳賢明. "Design of Customer-Based AQM Routers for Improving Inter-Server Fairness." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/19137128110345381837.

Full text
Abstract:
博士<br>國立中興大學<br>資訊科學與工程學系<br>96<br>Active queue management (AQM) has been extensively investigated in the past decade. Many algorithms for AQM have been proposed. They focus on issues including the design of congestion control mechanisms and fair sharing of bandwidth. These algorithms attempt to mitigate network congestion under the cooperation of responsive or TCP-friendly source protocols. Some of these algorithms can also ensure congestion control performance with unresponsive flows. However, these algorithms are primarily intended to characterize the congestion control performance and bandwidth allocation fairness on a per-flow basis. They do not adequately address and treat the issue on fair sharing of bandwidth among servers; which is called inter-server fairness in this dissertation. In this dissertation, we address the issue on fair bandwidth sharing among servers, and present two novel AQM algorithms for solving the inter-server fairness problem. The first algorithm is named Server Fairness RED or SF-RED in short, and the second algorithm is called Grouping Virtual Queue or GVQ. Based on the well-known AQM algorithm Adaptive RED (A-RED), we design and develop SF-RED algorithm for providing inter-server fairness services. In SF-RED, packets from a server are scheduled into a dedicated virtual queue, in which the packet dropping probability of a virtual queue is determined only by the occupancy of that virtual queue. To demonstrate the effectiveness of SF-RED, we conduct a series of simulations using an ns-2 simulator. The simulation results show that SF-RED can achieve a high quality of inter-server fairness, which is very close to a totally fair system, in allocating bandwidth among servers. SF-RED adopts an “one virtual queue per-server” policy; making the queue complexity a major disadvantage. However, since virtual queues are assigned only for servers with packets currently queued in the buffer; our simulation results also show that the proposed approach can be deployed easily in a router with limited buffer space. To reduce the queue complexity of SF-RED and ensure the inter-server fairness at the same time, we also propose GVQ as an enhancement of SF-RED. GVQ is a group-based AQM algorithm, in which servers with the equivalent number of active flows are grouped into the same class. Unlike SF-RED, GVQ schedules packets from the servers in the same class into a dedicated virtual queue, and allocates bandwidth to a virtual queue based on its queue weight, which is calculated from the normalized number of servers in that class. A closed-form equation is derived to represent the inter-server fairness as a function of queue complexity. We show that this equation is capable of predicting accurately the inter-server fairness of the network for a given buffer configuration, thus providing a good guideline for the trade-off between desired inter-server fairness and queue complexity of the queuing mechanism design. To demonstrate the effectiveness of GVQ, a set of simulations are also run using an ns-2 simulator. The results of both analysis and simulations well match, and they indicate that high-quality inter-server fairness can be achieved by using only a small number of virtual queues. A series of simulations are also undertaken to compare the congestion control performance of RED, SF-RED and GVQ. These results indicate that the proposed algorithms outperform RED in stabilizing the queue size, while it is still capable of maintaining a high level of inter-server fairness. SF-RED and GVQ can stabilize a congested queue at a target length, enabling them to achieve low packet delay and packet loss which are considered two major performance metrics of an AQM router.
APA, Harvard, Vancouver, ISO, and other styles
48

Lin, Hong-Wu, and 林宏武. "Fuzzy AQM Router Design for Varying Numbers of Session TCP Networks." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/67855150749937720951.

Full text
Abstract:
碩士<br>國立臺灣海洋大學<br>資訊工程學系<br>94<br>Networking researchers have proposed AQM (Active Queue Management) to be used in routers to meet the increasing demand of Internet applications. The dynamic model of Transmission Control Protocol (TCP) has been proposed such that three key parameters, number of TCP sessions, link capacity and round-trip time are related to a feedback control problem. However the number of TCP sessions was assumed to be constant in previous work. In fact, the number of sessions TCP network could vary from time to time. In this paper, we will develop a Takagi-Sugeno(T-S) fuzzy model to simulate the behavior of varying numbers of session nonlinear TCP network. Then we will use the T-S fuzzy model to design the fuzzy controller. The result of simulations shows that our controller provides robust performance when the number of TCP sessions varies from time to time.
APA, Harvard, Vancouver, ISO, and other styles
49

楊正婷. "Fuzzy Based Congestion Control of TCP/AQM Router with Varying TCP Sessions." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/76730685393922189303.

Full text
Abstract:
碩士<br>國立臺灣海洋大學<br>輪機工程系<br>101<br>Due to the number of transmission control protocol sessions usually varies from time to time, a perturbed Takagi-Sugeno fuzzy model is used in this paper to represent the dynamic transmission control protocol network systems. According to the proposed perturbed Takagi-Sugeno fuzzy model, a robust fuzzy controller design approach is investigated to achieve the congestion avoidance for the transmission control protocol network systems. In order to accomplish the above mission, some sufficient conditions are derived based on the Lyapunov stability theory. By solving these sufficient stability conditions, an active queue management router can be obtained via the proposed robust fuzzy congestion control technique.
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Chang-Kuo, and 陳昌國. "Design and Implementation of Robust Congestion Controllers for TCP /AQM Communication Networks." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/64960015034683439833.

Full text
Abstract:
博士<br>國立成功大學<br>工程科學系專班<br>97<br>Communication network is an essential part for many applications in science and engineering. In particular, based on different network architectures such as various nodes, transmission links and traffic sources, the Internet becomes a global network connecting millions of users and hosts. Unfortunately, the heterogeneity and the large scale in current networks give rise to complexity and difficulty for network management and control. Moreover, the quality of service (QoS) cannot be guaranteed since the number of users is increasing rapidly and there is some unpredictable interference in the networking environment. Therefore, traffic congestion turns out to be one of the major communication problems in Internet. Consequently, how to avoid congestions at bottleneck nodes by regulating the traffic flow and achieve the better performance has been an important issue over the last decade. In this dissertation, the robust congestion controller design and implementation problems are investigated for TCP communication networks. Firstly, a rate-based non-linear TCP network system with a saturated input is addressed. Based on variable structure control (VSC) and nonlinear output feedback approaches, different kinds of active queue management (AQM) controllers are presented to achieve the desired queue size and to guarantee the asymptotic stability of the closed-loop system. Secondly, the linear feedback congestion controller design for nonlinear window-based time-delay TCP network system with input saturation is discussed. An improved genetic algorithm (GA) based Proportional-Integral-Derivative (PID) controller is proposed to guarantee the performances for TCP/AQM networks. State feedback control strategies are developed for the linearized time-delay TCP/AQM model with input saturation. Thirdly, due to the stochastic properties of the communication network, a robust stochastic AQM controller is proposed to guarantee the robustly asymptotically stable in the mean square applying the Lyapunov–Krasovskii functional approach and the linear matrix inequality (LMI) technique, while achieving the prescribed H-infinite disturbance rejection attenuation level of the closed-loop stochastic TCP/AQM system. Finally, based on the Linux platform, the proposed congestion controllers has been implemented to verify the theoretical results. Some illustrative examples performed in the NS2 are given to demonstrate the effectiveness of our main results.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!