Academic literature on the topic 'Threshold dose distributions'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Threshold dose distributions.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Threshold dose distributions"

1

Blom, W. Marty, Berber J. Vlieg-Boerstra, Astrid G. Kruizinga, Sicco van der Heide, Geert F. Houben, and Anthony E. J. Dubois. "Threshold dose distributions for 5 major allergenic foods in children." Journal of Allergy and Clinical Immunology 131, no. 1 (2013): 172–79. http://dx.doi.org/10.1016/j.jaci.2012.10.034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ballmer-Weber, Barbara K., Montserrat Fernandez-Rivas, Kirsten Beyer, et al. "How much is too much? Threshold dose distributions for 5 food allergens." Journal of Allergy and Clinical Immunology 135, no. 4 (2015): 964–71. http://dx.doi.org/10.1016/j.jaci.2014.10.047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Deale, O. C., R. C. Wesley, D. Morgan, and B. B. Lerman. "Nature of defibrillation: determinism versus probabilism." American Journal of Physiology-Heart and Circulatory Physiology 259, no. 5 (1990): H1544—H1550. http://dx.doi.org/10.1152/ajpheart.1990.259.5.h1544.

Full text
Abstract:
The gradual transitions that are found between unsuccessful and successful shock strengths in percent success or dose-response curves suggest that defibrillation is a probabilistic phenomenon. This concept appears to be reinforced by the fact that a frequency distribution is observed in defibrillation threshold data and that a dose-response relationship is also obtained by integration of the frequency distribution. The purpose of this study was to investigate whether a deterministic threshold model (based on experimental results) could produce 1) gradual transitions in dose-response curves, and 2) a threshold frequency distribution for individual subjects. In the experimental phase of the study, a linear deterministic relationship was found between transthoracic threshold current and defibrillation episode number (other variables held constant) in pentobarbital-anesthetized dogs. The correlation coefficient for each dog was between 0.77 and 0.98 (P less than 0.01), and both positive and negative slopes were found. Based on these results, threshold current was modeled for computer simulation as a linear function of episode number. The model was thus purely deterministic with no random variability. For each simulated experiment, several parameters were varied: order of shocks (increment, decrement, random order), slope of threshold function, and percent error of the initial threshold. Several hundred computer simulations were performed to determine the effect of varying these parameters. In all cases, threshold-frequency distributions and sigmoidal dose-response curves with gradual transitions were produced. The results of this investigation demonstrate that the apparent probabilistic behavior of defibrillation can be produced by a deterministic relationship.
APA, Harvard, Vancouver, ISO, and other styles
4

Deasy, Joseph O., M. Victor Wickerhauser, and Mathieu Picard. "Accelerating Monte Carlo simulations of radiation therapy dose distributions using wavelet threshold de-noising." Medical Physics 29, no. 10 (2002): 2366–73. http://dx.doi.org/10.1118/1.1508112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sarycheva, Svetlana S. "Patients skin dose measurements during interventional radiological examinations using Gafchromic XR-RV3 FILM." Radiatsionnaya Gygiena = Radiation Hygiene 12, no. 4 (2020): 89–95. http://dx.doi.org/10.21514/1998-426x-2019-12-4-89-95.

Full text
Abstract:
This work is devoted to the assessment of the absorbed dose in the skin of patients undergoing interventional radiological examinations. There is a probability of deterministic effects in the skin of patients due to high doses of radiation during this type of medical examinations. The aim of this work was to conduct direct measurements of the absorbed dose in the skin of patients undergoing interventional radiological procedures using special dosimetric radiochromic films Gafchromic XR-RV3 to visualize the distribution of radiation over the patient’s skin surface and to study the possibility of exceeding the threshold values for the deterministic effects in the skin. The paper discusses the features of measurements with GafchromiC XR-RV3 films. The method of film digitization on conventional flatbed scanner and image processing with the ImageJ program was tested. The obtained calibration curve for this type of film was presented. The skin dose distributions for several interventional studies obtained with radiochromic films were presented. The measured value of the maximum absorbed dose in the skin for four of the fourteen analyzed procedures exceeded the threshold value of the absorbed dose for the occurrence of skin erythema in 2 Gy. The highest values of the maximum absorbed dose in the skin were obtained for the coronary angioplasty – 3.2 Gy and for the procedure of embolization of the uterine arteries – 2.9 Gy.
APA, Harvard, Vancouver, ISO, and other styles
6

Doi, Hiroshi, Hiroya Shiomi, Norihisa Masai, et al. "Threshold doses and prediction of visually apparent liver dysfunction after stereotactic body radiation therapy in cirrhotic and normal livers using magnetic resonance imaging." Journal of Radiation Research 57, no. 3 (2016): 294–300. http://dx.doi.org/10.1093/jrr/rrw008.

Full text
Abstract:
Abstract The purpose of the present study was to investigate the threshold dose for focal liver damage after stereotactic body radiation therapy (SBRT) in cirrhotic and normal livers using magnetic resonance imaging (MRI). A total of 64 patients who underwent SBRT for liver tumors, including 54 cirrhotic patients with hepatocellular carcinoma (HCC) and 10 non-cirrhotic patients with liver metastases, were analyzed. MRI was performed 3−6 months after SBRT, using gadolinium-ethoxybenzyl-diethylenetriamine pentaacetic acid-enhanced T1-weighted sequences. All MRI datasets were merged with 3D dosimetry data. All dose distributions were corrected to the biologically effective dose using the linear–quadratic model with an assumed α/β ratio of 2 Gy. The development of liver dysfunction was validly correlated with isodose distribution. The median biologically effective dose (BED 2 ) that provoked liver dysfunction was 57.3 (30.0−227.9) and 114.0 (70.4−244.9) Gy in cirrhotic and normal livers, respectively ( P = 0.0002). The BED 2 associated with a >5% risk of liver dysfunction was 38.5 in cirrhotic livers and 70.4 Gy in normal livers. The threshold BED 2 for liver dysfunction was not significantly different between Child−Pugh A and B patients ( P = 0.0719). Moreover, the fractionation schedule was not significantly correlated with threshold BED 2 for liver dysfunction in the cirrhotic liver ( P = 0.1019). In the cirrhotic liver, fractionation regimen and Child−Pugh classification did not significantly influence the threshold BED 2 for focal liver damage after SBRT. We suggest that the threshold BED 2 for liver dysfunction after SBRT is 40 and 70 Gy in the cirrhotic and normal liver, respectively.
APA, Harvard, Vancouver, ISO, and other styles
7

Akazawa, K., H. Doi, S. Ohta, et al. "Relationship between Eustachian tube dysfunction and otitis media with effusion in radiotherapy patients." Journal of Laryngology & Otology 132, no. 2 (2018): 111–16. http://dx.doi.org/10.1017/s0022215118000014.

Full text
Abstract:
AbstractObjective:This study evaluated the relationship between radiation and Eustachian tube dysfunction, and examined the radiation dose required to induce otitis media with effusion.Methods:The function of 36 Eustachian tubes in 18 patients with head and neck cancer were examined sonotubometrically before, during, and 1, 2 and 3 months after, intensity-modulated radiotherapy. Patients with an increase of 5 dB or less in sound pressure level (dB) during swallowing were categorised as being in the dysfunction group. Additionally, radiation dose distributions were assessed in all Eustachian tubes using three dose–volume histogram parameters.Results:Twenty-two of 25 normally functioning Eustachian tubes before radiotherapy (88.0 per cent) shifted to the dysfunction group after therapy. All ears that developed otitis media with effusion belonged to the dysfunction group. The radiation dose threshold evaluation revealed that ears with otitis media with effusion received significantly higher doses to the Eustachian tubes.Conclusion:The results indicate a relationship between radiation dose and Eustachian tube dysfunction and otitis media with effusion.
APA, Harvard, Vancouver, ISO, and other styles
8

Remington, Benjamin C., Joost Westerhout, Anthony E. J. Dubois, et al. "Suitability of low‐dose, open food challenge data to supplement double‐blind, placebo‐controlled data in generation of food allergen threshold dose distributions." Clinical & Experimental Allergy 51, no. 1 (2020): 151–54. http://dx.doi.org/10.1111/cea.13753.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hiçsönmez, Ayşe, Yıldız Güney, Ayşen Dizman, et al. "Challenges and Differences in External Radiation Therapy for Retinoblastoma: From Standard Techniques to New Developments." Tumori Journal 103, no. 5 (2015): 438–42. http://dx.doi.org/10.5301/tj.5000406.

Full text
Abstract:
Aims The purpose of this study is to calculate the treatment plans and to compare the dose distributions and dose-volume histograms (DVH) for 6 external radiotherapy techniques for the treatment of retinoblastoma as well as intensity-modulated radiotherapy (IMRT) and fractionated stereotactic radiotherapy (Cyberknife). Methods Treatment plans were developed using 6 techniques, including an en face electron technique (ET), an anterior and lateral wedge photon technique (LFT), a 3D conformal (6 fields) technique (CRT), an inverse plan IMRT, tomotherapy, and conventional focal stereotactic external beam radiotherapy with Cyberknife (SBRT). Dose volume analyses were carried out for each technique. Results All techniques except electron provided similar target coverage. When comparing conformal plan with IMRT and SBRT, there was no significant difference in planning target volume dose distribution. The mean volume of ipsilateral bony orbit received more than 20 Gy, a suggested threshold for bone growth inhibition. The V20 Gy was 73% for the ET, 57% for the LFT, 87% for the CRT, 65% for the IMRT, 66% for the tomotherapy, and 2.7% for the SBRT. Conclusions This work supports the potential use of IMRT and SBRT to spare normal tissues in these patients.
APA, Harvard, Vancouver, ISO, and other styles
10

Fritsch, P., N. Dudoignon, K. Guillet, G. Rateau, and J. Delforge. "Influence de la distribution de dose sur le risque d'apparition de cancers pulmonaires après inhalation d'oxydes d'actinides." Canadian Journal of Physiology and Pharmacology 80, no. 7 (2002): 722–26. http://dx.doi.org/10.1139/y02-098.

Full text
Abstract:
The aim of this work was to estimate risk of lung tumour occurrence after inhalation of actinide oxides from published studies and rat studies in progress. For the same delivered dose, the risk increases when homogeneity of irradiation increases, i.e., the number of particles deposited after inhalation increases (small particles and (or) low specific alpha activity). The dose–effect relationships reported appear linear up to a few gray, depending on the aerosol considered, and then the slope decreases. This slope, which corresponds with the risk, can vary over one order of magnitude depending on the aerosol used. An effective threshold at about 1 Gy was not observed for the most homogeneous dose distributions. A dosimetric and biological approach is proposed to provide a more realistic risk estimate.Key words: actinide oxides, inhalation, lung tumour, alpha irradiation, dosimetry.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Threshold dose distributions"

1

Faria, Clara Maria Gonçalves de. "Distribuições de limiar de dose e suas causas e consequências em Terapia Fotodinâmica." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/76/76132/tde-18052017-083829/.

Full text
Abstract:
O princípio de terapia fotodinâmica (TFD) foi introduzido por volta de 1900 mas posteriormente investgado como candidato para tratamento de cancer na década de 1970. Desde então, existem diversos trabalhos a respeito do assunto in vitro, in vivo e em estudos clínicos e grandes avanços foram alcançados. Entretanto, alguns desafios ainda não foram superados, como a variabilidade dos resultados. Este trabalho consiste na investigação de suas causas, em que o principal objetivo é avançar o estado da arte em TFD. Para isso foi usado um modelo de distribuição de limiar de dose para avaliar resistência em TFD in vitro. As distribuições de limiar de dose são obtidas pela derivação a curva de dose resposta experimental. Elas são caracterizadas pela sua largura e pela dose que corresponde ao pico, que se relaciona a homogeneidade e resistência intrínseca da população, respectivamente. Na seção 1, é apresentada a avaliação e comparação de dados obtidos de resultados publicados na literatura e, na seção 2, de experimentos realizados pela autora em diferentes linhagens celulares. Da análise da primeira etapa, foi observado que a largura da distribuição é proporcional a dose do pico e foi possível investigar a dependência do resultado da TFD com a linhagem celular, dado um fotossensibilizador (FS). Foi interessante, também, notar que as distribuições de limiar de dose correspondem a curvas de atividade de marcadores celulares de apoptose, como função da dose de luz, para a maior parte das condições analisadas. Dos experimentos realizados pela autora, foi visto que as células normais são as mais resistentes ao dano, seguida das células de câncer resistentes e sua linhagem parental, e que sua resposta foi a mais homogênea. Essas observações foram corroboradas pelas imagens obtidas de microscopia de fluorescência para avaliação da captação de FS, que mostraram que as células tumorais acumulam mais FS que as outras. Portanto, foi mostrado o potencial de se aplicar distribuições de limiar de dose na análise de resultados de TFD in vitro, ela é uma poderosa ferramenta que fornece mais informações que as curvas de dose resposta padrão.<br>The principle of photodynamic therapy (PDT) was introduced around 1900 but further investigated as a candidate to cancer treatment in the 1970´s. Since then, there are several papers regarding the subject in vitro, in vivo and clinical trials and great advances were achieved. However, some challenges were not yet overcome, such as results variability. This work consists in the investigation of its causes, where the main goal is to advance the state of art of PDT. For that it is being used a threshold dose distribution model to evaluate cell resistance to PDT in vitro. The threshold distributions are obtained by differentiating the experimental dose response curve. They are characterized by its width and the dose that corresponds to the peak which relates to the homogeneity and intrinsic resistance of the population, respectively. In section 1, it is presented the evaluation and comparison of data obtained from published results in literature and, in section 2, of experiments performed by the author in different cell lines. From the analysis in the first part, it was observed that the width of the distribution is proportional to its dose of the peak and it was possible to investigate the dependence of the PDT result with the cell line, given a fixed photosensitizer (PS). It was also interesting to note that the threshold distribution corresponded to the activity curves for apoptotic cell markers as a function of light dose, for most of the conditions analyzed. From the experiments performed by the author, it was seen that the normal cell line was the most resistant one, followed by the resistant cancer cells and its parental cell line, and that its response was more homogeneous. Those finding were supported by the fluorescence microscopy images obtained to evaluate PS uptake, which shown that the tumor cells accumulated more PS than the other ones. Therefore, it was shown the potential of applying the threshold distribution to analyze PDT results in vitro, it is a powerful tool that provides more information than the standard dose response curves.
APA, Harvard, Vancouver, ISO, and other styles
2

Sabino, Luis Gustavo. "Estudo da distribuição de doses limiares em TFD para um modelo de cultura tridimensional de células obtido pelo método de levitação magnética." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/76/76132/tde-03022015-164253/.

Full text
Abstract:
Um conceito central na dosimetria da terapia fotodinâmica (TFD) é o limiar de dose (Dth do inglês threshold dose). O Dth é definido como a quantidade mínima de luz que deve ser absorvida pelas moléculas de fotossensibilizador (FS) dentro das células malignas a serem tratadas para que ocorra a morte celular por necrose ou apoptose. Os resultados do estudo da captação de FS por células Hep G2 demonstraram que uma população celular de linhagem, cultivada em monocamada, apresenta captação de Photogem (PG) heterogênea, ou seja, algumas células têm maior capacidade de captar moléculas de PG, outras células captam o PG em menor quantidade. A captação heterogênea de PG pode ser a causa para fenômenos de seleção de células mais resistentes à TFD. É razoável supor que as subpopulações celulares de uma mesma massa de células malignas possam apresentar diferentes valores de Dth. Definiu-se uma função de distribuição das doses limiares (g()) como uma função de distribuição gaussiana, e para a sua parametrização desenvolveu-se um método para o cultivo in vitro de culturas tridimensionais (culturas 3D), mais fidedignas ao tecido neoplásico maligno que as culturas tradicionais. Utilizando-se o método de levitação magnética (MLM) e o método de impressão magnética (MIM) para a dosimetria da TFD, foi possível parametrizar a g(Dth), investigar a resistência celular à TFD. O MLM demonstrou ser facilmente manuseável na rotina experimental, e consistente para o teste in vitro de novos FS. Comparando-se as culturas MDA-MB-231e Hep G2, obtidas por MLM por mais de 185 horas de levitação, pode-se concluir que as células Hep G2 formaram uma estrutura celular mais densa e que ofereceu mais resistência ao dano causado pela TFD. É notável como as células Hep G2 retomaram o crescimento, e restabeleceram-se em cultura de modo semelhante ao grupo controle, por meio da reconstrução da matriz extracelular (MEC). Enquanto isso, no caso das células MDA-MB-231, a integridade da cultura não foi restabelecida após a aplicação da TFD, formando uma cultura fragmentada. O dano mais evidente, para ambas as culturas, foi observado nas margens dos tumores, evidenciando que os componentes importantes da reação fotodinâmica, como o fotossensibilizador, a luz e o oxigênio, estavam presentes em maiores quantidades na superfície da cultura, em comparação às outras regiões tumorais. Os resultados obtidos demonstram que para o Photogem (PG) é necessária uma fluência de luz da ordem de 40 J.cm-2, para que o efeito fotodinâmico promova morte celular nas culturas 3D obtidas pelo MLM. O resultado da combinação de dois tipos celulares, o maligno (MDA-MB-231) e o sadio (HPF), demonstrou que o efeito fotodinâmico é efetivo quando se tem controle adequado da entrega dos agentes da terapia, independentemente do tipo celular. Algumas células sobreviveram ao tratamento, e existe um forte indicativo de que a presença de fibroblastos HPF esteja relacionada a esta pequena parcela de células que receberam dose de luz inferior ao seu Dth. Os resultados demonstraram que quanto maior a fluência de luz, menor é o IC50 do PG. Para a fluência de luz de 60 J.cm-1 obteve-se um IC50 de 3,1 &mu;g.mL-1, e para a fluência de luz de 30 J.cm-1 obteve-se um IC50 de 18,0 &mu;g.mL-1.<br>Photodynamic therapy (PDT) dosimetry includes a central concept: the threshold dose (Dth), which is the minimum amount of light to be absorbed by the photosensitizer (PS) molecules to induce irreversible oxidative damage, and hence to cause cell death by necrosis or apoptosis. It is reasonable to assume that cells subpopulation in one individual tumor cell mass may present different Dth values, which implies the existence of a distribution of Dth values defined by a Gauss distribution function (g(Dth)). Developing methods for more realistic tissue emulation with in vitro cultures, such as three-dimensional (3D) cultures, have been encouraged aiming to avoid the above-mentioned dissimilarities. A 3D cell culture is preferable compared to monolayer cell cultures, because it provides cell-cell and cell-substrate interaction, and makes evaluating a culture and its volumetric dimension (which resembles the tumor morphology) possible. This study also includes the development of a 3D model, using the magnetic levitation method (MLM) and the magnetic printing method (MIM, from Portuguese método de impressão magnetic) for PDT dosimetry. The aim is to define parameters for g(Dth), to investigate cell resistance to PDT, and to achieve a fast and consistent method for new PS in vitro tests. By comparing cultures from the different cell types studied, the ones obtained by MLM for more than 185 hours were found to present a denser cellular structure, which provided improved resistance to PDT-induced damage. Hep G2 cells showed a remarkable behavior by being able to recover culture integrity; meanwhile MDA-MB-231 cells were not able to do so. The most evident damage, for both cell culture types, was observed on tumor margins, showing that the main elements to play a role in PDT reaction (PS, light and oxygen) were present in larger quantities, at culture surface, when compared with internal regions of the cell culture. Results obtained for PG show that a light fluence of about 40 J.cm-2 is required to induce cell death by photodynamic effect on 3D cells obtained by MLM. A combination of two different cell types - a tumor cell line and a healthy cell line - shows clearly that there is no difference for the photodynamic outcome if one holds enough control on the therapeutic parameters. The results presented in this thesis show that even a strain cell population, grown in a monolayer cell culture, results in a non-homogeneous PG uptake, which means that part of the cells seems to be able to collect PG molecules more efficiently than other cells. This difference in collection among cells may be the cause of a selection of cells that are more \"resistant\" to PDT. There were cells that survived treatment, and the presence of HPF fibroblasts might be the cause of these surviving cells, since their Dth might have not been achieved. The results showed that as higher is the light fluence, as lower is the IC50 of PG. The fluence of 60 and 30 J.cm-1 resulted in IC50 of 3.1 and 18.0 &mu;g.mL-1, respectively.
APA, Harvard, Vancouver, ISO, and other styles
3

Sabino, Luis Gustavo. "Modelo matemático de distribuição larga de dose limiar em tumores submetidos a múltiplas sessões de terapia fotodinâmica." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/76/76132/tde-03032010-162858/.

Full text
Abstract:
A Terapia fotodinâmica (TFD) é uma conhecida opção terapêutica para diversos tipos de lesões malignas e não-malignas. A TFD age por meio de uma reação fotoquímica, formando agentes oxidativos que causam inúmeros danos às estruturas subcelulares e posterior morte da célula. Em grande parte dos casos são necessárias várias sessões da TFD para erradicação completa da lesão neoplásica. No entanto, vários estudos clínicos têm sido publicados mostrando recrescimento tumoral e um aumento na resistência do tumor às sessões posteriores da TFD. Neste estudo apresentamos um modelo teórico para descrever os efeitos causados por sucessivas sessões da TFD quando ocorre recrescimento tumoral. Para isso, considera-se uma distribuição de dose limiar que representa a variedade celular de um modelo teórico de tumor. A existência de uma variedade de células com diferentes doses limiares pode ser a causa de uma resposta parcial do tecido à terapia, implicando em recrescimento do tecido tumoral. Neste modelo, assume-se que esta distribuição de dose limiar é representada por uma distribuição Gaussiana modificada. Em termos de dose limiar, valores mais altos implicam em maior resistência à TFD. Se a distribuição é larga, o tratamento não é capaz de eliminar todas as células. A fração de células que sobrevivem promovem o recrescimento tumoral; no entanto, a população de células no tumor recrescido apresenta diferentes características quando comparada com a população de células do tumor original. Para avaliar a ocorrência da seleção das células mais resistentes foi realizada uma investigação sobre as alterações da resposta das células tumorais, após múltiplas sessões da terapia fotodinâmica. Para simular este tipo de procedimento foram realizadas sucessivas sessões da TFD em culturas de células de hepatocarcinoma (HepG2). Entre as sessões de TFD foi aguardado um intervalo de tempo suficiente para que as células sobreviventes se reproduzissem e formassem uma nova cultura celular. O fotossensibilizador utilizado nos experimentos foi o Photogem® e a iluminação realizada em 630 ± 10nm. Os resultados dos experimentos in vitro forneceram evidências do aumento da resistência das células neoplásicas da linhagem HepG2 após sucessivas aplicações da TFD. Este aumento é previsto pelo modelo teórico e pode estar relacionado com a variação das características da população celular, que é expressa neste modelo pela distribuição de dose limiar. No entanto, o aumento da resistência da população celular à TFD previsto pelo modelo teórico é mais acentuado do que o aumento observado no experimento com culturas celulares, portanto, mais estudos serão necessários para adequar o modelo à condição real. Com base na variabilidade das células tumorais, as simulações demonstraram que a dose de luz insuficiente pode induzir um aumento da resistência do tumor às posteriores sessões da TFD. Este modelo poderá ser utilizado para avaliar qual o tipo de distribuição de dose limiar pode-se encontrar em tumores reais e quais as conseqüências causadas pela atenuação da luz em função da profundidade do tumor. A idéia apresentada neste estudo motivará novos estudos para identificar a importância da distribuição de dose limiar em tumores submetidos à TFD.<br>Photodynamic therapy is a well known treatment option for many types of malignant and nonmalignant lesions. This technique causes cell damage through a photochemical reaction, generating oxidative agents responsible for tumor cell killing. In several cases, multiple PDT-sessions are needed to promote cancer eradication. However, several clinical studies have been reported an increase of tumor resistance after a PDT-session. We present a theoretical model to describe the effects caused by successive PDT sessions based on the consequences of a partial response caused by the threshold dose distribution within the hypothetical tumor. In this model, we assume that this threshold dose distribution is represented by a Modified Gaussian Distribution. In terms of threshold dose, higher values imply higher resistance to PDT. If the distribution is broad, the treatment cannot result in the killing of all tumor cells. The survival cell fraction promotes a tumor regrowth with different characteristics compared to the original cell population. We applied the model in a hypothetical tumor to exemplify the idea here presented. The qualitative analysis extracted from our theoretical model shows a behavior that is in agreement with results obtained in our results from in vitro experiments and several clinical observations. To investigate the occurrence of a selection of higher threshold dose cells, an experiment that evaluated the response of tumor cells after multiple sessions of photodynamic therapy was carried out. To simulate this procedure, successive sessions of PDT in hepatocellular carcinoma cells (HepG2) were performed. A time interval between PDT-sessions was respected to allow surviving cells division, resulting in a new cell culture. The photosensitizer used in the experiments was Photogem® and a 630 ± 10nm irradiation was performed. The result of in vitro experiments provided evidence of increasing resistance of HepG2 cells after successive PDT-sessions. This increase is predicted by the theoretical model and may be related to variations in the tumor cell population, which is expressed by the variation of the distribution of threshold dose, according to the model. However, the increased PDT resistance of the cell population provided by the theoretical model is more pronounced than the one experimentally observed. Based on tumor cell variability, the simulations demonstrated that insufficient light dose can induce an increase in tumor resistance to further PDT sessions. This model maybe used to evaluate which type of threshold dose distribution we can find in real tumors, and the consequences caused by light attenuation observed from the illuminated surface and deeper tumor regions. This proposed model shows relative agreement to clinical literature. However, further experimental observations shall improve the model here presented. The idea presented in this study shall motivate further studies to identify the importance of cell threshold distribution in tumors submitted to PDT techniques.
APA, Harvard, Vancouver, ISO, and other styles
4

Garção, Tatiana Cristina Soares. "Avaliação empírica do risco de mercado: estimação do Value-at-risk pela Teoria dos Valores Extremos." Master's thesis, 2017. http://hdl.handle.net/10451/31902.

Full text
Abstract:
Tese de mestrado em Matemática Financeira, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2017<br>Nos últimos anos, os mercados financeiros têm apresentado comportamentos que se têm traduzido em perdas avultadas em especial para as instituições financeiras. Nesse sentido, os órgãos reguladores têm fomentado a implementação de metodologias de prevenção e gestão de risco. Ao nível das métricas mais populares para medir o risco encontram-se atualmente o Value-at-Risk (VaR), contudo as metodologias tradicionais de cálculo do VaR supõem normalidade e acomodam mal as ocorrências extremas da distribuição dos retornos. Pois é conhecido o fato de que distribuições de retornos das séries financeiras costumam apresentar caudas mais pesadas do que as de uma distribuição normal. O objetivo principal deste trabalho é realçar a importância da Teoria de Valores Extremos (EVT) no cálculo do Value-at-Risk (VaR), investigar a possibilidade das hipóteses exigidas pelas metodologias tradicionais, e a influência que as características tipicamente encontradas em séries financeiras exercem sobre os modelos de cálculo do VaR mais utilizados. Estuda-se, também, a possibilidade de se obter uma melhoria significativa do ponto de vista do controlo do risco, através da utilização da EVT. Foram aplicadas técnicas de backtesting, como os testes de Kupiec (1995) e de Christoffersen (1998), na avaliação da performance dos diferentes modelos de previsão do VaR. São apresentados de uma forma sucinta alguns dos principais resultados ligados à teoria dos valores extremos e também são apresentadas algumas estatísticas que possibilitam a simplificação do processo de reconhecimento de dados de cauda pesada. A modelação da cauda é um assunto de particular interesse, dá-se mais importância a dois métodos de modelação da cauda.<br>In the last years, financial markets have shown some behaviors that resulted on huge losses, especially for financial institutions. In that context, regulators have encouraged the implementation of preventive and management risk methodologies. On the top of most popular metrics to measure risk, we found nowadays Value-at-Risk (VaR), however the traditional VaR calculation methodologies assume normality and hardly accommodate the extreme occurrences of the distribution of returns. Actually, it is quite well known that distributions of returns from the financial series tend to have heavier tails than those of a normal distribution. The main purpose of this dissertation is to enhance the importance of Extreme Value Theory (EVT) in the calculation of Value-at-Risk (VaR), investigate the possibility of the hypotheses required by traditional methodologies, and the influence that the characteristics typically found in financial series on the most widely used VaR calculation models. The possibility of achieving a significant improvement from the point of view of risk control through the use of EVT is also studied. Backtesting technics like Kupiec tests (1995) and Christofferersen (1998) have been applied in the performance evaluation of the different models for forecasting VaR. A short introduction to the main results inherent in the EVT and also, a set of statistics to simplify the recognition process of heavy tailed data is provided. Tail modelling is a subject of particular interest in this dissertation, hence two methods of tail modeling are receiving more importance.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Threshold dose distributions"

1

Laumbach, Robert, and Michael Gochfeld. Toxicology. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190662677.003.0007.

Full text
Abstract:
This chapter describes the basic principles of toxicology and their application to occupational and environmental health. Topics covered include pathways that toxic substances may take from sources in the environment to molecular targets in the cells of the body where toxic effects occur. These pathways include routes of exposure, absorption into the body, distribution to organs and tissues, metabolism, storage, and excretion. The various types of toxicological endpoints are discussed, along with the concepts of dose-response relationships, threshold doses, and the basis of interindividual differences and interspecies differences in response to exposure to toxic substances. The diversity of cellular and molecular mechanisms of toxicity, including enzyme induction and inhibition, oxidative stress, mutagenesis, carcinogenesis, and teratogenesis, are discussed and the chapter concludes with examples of practical applications in clinical evaluation and in toxicity testing.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Threshold dose distributions"

1

Fowler, Timothy, and Timothy Fowler. "Priority, Not Equality, of Welfare." In Liberalism, Childhood and Justice. Policy Press, 2020. http://dx.doi.org/10.1332/policypress/9781529201635.003.0005.

Full text
Abstract:
In this chapter I consider various possible distributive principles, that assess what a fair distribution of welfare would look like. I reject the principle of distributive equality because equality favours levelling down, making the lives of some people go worse while making no one’s life go better. In place I adopt the priority view, which suggests that the concern of justice should be promoting the welfare of the least advantaged children. I then consider the sufficiency principle, which holds that justice is about securing each person ‘enough’ and is unconcerned with advantages above this threshold. I argue that this sufficiency view should be rejected, even in its more plausible moderate forms, but that it does provide a useful intermediary role in working out what are the implications of prioritarianism.
APA, Harvard, Vancouver, ISO, and other styles
2

Prasada Rao Borra, Surya, Kongara Ramanjaneyulu, and K. Raja Rajeswari. "A Robust and Oblivious Watermarking Method Using Maximum Wavelet Coefficient Modulation and Genetic Algorithm." In Modeling and Simulation in Engineering - Selected Problems. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.93832.

Full text
Abstract:
An image watermarking method using Discrete Wavelet Transform (DWT) and Genetic Algorithm (GA) is presented for applications like content authentication and copyright protection. This method is robust to various image attacks. For watermark detection/extraction, the cover image is not essential. Gray scale images of size 512 × 512 as cover image and binary images of size 64 × 64 as watermark are used in the simulation of the proposed method. Watermark embedding is done in the DWT domain. 3rd and 2nd level detail sub-band coefficients are selected for further processing. Selected coefficients are arranged in different blocks. The size of the block and the number blocks depends on the size of the watermark. One watermark bit is embedded in each block. Then, inverse DWT operation is performed to get the required watermarked image. This watermarked image is used for transmission and distribution purposes. In case of any dispute over the ownership, the hidden watermark is decoded to solve the problem. Threshold-based method is used for watermark extraction. Control parameters are identified and optimized based on GA for targeted performance in terms of PSNR and NCC. Performance comparison is done with the existing works and substantial improvement is witnessed.
APA, Harvard, Vancouver, ISO, and other styles
3

"1.2 Method of constant stimuli (Method of frequency) By the Method of Frequency the stimulus range is selected in discrete intervals so that the frequency of positive answers is distributed over the range between 1% and 99%. In general, the frequency of positive res­ ponses either for an individual or for a group, is cumulatively normally distributed over a geometric intensity continuum. The absolute odor thre­ shold can then be defined as the effective dose corresponding to an arbi­ trarily selected frequency of positive responses, ordinarily 50% : ED^: Effective dose at the 50% level. 3.1.3 Signal detection The Signal Detection principle is a determination of the relation­ ship between hits and false alarms. In determining signal detectability, a stimulus or a few stimuli are presented in random order, alternating with noise. Since sensory impressions resulting frcm the presentation of stimulus versus noise are assumed to be normally distributed over the same intensity continuum and to have the same dispersion, the index of detectability d' for p (hits) minus p (false) indicates the extent to which the two distributions overlap. 3.2 Indication of response 3.2.1 "Yes" or "no" response In the classical evaluation yes-no answers are dependent on the sub­ jects1 honesty and motivation among other factors. However, yes-no ans­ wers may be evaluated if they are presented a sufficiently large number of times alternating with blanks. 3.2.2 forced choice technique One method of controlling response perseveration and otter antici­ pation factors is to use a forced choice response indication based on two or more response categories. In the measurement of odors the panelist has to report the temporal position of positive stimuli in a series of randan blanks. If the concentration is below the threshold, the test sub­ jects will guess. As the odorant concentration will increase, the rela­ tive cumulative frequency for identification of the correct sample will be greater. In order to determine the relative odor recognition a cor­ rection must be made. 3.3 Size of stimulus intervals 3.3.1 Concentration intervals In selecting the stimulus continuum in threshold determination, the relation between just noticeable difference in relation to the intensity of stimuli is of interest. In accordance with Weber's law this quotient is assumed to be a constant. Therefore it would appear best to determine absolute thresholds on an intensity continuum in the form of a gecxnetric progression. 3.2.2 Time intervals Because of adaptation processes the exposure time until reaching a decision should be limited. Also the interval between two stimuli must be observed." In Odour Prevention and Control of Organic Sludge and Livestock Farming. CRC Press, 1986. http://dx.doi.org/10.1201/9781482286311-25.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Threshold dose distributions"

1

Faria, Clara Maria, Natalia M. Inada, Cristina Kurachi, and Vanderlei S. Bagnato. "Threshold dose distribution and its causes and consequences in photodynamic therapy (Conference Presentation)." In Optical Methods for Tumor Treatment and Detection: Mechanisms and Techniques in Photodynamic Therapy XXVI, edited by David H. Kessel and Tayyaba Hasan. SPIE, 2017. http://dx.doi.org/10.1117/12.2250640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Raileanu, Valentin. "Utilizarea teoriei valorilor extreme în climatologie." In Starea actuală a componentelor de mediu. Institute of Ecology and Geography, Republic of Moldova, 2019. http://dx.doi.org/10.53380/9789975315593.17.

Full text
Abstract:
The article briefly describes the history and fields of application of the theory of extreme values, including climatology. The data format, the Generalized Extreme Value (GEV) probability distributions with Bock Maxima, the Generalized Pareto (GP) distributions with Point of Threshold (POT) and the analysis methods are presented. Estimating the distribution parameters is done using the Maximum Likelihood Estimation (MLE) method. Free R software installation, the minimum set of required commands and the GUI in2extRemes graphical package are described. As an example, the results of the GEV analysis of a simulated data set in in2extRemes are presented.
APA, Harvard, Vancouver, ISO, and other styles
3

Selker, Ruud, Ping Liu, Erich Jurdik, and Jay Chaudhuri. "Out-of-Roundness of the TurkStream Project Line Pipe." In ASME 2019 38th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/omae2019-96154.

Full text
Abstract:
Abstract When using stochastic data in a probabilistic engineering assessment it is common practice to fit the central portion of the data, using for example the Normal distribution. This generally works for predicting expected behaviour; however, it does not necessarily describe extreme behaviour very well. Extreme value theory and more specifically the peaks-over-threshold method [1] is adopted to assess the extreme behaviour represented by the distribution’s tail. A generalized Pareto distribution is used to fit all samples exceeding a certain threshold value. Estimation of the distribution’s shape parameter provides valuable information on whether the upper tail of the fitted distribution has “finite” (having an endpoint) or “infinite support”. Line pipe out-of-roundness affects weldability, fatigue performance and collapse resistance. Especially for deep water pipelines it is important to meet tight tolerances. Out-of-roundness data were evaluated and show finite support of the upper tail; however, the evidence is not as strong as the evidence that was found for the line pipe material strength [7]. Hence it is important to measure and confirm the out-of-roundness of every pipe joint during manufacture. The degree of tail support can have a major effect on the calculated return period of failure events, particularly when related to low occurrence probabilities.
APA, Harvard, Vancouver, ISO, and other styles
4

Tsuboi, Kazuhiro. "Density Distribution in Stagnation Region of Saffman Equation for Dusty Gas." In ASME 2002 Pressure Vessels and Piping Conference. ASMEDC, 2002. http://dx.doi.org/10.1115/pvp2002-1545.

Full text
Abstract:
We investigate the behaviour of flow field around an obstacle placed in uniform particle flow based on two-fluid Saffman equation. Particle density in the vicinity of the front stagnation point is, in particular, the primary interest in the present study. In the case of small Stokes number, in which particle impingement does not occur, there exists the exact solution of the flow field of particle phase is obtained. Perturbed solution is also obtained in the reciprocal of Stokes number when Stokes number is large enough. Comparison between numerical results and these solutions shows good agreement and the peak of particle density appears near the threshold of partide impingement to the body surface.
APA, Harvard, Vancouver, ISO, and other styles
5

Maeda, Noriyoshi, and Tetsuo Shoji. "Failure Probability Analysis by Probabilistic Fracture Mechanics Based on FRI SCC Model." In ASME 2010 Pressure Vessels and Piping Division/K-PVP Conference. ASMEDC, 2010. http://dx.doi.org/10.1115/pvp2010-25917.

Full text
Abstract:
Failure Probability of a weld by stress corrosion cracking (SCC) in austenitic stainless steel piping was analyzed by probabilistic fracture mechanics (PFM) approach based on electro-chemical crack growth model (FRI model). In this model, crack growth rate da/dt where a is crack depth is anticipated as the rate of chemical corrosion process defined by electro-chemical Coulomb’s law. The process is also related to the strain rate at the crack tip, taking small scale yielding condition into consideration. Derived transcendental equation is solved numerically by iterative method. Compared to the mechanical crack growth equation like Paris’ law for SCC, FRI model can introduce many electro-chemical parameters such as electric current associated with corrosion of newly born SCC crack surface, the frequency of protective film break and mechanical parameters such as stress intensity factor change with time dK/dt. Stratified Monte-Carlo method was introduced which define the cell of sampling space by the ranges of a/c (c is crack length at surface) and the width of K of sampling space, Kw which has to be defined referring to KSCC below which no SCC is caused. Log-normal distributions were anticipated for a/c distribution and K distribution. Parameter survey performed shows that failure probability which is defined as the ratio of crack number whose depth reached 80% of wall thickness to the total crack number depends on many parameters introduced, especially on yielding stress, electric current decay parameter m, strain hardening index n in Ramberg-Osgood equation and dK/dt. From the requirements of FRI model, two types of threshold value of initial crack depth, cracks having smaller depth than this value can not grow, are proposed. Calculated failure probability does not reach 1 when cracks having smaller initial depth than the threshold value are included in the distribution of analyzing cracks.
APA, Harvard, Vancouver, ISO, and other styles
6

Soni, Sunilkumar, Santanu Das, and Aditi Chattopadhyay. "Optimal Sensor Placement for Damage Detection in Complex Structures." In ASME 2009 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. ASMEDC, 2009. http://dx.doi.org/10.1115/smasis2009-1419.

Full text
Abstract:
An optimal sensor placement methodology is proposed based on detection theory framework to maximize the detection rate and minimize the false alarm rate. Minimizing the false alarm rate for a given detection rate plays an important role in improving the efficiency of a Structural Health Monitoring (SHM) system as it reduces the number of false alarms. The placement technique is such that the sensor features are as directly correlated and as sensitive to damage as possible. The technique accounts for a number of factors, like actuation frequency and strength, minimum damage size, damage detection scheme, material damping, signal to noise ratio (SNR) and sensing radius. These factors are not independent and affect each other. Optimal sensor placement is done in two steps. First, a sensing radius, which can capture any detectable change caused by a perturbation and above a certain threshold, is calculated. This threshold value is based on Neyman-Pearson detector that maximizes the detection rate for a fixed false alarm rate. To avoid sensor redundancy, a criterion to minimize sensing region overlaps of neighboring sensors is defined. Based on the sensing region and the minimum overlap concept, number of sensors needed on a structural component is calculated. In the second step, a damage distribution pattern, known as probability of failure distribute, is calculated for a structural component using finite element analysis. This failure distribution helps in selecting the most sensitive sensors, thereby removing those making remote contributions to the overall detection scheme.
APA, Harvard, Vancouver, ISO, and other styles
7

Machado-Damhaug, Ulla, Finn-Christian W. Hanssen, Maren Brunborg, Håkon Storheim, and Sverre Haver. "Air Gap Assessment of Semi-Submersibles: Efficient Utilization of Model Test Data." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18294.

Full text
Abstract:
Abstract In linear air gap analysis of semi-submersibles, the surface elevation is modified by an asymmetry factor to account for the crest-to-trough asymmetry related to wave nonlinearity and the effect of nonlinear diffraction. The asymmetry factor varies with numerous conditions including the position relative to the semi-submersible, the sea state, the relative wave direction and the semi-submersible’s loading condition. Although simplified values for the asymmetry factor are suggested in rules and guidelines such as DNVGL-OTG-13, model tests should ideally be performed in order to get accurate values for each specific design. When considering nonlinear responses, model tests are generally done according to the contour line approach, where several realizations (seeds) of a few critical sea states are performed using the 3-hour maximum from each of these to fit a Gumbel distribution. We here seek to explore if the number of seeds can be optimized when considering the relative wave elevation (upwell) by using the less data-wasteful Peaks-Over-Threshold method.
APA, Harvard, Vancouver, ISO, and other styles
8

Magdoom, K. N., Thomas H. Mareci, and Malisa Sarntinoranont. "Segmentation of Rat Brain MR Images Using Artificial Neural Network Classifier." In ASME 2013 Summer Bioengineering Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/sbc2013-14399.

Full text
Abstract:
Recently MR image based computational models are being developed to assist targeted drug delivery in the brain by helping determine appropriate catheter position, drug dose among others to achieve the desired drug distribution [1–3]. Such a planning might be important to prevent damaging healthier tissues because many of the drugs (e.g. chemotherapeutic agents) are usually toxic and needs to be concentrated in specific regions of interest (e.g. tumor). However, for the image based model to make accurate predictions, it is important to segment the image and assign appropriate tissue properties such as hydraulic conductivity which are known to vary significantly within the brain. For example, it has been experimentally found that drugs injected into brain parenchyma get preferentially transported along the white matter tracts compared to the gray matter regions [4]. Segmenting MR images is a challenging task since the pixel intensities between different regions often overlap, hence traditional approaches based on thresholds might not provide reliable results. In this study, we used multi-layered perceptron (MLP) neural network to segment rat brain MR images into 3 different regions namely white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF).
APA, Harvard, Vancouver, ISO, and other styles
9

Jackson, M. J. "Non-Traditional Micromachining Using Pulsed Liquids." In ASME 2006 International Manufacturing Science and Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/msec2006-21012.

Full text
Abstract:
Non-traditional machining using the energy afforded by pulsed liquid impacts is currently being applied to machining materials at the microscale. This paper discusses the theoretical modelling of liquid impact machining thresholds as a result of temporal and spatial distributions of transient stresses in elastically deformable materials. The model predicts changes in the response characteristics of materials due to an idealised representation of a liquid droplet impacting a plane surface. The analytical approach used does not include the secondary effects of liquid impact and is therefore only applicable to the first stages of impact where the compressibility of the liquid droplet is most significant. The predicted response characteristics are compared with experimental data generated using a specially constructed micromachining center. The predicted response of a model material compare well with the experimental results. The results presented in this paper illustrate the importance of the energy provided by pulsed liquid impacts to remove material at the microscale. The secondary effects of liquid droplet dispersion are also illustrated and the mechanism of material removal during liquid droplet dispersion is described in detail.
APA, Harvard, Vancouver, ISO, and other styles
10

Laurinat, James E., Matthew R. Kesterson, and Steve J. Hensel. "Pressurization Analysis for Flame Heating of a Screw Top Utility Can Loaded With Plutonium Oxide Powder." In ASME 2016 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/pvp2016-63120.

Full text
Abstract:
The documented safety analysis for the Savannah River Site (SRS) evaluates the consequences of a postulated 1000 °C (1273 K) fire in a glovebox. The radiological dose consequences for a pressurized release of plutonium oxide powder during such a fire depend on the maximum pressure that is attained inside the oxide storage containers. To enable evaluation of the dose consequences, temperature and pressure transients have been calculated for exposure of a typical set of storage containers to the fire. The oxide storage configuration selected for analysis is can/bag/can, comprised of oxide powder inside an 8.38E−6 m3 stainless steel B vial inside 0.006 kg of polyethylene bagging inside a one-quart screw top utility can of the type commonly used to package solvents or rubber cements. The analysis accounts for pressurization from gases generated by pyrolysis of the polyethylene bagging and evaporation of moisture adsorbed onto the oxide powder. Results were obtained for different can orientations and different surface fire exposures, with and without initial pressurization of the B vial by hydrogen from the radiolysis of moisture. Based on the results of hydrogen back pressure tests for plutonium oxide powders loaded with moisture, the initial gauge pressure from radiolytic hydrogen was set at a bounding value of 82 psig (5.65E5 Pa). The pressurization analysis credits venting to and from the B vial but does not credit venting or leakage from the can. Calculated maximum gauge pressures inside the utility can range from 1.98E5 Pa for an upright can exposed to fire on only one side, to 7.78E5 Pa for an upright can engulfed by fire. Maximum gauge pressures inside the B vial vary from 1.36E5 to 1.43E6 Pa. Due to the low rate of venting from the B vial into the can gas space, the can pressure is nearly independent of the B vial pressure. Calculated maximum pressures are compared to the utility can burst pressure. In lieu of an analytic structural analysis of the utility cans, burst pressures and leakage rates were measured using compressed nitrogen gas. Leakage of gas through the can lid thread and seams prevented the test apparatus from reaching the burst pressure. To achieve the burst pressure, it was necessary to seal the can lid threads and seams by brazing. The measured gauge burst pressure was 2.50E5 +/− 0.43E5 Pa. The measured burst pressures are lower than the calculated maximum pressure due to fire exposure, indicating that the utility cans could burst during exposure to a 1000 °C (1273 K) fire. Leakage rates were measured for cans initially pressurized to a gauge pressure of 1.24E5 Pa. The measured leakage rates were found to be proportional to the gauge pressure inside the can, with a time constant for leakage of 0.424 +/− 0.010 reciprocal seconds. The leakage time constants follow a threshold Weibull distribution.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Threshold dose distributions"

1

Nolan, Brian, Brenda Gannon, Richard Layte, Dorothy Watson, Christopher T. Whelan, and James Williams. Monitoring Poverty Trends in Ireland: Results from the 2000 Living in Ireland survey. ESRI, 2002. http://dx.doi.org/10.26504/prs45.

Full text
Abstract:
This study is the latest in a series monitoring the evolution of poverty, based on data gathered by The ESRI in the Living in Ireland Surveys since 1994. These have allowed progress towards achieving the targets set out in the National Anti Poverty Strategy since 1997 to be assessed. The present study provides an updated picture using results from the 2000 round of the Living in Ireland survey. The numbers interviewed in the 2000 Living in Ireland survey were enhanced substantially, to compensate for attrition in the panel survey since it commenced in 1994. Individual interviews were conducted with 8,056 respondents. Relative income poverty lines do not on their own provide a satisfactory measure of exclusion due to lack of resources, but do nonetheless produce important key indicators of medium to long-term background trends. The numbers falling below relative income poverty lines were most often higher in 2000 than in 1997 or 1994. The income gap for those falling below these thresholds also increased. By contrast, the percentage of persons falling below income lines indexed only to prices (rather than average income) since 1994 or 1997 fell sharply, reflecting the pronounced real income growth throughout the distribution between then and 2000. This contrast points to the fundamental factors at work over this highly unusual period: unemployment fell very sharply and substantial real income growth was seen throughout the distribution, including social welfare payments, but these lagged behind income from work and property so social welfare recipients were more likely to fall below thresholds linked to average income. The study shows an increasing probability of falling below key relative income thresholds for single person households, those affected by illness or disability, and for those who are aged 65 or over - many of whom rely on social welfare support. Those in households where the reference person is unemployed still face a relatively high risk of falling below the income thresholds but continue to decline as a proportion of all those below the lines. Women face a higher risk of falling below those lines than men, but this gap was marked among the elderly. The study shows a marked decline in deprivation levels across different household types. As a result consistent poverty, that is the numbers both below relative income poverty lines and experiencing basic deprivation, also declined sharply. Those living in households comprising one adult with children continue to face a particularly high risk of consistent poverty, followed by those in families with two adults and four or more children. The percentage of adults in households below 70 per cent of median income and experiencing basic deprivation was seen to have fallen from 9 per cent in 1997 to about 4 per cent, while the percentage of children in such households fell from 15 per cent to 8 per cent. Women aged 65 or over faced a significantly higher risk of consistent poverty than men of that age. Up to 2000, the set of eight basic deprivation items included in the measure of consistent poverty were unchanged, so it was important to assess whether they were still capturing what would be widely seen as generalised deprivation. Factor analysis suggested that the structuring of deprivation items into the different dimensions has remained remarkably stable over time. Combining low income with the original set of basic deprivation indicators did still appear to identify a set of households experiencing generalised deprivation as a result of prolonged constraints in terms of command over resources, and distinguished from those experiencing other types of deprivation. However, on its own this does not tell the whole story - like purely relative income measures - nor does it necessarily remain the most appropriate set of indicators looking forward. Finally, it is argued that it would now be appropriate to expand the range of monitoring tools to include alternative poverty measures incorporating income and deprivation. Levels of deprivation for some of the items included in the original basic set were so low by 2000 that further progress will be difficult to capture empirically. This represents a remarkable achievement in a short space of time, but poverty is invariably reconstituted in terms of new and emerging social needs in a context of higher societal living standards and expectations. An alternative set of basic deprivation indicators and measure of consistent poverty is presented, which would be more likely to capture key trends over the next number of years. This has implications for the approach adopted in monitoring the National Anti-Poverty Strategy. Monitoring over the period to 2007 should take a broader focus than the consistent poverty measure as constructed to date, with attention also paid to both relative income and to consistent poverty with the amended set of indicators identified here.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!