To see the other types of publications on this topic, follow the link: Wavelet Image denoising.

Dissertations / Theses on the topic 'Wavelet Image denoising'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 35 dissertations / theses for your research on the topic 'Wavelet Image denoising.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Ghazel, Mohsen. "Adaptive Fractal and Wavelet Image Denoising." Thesis, University of Waterloo, 2004. http://hdl.handle.net/10012/882.

Full text
Abstract:
The need for image enhancement and restoration is encountered in many practical applications. For instance, distortion due to additive white Gaussian noise (AWGN) can be caused by poor quality image acquisition, images observed in a noisy environment or noise inherent in communication channels. In this thesis, image denoising is investigated. After reviewing standard image denoising methods as applied in the spatial, frequency and wavelet domains of the noisy image, the thesis embarks on the endeavor of developing and experimenting with new image denoising methods based on fractal and wavelet transforms. In particular, three new image denoising methods are proposed: context-based wavelet thresholding, predictive fractal image denoising and fractal-wavelet image denoising. The proposed context-based thresholding strategy adopts localized hard and soft thresholding operators which take in consideration the content of an immediate neighborhood of a wavelet coefficient before thresholding it. The two fractal-based predictive schemes are based on a simple yet effective algorithm for estimating the fractal code of the original noise-free image from the noisy one. From this predicted code, one can then reconstruct a fractally denoised estimate of the original image. This fractal-based denoising algorithm can be applied in the pixel and the wavelet domains of the noisy image using standard fractal and fractal-wavelet schemes, respectively. Furthermore, the cycle spinning idea was implemented in order to enhance the quality of the fractally denoised estimates. Experimental results show that the proposed image denoising methods are competitive, or sometimes even compare favorably with the existing image denoising techniques reviewed in the thesis. This work broadens the application scope of fractal transforms, which have been used mainly for image coding and compression purposes.
APA, Harvard, Vancouver, ISO, and other styles
2

Tuncer, Guney. "A Java Toolbox For Wavelet Based Image Denoising." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12608037/index.pdf.

Full text
Abstract:
Wavelet methods for image denoising have became widespread for the last decade. The effectiveness of this denoising scheme is influenced by many factors. Highlights can be listed as choosing of wavelet used, the threshold determination and transform level selection for thresholding. For threshold calculation one of the classical solutions is Wiener filter as a linear estimator. Another one is VisuShrink using global thresholding for nonlinear area. The purpose of this work is to develop a Java toolbox which is used to find best denoising schemes for distinct image types particularly Synthetic Aperture Radar (SAR) images. This can be accomplished by comparing these basic methods with well known data adaptive thresholding methods such as SureShrink, BayeShrink, Generalized Cross Validation and Hypothesis Testing. Some nonwavelet denoising process are also introduced. Along with simple mean and median filters, more statistically adaptive median, Lee, Kuan and Frost filtering techniques are also tested to assist wavelet based denoising scheme. All of these methods on the basis of wavelet models and some traditional methods will be implemented in pure java code using plug-in concept of ImageJ which is a popular image processing tool written in Java.
APA, Harvard, Vancouver, ISO, and other styles
3

Aparnnaa. "Image Denoising and Noise Estimation by Wavelet Transformation." Kent State University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=kent1555929391906805.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Liao, Zhiwu. "Image denoising using wavelet domain hidden Markov models." HKBU Institutional Repository, 2005. http://repository.hkbu.edu.hk/etd_ra/616.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Quan, Jin. "Image Denoising of Gaussian and Poisson Noise Based on Wavelet Thresholding." University of Cincinnati / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1380556846.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Balster, Eric J. "Video compression and rate control methods based on the wavelet transform." Columbus, Ohio : Ohio State University, 2004. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1086098540.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2003.<br>Title from first page of PDF file. Document formatted into pages; contains xxv, 142 p.; also includes graphics. Includes abstract and vita. Advisor: Yuan F. Zheng, Dept. of Electrical and Computer Engineering. Includes bibliographical references (p. 135-142).
APA, Harvard, Vancouver, ISO, and other styles
7

Kim, Il-Ryeol. "Wavelet domain partition-based signal processing with applications to image denoising and compression." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file 2.98 Mb., 119 p, 2006. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:3221054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cheng, Wei. "Studies on NDT Image Denoising by Wavelet Transform and Self-Orgnizing Maps." 京都大学 (Kyoto University), 2004. http://hdl.handle.net/2433/147636.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Silwal, Sharad Deep. "Bayesian inference and wavelet methods in image processing." Manhattan, Kan. : Kansas State University, 2009. http://hdl.handle.net/2097/2355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Akyay, Tolga. "Wavelet-based Outlier Detection And Denoising Of Airborne Laser Scanning Data." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12610164/index.pdf.

Full text
Abstract:
The method of airborne laser scanning &ndash<br>also named as LIDAR &ndash<br>has recently turned out to be an efficient way for generating high quality digital surface and elevation models. In this work, wavelet-based outlier detection and different wavelet thresholding (wavelet shrinkage) methods for denoising of airborne laser scanning data are discussed. The task is to investigate the effect of wavelet-based outlier detection and find out which wavelet thresholding methods provide best denoising results for post-processing. Data and results are analyzed and visualized by using a MATLAB program which was developed during this work.
APA, Harvard, Vancouver, ISO, and other styles
11

Matoušek, Luděk. "Waveletová analýza a zvýrazňování MR tomografických a ultrazvukových obrazů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2008. http://www.nusl.cz/ntk/nusl-217525.

Full text
Abstract:
Tomographic MR (Magnetic Resonance) and sonographic biosignal processing are important non-invasive diagnostic methods used in a medicine. A noise added into processed data by an amplifier of tomograph receiving part and by circuits of sonograph is resulting in a body organ diagnosis degradation. Image data are stored in a standardized DICOM medical file format. Methods using wavelet analysis for noise suppression in image data have been designed and their comparation with classical methods has been made in this work. The MATLAB was utilized for data processing and data rewriting back to the DICOM format.
APA, Harvard, Vancouver, ISO, and other styles
12

Mucha, Martin. "Moderní směrové způsoby reprezentace obrazů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-220200.

Full text
Abstract:
Transformation methods are used to describe the image based on defined shapes, which are called bases or frames. Thanks to these shapes it is possible to transform the image with the help of calculated transformation coefficients and further work with this image. It is possible to image denoising, reconstruct the image, transform it and do other things. There are several types of methods of the image processing. In this field a significiant development could be seen. This study is focused on analysis of characteristics of individual well known methods of transformation such as Fouriers´s or Wavelet´s. For comparison, there are also new chosen methods of transformation described: Ripplet, Curvelet, Surelet, Tetrolet, Contourlet and Shearlet. Functional toolboxes were used for comparison of individual methods and their characteristics. These functional toolboxes were modified for the possibility of limitation of transformation coefficients for their potential use in subsequent reconstruction.
APA, Harvard, Vancouver, ISO, and other styles
13

Zátyik, Ján. "Směrové reprezentace obrazů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2011. http://www.nusl.cz/ntk/nusl-218921.

Full text
Abstract:
Various methods describes an image by specific shapes, which are called basis or frames. With these basis can be transformed the image into a representation by transformation coefficients. The aim is that the image can be described by a small number of coefficients to obtain so-called sparse representation. This feature can be used for example for image compression. But basis are not able to describe all the shapes that may appear in the image. This lack increases the number of transformation coefficients describing the image. The aim of this thesis is to study the general principle of calculating the transformation coefficients and to compare classical methods of image analysis with some of the new methods of image analysis. Compares effectiveness of method for image reconstruction from a limited number of coefficients and a noisy image. Also, compares image interpolation method using characteristics of two different transformations with bicubic transformation. Theoretical part describes the transformation methods. Describes some methods from aspects of multi/resolution, localization in time and frequency domains, redundancy and directionality. Furthermore, gives examples of transformations on a particular image. The practical part of the thesis compares efficiency of the Fourier, Wavelet, Contourlet, Ridgelet, Radon, Wavelet Packet and WaveAtom transform in image recontruction from a limited number of the most significant transformation coefficients. Besides, ability of image denoising using these methods with thresholding techniques applied to transformation coefficients. The last section deals with the interpolation of image interpolation by combining of two methods and compares the results with the classical bicubic interpolation.
APA, Harvard, Vancouver, ISO, and other styles
14

Malek, Mohamed. "Extension de l'analyse multi-résolution aux images couleurs par transformées sur graphes." Thesis, Poitiers, 2015. http://www.theses.fr/2015POIT2304/document.

Full text
Abstract:
Dans ce manuscrit, nous avons étudié l’extension de l’analyse multi-résolution aux images couleurs par des transformées sur graphe. Dans ce cadre, nous avons déployé trois stratégies d’analyse différentes. En premier lieu, nous avons défini une transformée basée sur l’utilisation d’un graphe perceptuel dans l’analyse à travers la transformé en ondelettes spectrale sur graphe. L’application en débruitage d’image met en évidence l’utilisation du SVH dans l’analyse des images couleurs. La deuxième stratégie consiste à proposer une nouvelle méthode d’inpainting pour des images couleurs. Pour cela, nous avons proposé un schéma de régularisation à travers les coefficients d’ondelettes de la TOSG, l’estimation de la structure manquante se fait par la construction d’un graphe des patchs couleurs à partir des moyenne non locales. Les résultats obtenus sont très encourageants et mettent en évidence l’importance de la prise en compte du SVH. Dans la troisième stratégie, nous proposons une nouvelleapproche de décomposition d’un signal défini sur un graphe complet. Cette méthode est basée sur l’utilisation des propriétés de la matrice laplacienne associée au graphe complet. Dans le contexte des images couleurs, la prise en compte de la dimension couleur est indispensable pour pouvoir identifier les singularités liées à l’image. Cette dernière offre de nouvelles perspectives pour une étude approfondie de son comportement<br>In our work, we studied the extension of the multi-resolution analysis for color images by using transforms on graphs. In this context, we deployed three different strategies of analysis. Our first approach consists of computing the graph of an image using the psychovisual information and analyzing it by using the spectral graph wavelet transform. We thus have defined a wavelet transform based on a graph with perceptual information by using the CIELab color distance. Results in image restoration highlight the interest of the appropriate use of color information. In the second strategy, we propose a novel recovery algorithm for image inpainting represented in the graph domain. Motivated by the efficiency of the wavelet regularization schemes and the success of the nonlocal means methods we construct an algorithm based on the recovery of information in the graph wavelet domain. At each step the damaged structure are estimated by computing the non local graph then we apply the graph wavelet regularization model using the SGWT coefficient. The results are very encouraging and highlight the use of the perceptual informations. In the last strategy, we propose a new approach of decomposition for signals defined on a complete graphs. This method is based on the exploitation of of the laplacian matrix proprieties of the complete graph. In the context of image processing, the use of the color distance is essential to identify the specificities of the color image. This approach opens new perspectives for an in-depth study of its behavior
APA, Harvard, Vancouver, ISO, and other styles
15

Oliveira, Helder Cesar Rodrigues de. "Proposta de redução da dose de radiação na mamografia digital utilizando novos algoritmos de filtragem de ruído Poisson." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/18/18152/tde-29032016-160603/.

Full text
Abstract:
O objetivo deste trabalho é apresentar um novo método para a remoção do ruído Poisson em imagens de mamografia digital adquiridas com baixa dosagem de radiação. Sabe-se que a mamografia por raios X é o exame mais eficiente para a detecção precoce do câncer de mama, aumentando consideravelmente as chances de cura da doença. No entanto, a radiação absorvida pela paciente durante o exame ainda é um problema a ser tratado. Estudos indicam que a exposição à radiação pode induzir a formação do câncer em algumas mulheres radiografadas. Apesar desse número ser significativamente baixo em relação ao número de mulheres que são salvas pelo exame, existe a necessidade do desenvolvimento de meios que viabilizem a diminuição da dose de radiação empregada. No entanto, uma redução na dose de radiação piora a qualidade da imagem pela diminuição da relação sinal-ruído, prejudicando o diagnóstico médico e a detecção precoce da doença. Nesse sentido, a proposta deste trabalho é apresentar um método para a filtragem do ruído Poisson que é adicionado às das imagens mamográficas quando adquiridas com baixa dosagem de radiação, fazendo com que ela apresente qualidade equivalente àquela adquirida com a dose padrão de radiação. O algoritmo proposto foi desenvolvido baseado em adaptações de algoritmos bem estabelecidos na literatura, como a filtragem no domínio Wavelet, aqui usando o Shrink-thresholding (WTST), e o Block-matching and 3D Filtering (BM3D). Os resultados obtidos com imagens mamográficas adquiridas com phantom e também imagens clínicas, mostraram que o método proposto é capaz de filtrar o ruído adicional incorporado nas imagens sem perda aparente de informação.<br>The aim of this work is to present a novel method for removing the Poisson noise in digital mammography images acquired with reduced radiation dose. It is known that the X-ray mammography is the most effective exam for early detection of breast cancer, greatly increasing the chances of healing the disease. However, the radiation absorbed by the patient during the exam is still a problem to be treated. Some studies showed that mammography can induce breast cancer in a few women. Although this number is significantly low compared to the number of women who are saved by the exam, it is important to develop methods to enable the reduction of the radiation dose used in the exam. However, dose reduction led to a decrease in image quality by means of the signal to noise ratio, impairing medical diagnosis and the early detection of the disease. In this sense, the purpose of this study is to propose a new method to reduce Poisson noise in mammographic images acquired with low radiation dose, in order to achive the same quality as those acquired with the standard dose. The method is based on well established algorithms in the literature as the filtering in Wavelet domain, here using Shrink-thresholding (WTST) and the Block-matching and 3D Filtering (BM3D). Results using phantom and clinical images showed that the proposed algorithm is capable of filtering the additional noise in images without apparent loss of information.
APA, Harvard, Vancouver, ISO, and other styles
16

Zhao, Fangwei. "Multiresolution analysis of ultrasound images of the prostate." University of Western Australia. School of Electrical, Electronic and Computer Engineering, 2004. http://theses.library.uwa.edu.au/adt-WU2004.0028.

Full text
Abstract:
[Truncated abstract] Transrectal ultrasound (TRUS) has become the urologist’s primary tool for diagnosing and staging prostate cancer due to its real-time and non-invasive nature, low cost, and minimal discomfort. However, the interpretation of a prostate ultrasound image depends critically on the experience and expertise of a urologist and is still difficult and subjective. To overcome the subjective interpretation and facilitate objective diagnosis, computer aided analysis of ultrasound images of the prostate would be very helpful. Computer aided analysis of images may improve diagnostic accuracy by providing a more reproducible interpretation of the images. This thesis is an attempt to address several key elements of computer aided analysis of ultrasound images of the prostate. Specifically, it addresses the following tasks: 1. modelling B-mode ultrasound image formation and statistical properties; 2. reducing ultrasound speckle; and 3. extracting prostate contour. Speckle refers to the granular appearance that compromises the image quality and resolution in optics, synthetic aperture radar (SAR), and ultrasound. Due to the existence of speckle the appearance of a B-mode ultrasound image does not necessarily relate to the internal structure of the object being scanned. A computer simulation of B-mode ultrasound imaging is presented, which not only provides an insight into the nature of speckle, but also a viable test-bed for any ultrasound speckle reduction methods. Motivated by analysis of the statistical properties of the simulated images, the generalised Fisher-Tippett distribution is empirically proposed to analyse statistical properties of ultrasound images of the prostate. A speckle reduction scheme is then presented, which is based on Mallat and Zhong’s dyadic wavelet transform (MZDWT) and modelling statistical properties of the wavelet coefficients and exploiting their inter-scale correlation. Specifically, the squared modulus of the component wavelet coefficients are modelled as a two-state Gamma mixture. Interscale correlation is exploited by taking the harmonic mean of the posterior probability functions, which are derived from the Gamma mixture. This noise reduction scheme is applied to both simulated and real ultrasound images, and its performance is quite satisfactory in that the important features of the original noise corrupted image are preserved while most of the speckle noise is removed successfully. It is also evaluated both qualitatively and quantitatively by comparing it with median, Wiener, and Lee filters, and the results revealed that it surpasses all these filters. A novel contour extraction scheme (CES), which fuses MZDWT and snakes, is proposed on the basis of multiresolution analysis (MRA). Extraction of the prostate contour is placed in a multi-scale framework provided by MZDWT. Specifically, the external potential functions of the snake are designated as the modulus of the wavelet coefficients at different scales, and thus are “switchable”. Such a multi-scale snake, which deforms and migrates from coarse to fine scales, eventually extracts the contour of the prostate
APA, Harvard, Vancouver, ISO, and other styles
17

Didas, Stephan [Verfasser], and Joachim [Akademischer Betreuer] Weickert. "Denoising and enhancement of digital images : variational methods, integrodifferential equations, and wavelets / Stephan Didas. Betreuer: Joachim Weickert." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2011. http://d-nb.info/105105673X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Bouyrie, Mathieu. "Restauration d'images de noyaux cellulaires en microscopie 3D par l'introduction de connaissance a priori." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLA032/document.

Full text
Abstract:
Cette thèse aborde la problématique de la restauration d’images 3D de noyaux cellulaires fluorescents issues de la microscopie 2-photons à balayage laser d’animaux observés in vivo et in toto au cours de leur développement embryonnaire. La dégradation originale de ces images provient des limitations des systèmes optiques, du bruit intrinsèque des systèmes de détection ansi que de l’absorption et la diffusion de la lumière dans la profondeur des tissus. A la différence des propositions de “débruitage” de l’état de l’art, nous proposons ici une méthode qui prend en compte les particularités des données biologiques. Cette méthode, adaptation à la troisième dimension d’un algorithme utilisé dans l’analyse d’image astronomique, tire parti de connaissances a priori sur les images étudiées. Les hypothèses émises portent à la fois sur la détérioration du signal par un bruit supposé Mixe Poisson Gaussien (MPG) et sur la nature des objets observés. Nous traitons ici le cas de noyaux de cellules embryonnaires que nous supposons quasi sphériques.L’implémentation en 3D doit prendre en compte les dimensions de la grille d’échantillonnage de l’image. En effet ces dimensions ne sont pas identiques dans les trois directions de l’espace et un objet sphérique échantillonné sur cette grille perd cette caractéristique. Pour adapter notre méthode à une telle grille, nous avons ré-interprété le processus de filtrage, au coeur de la théorie originale, comme un processus physique de diffusion<br>In this this document, we present a method to denoise 3D images acquired by 2-photon microscopy and displaying cell nuclei of animal embryos. The specimens are observed in toto and in vivo during their early development. Image deterioration can be explained by the microscope optical flaws, the acquisition system limitations, and light absorption and diffusion through the tissue depth.The proposed method is a 3D adaptation of a 2D method so far applied to astronomical images and it also differs from state-of the of-the-art methods by the introduction of priors on the biological data. Our hypotheses include assuming that noise statistics are Mixed Poisson Gaussian (MPG) and that cell nuclei are quasi spherical.To implement our method in 3D, we had to take into account the sampling grid dimensions which are different in the x, y or z directions. A spherical object imaged on this grid loses this property. To deal with such a grid, we had to interpret the filtering process, which is a core element of the original theory, as a diffusion process
APA, Harvard, Vancouver, ISO, and other styles
19

Schmitt, Jeremy. "Déconvolution Multicanale et Détection de Sources en utilisant des représentations parcimonieuses : application au projet Fermi." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00670302.

Full text
Abstract:
Ce mémoire de thèse présente de nouvelles méthodologies pour l'analyse de données Poissoniennes sur la sphère, dans le cadre de la mission Fermi. Les objectifs principaux de la mission Fermi, l'étude du fond diffus galactique et l'établissement du catalogue de source, sont com pliqués par la faiblesse du flux de photons et les effets de l'instrument de mesure. Ce mémoire introduit une nouvelle représentation mutli-échelles des données Poissoniennes sur la sphère, la Transformée Stabilisatrice de Variance Multi-Echelle sur la Sphère (MS-VSTS), consistant à combiner une transformée multi-échelles sur la sphère (ondelettes, curvelets), avec une transformée stabilisatrice de variance (VST). Cette méthode est appliquée à la suppression du bruit de Poisson mono et multicanale, à l'interpolation de données manquantes, à l'extraction d'un modèle de fond et à la déconvolution multicanale. Enfin, ce mémoire aborde le problème de la séparation de composantes en utilisant des représentations parcimonieuses (template fitting).
APA, Harvard, Vancouver, ISO, and other styles
20

Cho, Dongwook. "Image denoising using wavelet transforms." Thesis, 2004. http://spectrum.library.concordia.ca/8141/1/MQ94737.pdf.

Full text
Abstract:
Image denoising is a fundamental process in image processing, pattern recognition, and computer vision fields. The main goal of image denoising is to enhance or restore a noisy image and help the other system (or human) to understand it better. In this thesis, we discuss some efficient approaches for image denoising using wavelet transforms. Since Donoho proposed a simple thresholding method, many different approaches have been suggested for a decade. They have shown that denoising using wavelet transforms produces superb results. This is because wavelet transform has the compaction property of having only a small number of large coefficients and a large number of small coefficients. In the first part of the thesis, some important wavelet transforms for image denoising and a literature review on the existing methods are described. In the latter part, we propose two different approaches for image denoising. The first approach is to take advantage of the higher order statistical coupling between neighbouring wavelet coefficients and their corresponding coefficients in the parent level with effective translation-invariant wavelet transforms. The other is based on multivariate statistical modeling and the clean coefficients are estimated in a general rule using Bayesian approach. Various estimation expressions can be obtained by a priori probability distribution, called multivariate generalized Gaussian distribution (MGGD). The method can take into account various related information. The experimental results show that both of our methods give comparatively higher PSNR and less visual artifact than other methods.
APA, Harvard, Vancouver, ISO, and other styles
21

Mu-Yen, Chen, and 陳木炎. "Radar Image Denoising with Wavelet Packets." Thesis, 1995. http://ndltd.ncl.edu.tw/handle/02378868527279744430.

Full text
Abstract:
碩士<br>國立海洋大學<br>航海技術學系<br>83<br>In this thesis, the frame of radar PPI image is treated as a whole for denoising via wavelet packets. To obtain a clean scan for radar observation, the object boundary in PPI image is considered more important than its texture content. Therefore, the complexity of denoising is reduced. Iterated noise estimation (INE) and recursively coefficients thresholding and reconstructing (RCTR) are two major steps is our denoising approach. It is shown that RCTR has better performance than the traditional inverse wavelet transform when the same coefficients thresholding is applied. Moreover, the performance is insensitive to the estimation error of noise variance by INE.
APA, Harvard, Vancouver, ISO, and other styles
22

LIN, SIN-HONG, and 林信宏. "A wavelet-based image denoising method." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/9qf5dk.

Full text
Abstract:
碩士<br>國立臺北科技大學<br>自動化科技研究所<br>107<br>The efficient representation of edges is key to improving the image denoising performance. This motivates us to capture the edges and calculate them with a wavelet transform. A novel image denoising method is proposed by exploiting the image edges information and the multidirectional shrinkage. The image edges preservation effect is achieved by applying the main direction in the wavelet transform. Since the image is perform wavelet transform in different directions, for each pixel we obtain many different estimates, one of which is optimal. A noisy image is decomposed into subbands of LL, LH, HL, and HH in wavelet domain. LL subband contains the low frequency coefficients along with less noise, which can be easily eliminated using TV-based method. More edges and other detailed information like textures are contained in the other three subbands, and we propose a shrinkage method based on the local variance to extract them from high frequency noise. And apply adaptive threshold shrinkage to denoising. The final denoised output is obtained by a weighted averaging of all individual estimates. Experimental results show that our method, compared with other wavelet-based denoising algorithms, can effectively remove noise and preserve detail information such as edges and textures.
APA, Harvard, Vancouver, ISO, and other styles
23

Lee, Yu-lun, and 李育倫. "Diffusion Weighted Image Denoising by Wavelet Transform." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/45251434655663101996.

Full text
Abstract:
碩士<br>國立雲林科技大學<br>工業工程與管理研究所碩士班<br>99<br>Diffusion tensor imaging (DTI) is one of very common medical imaging, which composed by a series Diffusion weighted imaging. It is a non-invasive technique and it can provide the direction of nerve fiber in our brain. However, some noises, which generated by DTI rapid imaging, patients state and other factors, reduce the image quality of DWI. Noises make tractography produce incorrect result. In this study, we add noises in the simulation images, and use wavelet transform and anisotropic filter to denoising. We compare the performance of two filters, and choice a better filter used in practical images. The result of simulation data show that, wavelet transform significantly lower than anisotropic filter in average error. After denoising, the amplitude of measure index FA value average error shrink from 0.00832 to 0.001964, and MD value average error form 0.008096 to 0.002108, the average of measure index can be close to simulation image. There have four patients with metastatic tumor and four normal subjects in practical data. The regions of interest (ROI) include region of tumor, region of edema, region of edema symmetrical and region of white matter in normal subjects. We compute the measure index FA and MD, draw scatter plots, and then we compare the results. It show that, in different areas test, FA and MD value significantly different after denoising, in different subjects test, FA value significantly different after denoising, and MD value do not significantly different after denoising. The average deviation of FA value is 0.079775 and MD value is 0.038185. The result can be one of reference for medical imaging quantitative basis.
APA, Harvard, Vancouver, ISO, and other styles
24

Bai, Rong. "Wavelet Shrinkage Based Image Denoising using Soft Computing." Thesis, 2008. http://hdl.handle.net/10012/3876.

Full text
Abstract:
Noise reduction is an open problem and has received considerable attention in the literature for several decades. Over the last two decades, wavelet based methods have been applied to the problem of noise reduction and have been shown to outperform the traditional Wiener filter, Median filter, and modified Lee filter in terms of root mean squared error (MSE), peak signal noise ratio (PSNR) and other evaluation methods. In this research, two approaches for the development of high performance algorithms for de-noising are proposed, both based on soft computing tools, such as fuzzy logic, neural networks, and genetic algorithms. First, an improved additive noise reduction method for digital grey scale nature images, which uses an interval type-2 fuzzy logic system to shrink wavelet coefficients, is proposed. This method is an extension of a recently published approach for additive noise reduction using a type-1 fuzzy logic system based wavelet shrinkage. Unlike the type-1 fuzzy logic system based wavelet shrinkage method, the proposed approach employs a thresholding filter to adjust the wavelet coefficients according to the linguistic uncertainty in neighborhood values, inter-scale dependencies and intra-scale correlations of wavelet coefficients at different resolutions by exploiting the interval type-2 fuzzy set theory. Experimental results show that the proposed approach can efficiently and rapidly remove additive noise from digital grey scale images. Objective analysis and visual observations show that the proposed approach outperforms current fuzzy non-wavelet methods and fuzzy wavelet based methods, and is comparable with some recent but more complex wavelet methods, such as Hidden Markov Model based additive noise de-noising method. The main differences between the proposed approach and other wavelet shrinkage based approaches and the main improvements of the proposed approach are also illustrated in this thesis. Second, another improved method of additive noise reduction is also proposed. The method is based on fusing the results of different filters using a Fuzzy Neural Network (FNN). The proposed method combines the advantages of these filters and has outstanding ability of smoothing out additive noise while preserving details of an image (e.g. edges and lines) effectively. A Genetic Algorithm (GA) is applied to choose the optimal parameters of the FNN. The experimental results show that the proposed method is powerful for removing noise from natural images, and the MSE of this approach is less, and the PSNR of is higher, than that of any individual filters which are used for fusion. Finally, the two proposed approaches are compared with each other from different point of views, such as objective analysis in terms of mean squared error(MSE), peak signal to noise ratio (PSNR), image quality index (IQI) based on quality assessment of distorted images, and Information Theoretic Criterion (ITC) based on a human vision model, computational cost, universality, and human observation. The results show that the proposed FNN based algorithm optimized by GA has the best performance among all testing approaches. Important considerations for these proposed approaches and future work are discussed.
APA, Harvard, Vancouver, ISO, and other styles
25

Lin, You-Yu, and 林祐宇. "Image Denoising by Multi-wavelet with Finite Element." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/dxps65.

Full text
Abstract:
碩士<br>國立清華大學<br>通訊工程研究所<br>107<br>Denoising is an important part of image process for a long time. When we are capturing or transmitting the images, they are often effected by different kinds of noise. For examples, the dirty camera lens will increase the noise of images, the blurring images caused by moving quickly, and so on. There are many cases that will make the images not clear (also called noise). If we do some further analysis on the noising images, it still hard to get the accurate conclusion. Consequently, image denoising will be an essential and important method on image process. There are some famous methods on image denoising now, such as linear or non-linear noise filter, Fourier transform (from frequency-domain to frequency-domain), wavelet transform, etc. Multi-wavelet transform has been emphasized by researchers these years because multi-wavelet transfers the data from time-domain to time-domain. After multi-wavelet transform, we can obtain lots of information from time-domain which can not obtain by Fourier transform. Furthermore, multi-wavelet transform contains the property of compression, which is helpful for the era of big data. Compared to wavelet transform, multi-wavelet transform can contain lots properties at once, such as orthogonality, symmetric, anti-symmetric, short support, large vanishing moment, and so on. Properties varies from different multi-wavelet. Using those properties, we remove the noise on images successfully. Geronimo-Hardin-Masopust(GHM) with multiplicity 2 has been used to relax the effect of noise for most researches recently. However, our main contribution is that introducing other multi-wavelet which is more general to denoise, called multi-wavelet with finite element. Using multi-wavelet with finite element can decrease the computation time effectively. In addition, we can remove noise more precisely when noise is large. More importantly, multi-wavelet with finite element will not be controlled by multiplicity. We can choose proper multiplicity by our requirement to obtain well performance of denoising. Finally, we present some experiments using four kinds of different data, which can be divided into three parts. The first part is repeated data with different multiplicity. Second, passing through the repeat data by filter which is designed in advance. Third, by re-sorting and re-arranging, we can get the denoising image which is also after compressing.
APA, Harvard, Vancouver, ISO, and other styles
26

Chen, Wei-Chang, and 陳維昌. "Fuzzy-Neighshrink Filter for Wavelet-based Image Denoising." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/75414076006412954326.

Full text
Abstract:
碩士<br>國立宜蘭大學<br>電子工程學系碩士班<br>97<br>In image processing, it is a classical problem for denoising natural image corrupted by additive white Gaussian noise. Recently, wavelet shrinkages are shown to be useful methods in signal or image denoising. It is a relatively methodology. First, wavelet coefficients are obtained from noisy image by wavelet transform. Then, wavelet coefficients are sorted by grade and also shrunk by a suitable threshold. After shrinkage, denoised signal are got by processing the inverse wavelet transform. In this paper, a new Fuzzy-Neighshrink filter is proposed. By using the fuzzy rule and the information of the neighborhood of an image, noise is sorted and reduced from these wavelet coefficients. The proposed algorithm keeps many important wavelet coefficients which are usually killed on some other threshold approach. Based on the neighshrink-average concept, the difference of wavelet coefficients between the true image and the noise is taken into count to build suitable memberships and to replace the threshold of classical neighshrink. Experimental results show that this method get better performance than some denoising methods.
APA, Harvard, Vancouver, ISO, and other styles
27

Zhao, Jie, and 趙杰. "The Study of Wavelet Image Denoising for Small-Animal PET." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/gub6kr.

Full text
Abstract:
博士<br>國立陽明大學<br>生物醫學影像暨放射科學系<br>105<br>ABSTRACT The aim of this dissertation was to improve the image quality of small-animal Positron emission tomography (PET). PET has recently become more significant, and it is an important part of clinical diagnosis and preclinical research. Owing to the genetic similarity of small animals to humans, the value of using molecular imaging on small animals by means of PET is widely recognized as an important aspect of preclinical research and drug development. In small-animal PET, the raw data is often corrupted by noise severely. Reducing the noise in small-animal positron emission tomography (PET) images is an important and challenging task. Recently, several hybrid denoising techniques based on wavelet transform (WT) have been developed. However, these hybrid methods have complicated mathematical structures and require complex parameter estimations, and therefore demand a high level of manual intervention. Under such circumstances, good performance with respect to image quality using these new methods would only seem to be achievable with an increased computational burden. In this dissertation, we propose a novel wavelet denoising (WD) method. This method is based on scanner-dependent threshold estimation and the Visushrink method. The method provides a compromise between computational burden and image quality. The experimental results indicate that the proposed method is better than the Visushrink method. Compared with the Visushrink method, the proposed method provides good image quality at higher decomposition levels. In terms of usability and efficiency, the proposed method is better than the hybrid method. The proposed WD method also has several useful properties; therefore, it is possible that it might become an alternative solution to reduce the noise in small-animal PET images.
APA, Harvard, Vancouver, ISO, and other styles
28

Chiou, Shin-Yi, and 邱馨儀. "Convection-Diffusion Model for Image Denoising in Comparison with Wavelet Method and Singular Value Decomposition." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/77592389652936191952.

Full text
Abstract:
碩士<br>國立中興大學<br>應用數學系所<br>100<br>In this thesis, we propose a modified convection diffusion equation (MCD) for image denoising. There are many tools for image denoising such as singular value decomposition (SVD), wavelet shrinkage and the filters by partial differential equations (PDEs). The traditional PDE filters use isotropic and anisotropic diffusion equations. Later, Shih et al. proposed a convection diffusion equation (CD) for denoising quickly. For improving the image quality, we propose a novel algorithm by a mixture of CD and isotropic diffusion equation to preserve the orientation of image edge. The numerical results show that MCD can remove the noise effectively, especially for salt-and-pepper noise in comparing with current popular filters.
APA, Harvard, Vancouver, ISO, and other styles
29

Achim, Alin. "Novel Bayesian multiscale methods for image denoising using alpha-stable distributions." 2003. http://nemertes.lis.upatras.gr/jspui/handle/10889/1265.

Full text
Abstract:
Before launching into ultrasound research, it is important to recall that the ultimate goal is to provide the clinician with the best possible information needed to make an accurate diagnosis. Ultrasound images are inherently affected by speckle noise, which is due to image formation under coherent waves. Thus, it appears to be sensible to reduce speckle artifacts before performing image analysis, provided that image texture that might distinguish one tissue from another is preserved. The main goal of this thesis was the development of novel speckle suppression methods from medical ultrasound images in the multiscale wavelet domain. We started by showing, through extensive modeling, that the subband decompositions of ultrasound images have significantly non-Gaussian statistics that are best described by families of heavy-tailed distributions such as the alpha-stable. Then, we developed Bayesian estimators that exploit these statistics. We used the alpha-stable model to design both the minimum absolute error (MAE) and the maximum a posteriori (MAP) estimators for alpha-stable signal mixed in Gaussian noise. The resulting noise-removal processors perform non-linear operations on the data and we relate this non-linearity to the degree of non-Gaussianity of the data. We compared our techniques to classical speckle filters and current state-of-the-art soft and hard thresholding methods applied on actual ultrasound medical images and we quantified the achieved performance improvement. Finally, we have shown that our proposed processors can find application in other areas of interest as well, and we have chosen as an illustrative example the case of synthetic aperture radar (SAR) images.<br>Ο απώτερος σκοπός της έρευνας που παρουσιάζεται σε αυτή τη διδακτορική διατριβή είναι η διάθεση στην κοινότητα των κλινικών επιστημόνων μεθόδων οι οποίες να παρέχουν την καλύτερη δυνατή πληροφορία για να γίνει μια σωστή ιατρική διάγνωση. Οι εικόνες υπερήχων προσβάλλονται ενδογενώς από θόρυβο, ο οποίος οφείλεται στην διαδικασία δημιουργίας των εικόνων μέσω ακτινοβολίας που χρησιμοποιεί σύμφωνες κυματομορφές. Είναι σημαντικό πριν τη διαδικασία ανάλυσης της εικόνας να γίνεται απάλειψη του θορύβου με κατάλληλο τρόπο ώστε να διατηρείται η υφή της εικόνας, η οποία βοηθά στην διάκριση ενός ιστού από έναν άλλο. Κύριος στόχος της διατριβής αυτής υπήρξε η ανάπτυξη νέων μεθόδων καταστολής του θορύβου σε ιατρικές εικόνες υπερήχων στο πεδίο του μετασχηματισμού κυματιδίων. Αρχικά αποδείξαμε μέσω εκτενών πειραμάτων μοντελοποίησης, ότι τα δεδομένα που προκύπτουν από τον διαχωρισμό των εικόνων υπερήχων σε υποπεριοχές συχνοτήτων περιγράφονται επακριβώς από μη-γκαουσιανές κατανομές βαρέων ουρών, όπως είναι οι άλφα-ευσταθείς κατανομές. Κατόπιν, αναπτύξαμε Μπεϋζιανούς εκτιμητές που αξιοποιούν αυτή τη στατιστική περιγραφή. Πιο συγκεκριμένα, χρησιμοποιήσαμε το άλφα-ευσταθές μοντέλο για να σχεδιάσουμε εκτιμητές ελάχιστου απόλυτου λάθος και μέγιστης εκ των υστέρων πιθανότητας για άλφα-ευσταθή σήματα αναμεμειγμένα με μη-γκαουσιανό θόρυβο. Οι επεξεργαστές αφαίρεσης θορύβου που προέκυψαν επενεργούν κατά μη-γραμμικό τρόπο στα δεδομένα και συσχετίζουν με βέλτιστο τρόπο αυτή την μη-γραμμικότητα με τον βαθμό κατά τον οποίο τα δεδομένα είναι μη-γκαουσιανά. Συγκρίναμε τις τεχνικές μας με κλασσικά φίλτρα καθώς και σύγχρονες μεθόδους αυστηρού και μαλακού κατωφλίου εφαρμόζοντάς τες σε πραγματικές ιατρικές εικόνες υπερήχων και ποσοτικοποιήσαμε την απόδοση που επιτεύχθηκε. Τέλος, δείξαμε ότι οι προτεινόμενοι επεξεργαστές μπορούν να βρουν εφαρμογές και σε άλλες περιοχές ενδιαφέροντος και επιλέξαμε ως ενδεικτικό παράδειγμα την περίπτωση εικόνων ραντάρ συνθετικής διατομής.
APA, Harvard, Vancouver, ISO, and other styles
30

Gupta, Pradeep Kumar. "Denoising And Inpainting Of Images : A Transform Domain Based Approach." Thesis, 2007. http://hdl.handle.net/2005/515.

Full text
Abstract:
Many scientific data sets are contaminated by noise, either because of data acquisition process, or because of naturally occurring phenomena. A first step in analyzing such data sets is denoising, i.e., removing additive noise from a noisy image. For images, noise suppression is a delicate and a difficult task. A trade of between noise reduction and the preservation of actual image features has to be made in a way that enhances the relevant image content. The beginning chapter in this thesis is introductory in nature and discusses the Popular denoising techniques in spatial and frequency domains. Wavelet transform has wide applications in image processing especially in denoising of images. Wavelet systems are a set of building blocks that represent a signal in an expansion set involving indices for time and scale. These systems allow the multi-resolution representation of signals. Several well known denoising algorithms exist in wavelet domain which penalize the noisy coefficients by threshold them. We discuss the wavelet transform based denoising of images using bit planes. This approach preserves the edges in an image. The proposed approach relies on the fact that wavelet transform allows the denoising strategy to adapt itself according to directional features of coefficients in respective sub-bands. Further, issues related to low complexity implementation of this algorithm are discussed. The proposed approach has been tested on different sets images under different noise intensities. Studies have shown that this approach provides a significant reduction in normalized mean square error (NMSE). The denoised images are visually pleasing. Many of the image compression techniques still use the redundancy reduction property of the discrete cosine transform (DCT). So, the development of a denoising algorithm in DCT domain has a practical significance. In chapter 3, a DCT based denoising algorithm is presented. In general, the design of filters largely depends on the a-priori knowledge about the type of noise corrupting the image and image features. This makes the standard filters to be application and image specific. The most popular filters such as average, Gaussian and Wiener reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high frequency details making the image non-smooth. An integrated approach to design filters based on DCT is proposed in chapter 3. This algorithm reorganizes DCT coefficients in a wavelet transform manner to get the better energy clustering at desired spatial locations. An adaptive threshold is chosen because such adaptively can improve the wavelet threshold performance as it allows additional local information of the image to be incorporated in the algorithm. Evaluation results show that the proposed filter is robust under various noise distributions and does not require any a-priori Knowledge about the image. Inpainting is another application that comes under the category of image processing. In painting provides a way for reconstruction of small damaged portions of an image. Filling-in missing data in digital images has a number of applications such as, image coding and wireless image transmission for recovering lost blocks, special effects (e.g., removal of objects) and image restoration (e.g., removal of solid lines, scratches and noise removal). In chapter 4, a wavelet based in painting algorithm is presented for reconstruction of small missing and damaged portion of an image while preserving the overall image quality. This approach exploits the directional features that exist in wavelet coefficients in respective sub-bands. The concluding chapter presents a brief review of the three new approaches: wavelet and DCT based denoising schemes and wavelet based inpainting method.
APA, Harvard, Vancouver, ISO, and other styles
31

Καπρινιώτης, Αχιλλέας. "Αφαίρεση θορύβου από ψηφιακές εικόνες μικροσυστοιχιών DNA". Thesis, 2009. http://nemertes.lis.upatras.gr/jspui/handle/10889/1621.

Full text
Abstract:
Στο πείραμα των μικροσυστοιχιών, η απόκτηση εικόνας συνοδεύεται πάντα από θόρυβο, ο οποίος είναι έμφυτος σε τέτοιου είδους διεργασίες. Είναι λοιπόν επιτακτική ανάγκη να χρησιμοποιηθούν τεχνικές προς καταστολή αυτού. Στην παρούσα εργασία αναλύονται μέθοδοι και παρουσιάζονται τα αποτελέσματά τους σε 5 επιλεγμένα παραδείγματα. Ιδιαίτερη έμφαση δίνεται στο wavelet denoising και συγκεκριμένα στους αλγορίθμους soft thresholding, hard thresholding και stationary wavelet transform.<br>The subject of this diploma thesis is the manufacturing of a driver assistance system. Robust and reliable vehicle detection from images acquired by a moving vehicle (i.e., on road vehicle detection) is an important problem with applications to driver assistance systems and autonomous, self-guided vehicles. The focus of this diploma is on the issues of feature extraction and classification for rear-view vehicle detection. Specifically, by treating the problem of vehicle detection as a two-class classification problem, we have investigated several different feature extraction methods such as wavelets and Gabor filters. To evaluate the extracted features, we have experimented with two popular classifiers, neural networks and support vector machines (SVMs).
APA, Harvard, Vancouver, ISO, and other styles
32

Gupta, Nikhil. "Denoising and compression of digital images using wavelets." Thesis, 2004. http://spectrum.library.concordia.ca/7942/1/MQ91038.pdf.

Full text
Abstract:
This thesis concentrates primarily on two problems that concern noise corrupted images and looks to the wavelet domain for the solutions. Firstly, the issue of noise reduction in digital images is addressed. Most of the popular thresholding techniques are either subband adaptive, i.e., do not adapt spatially according to individual subband coefficients, or rely heavily on subband statistics for adaptation, which makes them computationally expensive. A low-complexity adaptation of the subband-optimal thresholds according to the individual coefficients is presented. The correlation that exists in consecutive subbands in wavelet domain is exploited to adapt the subband optimal thresholds using the magnitude of the corresponding parent coefficients. Using simulated experiments, the ability of the proposed algorithm to preserve the edges and fine details in an image, while successfully reducing the noise from the smooth regions, is demonstrated. Secondly, the relatively uncharted area of simultaneous denoising and compression of images that are corrupted with noise is explored. A data adaptive subband (wavelet) coder that performs joint denoising and compression of the input image based on both the additive white Gaussian noise level in the image and the compression rate desired is developed. A simple uniform threshold quantizer (UTQ), with centroid reconstruction, is adapted to have a joint noise level (data) and output bitrate adaptive zero-zone and reconstruction. To improve the performance of this variable-rate-coder, a context-based classification scheme that improves the quantization of the fine detail in the image is also proposed. The joint denoising and compression scheme is further extended for the removal of multiplicative speckle from medical Ultrasound images using homomorphic filtering.
APA, Harvard, Vancouver, ISO, and other styles
33

Zhao, Hanqing. "Numerical Algorithms for Discrete Models of Image Denoising." Phd thesis, 2010. http://hdl.handle.net/10048/1165.

Full text
Abstract:
In this thesis, we develop some new models and efficient algorithms for image denoising. The total variation model of Rudin, Osher, and Fatemi(ROF) for image denoising is considered to be one of the most successful deterministic denoising models. It exploits the non-smooth total variation (TV) semi-norm to preserve discontinuities and to keep the edges of smooth regions sharp. Despite its simple form, the TV semi-norm results in a strongly nonlinear Euler-Lagrange equation and poses computational challenge in solving the model efficiently. Moreover, this model produces so-called staircase effect. In this thesis, we propose several new algorithms and models to solve these problems. We study the discretized ROF model and propose a new algorithm which does not involve partial differential equations. Convergence of the algorithm is analyzed. Numerical results show that this algorithm is efficient and stable. We then introduce a denoising model which utilizes high-order difference to approximate piece-wise smooth functions. This model eliminates undesirable staircases, and improves both visual quality and signal-to-noise ratio. Our algorithm is generalized to solve the high-order models. A relaxation technique is proposed for the iteration scheme, aiming to accelerate our solution process. Finally, we propose a method combining total variation and wavelet packets to improve performance on texture-rich images. The ROF model is utilized to eliminate noise, and a wavelet packet transform is used to enhance textures. The numerical results show that the combinational method exploits the advantages of both total variation and wavelet packets.<br>Mathematics
APA, Harvard, Vancouver, ISO, and other styles
34

Glew, Devin. "Self-Similarity of Images and Non-local Image Processing." Thesis, 2011. http://hdl.handle.net/10012/6019.

Full text
Abstract:
This thesis has two related goals: the first involves the concept of self-similarity of images. Image self-similarity is important because it forms the basis for many imaging techniques such as non-local means denoising and fractal image coding. Research so far has been focused largely on self-similarity in the pixel domain. That is, examining how well different regions in an image mimic each other. Also, most works so far concerning self-similarity have utilized only the mean squared error (MSE). In this thesis, self-similarity is examined in terms of the pixel and wavelet representations of images. In each of these domains, two ways of measuring similarity are considered: the MSE and a relatively new measurement of image fidelity called the Structural Similarity (SSIM) Index. We show that the MSE and SSIM Index give very different answers to the question of how self-similar images really are. The second goal of this thesis involves non-local image processing. First, a generalization of the well known non-local means denoising algorithm is proposed and examined. The groundwork for this generalization is set by the aforementioned results on image self-similarity with respect to the MSE. This new method is then extended to the wavelet representation of images. Experimental results are given to illustrate the applications of these new ideas.
APA, Harvard, Vancouver, ISO, and other styles
35

Μαστρογιάννη, Αικατερίνη. "Εντοπισμός θέσης υπομικροσυστοιχιών και spots σε ψηφιακές εικόνες μικροσυστοιχιών". Thesis, 2010. http://nemertes.lis.upatras.gr/jspui/handle/10889/4004.

Full text
Abstract:
Η τεχνολογία των DNA μικροσυστοιχιών είναι μια υψηλής απόδοσης τεχνική που καθορίζει το κατά πόσο ένα κύτταρο μπορεί να ελέγξει, ταυτόχρονα, την έκφραση ενός πολύ μεγάλου αριθμού γονιδίων. Οι DNA μικροσυστοιχίες χρησιμοποιούνται για την παρακολούθηση και τον έλεγχο των αλλαγών που υφίστανται τα επίπεδα της γονιδιακής έκφρασης λόγω περιβαλλοντικών συνθηκών ή των αλλαγών που λαμβάνουν χώρα σε ασθενή κύτταρα σε σχέση με τα υγιή, χρησιμοποιώντας εξελιγμένες μεθόδους επεξεργασίας πληροφοριών. Εξαιτίας του τρόπου με τον οποίον παράγονται οι μικροσυστοιχίες, κατά την πειραματική επεξεργασία τους, εμφανίζεται ένας μεγάλος αριθμός διαδικασιών που εισάγουν σφάλματα, γεγονός που αναπόφευκτα οδηγεί στην δημιουργία υψηλού επιπέδου θορύβου και σε κατασκευαστικά προβλήματα στα προκύπτοντα δεδομένα. Κατά την διάρκεια των τελευταίων δεκαπέντε ετών, έχουν προταθεί από αρκετούς ερευνητές, πολλές και ικανές μέθοδοι που δίνουν λύσεις στο πρόβλημα της ενίσχυσης και της βελτίωσης των εικόνων μικροσυστοιχίας. Παρά το γεγονός της ευρείας ενασχόλησης των ερευνητών με τις μεθόδους επεξεργασίας των εικόνων μικροσυστοιχίας, η διαδικασία βελτίωσης τους αποτελεί ακόμη, ένα θέμα που προκαλεί ενδιαφέρον καθώς η ανάγκη για καλύτερα αποτελέσματα δεν έχει μειωθεί. Στόχος της διδακτορικής διατριβής είναι να συνεισφέρει σημαντικά στην προσπάθεια βελτίωσης των αποτελεσμάτων προτείνοντας μεθόδους ψηφιακής επεξεργασίας εικόνας που επιφέρουν βελτίωση της ποιότητας των εικόνων μέσω της μείωσης των συνιστωσών του θορύβου και της τεμαχιοποίησης της εικόνας. Πιο συγκεκριμένα, στα πλαίσια εκπόνησης της διατριβής παρουσιάζεται μια νέα αυτόματη μέθοδος εντοπισμού της θέσης των υπομικροσυστοιχιών σκοπός της οποίας είναι να καλυφθεί εν μέρει το κενό που υπάρχει στην βιβλιογραφία των μικροσυστοιχιών για το βήμα της προεπεξεργασίας που αφορά στην αυτόματη εύρεση της θέσης των υπομικροσυστοιχιών σε μια μικροσυστοιχία. Το βήμα αυτό της προεπεξεργασίας, σπανίως, λαμβάνεται υπόψιν καθώς στις περισσότερες εργασίες σχετικές με τις μικροσυστοιχίες, γίνεται μια αυθαίρετη υπόθεση ότι οι υπομικροσυστοιχίες έχουν με κάποιον τρόπο ήδη εντοπιστεί. Στα πραγματικά συστήματα αυτόματης ανάλυσης της εικόνας μικροσυστοιχίας, την αρχική εκτίμηση της θέσης των υπομικροσυστοιχιών, συνήθως, ακολουθεί η διόρθωση που πραγματοποιείται σε κάθε μια από τις θέσεις αυτές από τους χειριστές των συστημάτων. Η αυτοματοποίηση της εύρεσης θέσης των υπομικροσυστοιχιών οδηγεί σε πιο γρήγορους και ακριβείς υπολογισμούς που αφορούν στην πληροφορία που προσδιορίζεται από την εικόνα μικροσυστοιχίας. Στην συνέχεια της διατριβής, παρουσιάζεται μια συγκριτική μελέτη για την αποθορυβοποίηση των εικόνων μικροσυστοιχίας χρησιμοποιώντας τον μετασχηματισμό κυματιδίου και τα χωρικά φίλτρα ενώ επιπλέον με την βοήθεια τεχνικών της μαθηματικής μορφολογίας πραγματοποιείται δραστική μείωση του θορύβου που έχει την μορφή «αλάτι και πιπέρι». Τέλος, στα πλαίσια της εκπόνησης της διδακτορικής διατριβής, παρουσιάζεται μια μέθοδος κατάτμησης των περιοχών των spot των μικροσυστοιχιών, βασιζόμενη στον αλγόριθμο Random Walker. Κατά την πειραματική διαδικασία επιτυγχάνεται επιτυχής κατηγοριοποίηση των spot, ακόμα και στην περίπτωση εικόνων μικροσυστοιχίας με σοβαρά προβλήματα (θόρυβος, κατασκευαστικά λάθη, λάθη χειρισμού κατά την διαδικασία κατασκευής της μικροσυστοιχίας κ.α.), απαιτώντας σαν αρχική γνώση μόνο ένα μικρό αριθμό από εικονοστοιχεία προκειμένου να επιτευχθεί υψηλής ποιότητας κατάτμηση εικόνας. Τα πειραματικά αποτελέσματα συγκρίνονται ποιοτικά με αυτά που προκύπτουν με την εφαρμογή του μοντέλου κατάτμησης Chan-Vese το οποίο χρησιμοποιεί μια αρχική υπόθεση των συνόρων που υπάρχουν μεταξύ των ομάδων προς ταξινόμηση, αποδεικνύοντας ότι η ακρίβεια με την οποία η προτεινόμενη μέθοδος ταξινομεί τις περιοχές των spot στην σωστή κατηγορία σε μια μικροσυστοιχία, είναι σαφώς καλύτερη και πιο ακριβής.<br>DNA microarray technology is a high-throughput technique that determines how a cell can control the expression of large numbers of genes simultaneously. Microarrays are used to monitor changes in the expression levels of genes in response to changes in environmental conditions or in healthy versus diseased cells by using advanced information processing methods. Due to the nature of the acquisition process, microarray experiments involve a large number of error-prone procedures that lead to a high level of noise and structural problems in the resulting data. During the last fifteen years, robust methods have been proposed by many researchers resulting in several solutions for the enhancement of the microarray images. Though microarray image analysis has been elaborated quite enough, the enhancement process is still an intriguing issue as the need for even better results has not decreased. The goal of this PhD thesis is to significantly contribute to the above effort by proposing enhancing methods (denoising, segmentation) for the microarray image analysis. More specifically, a novel automated subgrid detection method is presented introducing a pre-processing step of the subgrid detection. This step is rarely taken into consideration as in most microarray enhancing methods it is arbitrarily assumed that the subgrids have been already identified. The automation of the subgrid detection leads to faster and more accurate information extraction from microarray images. Consequently, the PhD thesis presents a comparative denoising framework for microarray image denoising that includes wavelets and spatial filters while on the other hand uses mathematical morphology methods to reduce the “salt&pepper”-like noise in microarray images. Finally, a method for microarray spot segmentation is proposed, based on the Random Walker algorithm. During the experimental process, accurate spot segmentation is obtained even in case of relatively-high distorted images, using only an initial annotation for a small number of pixels for high-quality image segmentation. The experimental results are qualitatively compared to the Chan-Vese segmentation model, showing that the accuracy of the proposed microarray spot detection method is more accurate than the spot borders defined by the compared method.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography