To see the other types of publications on this topic, follow the link: JPEG 2000 Compression.

Dissertations / Theses on the topic 'JPEG 2000 Compression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 24 dissertations / theses for your research on the topic 'JPEG 2000 Compression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Park, Min Jee, Jae Taeg Yu, Myung Han Hyun, and Sung Woong Ra. "A Development of Real Time Video Compression Module Based on Embedded Motion JPEG 2000." International Foundation for Telemetering, 2015. http://hdl.handle.net/10150/596452.

Full text
Abstract:
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV<br>In this paper, we develop a miniaturized real time video compression module (VCM) based on embedded motion JPEG 2000 using ADV212 and FPGA. We consider layout of components, values of damping resistors, and lengths of the pattern lines for optimal hardware design. For software design, we consider compression steps to monitor the status of the system and make the system robust. The weight of the developed VCM is approximately 4 times lighter than the previous development. Furthermore, experimental results show that the PSNR is increased about 3dB and the compression processing time is approximately 2 times faster than the previous development.
APA, Harvard, Vancouver, ISO, and other styles
2

Zeybek, Emre. "Compression multimodale du signal et de l’image en utilisant un seul codeur." Thesis, Paris Est, 2011. http://www.theses.fr/2011PEST1060/document.

Full text
Abstract:
Cette thèse a pour objectif d'étudier et d'analyser une nouvelle stratégie de compression, dont le principe consiste à compresser conjointement des données issues de plusieurs modalités, en utilisant un codeur unique. Cette approche est appelée « Compression Multimodale ». Dans ce contexte, une image et un signal audio peuvent être compressés conjointement et uniquement par un codeur d'image (e.g. un standard), sans la nécessité d'intégrer un codec audio. L'idée de base développée dans cette thèse consiste à insérer les échantillons d'un signal en remplacement de certains pixels de l'image « porteuse » tout en préservant la qualité de l'information après le processus de codage et de décodage. Cette technique ne doit pas être confondue aux techniques de tatouage ou de stéganographie puisqu'il ne s'agit pas de dissimuler une information dans une autre. En Compression Multimodale, l'objectif majeur est, d'une part, l'amélioration des performances de la compression en termes de débit-distorsion et d'autre part, l'optimisation de l'utilisation des ressources matérielles d'un système embarqué donné (e.g. accélération du temps d'encodage/décodage). Tout au long de ce rapport, nous allons étudier et analyser des variantes de la Compression Multimodale dont le noyau consiste à élaborer des fonctions de mélange et de séparation, en amont du codage et de séparation. Une validation est effectuée sur des images et des signaux usuels ainsi que sur des données spécifiques telles que les images et signaux biomédicaux. Ce travail sera conclu par une extension vers la vidéo de la stratégie de la Compression Multimodale<br>The objective of this thesis is to study and analyze a new compression strategy, whose principle is to compress the data together from multiple modalities by using a single encoder. This approach is called “Multimodal Compression” during which, an image and an audio signal is compressed together by a single image encoder (e.g. a standard), without the need for an integrating audio codec. The basic idea developed in this thesis is to insert samples of a signal by replacing some pixels of the "carrier's image” while preserving the quality of information after the process of encoding and decoding. This technique should not be confused with techniques like watermarking or stéganographie, since Multimodal Compression does not conceal any information with another. Two main objectives of Multimodal Compression are to improve the compression performance in terms of rate-distortion and to optimize the use of material resources of a given embedded system (e.g. acceleration of encoding/decoding time). In this report we study and analyze the variations of Multimodal Compression whose core function is to develop mixing and separation prior to coding and separation. Images and common signals as well as specific data such as biomedical images and signals are validated. This work is concluded by discussing the video of the strategy of Multimodal Compression
APA, Harvard, Vancouver, ISO, and other styles
3

Silva, Sandreane Poliana. "Comparação entre os métodos de compressão fractal e JPEG 2000 em um sistema de reconhecimento de íris." Universidade Federal de Uberlândia, 2008. https://repositorio.ufu.br/handle/123456789/14385.

Full text
Abstract:
Currently living in the digital age, so the manipulation of data and images is often all day. Due to the problem of space for storage of pictures and time of transmission, many compression techniques had been developed, and a great challenge is to make these techniques to bring good results in terms of compression rate, picture quality and processing time. The Fractal Compression technique developed by Fisher, was described, implemented and tested in this work and it brought great results, and considerable improvement in terms of execution time, which was rather low. Another area that has been emphasizing is the use of biometric techniques to the people recognition. A very used technique is the iris recognition that has shown enough reliability. Thus, connecting the two technologies brings great benefits. In this work, images of iris were compressed by the method implemented here and were made simulations of the technique iris recognition developed by Libor Maseck. The results show that it is possible to compress fractally the images without damage the recognition system. Comparisons were made and was possible realize that even with changes in pixels of images, the system remains very reliable, bringing benefits to storage space.<br>Atualmente vive-se na era digital, por isso a manipulação de dados e imagens é freqüente todos os dias. Devido ao problema de espaço para armazenamento dessas imagens e tempo de transmissão, foram desenvolvidas várias técnicas de compressão, e um grande desafio é fazer com que essas técnicas tragam bons resultados em termos de taxa de compressão, qualidade da imagem e tempo de processamento. A técnica de compressão Fractal desenvolvida por Fisher, foi descrita, implementada e testada neste trabalho e trouxe ótimos resultados, e melhoria considerável em termos de tempo de execução, que foi bastante reduzido. Outra área que vem se destacando é o uso de técnicas biométricas para reconhecimento de pessoas. Uma técnica muito usada é o reconhecimento de íris que tem mostrado bastante contabilidade. Assim, aliar as duas tecnologias traz grandes benefícios. No presente trabalho, imagens de íris foram comprimidas pelo método aqui implementado e foram realizadas simulações da técnica de reconhecimento de íris desenvolvida por Maseck. Os resultados mostram que é possível comprimir fractalmente as imagens sem prejudicar o sistema de reconhecimento. Comparações foram realizadas e foi possível perceber que mesmo havendo mudanças nos pixels das imagens, o sistema permanece bastante confiavel, trazendo vantagens em espaço de armazenamento.<br>Mestre em Ciências
APA, Harvard, Vancouver, ISO, and other styles
4

Lucero, Aldo. "Compressing scientific data with control and minimization of the L-infinity metric under the JPEG 2000 framework." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2007. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zeybek, Emre. "Compression multimodale du signal et de l'image en utilisant un seul codeur." Phd thesis, Université Paris-Est, 2011. http://tel.archives-ouvertes.fr/tel-00665757.

Full text
Abstract:
Cette thèse a pour objectif d'étudier et d'analyser une nouvelle stratégie de compression, dont le principe consiste à compresser conjointement des données issues de plusieurs modalités, en utilisant un codeur unique. Cette approche est appelée " Compression Multimodale ". Dans ce contexte, une image et un signal audio peuvent être compressés conjointement et uniquement par un codeur d'image (e.g. un standard), sans la nécessité d'intégrer un codec audio. L'idée de base développée dans cette thèse consiste à insérer les échantillons d'un signal en remplacement de certains pixels de l'image " porteuse " tout en préservant la qualité de l'information après le processus de codage et de décodage. Cette technique ne doit pas être confondue aux techniques de tatouage ou de stéganographie puisqu'il ne s'agit pas de dissimuler une information dans une autre. En Compression Multimodale, l'objectif majeur est, d'une part, l'amélioration des performances de la compression en termes de débit-distorsion et d'autre part, l'optimisation de l'utilisation des ressources matérielles d'un système embarqué donné (e.g. accélération du temps d'encodage/décodage). Tout au long de ce rapport, nous allons étudier et analyser des variantes de la Compression Multimodale dont le noyau consiste à élaborer des fonctions de mélange et de séparation, en amont du codage et de séparation. Une validation est effectuée sur des images et des signaux usuels ainsi que sur des données spécifiques telles que les images et signaux biomédicaux. Ce travail sera conclu par une extension vers la vidéo de la stratégie de la Compression Multimodale
APA, Harvard, Vancouver, ISO, and other styles
6

Yang, Hsueh-szu, and Benjamin Kupferschmidt. "Time Stamp Synchronization in Video Systems." International Foundation for Telemetering, 2010. http://hdl.handle.net/10150/605988.

Full text
Abstract:
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California<br>Synchronized video is crucial for data acquisition and telecommunication applications. For real-time applications, out-of-sync video may cause jitter, choppiness and latency. For data analysis, it is important to synchronize multiple video channels and data that are acquired from PCM, MIL-STD-1553 and other sources. Nowadays, video codecs can be easily obtained to play most types of video. However, a great deal of effort is still required to develop the synchronization methods that are used in a data acquisition system. This paper will describe several methods that TTC has adopted in our system to improve the synchronization of multiple data sources.
APA, Harvard, Vancouver, ISO, and other styles
7

Kivci, Erdem Turker. "Development Of A Methodology For Geospatial Image Streaming." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612570/index.pdf.

Full text
Abstract:
Serving geospatial data collected from remote sensing methods (satellite images, areal photos, etc.) have become crutial in many geographic information system (GIS) applications such as disaster management, municipality applications, climatology, environmental observations, military applications, etc. Even in today&rsquo<br>s highly developed information systems, geospatial image data requies huge amount of physical storage spaces and such characteristics of geospatial image data make its usage limited in above mentioned applications. For this reason, web-based GIS applications can benefit from geospatial image streaming through web-based architectures. Progressive transmission of geospatial image and map data on web-based architectures is implemented with the developed image streaming methodology. The software developed allows user interaction in such a way that the users will visualize the images according to their level of detail. In this way geospatial data is served to the users in an efficient way. The main methods used to transmit geospatial images are serving tiled image pyramids and serving wavelet based compressed bitstreams. Generally, in GIS applications, tiled image pyramids that contain copies of raster datasets at different resolutions are used rather than differences between resolutions. Thus, redundant data is transmitted from GIS server with different resolutions of a region while using tiled image pyramids. Wavelet based methods decreases redundancy. On the other hand methods that use wavelet compressed bitsreams requires to transform the whole dataset before the transmission. A hybrid streaming methodology is developed to decrease the redundancy of tiled image pyramids integrated with wavelets which does not require transforming and encoding whole dataset. Tile parts&rsquo<br>coefficients produced with the methodlogy are encoded with JPEG 2000, which is an efficient technology to compress images at wavelet domain.
APA, Harvard, Vancouver, ISO, and other styles
8

Kaše, David. "Komprese obrazu pomocí vlnkové transformace." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2015. http://www.nusl.cz/ntk/nusl-234996.

Full text
Abstract:
This thesis deals with image compression using wavelet, contourlet and shearlet transformation. It starts with quick look at image compression problem a quality measurement. Next are presented basic concepts of wavelets, multiresolution analysis and scaling function and detailed look at each transform. Representatives of algorithms for coeficients coding are EZW, SPIHT and marginally EBCOT. In second part is described design and implementation of constructed library. Last part compare result of transforms with format JPEG 2000. Comparison resulted in determining type of image in which implemented contourlet and shearlet transform were more effective than wavelet. Format JPEG 2000 was not exceeded.
APA, Harvard, Vancouver, ISO, and other styles
9

Mhamdi, Maroua. "Méthodes de transmission d'images optimisées utilisant des techniques de communication numériques avancées pour les systèmes multi-antennes." Thesis, Poitiers, 2017. http://www.theses.fr/2017POIT2281/document.

Full text
Abstract:
Cette thèse est consacrée à l'amélioration des performances de codage/décodage de systèmes de transmission d'images fixes sur des canaux bruités et réalistes. Nous proposons, à cet effet, le développement de méthodes de transmission d'images optimisées en se focalisant sur les deux couches application et physique des réseaux sans fil. Au niveau de la couche application et afin d'assurer une bonne qualité de service, on utilise des algorithmes de compression efficaces permettant au récepteur de reconstruire l'image avec un maximum de fidélité (JPEG2000 et JPWL). Afin d'assurer une transmission sur des canaux sans fil avec un minimum de TEB à la réception, des techniques de transmission, de codage et de modulation avancées sont utilisées au niveau de la couche physique (système MIMO-OFDM, modulation adaptative, CCE, etc). Dans un premier temps, nous proposons un système de transmission robuste d'images codées JPWL intégrant un schéma de décodage conjoint source-canal basé sur des techniques de décodage à entrées pondérées. On considère, ensuite, l'optimisation d'une chaîne de transmission d'images sur un canal MIMO-OFDM sans fil réaliste. La stratégie de transmission d'images optimisée s'appuie sur des techniques de décodage à entrées pondérées et une approche d'adaptation de lien. Ainsi, le schéma de transmission proposé offre la possibilité de mettre en oeuvre conjointement de l'UEP, de l'UPA, de la modulation adaptative, du codage de source adaptatif et de décodage conjoint pour améliorer la qualité de l'image à la réception. Dans une seconde partie, nous proposons un système robuste de transmission de flux progressifs basé sur le principe de turbo décodage itératif de codes concaténés offrant une stratégie de protection inégale de données. Ainsi, l'originalité de cette étude consiste à proposer des solutions performantes d'optimisation globale d'une chaîne de communication numérique pour améliorer la qualité de transmission<br>This work is devoted to improve the coding/ decoding performance of a transmission scheme over noisy and realistic channels. For this purpose, we propose the development of optimized image transmission methods by focusing on both application and physical layers of wireless networks. In order to ensure a better quality of services, efficient compression algorithms (JPEG2000 and JPWL) are used in terms of the application layer enabling the receiver to reconstruct the images with maximum fidelity. Furthermore, to insure a transmission on wireless channels with a minimum BER at reception, some transmission, coding and advanced modulation techniques are used in the physical layer (MIMO-OFDM system, adaptive modulation, FEC, etc). First, we propose a robust transmission system of JPWL encoded images integrating a joint source-channel decoding scheme based on soft input decoding techniques. Next, the optimization of an image transmission scheme on a realistic MIMO-OFDM channel is considered. The optimized image transmission strategy is based on soft input decoding techniques and a link adaptation approach. The proposed transmission scheme offers the possibility of jointly implementing, UEP, UPA, adaptive modulation, adaptive source coding and joint decoding strategies, in order to improve the image visual quality at the reception. Then, we propose a robust transmission system for embedded bit streams based on concatenated block coding mechanism offering an unequal error protection strategy. Thus, the novelty of this study consists in proposing efficient solutions for the global optimization of wireless communication system to improve transmission quality
APA, Harvard, Vancouver, ISO, and other styles
10

Abot, Julien. "Stratégie de codage conjoint pour la transmission d'images dans un système MIMO." Thesis, Poitiers, 2012. http://www.theses.fr/2012POIT2296/document.

Full text
Abstract:
Ce travail de thèse présente une stratégie de transmission exploitant la diversité spatiale pour la transmission d'images sur canal sans fil. On propose ainsi une approche originale mettant en correspondance la hiérarchie de la source avec celle des sous-canauxSISO issus de la décomposition d'un canal MIMO. On évalue les performances des précodeurs usuels dans le cadre de cette stratégie via une couche physique réaliste, respectant la norme IEEE802.11n, et associé à un canal de transmission basé sur un modèle de propagation à tracé de rayons 3D. On montre ainsi que les précodeurs usuels sont mal adaptés pour la transmission d'un contenu hiérarchisé. On propose alors un algorithme de précodage allouant successivement la puissance sur les sous-canaux SISO afin de maximiser la qualité des images reçues. Le précodeur proposé permet d'atteindre un TEB cible compte tenu ducodage canal, de la modulation et du SNR des sous-canaux SISO. A partir de cet algorithme de précodage, on propose une solution d'adaptation de lien permettant de régler dynamiquement les paramètres de la chaîne en fonction des variations sur le canal de transmission. Cette solution détermine la configuration de codage/transmission maximisant la qualité de l'image en réception. Enfin, on présente une étude sur la prise en compte de contraintes psychovisuelles dans l'appréciation de la qualité des images reçues. On propose ainsi l'intégration d'une métrique à référence réduite basée sur des contraintes psychovisuelles permettant d'assister le décodeur vers la configuration de décodage offrant la meilleure qualité d'expérience. Des tests subjectifs confirment l'intérêt de l'approche proposée<br>This thesis presents a transmission strategy for exploiting the spatial diversity for image transmission over wireless channel. We propose an original approach based on the matching between the source hierarchy and the SISO sub-channels hierarchy, resulting from the MIMO channel decomposition. We evaluate common precoder performance in the context of this strategy via a realistic physical layer respecting the IEEE802.11n standard and associated with a transmission channel based on a 3D-ray tracer propagation model. It is shown that common precoders are not adapted for the transmission of a hierarchical content. Then, we propose a precoding algorithm which successively allocates power over SISO subchannels in order to maximize the received images quality. The proposed precoder achieves a target BER according to the channel coding, the modulation and the SISO subchannels SNR. From this precoding algorithm, we propose a link adaptation scheme to dynamically adjust the system parameters depending on the variations of the transmission channel. This solution determines the optimal coding/transmission configuration maximizing the image quality in reception. Finally, we present a study for take into account some psychovisual constraints in the assessment of the received images quality. We propose the insertion of a reduced reference metric based on psychovisual constraints, to assist the decoder in order to determine the decoding configuration providing the highest quality of experience. Subjective tests confirm the interest of the proposed approach
APA, Harvard, Vancouver, ISO, and other styles
11

Urbánek, Pavel. "Komprese obrazu pomocí vlnkové transformace." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2013. http://www.nusl.cz/ntk/nusl-236385.

Full text
Abstract:
This thesis is focused on subject of image compression using wavelet transform. The first part of this document provides reader with information about image compression, presents well known contemporary algorithms and looks into details of wavelet compression and following encoding schemes. Both JPEG and JPEG 2000 standards are introduced. Second part of this document analyzes and describes implementation of image compression tool including inovations and optimalizations. The third part is dedicated to comparison and evaluation of achievements.
APA, Harvard, Vancouver, ISO, and other styles
12

Brunet, Dominique. "Métriques perceptuelles pour la compression d'images : éude et comparaison des algorithmes JPEG et JPEG2000." Master's thesis, Université Laval, 2007. http://hdl.handle.net/20.500.11794/19752.

Full text
Abstract:
Les algorithmes de compression d'images JPEG et JPEG2000 sont présentés, puis comparés grâce à une métrique perceptuelle. L'algorithme JPEG décompose une image par la transformée en cosinus discrète, approxime les coefficients transformés par une quantisation uniforme et encode le résultat par l'algorithme de Huffman. Pour l'algorithme JPEG2000, on utilise une transformée en ondelettes décomposant une image en plusieurs résolutions. On décrit et justifie la construction d'ondelettes orthogonales ou biorthogonales ayant le maximum de propriétés parmi les suivantes: valeurs réelles, support compact, plusieurs moments, régularité et symétrie. Ensuite, on explique sommairement le fonctionnement de l'algorithme JPEG2000, puis on montre que la métrique RMSE n'est pas bonne pour mesurer l'erreur perceptuelle. On présente donc quelques idées pour la construction d'une métrique perceptuelle se basant sur le fonctionnement du système de vision humain, décrivant en particulier la métrique SSIM. On utilise finalement cette dernière métrique pour conclure que JPEG2000 fait mieux que JPEG.<br>In the present work we describe the image compression algorithms: JPEG and JPEG2000. We then compare them using a perceptual metric. JPEG algorithm decomposes an image with the discrete cosine transform, the transformed map is then quantized and encoded with the Huffman code. Whereas the JPEG2000 algorithm uses wavelet transform to decompose an image in many resolutions. We describe a few properties of wavelets and prove their utility in image compression. The wavelets properties are for instance: orthogonality or biorthogonality, real wavelets, compact support, number of moments, regularity and symmetry. We then briefly show how does JPEG2000 work. After we prove that RMSE error is clearly not the best perceptual metric. So forth we suggest other metrics based on a human vision system model. We describe the SSIM index and suggest it as a tool to evaluate image quality. Finally, using the SSIM metric, we show that JPEG2000 surpasses JPEG.
APA, Harvard, Vancouver, ISO, and other styles
13

Brunet, Dominique. "Métriques perceptuelles pour la compression d'images. Étude et comparaison des algorithmes JPEG et JPEG2000." Thesis, Université Laval, 2007. http://www.theses.ulaval.ca/2007/25159/25159.pdf.

Full text
Abstract:
Les algorithmes de compression d'images JPEG et JPEG2000 sont présentés, puis comparés grâce à  une métrique perceptuelle. L'algorithme JPEG décompose une image par la transformée en cosinus discrète, approxime les coefficients transformés par une quantisation uniforme et encode le résultat par l'algorithme de Huffman. Pour l'algorithme JPEG2000, on utilise une transformée en ondelettes décomposant une image en plusieurs résolutions. On décrit et justifie la construction d'ondelettes orthogonales ou biorthogonales ayant le maximum de propriétés parmi les suivantes: valeurs réelles, support compact, plusieurs moments, régularité et symétrie. Ensuite, on explique sommairement le fonctionnement de l'algorithme JPEG2000, puis on montre que la métrique RMSE n'est pas bonne pour mesurer l'erreur perceptuelle. On présente donc quelques idées pour la construction d'une métrique perceptuelle se basant sur le fonctionnement du système de vision humain, décrivant en particulier la métrique SSIM. On utilise finalement cette dernière métrique pour conclure que JPEG2000 fait mieux que JPEG.<br>In the present work we describe the image compression algorithms: JPEG and JPEG2000. We then compare them using a perceptual metric. JPEG algorithm decomposes an image with the discrete cosine transform, the transformed map is then quantized and encoded with the Huffman code. Whereas the JPEG2000 algorithm uses wavelet transform to decompose an image in many resolutions. We describe a few properties of wavelets and prove their utility in image compression. The wavelets properties are for instance: orthogonality or biorthogonality, real wavelets, compact support, number of moments, regularity and symmetry. We then briefly show how does JPEG2000 work. After we prove that RMSE error is clearly not the best perceptual metric. So forth we suggest other metrics based on a human vision system model. We describe the SSIM index and suggest it as a tool to evaluate image quality. Finally, using the SSIM metric, we show that JPEG2000 surpasses JPEG.
APA, Harvard, Vancouver, ISO, and other styles
14

Chen, Yung-Chen, and 陳詠哲. "Wavelet-Based JPEG 2000 Image Compression." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/66542893741935624689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Chuang, Yan-Tse, and 莊彥澤. "Embedded Edge Image within JPEG-2000 Compression System." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/71510167702242108119.

Full text
Abstract:
碩士<br>國立臺灣大學<br>電機工程學研究所<br>89<br>JPEG-2000 system is the newest standard for still image compression. In this Thesis, we discuss the basic architecture of JPEG-2000 system, which could be viewed as an evolution of image compression techniques during recent years. While over the past decades, a different class of image coding schemes, generally referred to as second-generation image coding technique, has been proposed, telling us that edges have features more rich in information used for recognition or perceiving an image. Derived from this concept, we propose two methods to combine the JPEG-2000 system with the second-generation image coding techniques, to display edges image first, then other parts of the image not belonging to the edges. These coding schemes allow us to perceive an image from the contour first and could be applied to several conventional application-specific image systems where edges contain critical information, such as pattern recognition, motion estimation, etc.
APA, Harvard, Vancouver, ISO, and other styles
16

Tsai, Ting-Chieh, and 蔡定杰. "JPEG 2000 Image Compression Method: Biorthogonal Wavelets and Lifting Transform." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/xtfr5b.

Full text
Abstract:
碩士<br>國立中山大學<br>應用數學系研究所<br>105<br>Image compression is an important aspect of image processing. In 1992 the Joint Photographic Experts Group announced the JPEG Standard algorithm. The related software jpg has become one of the most popular file format for still image compression in smart phones and digital cameras. In 2000, they announced the algorithm JPEG 2000, an important improvement of JPEG Standard in terms of compression rate and flexibility. Both algorithms consist of the four steps: Pre-processing, Transformation, Quantization and Encoding. In this thesis, we shall study the general algorithm of JPEG 2000. Special emphasis will be made on the discrete wavelet transform in the Transformation step. In particular, we shall study with mathematical rigor orthogonal wavelets and biorthogonal wavelets CDF 9/7, plus a related lifting transformation, developed by Daubechies-Sweldens. They form an important part in the lossy compression algorithm of JPEG 2000, respectively. Our material comes from several monographs, notably the book ‘Discrete Wavelet Transformations’ by Van Fleet, and the paper [4], with the help of the classical monographs by Daubechies [3] and Mallat [6]. The materials are organized into a self-contained and systematic manner. We also write some programs with Mathematica to exemplify the biorthogonal wavelet transform and lifting transform.
APA, Harvard, Vancouver, ISO, and other styles
17

Chen, Chun-Jen, and 陳俊仁. "A Study on JPEG 2000 Compression Algorithm Applied to DICOM Standard." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/79663965292742180073.

Full text
Abstract:
碩士<br>逢甲大學<br>電子工程所<br>93<br>Owing to the fast development of computer software/hardware and networking, information technology has been widely used in industry to enhance the efficiency and quality of service. In the medical domain, the Digital Image and COmmunication in Medicine (DICOM) standard protocol integrates the Hospital Information System (HIS), the Pictures Archiving Communications System (PACS) and the Radiology Information System (RIS), provides doctors and the medical research departments images from image servers via the network for diagnosing of patients’ symptoms. The main purpose of this thesis is to develop a digital image compression algorithm by using JPEG 2000 compression technology to compress digital images produced from various medical image acquisition equipments, and transform them with DICOM standard. After compression, the image quality can still be maintained and the space of storage as well as the time of access has been reduced. Therefore, the efficiency of the PACS system is increased.
APA, Harvard, Vancouver, ISO, and other styles
18

Wang, Sent-Po, and 王聖博. "A Study of Using Inter-Block Redundancy to Improve JPEG 2000 Compression Performance." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/99291500073848266753.

Full text
Abstract:
碩士<br>國立交通大學<br>資訊科學系所<br>92<br>In recent year, with the mobile phone with integrated digital camera becoming more and more popular, the size of image file is required smaller to achieve better storage or transmission efficiency. In order to fit such kind of application environments, how to compress the image file to a smaller size has become a more and more important topic. The method proposed in this paper utilizes the inter-block redundancy to further compress an image by a fast algorithm to drop redundant blocks in the encoder side and to utilize our block filling method to fill them back in the decoder. In this way, we can achieve the goal of further compressing an image file. Experimental results indicate that in high compression ratio the proposed method can indeed improve the compression ratio and meanwhile maintain a proper visual quality and on the other hand in low compression ratio the method is capable to maintain a finer image quality compared to pure JPEG 2000 standard.
APA, Harvard, Vancouver, ISO, and other styles
19

Selvaraj, V. "Rate Control Of MPEG-2 Video And JPEG images." Thesis, 1999. https://etd.iisc.ac.in/handle/2005/1639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Selvaraj, V. "Rate Control Of MPEG-2 Video And JPEG images." Thesis, 1999. http://etd.iisc.ernet.in/handle/2005/1639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Peliteiro, Rui Daniel Ribeiro. "Estudo de compressão de dados médicos volumétricos usando codificadores normalizados de imagem." Master's thesis, 2016. http://hdl.handle.net/10316/40553.

Full text
Abstract:
Dissertação de Mestrado Integrado em Engenharia Electrotécnica e de Computadores apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra<br>The increasing use of technology in aiding to medicine, allowed the development of techniques and procedures to create visual representations of the interior of a human body for medical purposes using techniques such as computed tomography (CT), magnetic resonance imaging (MRI), positron emisson tomography (PET), among others. These imaging techniques create very large amounts of data, forming in this way what is know as medical images. Since these images must be stored, it becomes imperative to its compression, since otherwise it becomes rapidly prohibitive storing this data locally or with the use of the cloud. In this sense, it is necessary to use an image encoding system, more specifically the encoder JPEG 2000. The choise fell on this encoder seen that the international standard for medical images and related information (ISO 12052), also known as DICOM, set as default encoder JPEG 2000 Part 1, i.e. the core of the JPEG 2000 encoding system. However, as this dissertation the medical images used are of volumetric type is also of great interest the utilization of Part 10 of this encoder, since this adds the ability to compression of volumetric images. In this sense it is initially presented an overview of the encoder already with the necessary changes in order to be able to handle with volumetric images. However, along the dissertation is given more attention to step two of the encoder. This step defines the use of transforms based on wavelet technology. In this follow-up is presented a detailed description of wavelet transform implemented in JPEG 2000, the discrete wavelet transform (DWT). In the same segment is show in detail an alternative to the latter, defined as directional adaptive discrete wavelet transform (DADWT). Finally, it is performed an analysis on the different possibilities of using the encoder settings for compression of medical images. The settings used range from the choise of lossy compression or lossless compression to the number of decompositions to perform in each slice in three orthogonal directions. With this is done an analysis of the results obtained, allowing to draw conclusions as varied as, if compensates use axial decomposition in the case of medical images and to that type, the gain obtained with the axial decomposition, the optimal number of decompositions in the three directions, the loss of information when using lossy compression.<br>A crescente utilização de tecnologia no auxílio à medicina, permitiu o desenvolvimento de técnicas e processos para a criação de representações visuais do interior do corpo humano para propósitos clínicos, usando técnicas como a tomografia computorizada (CT), imagem por ressonância magnética (MRI), tomografia por emissão de positrões (PET), entre outros. Estas técnicas imagiológicas geram uma grande quantidade de informação, formando deste modo o que é conhecido como imagens médicas. Uma vez que estas imagens devem ser armazenadas, torna-se imperativo a sua compressão, pois de outro modo torna-se rapidamente incomportável o armazenamento destes dados localmente ou com a utilização da cloud. Neste sentido, é necessário a utilização de um sistema de codificação de imagens, mais concretamente o codificador JPEG 2000. A escolha recaiu neste codificador visto que a norma internacional para imagens médicas e informações relacionadas (ISO 12052), também conhecida como DICOM, definiu como codificador padrão o JPEG 2000 Parte 1, ou seja, o sistema de codificação central ao JPEG 2000. No entanto, como nesta dissertação as imagens médicas utilizadas são do tipo volumétrico é também de grande interesse a utilização da Parte 10 deste codificador, já que este adiciona a capacidade de compressão de imagens volumétricas. Neste sentido é inicialmente apresentada uma visão geral do codificador já com as alterações necessárias a fim de conseguir lidar com imagens volumétricas. No entanto, ao longo da dissertação é dada mais atenção à etapa dois do codificador. Esta etapa define a utilização de transformadas baseadas em tecnologia wavelets. Neste seguimento é apresentada uma descrição detalhada da transformada wavelet implementada no JPEG 2000, a transformada discreta wavelet (DWT). No mesmo seguimento é apresentado minuciosamente uma alternativa a este último, definido como transformada discreta wavelet direcional adaptativa (DADWT). Por fim, é efetuada uma análise sobre as diferentes possibilidades de configurações de utilização do codificador para compressão de imagens médicas. As configurações utilizadas variam desde a escolha da compressão com perdas ou compressão sem perdas ao número de decomposições a efetuar em cada slice em três direções ortogonais. Com isto é feito uma análise dos resultados obtidos permitindo tirar conclusões tão variadas como, se compensa utilizar decomposição axial para o caso das imagens médicas e para que tipo, o ganho obtido com a decomposição axial, o número ótimo de decomposições a efetuar nas três direções, a perda de informação ao utilizar compressão com perdas.
APA, Harvard, Vancouver, ISO, and other styles
22

Augustine, Jacob. "Switching Theoretic Approach To Image Compression." Thesis, 1996. https://etd.iisc.ac.in/handle/2005/1898.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Augustine, Jacob. "Switching Theoretic Approach To Image Compression." Thesis, 1996. http://etd.iisc.ernet.in/handle/2005/1898.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Vychodil, Bedřich. "Produkce digitálních obrazových dat a jejich kontrola." Doctoral thesis, 2013. http://www.nusl.cz/ntk/nusl-322214.

Full text
Abstract:
(EN) This dissertation provides a broad understanding of fundamental terminology and an inside view of the digitization workflow and quality control processes. The main foci are on quality assurance during and outside the digitization processes and identification of the most suitable format for access, long term preservation, rendering and reformatting issues. An analysis of selected digitization centers is also included. An application called DIFFER (Determinator of Image File Format propERties) and subsequently The Image Data Validator - DIFFER has been developed using results from previously conducted research. The application utilizes new methods and technologies to help accelerate the whole processing and quality control. The goal was to develop a quality control application for a select group of relevant still image file formats capable of performing identification, characterization, validation and visual/mathematical comparison integrated into an operational digital preservation framework. This application comes with a well-structured graphic user interface, which helps the end user to understand the relationships between various file format properties, detect visual and non visual errors and simplify decision-making. Additional comprehensive annexes, describing the most crucial still image...
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography