Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Lempel-Ziv decompression.

Статті в журналах з теми "Lempel-Ziv decompression"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-23 статей у журналах для дослідження на тему "Lempel-Ziv decompression".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Maharjan, Shyam, Sujan Poudel, and Dipesh Tandukar. "A Comparative Study of Text-Based Lossless Compression." American Journal of Smart Technology and Solutions 3, no. 2 (2024): 34–39. http://dx.doi.org/10.54536/ajsts.v3i2.3566.

Повний текст джерела
Анотація:
Lossless data compression is a critical technique used to reduce file sizes without any loss of information during the encoding and decoding processes. This study presents a comparative analysis of two widely-used lossless compression algorithms: Huffman Encoding and Lempel-Ziv-Welch (LZW). The primary objective is to evaluate the performance of these algorithms in terms of compression ratio, compression time, decompression time, and space savings. The analysis was conducted on 100 files of varying sizes. The results demonstrate that the LZW algorithm outperforms Huffman Encoding, offering sup
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Shirakol, Shrikanth, Akshata Koparde, Sandhya ., Shravan Kulkarni, and Yogesh Kini. "Performance optimization of dual stage algorithm for lossless data compression and decompression." International Journal of Engineering & Technology 7, no. 2.21 (2018): 127. http://dx.doi.org/10.14419/ijet.v7i2.21.11849.

Повний текст джерела
Анотація:
In this paper, an optimized dual stage architecture is proposed which is the combination of Lempel-Ziv-Welch (LZW) Algorithm at the first phase and Arithmetic Coding being the later part of Architecture. LZW Algorithm is a lossless compression algorithm and code here for each character is available in the dictionary which reduces 5-bits per cycle as compared to ASCII. In arithmetic coding the numbers are represented by an interval of real numbers from zero to one according to their probabilities. It is an entropy coding and is lossless in nature. The text information is allowed to pass through
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Keerthy, A. S., and Manju Priya S. "Genomic Sequence Data Compression using Lempel-Ziv-Welch Algorithm with Indexed Multiple Dictionary." International Journal of Engineering and Advanced Technology (IJEAT) 9, no. 2 (2019): 541–47. https://doi.org/10.35940/ijeat.B3278.129219.

Повний текст джерела
Анотація:
With the advancement in technology and development of High Throughput System (HTS), the amount of genomic data generated per day per laboratory across the globe is surpassing the Moore’s law. The huge amount of data generated is of concern to the biologists with respect to their storage as well as transmission across different locations for further analysis. Compression of the genomic data is the wise option to overcome the problems arising from the data deluge. This paper discusses various algorithms that exists for compression of genomic data as well as a few general purpose algorithms
Стилі APA, Harvard, Vancouver, ISO та ін.
4

King, G. R. Gnana, C. Christopher Seldev, and N. Albert Singh. "A Novel Compression Technique for Compound Images Using Parallel Lempel-Ziv-Welch Algorithm." Applied Mechanics and Materials 626 (August 2014): 44–51. http://dx.doi.org/10.4028/www.scientific.net/amm.626.44.

Повний текст джерела
Анотація:
Compound image is a combination of natural images, text, and graphics.This paper presents a compression technique for improving coding efficiency. The algorithm first decomposes the compound images by using 3 level biorthogonal wavelet transform and then the transformed image was further compressed by Parallel dictionary based LZW algorithm called PDLZW.In PDLZW algorithm instead of using a unique fixed word width dictionary a hierarchical variable word width dictionary set containing several dictionaries of small address space and increases the word widths used for compression and decompressi
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Fitra Wijayanto, Erick, Muhammad Zarlis, and Zakarias Situmorang. "Increase the PSNR Of Image Using LZW and AES Algorithm With MLSB on Steganography." International Journal of Engineering & Technology 7, no. 2.5 (2018): 119. http://dx.doi.org/10.14419/ijet.v7i2.5.13965.

Повний текст джерела
Анотація:
There are many research has done a hybridization approach of text message insertion that has been compressed with Lempel-Ziv-Welch (LZW) algorithm and has also been encrypted. The text messages in ciphertext form are inserted into the image file using LSB (Least Significant Bit) method. The results of this study indicate that the value of Peak Signal to Noise Ratio (PSNR) lower than the LSB method of 0.94 times with a ratio of 20.33%, with Kekre's method of 10.04%. To improve the value of PSNR stego image of insertion, in this research is inserted audio samples using 5 bits to reduce the amoun
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Anandita, Ida Bagus Gede, I. Gede Aris Gunadi, and Gede Indrawan. "Analisis Kinerja Dan Kualitas Hasil Kompresi Pada Citra Medis Sinar-X Menggunakan Algoritma Huffman, Lempel Ziv Welch Dan Run Length Encoding." SINTECH (Science and Information Technology) Journal 1, no. 1 (2018): 7–15. http://dx.doi.org/10.31598/sintechjournal.v1i1.179.

Повний текст джерела
Анотація:
Technological progress in the medical area made medical images like X-rays stored in digital files. The medical image file is relatively large so that the image needs to be compressed. The lossless compression technique is an image compression where the decompression results are the same as the original or no information lost in the compression process. The existing algorithms on lossless compression techniques are Run Length Encoding (RLE), Huffman, and Lempel Ziv Welch (LZW). This study compared the performance of the three algorithms in compressing medical images. The result of image decomp
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Huang, Xiaobo, and Limin Yan. "P‐5.24: A Lossless Image Compression LZW Algorithm Based on Double Hash Dictionary." SID Symposium Digest of Technical Papers 56, S1 (2025): 1119–24. https://doi.org/10.1002/sdtp.19013.

Повний текст джерела
Анотація:
Among numerous lossless image compression algorithms, the LZW (Lempel‐Ziv‐Welch) algorithm is widely used due to its high adaptability and coding efficiency. However, the LZW algorithm has certain limitations as it is susceptible to factors such as dictionary storage structure and dictionary update strategies, often failing to balance compression ratio and speed. Therefore, this paper proposes an improved LZW algorithm based on a double‐hash dictionary. The update strategy involves using both an improved dictionary and the original dictionary, where frequently occurring strings in the original
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Nunes, Daniel S. N., Felipe A. Louza, Simon Gog, Mauricio Ayala-Rincón, and Gonzalo Navarro. "Grammar Compression by Induced Suffix Sorting." ACM Journal of Experimental Algorithmics 27 (December 31, 2022): 1–33. http://dx.doi.org/10.1145/3549992.

Повний текст джерела
Анотація:
A grammar compression algorithm, called GCIS, is introduced in this work. GCIS is based on the induced suffix sorting algorithm SAIS, presented by Nong et al. in 2009. The proposed solution builds on the factorization performed by SAIS during suffix sorting. A context-free grammar is used to replace factors by non-terminals. The algorithm is then recursively applied on the shorter sequence of non-terminals. The resulting grammar is encoded by exploiting some redundancies, such as common prefixes between right-hands of rules, sorted according to SAIS. GCIS excels for its low space and time requ
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Belu, Sabin, and Daniela Coltuc. "A Hybrid Data-Differencing and Compression Algorithm for the Automotive Industry." Entropy 24, no. 5 (2022): 574. http://dx.doi.org/10.3390/e24050574.

Повний текст джерела
Анотація:
We propose an innovative delta-differencing algorithm that combines software-updating methods with LZ77 data compression. This software-updating method relates to server-side software that creates binary delta files and to client-side software that performs software-update installations. The proposed algorithm creates binary-differencing streams already compressed from an initial phase. We present a software-updating method suitable for OTA software updates and the method’s basic strategies to achieve a better performance in terms of speed, compression ratio or a combination of both. A compari
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Kadhim, Doaa J., Mahmood F. Mosleh, and Faeza A. Abed. "Exploring Text Data Compression: A Comparative Study of Adaptive Huffman and LZW Approaches." BIO Web of Conferences 97 (2024): 00035. http://dx.doi.org/10.1051/bioconf/20249700035.

Повний текст джерела
Анотація:
Data compression is a critical procedure in computer science that aims to minimize the size of data files while maintaining their vital information. It is extensively utilized in Numerous applications, including communication, data storage, and multimedia transmission. In this work, we investigated the results of compressing four different text files with Lempel-Ziv-Welch compression techniques and Adaptive Huffman coding. The experiment used four text files: Arabic and English paragraphs and repeated Arabic and English characters. We measured Bit-rate, Compression Time, and Decompression Time
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Romankevych, Vitalii O., Ivan V. Mozghovyi, Pavlo A. Serhiienko, and Lefteris Zacharioudakis. "Decompressor for hardware applications." Applied Aspects of Information Technology 6, no. 1 (2023): 74–83. http://dx.doi.org/10.15276/aait.06.2023.6.

Повний текст джерела
Анотація:
The use of lossless compression in the application specific computers provides such advantages as minimized amount of memory, increased bandwidth of interfaces, reduced energy consumption, and improved self-testing systems. The article discusses known algorithms of lossless compression with the aim of choosing the most suitable one for implementation in a hardware-software decompressor. Among them, the Lempel-Ziv-Welch (LZW) algorithm makes it possible to perform the associative memory of the decompressor dictionary in the simplest way by using the sequential reading the symbols of the decompr
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Khan, Sangeen. "A Comprehensive Methodology for Data Compression and Decompression Utilizing Huffman Coding, LZW Compression, and RunLength Encoding, Integrated with Data Encryption Standard (DES) and Advanced Encryption Standard (AES) for Enhanced Security." Indian Journal of Cryptography and Network Security 4, no. 2 (2024): 7–13. http://dx.doi.org/10.54105/ijcns.a1427.04021124.

Повний текст джерела
Анотація:
With advancements in communication technologies, transitioning from 5G to 6G systems has led to exponential data growth, requiring secure and efficient data transmission solutions. This study integrates data compression techniques Huffman Coding, Lempel Ziv Welch (LZW), and Run-Length Encoding (RLE) with symmetric encryption algorithms, AES (Advanced Encryption Standard) and DES (Data Encryption Standard). The primary goal is to enhance computational performance while ensuring data security. Using a 32 byte dataset and implementing the algorithms in Go language via Visual Studio IDE, results d
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Sangeen, Khan. "A Comprehensive Methodology for Data Compression and Decompression Utilizing Huffman Coding, LZW Compression, and Run-Length Encoding, Integrated with Data Encryption Standard (DES) and Advanced Encryption Standard (AES) for Enhanced Security." Indian Journal of Cryptography and Network Security (IJCNS) 4, no. 2 (2024): 7–13. https://doi.org/10.54105/ijcns.A1427.04021124.

Повний текст джерела
Анотація:
<strong>Abstract: </strong>With advancements in communication technologies, transitioning from 5G to 6G systems has led to exponential data growth, requiring secure and efficient data transmission solutions. This study integrates data compression techniques&mdash;Huffman Coding, Lempel-Ziv-Welch (LZW), and Run-Length Encoding (RLE)&mdash;with symmetric encryption algorithms, AES (Advanced Encryption Standard) and DES (Data Encryption Standard). The primary goal is to enhance computational performance while ensuring data security. Using a 32-byte dataset and implementing the algorithms in Go la
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Di, Jinhong, Pengkun Yang, Chunyan Wang, and Lichao Yan. "Layered Lossless Compression Method of Massive Fault Recording Data." International Journal of Circuits, Systems and Signal Processing 16 (January 3, 2022): 17–25. http://dx.doi.org/10.46300/9106.2022.16.3.

Повний текст джерела
Анотація:
In order to overcome the problems of large error and low precision in traditional power fault record data compression, a new layered lossless compression method for massive fault record data is proposed in this paper. The algorithm applies LZW (Lempel Ziv Welch) algorithm, analyzes the LZW algorithm and existing problems, and improves the LZW algorithm. Use the index value of the dictionary to replace the input string sequence, and dynamically add unknown strings to the dictionary. The parallel search method is to divide the dictionary into several small dictionaries with different bit widths
Стилі APA, Harvard, Vancouver, ISO та ін.
15

T, Sujatha, and Selvam K. "LOSSLESS IMAGE COMPRESSION USING DIFFERENT ENCODING ALGORITHM FOR VARIOUS MEDICAL IMAGES." ICTACT Journal on Image and Video Processing 12, no. 4 (2022): 2704–9. https://doi.org/10.21917/ijivp.2022.0384.

Повний текст джерела
Анотація:
In the medical industry, the amount of data that can be collected and kept is currently increasing. As a result, in order to handle these large amounts of data efficiently, compression methods must be re-examined while taking the algorithm complexity into account. An image processing strategy should be explored to eliminate the duplication image contents, so boosting the capability to retain or transport data in the best possible manner. Image Compression (IC) is a method of compressing images as they are being stored and processed. The information is preserved in a lossless image compression
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Ratov, Denis. "DEVELOPMENT OF METHOD AND SOFTWARE FOR COMPRESSION AND ENCRYPTION OF INFORMATION." Journal of Automation and Information sciences 1 (January 1, 2022): 66–73. http://dx.doi.org/10.34229/1028-0979-2022-1-7.

Повний текст джерела
Анотація:
Researches of the subject area of lossless information compression and with data loss are carried out and data compression algorithms with minimal redundancy are considered: Shannon-Fano coding, Huffman coding and compression using a dictionary: Lempel-Ziv coding. In the course of the work, the theoretical foundations of data compression were used, studies of various methods of data compression were carried out, the best methods of archiving with encryption and storage of various kinds of data were identified. The method of archiving data in the work is used for the purpose of safe and rationa
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Ananth, Kamath, Kant Ankit, Srivatsa Aravind, and J.A Harisha. "Dynamic Decompression for Text Files." November 27, 2009. https://doi.org/10.5281/zenodo.1082702.

Повний текст джерела
Анотація:
Compression algorithms reduce the redundancy in data representation to decrease the storage required for that data. Lossless compression researchers have developed highly sophisticated approaches, such as Huffman encoding, arithmetic encoding, the Lempel-Ziv (LZ) family, Dynamic Markov Compression (DMC), Prediction by Partial Matching (PPM), and Burrows-Wheeler Transform (BWT) based algorithms. Decompression is also required to retrieve the original data by lossless means. A compression scheme for text files coupled with the principle of dynamic decompression, which decompresses only the secti
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Chandra, M. Edo. "IMPEMENTASI ALGORITMA LEMPEL ZIV STORER SZYMANSKI (LZSS) PADA APLIKASI BACAAN SHALAT BERBASIS ANDROID." KOMIK (Konferensi Nasional Teknologi Informasi dan Komputer) 3, no. 1 (2019). http://dx.doi.org/10.30865/komik.v3i1.1624.

Повний текст джерела
Анотація:
Compression is a way to compress or modify data so that the required storage space is smaller and more efficient. In this study, the large file size in the prayer reading application makes the document storage space requires a lot of storage space due to the large file size. Due to the large size of the file that is currently making a large and inefficient file storage area and the larger size of a file makes the smartphone slow due to the file. The purpose of this study is to design an android-based prayer reading application by implementing the Szymanski Ziv Storer Lempel algorithm (LZSS). A
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Anto, Rincy Thayyalakkal, and Rajesh Ramachandran. "A Compression System for Unicode Files Using an Enhanced Lzw Method." Pertanika Journal of Science and Technology 28, no. 4 (2020). http://dx.doi.org/10.47836/pjst.28.4.16.

Повний текст джерела
Анотація:
Data compression plays a vital and pivotal role in the process of computing as it helps in space reduction occupied by a file as well as to reduce the time taken to access the file. This work relates to a method for compressing and decompressing a UTF-8 encoded stream of data pertaining to Lempel-Ziv-welch (LZW) method. It is worth to use an exclusive-purpose LZW compression scheme as many applications are utilizing Unicode text. The system of the present work comprises a compression module, configured to compress the Unicode data by creating the dictionary entries in Unicode format. This is a
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Kulkarni, Sampada, Tejas Ghadge, Ashutosh Mishra, Sameer Dhane, and Pranav Jadhawar. "File Packer-Un packer System." International Journal Of Recent Trends In Multidisciplinary Research, February 22, 2024, 54–57. http://dx.doi.org/10.59256/ijrtmr.20240401009.

Повний текст джерела
Анотація:
The File Packer/ Un packer Application is a software designed to streamline the process of compressing and decompressing files and folders. In the digital age, the need for efficient data storage and transfer is paramount. This Application aims to address this need by providing a versatile and user-friendly tool for packing multiple files and directories into a single compressed archive, as well as extracting them when required. The application features a robust user interface that allows users to select files and folders for packing and specify compression settings. It supports various compre
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Rosalina, Rosalina, and Wahyu Hidayat. "SMS Compression System using Arithmetic Coding Algorithm." IT for Society 3, no. 02 (2018). http://dx.doi.org/10.33021/itfs.v3i02.586.

Повний текст джерела
Анотація:
&lt;p&gt;Short Message Service (SMS) has limitation in the length of its text message, which only provides 160 characters per SMS. It means that if we send more than 160 characters, it will be considered as sending more than one SMS, so that we have to spend more cost for sending SMS. On the other side, the Arithmetic Coding algorithm provides an effective mechanism for text compression. It has performed the great compression result and in many case it was considered as the better compression algorithm than other ones, such as Huffman and LZW (Lempel-Ziv-Welch). This research will implement th
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Mallafi, Hadary. "Performance Analysis in Web Based Data Uploading using LZ77 Compression and Chunking Method." Indonesian Journal on Computing (Indo-JC) 1, no. 1 (2016). http://dx.doi.org/10.21108/indojc.2016.1.1.8.

Повний текст джерела
Анотація:
One of the limitations in data uploading process is the maximum request length, besides that the data size that us transferred is also an issue because it influences the data sending cost. One of the way to cope with the problem of maximum request length is by downsizing the file size (chunking). Another way to do it is by enlarging the maximum reques length. Downsizing the file size can be done by chunking the files into a smaller size or by compressing it. In this paper, the author conducted a research about the file compression process that is done in client server using the technology of A
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Bille, Philip, Inge Li Gørtz, and Teresa Anna Steiner. "String Indexing with Compressed Patterns." ACM Transactions on Algorithms, July 21, 2023. http://dx.doi.org/10.1145/3607141.

Повний текст джерела
Анотація:
Given a string S of length n , the classic string indexing problem is to preprocess S into a compact data structure that supports efficient subsequent pattern queries. In this paper we consider the basic variant where the pattern is given in compressed form and the goal is to achieve query time that is fast in terms of the compressed size of the pattern. This captures the common client-server scenario, where a client submits a query and communicates it in compressed form to a server. Instead of the server decompressing the query before processing it, we consider how to efficiently process the
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!