To see the other types of publications on this topic, follow the link: Huffman’s algorithm.

Journal articles on the topic 'Huffman’s algorithm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Huffman’s algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Blanchette, Jasmin Christian. "Proof Pearl: Mechanizing the Textbook Proof of Huffman’s Algorithm." Journal of Automated Reasoning 43, no. 1 (February 13, 2009): 1–18. http://dx.doi.org/10.1007/s10817-009-9116-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

García, Jesús, Verónica González-López, Gustavo Tasca, and Karina Yaginuma. "An Efficient Coding Technique for Stochastic Processes." Entropy 24, no. 1 (December 30, 2021): 65. http://dx.doi.org/10.3390/e24010065.

Full text
Abstract:
In the framework of coding theory, under the assumption of a Markov process (Xt) on a finite alphabet A, the compressed representation of the data will be composed of a description of the model used to code the data and the encoded data. Given the model, the Huffman’s algorithm is optimal for the number of bits needed to encode the data. On the other hand, modeling (Xt) through a Partition Markov Model (PMM) promotes a reduction in the number of transition probabilities needed to define the model. This paper shows how the use of Huffman code with a PMM reduces the number of bits needed in this process. We prove the estimation of a PMM allows for estimating the entropy of (Xt), providing an estimator of the minimum expected codeword length per symbol. We show the efficiency of the new methodology on a simulation study and, through a real problem of compression of DNA sequences of SARS-CoV-2, obtaining in the real data at least a reduction of 10.4%.
APA, Harvard, Vancouver, ISO, and other styles
3

Okazaki, Hiroyuki, Yuichi Futa, and Yasunari Shidama. "Constructing Binary Huffman Tree." Formalized Mathematics 21, no. 2 (June 1, 2013): 133–43. http://dx.doi.org/10.2478/forma-2013-0015.

Full text
Abstract:
Summary Huffman coding is one of a most famous entropy encoding methods for lossless data compression [16]. JPEG and ZIP formats employ variants of Huffman encoding as lossless compression algorithms. Huffman coding is a bijective map from source letters into leaves of the Huffman tree constructed by the algorithm. In this article we formalize an algorithm constructing a binary code tree, Huffman tree.
APA, Harvard, Vancouver, ISO, and other styles
4

Nasif, Ammar, Zulaiha Ali Othman, and Nor Samsiah Sani. "The Deep Learning Solutions on Lossless Compression Methods for Alleviating Data Load on IoT Nodes in Smart Cities." Sensors 21, no. 12 (June 20, 2021): 4223. http://dx.doi.org/10.3390/s21124223.

Full text
Abstract:
Networking is crucial for smart city projects nowadays, as it offers an environment where people and things are connected. This paper presents a chronology of factors on the development of smart cities, including IoT technologies as network infrastructure. Increasing IoT nodes leads to increasing data flow, which is a potential source of failure for IoT networks. The biggest challenge of IoT networks is that the IoT may have insufficient memory to handle all transaction data within the IoT network. We aim in this paper to propose a potential compression method for reducing IoT network data traffic. Therefore, we investigate various lossless compression algorithms, such as entropy or dictionary-based algorithms, and general compression methods to determine which algorithm or method adheres to the IoT specifications. Furthermore, this study conducts compression experiments using entropy (Huffman, Adaptive Huffman) and Dictionary (LZ77, LZ78) as well as five different types of datasets of the IoT data traffic. Though the above algorithms can alleviate the IoT data traffic, adaptive Huffman gave the best compression algorithm. Therefore, in this paper, we aim to propose a conceptual compression method for IoT data traffic by improving an adaptive Huffman based on deep learning concepts using weights, pruning, and pooling in the neural network. The proposed algorithm is believed to obtain a better compression ratio. Additionally, in this paper, we also discuss the challenges of applying the proposed algorithm to IoT data compression due to the limitations of IoT memory and IoT processor, which later it can be implemented in IoT networks.
APA, Harvard, Vancouver, ISO, and other styles
5

Lamorahan, Christine, Benny Pinontoan, and Nelson Nainggolan. "Data Compression Using Shannon-Fano Algorithm." d'CARTESIAN 2, no. 2 (October 1, 2013): 10. http://dx.doi.org/10.35799/dc.2.2.2013.3207.

Full text
Abstract:
Abstract Communication systems in the world of technology, information and communication are known as data transfer system. Sometimes the information received lost its authenticity, because size of data to be transferred exceeds the capacity of the media used. This problem can be reduced by applying compression process to shrink the size of the data to obtain a smaller size. This study considers compression for data text using Shannon – Fano algorithm and shows how effective these algorithms in compressing it when compared with the Huffman algorithm. This research shows that text data compression using Shannon-Fano algorithm has a same effectiveness with Huffman algorithm when all character in string all repeated and when the statement short and just one character in the statement that repeated, but the Shannon-Fano algorithm more effective then Huffman algorithm when the data has a long statement and data text have more combination character in statement or in string/ word. Keywords: Data compression, Huffman algorithm, Shannon-Fano algorithm Abstrak Sistem komunikasi dalam dunia teknologi informasi dan komunikasi dikenal sebagai sistem transfer data. Informasi yang diterima kadang tidak sesuai dengan aslinya, dan salah satu penyebabnya adalah besarnya ukuran data yang akan ditransfer melebihi kapasitas media yang digunakan. Masalah ini dapat diatasi dengan menerapkan proses kompresi untuk mengecilkan ukuran data yang besar sehingga diperoleh ukuran yang lebih kecil. Penelitian ini menunjukan salah satu kompresi untuk data teks dengan menggunakan algoritma Shannon – Fano serta menunjukan seberapa efektif algoritma tersebut dalam mengkompresi data jika dibandingkan dengan algoritma Huffman. Kompresi untuk data teks dengan algoritma Shannon-Fano menghasilkan suatu data dengan ukuran yang lebih kecil dari data sebelumnya dan perbandingan dengan algoritma Huffman menunjukkan bahwa algoritma Shannon- Fano memiliki keefektifan yang sama dengan algoritma Huffman jika semua karakter yang ada di data berulang dan jika dalam satu kalimat hanya ada satu karakter yang berulang, tapi algoritma Shannon-Fano lebih efektif jika kalimat lebih panjang dan jumlah karakter di dalam kalimat atau kata lebih banyak dan beragam. Kata kunci: Algoritma Huffman, Algoritma Shannon-Fano, Kompresi data
APA, Harvard, Vancouver, ISO, and other styles
6

Singh, Satpreet, and Harmandeep Singh. "Improved Adaptive Huffman Compression Algorithm." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 1, no. 1 (December 30, 2011): 16–22. http://dx.doi.org/10.24297/ijct.v1i1.2602.

Full text
Abstract:
In information age, sending the data from one end to another endneed lot of space as well as time. Data compression is atechnique to compress the information source (e.g. a data file, aspeech signal, an image, or a video signal) in possible fewnumbers of bits. One of the major factors that influence the DataCompression technique is the procedure to encode the sourcedata and space required for encoded data. There are many datacompressions methods which are used for data compression andout of which Huffman is mostly used for same. Huffmanalgorithms have two ranges static as well as adaptive. StaticHuffman algorithm is a technique that encoded the data in twopasses. In first pass it requires to calculate the frequency of eachsymbol and in second pass it constructs the Huffman tree.Adaptive Huffman algorithm is expanded on Huffman algorithmthat constructs the Huffman tree but take more space than StaticHuffman algorithm. This paper introduces a new datacompression Algorithm which is based on Huffman coding. Thisalgorithm not only reduces the number of pass but also reducethe storage space in compare to adaptive Huffman algorithm andcomparable to static.
APA, Harvard, Vancouver, ISO, and other styles
7

Gupta, Apratim. "Huffman Algorithm Improvement." International Journal of Advanced Research in Computer Science and Software Engineering 7, no. 6 (June 30, 2017): 903–4. http://dx.doi.org/10.23956/ijarcsse/v7i6/0341.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rahman, Md Atiqur, and Mohamed Hamada. "Burrows–Wheeler Transform Based Lossless Text Compression Using Keys and Huffman Coding." Symmetry 12, no. 10 (October 10, 2020): 1654. http://dx.doi.org/10.3390/sym12101654.

Full text
Abstract:
Text compression is one of the most significant research fields, and various algorithms for text compression have already been developed. This is a significant issue, as the use of internet bandwidth is considerably increasing. This article proposes a Burrows–Wheeler transform and pattern matching-based lossless text compression algorithm that uses Huffman coding in order to achieve an excellent compression ratio. In this article, we introduce an algorithm with two keys that are used in order to reduce more frequently repeated characters after the Burrows–Wheeler transform. We then find patterns of a certain length from the reduced text and apply Huffman encoding. We compare our proposed technique with state-of-the-art text compression algorithms. Finally, we conclude that the proposed technique demonstrates a gain in compression ratio when compared to other compression techniques. A small problem with our proposed method is that it does not work very well for symmetric communications like Brotli.
APA, Harvard, Vancouver, ISO, and other styles
9

Erdal, Erdal, and Atilla Ergüzen. "An Efficient Encoding Algorithm Using Local Path on Huffman Encoding Algorithm for Compression." Applied Sciences 9, no. 4 (February 22, 2019): 782. http://dx.doi.org/10.3390/app9040782.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Xing Ke, Ke Chen, and Bin Li. "Huffman Coding and Applications in Compression for Vector Maps." Applied Mechanics and Materials 333-335 (July 2013): 718–22. http://dx.doi.org/10.4028/www.scientific.net/amm.333-335.718.

Full text
Abstract:
Huffman coding is a statistical lossless coding method with high efficiency. The principal and implementation of Huffman coding is discussed and Huffman coding is implemented to the compression of vector maps. The property of the algorithm is discussed. Experiments demonstrated that the algorithm proposed can compress vector maps with high efficiency and no loss.
APA, Harvard, Vancouver, ISO, and other styles
11

Laurentinus, Laurentinus, Harrizki Arie Pradana, Dwi Yuny Sylfania, and Fransiskus Panca Juniawan. "Performance comparison of RSA and AES to SMS messages compression using Huffman algorithm." Jurnal Teknologi dan Sistem Komputer 8, no. 3 (April 19, 2020): 171–77. http://dx.doi.org/10.14710/jtsiskom.2020.13468.

Full text
Abstract:
Improved security of short message services (SMS) can be obtained using cryptographic methods, both symmetric and asymmetric, but must remain efficient. This paper aims to study the performance and efficiency of the symmetric crypto of AES-128 and asymmetric crypto of RSA with message compression in securing SMS messages. The ciphertext of RSA and AES were compressed using the Huffman algorithm. The average AES encryption time for each character is faster than RSA, which is 5.8 and 24.7 ms/character for AES and AES+Huffman encryption and 8.7 and 45.8 ms/character for RSA and RSA+Huffman, from messages with 15, 30, 60 and 90 characters. AES decryption time is also faster, which is 27.2 ms/character compared to 47.6 ms/character in RSA. Huffman compression produces an average efficiency of 24.8 % for the RSA algorithm, better than 17.35 % of AES efficiency for plaintext of 1, 16, 45, and 88 characters.
APA, Harvard, Vancouver, ISO, and other styles
12

Abu-Taieh, Evon, and Issam AlHadid. "CRUSH: A New Lossless Compression Algorithm." Modern Applied Science 12, no. 11 (October 29, 2018): 387. http://dx.doi.org/10.5539/mas.v12n11p387.

Full text
Abstract:
Multimedia is highly competitive world, one of the properties that is reflected is speed of download and upload of multimedia elements: text, sound, pictures, animation. This paper presents CRUSH algorithm which is a lossless compression algorithm. CRUSH algorithm can be used to compress files. CRUSH method is fast and simple with time complexity O(n) where n is the number of elements being compressed.Furthermore, compressed file is independent from algorithm and unnecessary data structures. As the paper will show comparison with other compression algorithms like Shannon–Fano code, Huffman coding, Run Length Encoding, Arithmetic Coding, Lempel-Ziv-Welch (LZW), Run Length Encoding (RLE), Burrows-Wheeler Transform.Move-to-Front (MTF) Transform, Haar, wavelet tree, Delta Encoding, Rice &Golomb Coding, Tunstall coding, DEFLATE algorithm, Run-Length Golomb-Rice (RLGR).
APA, Harvard, Vancouver, ISO, and other styles
13

Abu-Taieh, Evon, and Issam AlHadid. "CRUSH: A New Lossless Compression Algorithm." Modern Applied Science 12, no. 11 (October 29, 2018): 406. http://dx.doi.org/10.5539/mas.v12n11p406.

Full text
Abstract:
Multimedia is highly competitive world, one of the properties that is reflected is speed of download and upload of multimedia elements: text, sound, pictures, animation. This paper presents CRUSH algorithm which is a lossless compression algorithm. CRUSH algorithm can be used to compress files. CRUSH method is fast and simple with time complexity O(n) where n is the number of elements being compressed.Furthermore, compressed file is independent from algorithm and unnecessary data structures. As the paper will show comparison with other compression algorithms like Shannon–Fano code, Huffman coding, Run Length Encoding, Arithmetic Coding, Lempel-Ziv-Welch (LZW), Run Length Encoding (RLE), Burrows-Wheeler Transform.Move-to-Front (MTF) Transform, Haar, wavelet tree, Delta Encoding, Rice &Golomb Coding, Tunstall coding, DEFLATE algorithm, Run-Length Golomb-Rice (RLGR).
APA, Harvard, Vancouver, ISO, and other styles
14

E, Kanniga. "Huffman Algorithm for Secured Data Transmission." International Journal of Psychosocial Rehabilitation 23, no. 3 (July 30, 2019): 456–63. http://dx.doi.org/10.37200/ijpr/v23i3/pr190143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Lu, W. W., and M. P. Gough. "A fast-adaptive Huffman coding algorithm." IEEE Transactions on Communications 41, no. 4 (April 1993): 535–38. http://dx.doi.org/10.1109/26.223776.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Padma, U. R., and Jayachitra. "SELF-EMBEDDING VIDEO WATERMARKING USING DUAL ORTHOGONAL COMPLEX CONTOURLET TRANSFORM WITH AUTOCORRELATION SYSTEM." International Journal of Research -GRANTHAALAYAH 3, no. 4 (April 30, 2015): 89–98. http://dx.doi.org/10.29121/granthaalayah.v3.i4.2015.3025.

Full text
Abstract:
This paper presents a novel non-blind watermarking algorithm using dual orthogonal complex contourlet transform. The dual orthogonal complex contourlet transform is preferred for watermarking because of its ability to capture the directional edges and contours superior to other transforms such as cosine transform, wavelet transform, etc. Digital image and video in their raw form require an enormous amount of storage capacity and the huge data systems also contain a lot of redundant information.Compression also increases the capacity of the communication channel. Image Compression using SPIHT Set Partitioning in Hierarchical Trees algorithm based on Huffman coding technique. SPIHT algorithm is the lossless compression algorithms reduce file size with no loss in image quality and comparing the final results in terms of bit error rate, PSNR and MSE.
APA, Harvard, Vancouver, ISO, and other styles
17

Sinaga, Helbert, Poltak Sihombing, and Handrizal Handrizal. "Perbandingan Algoritma Huffman Dan Run Length Encoding Untuk Kompresi File Audio." Talenta Conference Series: Science and Technology (ST) 1, no. 1 (October 17, 2018): 010–15. http://dx.doi.org/10.32734/st.v1i1.183.

Full text
Abstract:
Penelitian ini dilakukan untuk menganalisis perbandingan hasil kompresi dan dekompresi file audio*.mp3 dan *.wav. Kompresi dilakukan dengan mengurangi jumlah bit yang diperlukan untuk menyimpan atau mengirim file tersebut. Pada penelitian ini penulis menggunakan algoritma Huffman dan Run Length Encoding yang merupakan salah satu teknik kompresi yang bersifat lossless.Algoritma Huffman memiliki tiga tahapan untuk mengkompres data, yaitu pembentukan pohon, encoding dan decodingdan berkerja berdasarkan karakter per karakter. Sedangkan teknik run length ini bekerja berdasarkan sederetan karakter yang berurutan, yaitu hanya memindahkan pengulangan byte yang sama berturut-turut secara terus-menerus. Implementasi algoritma Huffman dan Run Length Encoding ini bertujuan untuk mengkompresi file audio *.mp3 dan *.wav sehingga ukuran file hasil kompresi lebih kecil dibandingkan file asli dimana parameter yang digunakan untuk mengukur kinerja algoritma ini adalah rasio kompresi, kompleksitas yang dihasilkan. Rasio kompresi file audio *.mp3 menggunakan Algoritma Huffman memiliki rata-rata 1.204% sedangkan RLE -94.44%, dan rasio kompresi file audio *.wav memiliki rata-rata 28.954 % sedangkan RLE -45.91%. This research was conducted to analyze the comparison of the results of compression and decompression of *.mp3 and *.wav audio files. Compression was completed by reducing the number of bits needed to save or send the file. In this study, the researcher used the Huffman algorithm and Run Length Encoding which is one of the lossless compression techniques. The Huffman algorithm has three stages to compress data, namely tree formation, encoding and decoding which work based on characters per character. On the other hand, the run length technique works based on a sequence of sequential characters that only move the repetitions of the same byte in succession continuously. The implementation of the Huffman algorithm and Run Length Encoding aimed to compress audio files *.mp3 and *.wav so that the size of the compressed file was smaller than the original file where the parameter used to measure the performance of this algorithm was the compression ratio, and the resulting complexity.*.Mp3 audio file compression ratio using Huffman Algorithm had an average of 1.204% while RLE -94.44%, and compression ratio *.wav audio files had an average of 28.954% while RLE -45.91%.
APA, Harvard, Vancouver, ISO, and other styles
18

LEVCOPOULOS, CHRISTOS, and TERESA M. PRZYTYCKA. "A WORK-TIME TRADE-OFF IN PARALLEL COMPUTATION OF HUFFMAN TREES AND CONCAVE LEAST WEIGHT SUBSEQUENCE PROBLEM." Parallel Processing Letters 04, no. 01n02 (June 1994): 37–43. http://dx.doi.org/10.1142/s0129626494000053.

Full text
Abstract:
We present a parallel algorithm for the Concave Least Weight Subsequence (CLWS) problem that exhibits the following work-time trade-off: Given a parameter p, the algorithm runs in [Formula: see text] time using p processors. By a known reduction of the Huffman Tree problem to the CLWS problem, we obtain the same complexity bounds for the Huffman Tree problem. However, as we show, for the later problem there exists a simpler (and, in fact, slightly better) algorithm that exhibits a similar trade-off: Namely, for a given parameter p, p≥1, the algorithm runs in [Formula: see text] time using p processors.
APA, Harvard, Vancouver, ISO, and other styles
19

Al-Smadi, Ahmad Mohamad, Ahmad Al-Smadi, Roba Mahmoud Ali Aloglah, Nisrein Abu-darwish, and Ahed Abugabah. "Files cryptography based on one-time pad algorithm." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 3 (June 1, 2021): 2335. http://dx.doi.org/10.11591/ijece.v11i3.pp2335-2342.

Full text
Abstract:
The Vernam-cipher is known as a one-time pad of algorithm that is an unbreakable algorithm because it uses a typically random key equal to the length of data to be coded, and a component of the text is encrypted with an element of the encryption key. In this paper, we propose a novel technique to overcome the obstacles that hinder the use of the Vernam algorithm. First, the Vernam and advance encryption standard AES algorithms are used to encrypt the data as well as to hide the encryption key; Second, a password is placed on the file because of the use of the AES algorithm; thus, the protection record becomes very high. The Huffman algorithm is then used for data compression to reduce the size of the output file. A set of files are encrypted and decrypted using our methodology. The experiments demonstrate the flexibility of our method, and it’s successful without losing any information.
APA, Harvard, Vancouver, ISO, and other styles
20

Várkonyi, D., and P. Hudoba. "On Generalizations and Improvements to the Shannon-Fano Code." Acta Technica Jaurinensis 10, no. 1 (March 6, 2017): 1. http://dx.doi.org/10.14513/actatechjaur.v10.n1.405.

Full text
Abstract:
This paper examines the possibility of generalizing the Shannon-Fano code for cases where the output alphabet has more then 2 (n) symbols. This generalization is well-known for the famous Huffman code. Furthermore, we will be looking at possible improvements to the algorithm, as well as other entropy based lossless data compression techniques, based on the same ideas as the Shannon-Fano code. All algorithms discussed in the paper were implemented by us in C++, and we will be illustrating our hypotheses with test results, made on an average performance PC.
APA, Harvard, Vancouver, ISO, and other styles
21

Lin, Yih-Kai, Shu-Chien Huang, and Cheng-Hsing Yang. "A fast algorithm for Huffman decoding based on a recursion Huffman tree." Journal of Systems and Software 85, no. 4 (April 2012): 974–80. http://dx.doi.org/10.1016/j.jss.2011.11.1019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

KARPINSKI, MAREK, LAWRENCE LARMORE, and YAKOV NEKRICH. "WORK-EFFICIENT ALGORITHMS FOR THE CONSTRUCTION OF LENGTH-LIMITED HUFFMAN CODES." Parallel Processing Letters 14, no. 01 (March 2004): 99–105. http://dx.doi.org/10.1142/s012962640400174x.

Full text
Abstract:
We present an algorithm for parallel construction of Huffman codes in [Formula: see text] time with p processors, where p>1, improving the previous result of Levcopoulos and Przytycka. We also show, that a greedy Huffman tree can be constructed in [Formula: see text] time with n processors.
APA, Harvard, Vancouver, ISO, and other styles
23

Pattiasina, Timothy John. "ANALISA KODE HUFFMAN UNTUK KOMPRESI DATA TEKS." Teknika 1, no. 1 (July 1, 2012): 1–12. http://dx.doi.org/10.34148/teknika.v1i1.1.

Full text
Abstract:
Huffman Algorithm adalah sa1ah satu algoritma kompresi tertua yang disusun oleh David Huffman pada tahun 1952. Algoritrna tersebut digunakan untuk membuat kompresi jenis loss compression, yaitu pemampatan data dimana tidak satu byte pun hilang sehingga data tersebut utuh dan disimpan sesuai dengan aslinya. Prinsip kerja algoritma Huffman adalah mengkodekan setiap karakter ke dalam representasi bit. Representasi bit untuk setiap karakter berbeda satu sama lain berdasarkan frekuensi kemunculan karakter. Semakin sering karakter tersebut muncul, maka semakin pendek panjang representasi bit nya. Sebaliknya bila semakin jarang frekuensi karakter muncul, maka semakin panjang representasi bit untuk karakter tersebut Teknik kompresi algoritma Huffman mampu memberikan penghematan pemakaian memori sampai 30%. Algoritma Huffman mempunyai kompleksitas 0 (n log n) untuk himpunan dengan n karakter.
APA, Harvard, Vancouver, ISO, and other styles
24

Ode, Oludotun, Lara Orlandic, and Omer T. Inan. "Towards Continuous and Ambulatory Blood Pressure Monitoring: Methods for Efficient Data Acquisition for Pulse Transit Time Estimation." Sensors 20, no. 24 (December 11, 2020): 7106. http://dx.doi.org/10.3390/s20247106.

Full text
Abstract:
We developed a prototype for measuring physiological data for pulse transit time (PTT) estimation that will be used for ambulatory blood pressure (BP) monitoring. The device is comprised of an embedded system with multimodal sensors that streams high-throughput data to a custom Android application. The primary focus of this paper is on the hardware–software codesign that we developed to address the challenges associated with reliably recording data over Bluetooth on a resource-constrained platform. In particular, we developed a lossless compression algorithm that is based on optimally selective Huffman coding and Huffman prefixed coding, which yields virtually identical compression ratios to the standard algorithm, but with a 67–99% reduction in the size of the compression tables. In addition, we developed a hybrid software–hardware flow control method to eliminate microcontroller (MCU) interrupt-latency related data loss when multi-byte packets are sent from the phone to the embedded system via a Bluetooth module at baud rates exceeding 115,200 bit/s. The empirical error rate obtained with the proposed method with the baud rate set to 460,800 bit/s was identically equal to 0%. Our robust and computationally efficient physiological data acquisition system will enable field experiments that will drive the development of novel algorithms for PTT-based continuous BP monitoring.
APA, Harvard, Vancouver, ISO, and other styles
25

Folorunso, O., H. O. D. Longe ., A. C. Ikekwere ., and S. K. Sharma . "A Framework for Encrypting the Huffman Algorithm." Journal of Applied Sciences 6, no. 5 (February 15, 2006): 1138–41. http://dx.doi.org/10.3923/jas.2006.1138.1141.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Tariq, Hamza, and Heebah Saleem. "Improved Image Steganography Algorithm using Huffman Codes." International Journal of Computer Applications 147, no. 12 (August 16, 2016): 1–4. http://dx.doi.org/10.5120/ijca2016911242.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Huang, H. C., and J. L. Wu. "Windowed Huffman coding algorithm with size adaptation." IEE Proceedings I Communications, Speech and Vision 140, no. 2 (1993): 109. http://dx.doi.org/10.1049/ip-i-2.1993.0015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Swati D. Ghule, Swati D. Ghul. "Evaluation of Huffman Algorithm for Compression Standards." Bioscience Biotechnology Research Communications 14, no. 5 (March 25, 2021): 44–46. http://dx.doi.org/10.21786/bbrc/14.5/9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Sardaraz, Muhammad, and Muhammad Tahir. "FCompress: An Algorithm for FASTQ Sequence Data Compression." Current Bioinformatics 14, no. 2 (January 7, 2019): 123–29. http://dx.doi.org/10.2174/1574893613666180322125337.

Full text
Abstract:
Background: Biological sequence data have increased at a rapid rate due to the advancements in sequencing technologies and reduction in the cost of sequencing data. The huge increase in these data presents significant research challenges to researchers. In addition to meaningful analysis, data storage is also a challenge, an increase in data production is outpacing the storage capacity. Data compression is used to reduce the size of data and thus reduces storage requirements as well as transmission cost over the internet. Objective: This article presents a novel compression algorithm (FCompress) for Next Generation Sequencing (NGS) data in FASTQ format. Method: The proposed algorithm uses bits manipulation and dictionary-based compression for bases compression. Headers are compressed with reference-based compression, whereas quality scores are compressed with Huffman coding. Results: The proposed algorithm is validated with experimental results on real datasets. The results are compared with both general purpose and specialized compression programs. Conclusion: The proposed algorithm produces better compression ratio in a comparable time to other algorithms.
APA, Harvard, Vancouver, ISO, and other styles
30

Shan, Yongxin, Xin Chen, Cao Qiu, and Ying Zhang. "Implementation of Fast Huffman Encoding Based on FPGA." Journal of Physics: Conference Series 2189, no. 1 (February 1, 2022): 012021. http://dx.doi.org/10.1088/1742-6596/2189/1/012021.

Full text
Abstract:
Abstract In order to improve the efficiency of JPEG encoding architecture, some imagine compression systems use pipeline technology. Different modules will spend different cycles in the image compression, so the pipeline is prone to blockage. This paper proposes uses doublebyte splicing output Huffman code architecture and double Huffman encoding units on FPGA, so that Huffman encoding process spent less clock cycles in JPEG encoding, and pipelined JPEG encoding algorithm can be encoded in real time without blocking. According to the FPGA verification, although the number of logic units used will increase, Huffman encoding takes less clock cycles.
APA, Harvard, Vancouver, ISO, and other styles
31

Liu, Yu, Jie Hui Zeng, and Xiao Yang Liu. "Study on a Hybrid Algorithm of Navigation Radar Echo Data Compression." Applied Mechanics and Materials 543-547 (March 2014): 2796–99. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.2796.

Full text
Abstract:
With the conflict between storage space of the ship navigation radar system and large raw echoes data and files, a mixed compression algorithm of the bitmap compression + half-byte compression + Double Huffman compression is presented which is based on the characteristics of the text type data. On the basis of the original half-byte compression algorithm, the method combined with the bitmap compression and Huffman improved compression algorithm. If just using the half a byte compression algorithm, the compression ratio can only be 50%; while using the hybrid compression, the total compression ratio can be up to 75%, which save a large amount of storage resources for ship navigation system, and improve the rate of data transmission at the same time.
APA, Harvard, Vancouver, ISO, and other styles
32

Baibekova, F. N., V. V. Podoltsev, N. M. Bespalova, and L. A. Sologubova. "Overview of the ways to reduce telemetric information redundancy." Radio industry (Russia) 29, no. 2 (May 30, 2019): 8–16. http://dx.doi.org/10.21778/2413-9599-2019-29-2-8-16.

Full text
Abstract:
The redundancy of telemetric information significantly complicates the real-time processing of the information flows. To speed up the telemetry processing process, the methods for telemetric information redundancy reduction should be applied in order to reduce its flows entering the monitoring systems while maintaining a high speed of processing and reliability of the information. The article provides an overview of the methods for reduction of telemetric information redundancy, such as the increase in PHY-rate of telemetry channel, the Huffman algorithm, the batch mechanism for generation and transmission of telemetry information, the adaptive difference algorithm, the algorithm for transmission of the information based on its representation by residual images, Golomb-Rice codes, reversible compression method. The advantages and disadvantages of each of them are considered. Recommendations on the use of multi-level telemetry information compression system are given, which makes it possible to effectively combine target data compression algorithms that give the highest compression ratio depending on the type of telemetry information transmitted.
APA, Harvard, Vancouver, ISO, and other styles
33

Soltani, Mohammad, and Amid Khatibi Bardsiri. "A New Secure Hybrid Algorithm for QR-Code Images Encryption and Steganography." APTIKOM Journal on Computer Science and Information Technologies 2, no. 2 (July 1, 2017): 86–96. http://dx.doi.org/10.11591/aptikom.j.csit.109.

Full text
Abstract:
Encryption is a method, for the protection of useful information, which is used as one of the security purposes and steganography is the art of hiding the fact that communication is taking place, by hiding information in other Information. In this article at first, plain text message as a security information is converted to the (Quick Response Code) QR-code image and then we proposed a new secure hybrid algorithm for the encryption and steganography of generated QR-code. In this article image encryption is based on two-dimensional logistic chaotic map and AES algorithm and steganography technique is based on LSB algorithm. In addition, Huffman algorithm has come out as the most efficient compression technique and we can use Huffman algorithm to compress encrypted QR-code. Experimental results show that the scheme proposed in this article has a high security and better QR-Code images encryption and steganography quality.
APA, Harvard, Vancouver, ISO, and other styles
34

Anandita, Ida Bagus Gede, I. Gede Aris Gunadi, and Gede Indrawan. "Analisis Kinerja Dan Kualitas Hasil Kompresi Pada Citra Medis Sinar-X Menggunakan Algoritma Huffman, Lempel Ziv Welch Dan Run Length Encoding." SINTECH (Science and Information Technology) Journal 1, no. 1 (February 9, 2018): 7–15. http://dx.doi.org/10.31598/sintechjournal.v1i1.179.

Full text
Abstract:
Technological progress in the medical area made medical images like X-rays stored in digital files. The medical image file is relatively large so that the image needs to be compressed. The lossless compression technique is an image compression where the decompression results are the same as the original or no information lost in the compression process. The existing algorithms on lossless compression techniques are Run Length Encoding (RLE), Huffman, and Lempel Ziv Welch (LZW). This study compared the performance of the three algorithms in compressing medical images. The result of image decompression will be compared to its performance in the objective assessment such as ratio, compression time, MSE (Mean Square Error) and PNSR (Peak Signal to Noise Ratio). MSE and PSNR are used for quantitative image quality measurement for subjective assessment assisted by three experts who will compare the original image with the decompression image. Based on the results obtained from the objective assessment of compression performance of RLE algorithm showed the best performance by yielding ratio, time, MSE and PSNR respectively 86,92%, 3,11ms, 0 and 0db. For Huffman, the results can be 12.26%, 96.94ms, 0, and 0db respectively. While LZW results can be in sequence -63.79%, 160ms, 0.3 and 58.955db. For the results of the subjective assessment, the experts argued that all images can be analyzed well.
APA, Harvard, Vancouver, ISO, and other styles
35

Liu, Yong, Bing Li, Yan Zhang, and Xia Zhao. "A Huffman-Based Joint Compression and Encryption Scheme for Secure Data Storage Using Physical Unclonable Functions." Electronics 10, no. 11 (May 25, 2021): 1267. http://dx.doi.org/10.3390/electronics10111267.

Full text
Abstract:
With the developments of Internet of Things (IoT) and cloud-computing technologies, cloud servers need storage of a huge volume of IoT data with high throughput and robust security. Joint Compression and Encryption (JCAE) scheme based on Huffman algorithm has been regarded as a promising technology to enhance the data storage method. Existing JCAE schemes still have the following limitations: (1) The keys in the JCAE would be cracked by physical and cloning attacks; (2) The rebuilding of Huffman tree reduces the operational efficiency; (3) The compression ratio should be further improved. In this paper, a Huffman-based JCAE scheme using Physical Unclonable Functions (PUFs) is proposed. It provides physically secure keys with PUFs, efficient Huffman tree mutation without rebuilding, and practical compression ratio by combining the Lempel-Ziv and Welch (LZW) algorithm. The performance of the instanced PUFs and the derived keys was evaluated. Moreover, our scheme was demonstrated in a file protection system with the average throughput of 473Mbps and the average compression ratio of 0.5586. Finally, the security analysis shows that our scheme resists physical and cloning attacks as well as several classic attacks, thus improving the security level of existing data protection methods.
APA, Harvard, Vancouver, ISO, and other styles
36

Adedeji, Kazeem B. "Performance Evaluation of Data Compression Algorithms for IoT-Based Smart Water Network Management Applications." Journal of Applied Science & Process Engineering 7, no. 2 (October 30, 2020): 554–63. http://dx.doi.org/10.33736/jaspe.2272.2020.

Full text
Abstract:
IoT-based smart water supply network management applications generate a huge volume of data from the installed sensing devices which are required to be processed (sometimes in-network), stored and transmitted to a remote centre for decision making. When the volume of data produced by diverse IoT smart sensing devices intensify, processing and storage of these data begin to be a serious issue. The large data size acquired from these applications increases the computational complexities, occupies the scarce bandwidth of data transmission and increases the storage space. Thus, data size reduction through the use of data compression algorithms is essential in IoT-based smart water network management applications. In this paper, the performance evaluation of four different data compression algorithms used for this purpose is presented. These algorithms, which include RLE, Huffman, LZW and Shanon-Fano encoding were realised using MATLAB software and tested on six water supply system data. The performance of each of these algorithms was evaluated based on their compression ratio, compression factor, percentage space savings, as well as the compression gain. The results obtained showed that the LZW algorithm shows better performance base on the compression ratio, compression factor, space savings and the compression gain. However, its execution time is relatively slow compared to the RLE and the two other algorithms investigated. Most importantly, the LZW algorithm has a significant reduction in the data sizes of the tested files than all other algorithms
APA, Harvard, Vancouver, ISO, and other styles
37

Hong-Chung, Chen, Wang Yue-Li, and Lan Yu-Feng. "A memory-efficient and fast Huffman decoding algorithm." Information Processing Letters 69, no. 3 (February 1999): 119–22. http://dx.doi.org/10.1016/s0020-0190(99)00002-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Parker, D. Stott. "Erratum: Conditions for Optimality of the Huffman Algorithm." SIAM Journal on Computing 27, no. 1 (February 1998): 317. http://dx.doi.org/10.1137/s0097539797328550.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Ponalagusa, R., E. Kannan, and Michael Arock. "A Huffman Decoding Algorithm in Mobile Robot Platform." Information Technology Journal 6, no. 5 (June 15, 2007): 776–79. http://dx.doi.org/10.3923/itj.2007.776.779.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Gao, Rong Hua, and Hua Rui Wu. "Research on Compression Storage of Massive Agricultural Data Based on Cloud Environment." Applied Mechanics and Materials 441 (December 2013): 1025–29. http://dx.doi.org/10.4028/www.scientific.net/amm.441.1025.

Full text
Abstract:
With the development of information technology, agriculture data show large amount of data, distributed, heterogeneous characteristics. It is difficult to access and management with the massive data are continuously increasing which affect the large-scale use of agricultural information data. In this paper, the method of compression algorithm is proposed which based on real-time and time space correlation characteristics. All data is divided into several categories by Huffman compression algorithm combines parallel processing cloud platform. Then, the massive agricultural data is compressed and reducing the data storage. Exprienment result show that cloud storage platform with dynamic scalability. Under the same experimental data, the method of this paper has higher compress ratio, and compression consuming less when a larger amount data, compared with the Huffman compress and dictionary-based data compression algorithm.
APA, Harvard, Vancouver, ISO, and other styles
41

Sarinova, Assiya, and Alexander Zamyatin. "Hyperspectral regression lossless compression algorithm of aerospace images." E3S Web of Conferences 149 (2020): 02003. http://dx.doi.org/10.1051/e3sconf/202014902003.

Full text
Abstract:
In this work, we propose an algorithm for compressing lossless hyperspectral aerospace images, which is characterized by the use of a channel-difference linear regression transformation, which significantly reduces the range of data changes and increases the degree of compression. The main idea of the proposed conversion is to form a set of pairs of correlated channels with the subsequent creation of the transformed blocks without losses using regression analysis. This analysis allows you to reduce the size of the channels of the aerospace image and convert them before compression. The transformation of the regressed channel is performed on the values of the constructed regression equation model. An important step is coding with the adapted Huffman algorithm. The obtained comparison results of the converted hyperspectral AI suggest the effectiveness of the stages of regression conversion and multi-threaded processing, showing good results in the calculation of compression algorithms.
APA, Harvard, Vancouver, ISO, and other styles
42

Tanjung, Abu Sani, and Surya Darma Nasution. "Comparison Analysis with Huffman Algorithm and Goldbach Codes Algorithm in File Compression Text Using the Method Exponential Comparison." IJICS (International Journal of Informatics and Computer Science) 4, no. 1 (March 29, 2020): 29. http://dx.doi.org/10.30865/ijics.v4i1.1387.

Full text
Abstract:
With the development of technology at this time many people know about compression. In simple compression is a process to shrink the file from its original size. At this time compression applications that are often used are WinZip, WinRar, and 7-Zip, namely with the aim of compressing documents and saving space on memory or data transmission. Compressed data can be in the form of images, audio, video and text. The use of the Huffman algorithm and the Goldbach Codes algorithm in compressing text files is intended to provide enormous benefits in the sending and storage process and requires less memory space compared to uncompressed text. The algorithm starts by providing a string of inputs as input, how to produce an algorithm output in the form of a binary string or code that translates each input string, so that the string has a small number of bits compared to strings that are not compressed. Thus, the problem is how to obtain the code with sorted characters and frequency tables as input and shorter binary code as output. In applying the Huffman algorithm and the Goldbach Codes algorithm in compressing text files is very good, the results were not reduced from the original file or there was no reduction
APA, Harvard, Vancouver, ISO, and other styles
43

M.T, Suhermanto. "Time Optimization for Lossy Decompression of the LISA Sensor Data on LAPAN A3 Satellite Using a Grouping Method of HUFFMAN Code Bit Number." Jurnal Teknologi Dirgantara 16, no. 1 (September 17, 2018): 23. http://dx.doi.org/10.30536/j.jtd.2018.v16.a2960.

Full text
Abstract:
The LAPAN-A3 satellite provides compressed multispectral data from LISA sensor using real-time lossy compression. The compression of the multispectral data of radiometric resolution 12bit/pixel is built from the Fourier transform and the use of Huffman decoder 514 binary length code. A problem arised in the data extraction process, that decompression performance is very slow because the search method of code value in Hufman table was done sequentially from one bit to the next bit in one block of data along 4000 pixels. The data extraction time for one scene in 12 minutes acquisition duration (one full path) takes up to 20 hours. This paper proposes a method of improving the LISA real-time lossy data decompression algorithm using the grouping method of bit code on the Huffman decoding algorithm and using pointer for reading data in the buffer memory. Using this method, the searching process of bit code for all characters in the Huffman decoder algorithm is done regularly, so the search processing time is significantly reduced. The performance test used 6 data samples. The result showed that extraction time has an average of 14 times faster. The lossy compression ratio is still in accordance with the design specification of LISA sensor that is less than 4 times and the appearance of the special character is very small i.e. less than 0.5%.
APA, Harvard, Vancouver, ISO, and other styles
44

Cha, Hyungtai, and Kwanghee Woo. "Efficient Multi-way Tree Search Algorithm for Huffman Decoder." International Journal of Fuzzy Logic and Intelligent Systems 4, no. 1 (June 1, 2004): 34–39. http://dx.doi.org/10.5391/ijfis.2004.4.1.034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Lin, Yih-Kai, and Kuo-Liang Chung. "A space-efficient Huffman decoding algorithm and its parallelism." Theoretical Computer Science 246, no. 1-2 (September 2000): 227–38. http://dx.doi.org/10.1016/s0304-3975(99)00080-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Larmore, Lawrence L., and Daniel S. Hirschberg. "A fast algorithm for optimal length-limited Huffman codes." Journal of the ACM 37, no. 3 (July 1990): 464–73. http://dx.doi.org/10.1145/79147.79150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Triana, Yaya Sudarya, and Astari Retnowardhani. "Blowfish algorithm and Huffman compression for data security application." IOP Conference Series: Materials Science and Engineering 453 (November 29, 2018): 012074. http://dx.doi.org/10.1088/1757-899x/453/1/012074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Silitonga, Parasian D. P., Imka Ardianta Singarimbun, and Irene Sri Morina. "Wave File Encryption using Huffman Compression and Serpent Algorithm." International Journal of Computer Trends and Technology 64, no. 1 (October 25, 2018): 24–28. http://dx.doi.org/10.14445/22312803/ijctt-v64p106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

De Prisco, R. "On the Data Expansion of the Huffman Compression Algorithm." Computer Journal 41, no. 3 (March 1, 1998): 137–44. http://dx.doi.org/10.1093/comjnl/41.3.137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Mesra, Hendra, Handayani Tjandrasa, and Chastine Fatichah. "New Lossless Compression Method using Cyclic Reversible Low Contrast Mapping (CRLCM)." International Journal of Electrical and Computer Engineering (IJECE) 6, no. 6 (December 1, 2016): 2836. http://dx.doi.org/10.11591/ijece.v6i6.12848.

Full text
Abstract:
<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography