To see the other types of publications on this topic, follow the link: Huffman algorithm.

Journal articles on the topic 'Huffman algorithm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Huffman algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Okazaki, Hiroyuki, Yuichi Futa, and Yasunari Shidama. "Constructing Binary Huffman Tree." Formalized Mathematics 21, no. 2 (2013): 133–43. http://dx.doi.org/10.2478/forma-2013-0015.

Full text
Abstract:
Summary Huffman coding is one of a most famous entropy encoding methods for lossless data compression [16]. JPEG and ZIP formats employ variants of Huffman encoding as lossless compression algorithms. Huffman coding is a bijective map from source letters into leaves of the Huffman tree constructed by the algorithm. In this article we formalize an algorithm constructing a binary code tree, Huffman tree.
APA, Harvard, Vancouver, ISO, and other styles
2

Hasan, Syahril, Saiful Do Abdullah, and Arisandy Ambarita. "Penerapan Algoritma Huffman Coding Dalam Menghemat Ruang Penyimpanan Data Multimedia File (Teks dan Gambar) Berbasis Python." Jurnal Ilmiah ILKOMINFO - Ilmu Komputer & Informatika 7, no. 2 (2024): 118–27. http://dx.doi.org/10.47324/ilkominfo.v7i2.268.

Full text
Abstract:
Abstrak: Kompresi adalah teknik yang digunakan untuk mengompres data agar sesuai dengan ukuran pada media yang digunakan. seperti backup data, transmisi data, dan keamanan data. Algoritma Huffman dalam kompresi teks dapat menghasilkan pengurangan ukuran file yang signifikan tanpa kehilangan informasi, struktur pohon Huffman dan pengkodean karakter memberikan wawasan yang mendalam tentang cara algoritma bekerja. Metode Pengembangan yang gunakan adalah Algoritma Huffman yang di mulai dengan Pengumpulan Frekuensi, Pembuatan Tree Huffman, Pembuatan Kode Huffman, Pembuatan Tabel Kompresi, dan Kompr
APA, Harvard, Vancouver, ISO, and other styles
3

Priyono, Eko, and Hindayati Mustafidah. "Text Compression Using the Shannon-Fano, Huffman, and Half–Byte Algorithms." International Journal of Scientific Research and Management (IJSRM) 12, no. 09 (2024): 1422–27. http://dx.doi.org/10.18535/ijsrm/v12i09.ec01.

Full text
Abstract:
Background and Objectives: File sizes increase as technology advances. Large files require more storage memory and longer transfer times. Data compression is changing an input or original data into another data stream as output or compressed data which is smaller in size. Existing compression techniques include the Huffman, Shannon-Fano, and Half-Byte algorithms. Like algorithms in computer science, these three algorithms offer advantages and disadvantages. Therefore, testing is needed to determine which algorithm is most effective for data compression, especially text data. Methods: Applying
APA, Harvard, Vancouver, ISO, and other styles
4

Gupta, Apratim. "Huffman Algorithm Improvement." International Journal of Advanced Research in Computer Science and Software Engineering 7, no. 6 (2017): 903–4. http://dx.doi.org/10.23956/ijarcsse/v7i6/0341.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kairišs, Edgars, and Mihails Kijaško. "HUFFMAN COMPRESSION ALGORITHM." HUMAN. ENVIRONMENT. TECHNOLOGIES. Proceedings of the Students International Scientific and Practical Conference, no. 21 (April 19, 2017): 131–34. http://dx.doi.org/10.17770/het2017.21.3594.

Full text
Abstract:
In modern IT world, we are returning to the problem of low storage space. May be typically users do not see this problem and still there is more data that companies collect and should store. In this work author analyzed Huffman compression algorithm it’s effectiveness, working principles and examples of usage.
APA, Harvard, Vancouver, ISO, and other styles
6

Singh, Satpreet, and Harmandeep Singh. "Improved Adaptive Huffman Compression Algorithm." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 1, no. 1 (2011): 16–22. http://dx.doi.org/10.24297/ijct.v1i1.2602.

Full text
Abstract:
In information age, sending the data from one end to another endneed lot of space as well as time. Data compression is atechnique to compress the information source (e.g. a data file, aspeech signal, an image, or a video signal) in possible fewnumbers of bits. One of the major factors that influence the DataCompression technique is the procedure to encode the sourcedata and space required for encoded data. There are many datacompressions methods which are used for data compression andout of which Huffman is mostly used for same. Huffmanalgorithms have two ranges static as well as adaptive. Sta
APA, Harvard, Vancouver, ISO, and other styles
7

Zhan, Honghui. "Image compression and reconstruction based on GUI and Huffman coding." Journal of Physics: Conference Series 2580, no. 1 (2023): 012025. http://dx.doi.org/10.1088/1742-6596/2580/1/012025.

Full text
Abstract:
Abstract Huffman coding is an important part of image compression technology, the image compression platform is based on GUI, and Huffman is also widely used. This paper introduces the basic principle of the Huffman algorithm, compares it with arithmetic coding and run length encoding, and expounds on the application of these three algorithms in JPEG compression. The AC algorithm combined block-based, fine texture models and adaptive arithmetic coding in the given an example. The RLE algorithm used automatic threshold, direction judgment, and selective value counts to improve its compression e
APA, Harvard, Vancouver, ISO, and other styles
8

Lamorahan, Christine, Benny Pinontoan, and Nelson Nainggolan. "Data Compression Using Shannon-Fano Algorithm." d'CARTESIAN 2, no. 2 (2013): 10. http://dx.doi.org/10.35799/dc.2.2.2013.3207.

Full text
Abstract:
Abstract Communication systems in the world of technology, information and communication are known as data transfer system. Sometimes the information received lost its authenticity, because size of data to be transferred exceeds the capacity of the media used. This problem can be reduced by applying compression process to shrink the size of the data to obtain a smaller size. This study considers compression for data text using Shannon – Fano algorithm and shows how effective these algorithms in compressing it when compared with the Huffman algorithm. This research shows that text data compress
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Xing Ke, Ke Chen, and Bin Li. "Huffman Coding and Applications in Compression for Vector Maps." Applied Mechanics and Materials 333-335 (July 2013): 718–22. http://dx.doi.org/10.4028/www.scientific.net/amm.333-335.718.

Full text
Abstract:
Huffman coding is a statistical lossless coding method with high efficiency. The principal and implementation of Huffman coding is discussed and Huffman coding is implemented to the compression of vector maps. The property of the algorithm is discussed. Experiments demonstrated that the algorithm proposed can compress vector maps with high efficiency and no loss.
APA, Harvard, Vancouver, ISO, and other styles
10

Fauzan, Mohamad Nurkamal, Muhammad Alif, and Cahyo Prianto3. "Comparison of Huffman Algorithm and Lempel Ziv Welch Algorithm in Text File Compression." IT Journal Research and Development 7, no. 2 (2022): 155–69. http://dx.doi.org/10.25299/itjrd.2023.10437.

Full text
Abstract:
The development of data storage hardware is very rapidly over time. In line with the development of storage hardware, the amount of digital data shared on the internet is increasing every day. That way no matter how big the size of the storage device we have, of course, it will only be a matter of time until that storage space is exhausted. Therefore, in terms of maximizing storage space, a technique called compression emerged. This study focuses on a comparative analysis of 2 Lossless compression technique algorithms, namely the Huffman algorithm and Lempel Ziv Welch (LZW). A number of test f
APA, Harvard, Vancouver, ISO, and other styles
11

Nasif, Ammar, Zulaiha Ali Othman, and Nor Samsiah Sani. "The Deep Learning Solutions on Lossless Compression Methods for Alleviating Data Load on IoT Nodes in Smart Cities." Sensors 21, no. 12 (2021): 4223. http://dx.doi.org/10.3390/s21124223.

Full text
Abstract:
Networking is crucial for smart city projects nowadays, as it offers an environment where people and things are connected. This paper presents a chronology of factors on the development of smart cities, including IoT technologies as network infrastructure. Increasing IoT nodes leads to increasing data flow, which is a potential source of failure for IoT networks. The biggest challenge of IoT networks is that the IoT may have insufficient memory to handle all transaction data within the IoT network. We aim in this paper to propose a potential compression method for reducing IoT network data tra
APA, Harvard, Vancouver, ISO, and other styles
12

Kadhim, Doaa J., Mahmood F. Mosleh, and Faeza A. Abed. "Exploring Text Data Compression: A Comparative Study of Adaptive Huffman and LZW Approaches." BIO Web of Conferences 97 (2024): 00035. http://dx.doi.org/10.1051/bioconf/20249700035.

Full text
Abstract:
Data compression is a critical procedure in computer science that aims to minimize the size of data files while maintaining their vital information. It is extensively utilized in Numerous applications, including communication, data storage, and multimedia transmission. In this work, we investigated the results of compressing four different text files with Lempel-Ziv-Welch compression techniques and Adaptive Huffman coding. The experiment used four text files: Arabic and English paragraphs and repeated Arabic and English characters. We measured Bit-rate, Compression Time, and Decompression Time
APA, Harvard, Vancouver, ISO, and other styles
13

Erdal, Erdal, and Atilla Ergüzen. "An Efficient Encoding Algorithm Using Local Path on Huffman Encoding Algorithm for Compression." Applied Sciences 9, no. 4 (2019): 782. http://dx.doi.org/10.3390/app9040782.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wijaya, Bayu Angga, Sarwando Siboro, Mahendra Brutu, and Yelita Kristiani Lase. "Application of Huffman Algorithm and Unary Codes for Text File Compression." SinkrOn 7, no. 3 (2022): 1000–1007. http://dx.doi.org/10.33395/sinkron.v7i3.11567.

Full text
Abstract:
Technique in carrying out data compression is an important point in technological developments. With compression in data in the form of text can include many uses, including for data transfer, copying and for backing up data. From its uses, this aspect is important for data security. There are many compression techniques on the data, including using huffman algorithms and unary code. One of its applications will be implemented on a text data that is widely used by digital actors in storing important data. The data must not be known by unauthorized parties in accessing the data. Therefore, huff
APA, Harvard, Vancouver, ISO, and other styles
15

Ma, Shaowen. "Comparison of image compression techniques using Huffman and Lempel-Ziv-Welch algorithms." Applied and Computational Engineering 5, no. 1 (2023): 793–801. http://dx.doi.org/10.54254/2755-2721/5/20230705.

Full text
Abstract:
Image compression technology is very popular in the field of image analysis because the compressed image is convenient for storage and transmission. In this paper, the Huffman algorithm and Lempel-Ziv-Welch (LZW) algorithm are introduced. They are widely used in the field of image compression, and the compressed image results of the two algorithms are calculated and compared. Based on the four dimensions of Compression Ratio (CR), Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR) and Bits Per Pixel (BPP), the applicable conditions of the two algorithms in compressing small image files
APA, Harvard, Vancouver, ISO, and other styles
16

Laurentinus, Laurentinus, Harrizki Arie Pradana, Dwi Yuny Sylfania, and Fransiskus Panca Juniawan. "Performance comparison of RSA and AES to SMS messages compression using Huffman algorithm." Jurnal Teknologi dan Sistem Komputer 8, no. 3 (2020): 171–77. http://dx.doi.org/10.14710/jtsiskom.2020.13468.

Full text
Abstract:
Improved security of short message services (SMS) can be obtained using cryptographic methods, both symmetric and asymmetric, but must remain efficient. This paper aims to study the performance and efficiency of the symmetric crypto of AES-128 and asymmetric crypto of RSA with message compression in securing SMS messages. The ciphertext of RSA and AES were compressed using the Huffman algorithm. The average AES encryption time for each character is faster than RSA, which is 5.8 and 24.7 ms/character for AES and AES+Huffman encryption and 8.7 and 45.8 ms/character for RSA and RSA+Huffman, from
APA, Harvard, Vancouver, ISO, and other styles
17

Sinaga, Helbert, Poltak Sihombing, and Handrizal Handrizal. "Perbandingan Algoritma Huffman Dan Run Length Encoding Untuk Kompresi File Audio." Talenta Conference Series: Science and Technology (ST) 1, no. 1 (2018): 010–15. http://dx.doi.org/10.32734/st.v1i1.183.

Full text
Abstract:
Penelitian ini dilakukan untuk menganalisis perbandingan hasil kompresi dan dekompresi file audio*.mp3 dan *.wav. Kompresi dilakukan dengan mengurangi jumlah bit yang diperlukan untuk menyimpan atau mengirim file tersebut. Pada penelitian ini penulis menggunakan algoritma Huffman dan Run Length Encoding yang merupakan salah satu teknik kompresi yang bersifat lossless.Algoritma Huffman memiliki tiga tahapan untuk mengkompres data, yaitu pembentukan pohon, encoding dan decodingdan berkerja berdasarkan karakter per karakter. Sedangkan teknik run length ini bekerja berdasarkan sederetan karakter y
APA, Harvard, Vancouver, ISO, and other styles
18

E, Kanniga. "Huffman Algorithm for Secured Data Transmission." International Journal of Psychosocial Rehabilitation 23, no. 3 (2019): 456–63. http://dx.doi.org/10.37200/ijpr/v23i3/pr190143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Lu, W. W., and M. P. Gough. "A fast-adaptive Huffman coding algorithm." IEEE Transactions on Communications 41, no. 4 (1993): 535–38. http://dx.doi.org/10.1109/26.223776.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Fredriksson, Kimmo, and Jorma Tarhio. "Efficient String Matching in Huffman Compressed Texts." Fundamenta Informaticae 63, no. 1 (2004): 1–16. https://doi.org/10.3233/fun-2004-63101.

Full text
Abstract:
We present an efficient algorithm for scanning Huffman compressed texts. The algorithm parses the compressed text in O(n{[log2σ]/b}) time, where n is the size of the compressed text in bytes, σ is the size of the alphabet, and b is a user specified parameter. The method uses a variable size super-alphabet, with an average size of O({b/[H log2σ]}) characters, where H is the entropy of the text. Each super-character is processed in O(1) time. The algorithm uses O(2b) space and O(b2b) preprocessing time. The method can be easily augmented by auxiliary functions, which can e.g. decompress the text
APA, Harvard, Vancouver, ISO, and other styles
21

Lin, Yih-Kai, Shu-Chien Huang, and Cheng-Hsing Yang. "A fast algorithm for Huffman decoding based on a recursion Huffman tree." Journal of Systems and Software 85, no. 4 (2012): 974–80. http://dx.doi.org/10.1016/j.jss.2011.11.1019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Pattiasina, Timothy John. "ANALISA KODE HUFFMAN UNTUK KOMPRESI DATA TEKS." Teknika 1, no. 1 (2012): 1–12. http://dx.doi.org/10.34148/teknika.v1i1.1.

Full text
Abstract:
Huffman Algorithm adalah sa1ah satu algoritma kompresi tertua yang disusun oleh David Huffman pada tahun 1952. Algoritrna tersebut digunakan untuk membuat kompresi jenis loss compression, yaitu pemampatan data dimana tidak satu byte pun hilang sehingga data tersebut utuh dan disimpan sesuai dengan aslinya. Prinsip kerja algoritma Huffman adalah mengkodekan setiap karakter ke dalam representasi bit. Representasi bit untuk setiap karakter berbeda satu sama lain berdasarkan frekuensi kemunculan karakter. Semakin sering karakter tersebut muncul, maka semakin pendek panjang representasi bit nya. Se
APA, Harvard, Vancouver, ISO, and other styles
23

Shan, Yongxin, Xin Chen, Cao Qiu, and Ying Zhang. "Implementation of Fast Huffman Encoding Based on FPGA." Journal of Physics: Conference Series 2189, no. 1 (2022): 012021. http://dx.doi.org/10.1088/1742-6596/2189/1/012021.

Full text
Abstract:
Abstract In order to improve the efficiency of JPEG encoding architecture, some imagine compression systems use pipeline technology. Different modules will spend different cycles in the image compression, so the pipeline is prone to blockage. This paper proposes uses doublebyte splicing output Huffman code architecture and double Huffman encoding units on FPGA, so that Huffman encoding process spent less clock cycles in JPEG encoding, and pipelined JPEG encoding algorithm can be encoded in real time without blocking. According to the FPGA verification, although the number of logic units used w
APA, Harvard, Vancouver, ISO, and other styles
24

Qerom, Mahmoud Al. "Improved intensity rounding and division near lossless image compression algorithm using delta encoding." International Journal of Data and Network Science 9, no. 1 (2025): 173–86. https://doi.org/10.5267/j.ijdns.2024.9.002.

Full text
Abstract:
This paper presents RIFD-DLT, an advanced near-lossless image compression algorithm that combines Delta Encoding with the original rounding the intensity followed by division (RIFD) method. The RIFD method first minimizes the image intensities, which makes the next compression stages more efficient. Subsequently, Delta Encoding subtracts neighboring rows in each of the image's three-color matrices, using the proximity of pixel values in adjacent rows to further reduce the image intensity. Extensive investigations show that RIFD-DLT outperforms the state-of-the-art algorithms and benchmarks wit
APA, Harvard, Vancouver, ISO, and other styles
25

Nwe, Ni Kyaw, Kyaw Naing Kyaw, and Myo Nwe Wai Myat. "Identification of Persons by Fingerprint using Huffman Coding Algorithm." International Journal of Trend in Scientific Research and Development 3, no. 5 (2019): 1208–11. https://doi.org/10.5281/zenodo.3590611.

Full text
Abstract:
Fingerprint system is one of the most common biometric features used for personal identification and verification. There are many techniques used for finding the features of fingerprint when it matches the other images of fingerprint. This system must input the fingerprint for the corresponding profile. It converts the binary image. This image will convert the Huffman code approach and to generate codes for input fingerprint. The system has the profile information along with their associated fingerprints. When the system identified the fingerprint, the system will display the associated profil
APA, Harvard, Vancouver, ISO, and other styles
26

LEVCOPOULOS, CHRISTOS, and TERESA M. PRZYTYCKA. "A WORK-TIME TRADE-OFF IN PARALLEL COMPUTATION OF HUFFMAN TREES AND CONCAVE LEAST WEIGHT SUBSEQUENCE PROBLEM." Parallel Processing Letters 04, no. 01n02 (1994): 37–43. http://dx.doi.org/10.1142/s0129626494000053.

Full text
Abstract:
We present a parallel algorithm for the Concave Least Weight Subsequence (CLWS) problem that exhibits the following work-time trade-off: Given a parameter p, the algorithm runs in [Formula: see text] time using p processors. By a known reduction of the Huffman Tree problem to the CLWS problem, we obtain the same complexity bounds for the Huffman Tree problem. However, as we show, for the later problem there exists a simpler (and, in fact, slightly better) algorithm that exhibits a similar trade-off: Namely, for a given parameter p, p≥1, the algorithm runs in [Formula: see text] time using p pr
APA, Harvard, Vancouver, ISO, and other styles
27

KARPINSKI, MAREK, LAWRENCE LARMORE, and YAKOV NEKRICH. "WORK-EFFICIENT ALGORITHMS FOR THE CONSTRUCTION OF LENGTH-LIMITED HUFFMAN CODES." Parallel Processing Letters 14, no. 01 (2004): 99–105. http://dx.doi.org/10.1142/s012962640400174x.

Full text
Abstract:
We present an algorithm for parallel construction of Huffman codes in [Formula: see text] time with p processors, where p>1, improving the previous result of Levcopoulos and Przytycka. We also show, that a greedy Huffman tree can be constructed in [Formula: see text] time with n processors.
APA, Harvard, Vancouver, ISO, and other styles
28

R, Lavanya, Kartik V I, Madhusudhan G K, Chinnu H, and Makasud A T. "Design and Implementation of a High-Performance VLSI Architecture for Canonical Huffman Encoding." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 11 (2024): 1–5. https://doi.org/10.55041/ijsrem39310.

Full text
Abstract:
A key component of contemporary computer systems, data compression makes it possible for information to be stored and transmitted efficiently. A popular technique for lossless data compression, Huffman coding can drastically cut down on data size without sacrificing intensity. But conventional Huffman encoding techniques can have scalability and speed issues, especially when used in hardware. A high-throughput Very Large-Scale Integration (VLSI) design for a Canonical Huffman Encoder is shown in this study. To accomplish quick and effective encoding, the suggested approach makes use of paralle
APA, Harvard, Vancouver, ISO, and other styles
29

Maharjan, Shyam, Sujan Poudel, and Dipesh Tandukar. "A Comparative Study of Text-Based Lossless Compression." American Journal of Smart Technology and Solutions 3, no. 2 (2024): 34–39. http://dx.doi.org/10.54536/ajsts.v3i2.3566.

Full text
Abstract:
Lossless data compression is a critical technique used to reduce file sizes without any loss of information during the encoding and decoding processes. This study presents a comparative analysis of two widely-used lossless compression algorithms: Huffman Encoding and Lempel-Ziv-Welch (LZW). The primary objective is to evaluate the performance of these algorithms in terms of compression ratio, compression time, decompression time, and space savings. The analysis was conducted on 100 files of varying sizes. The results demonstrate that the LZW algorithm outperforms Huffman Encoding, offering sup
APA, Harvard, Vancouver, ISO, and other styles
30

Swati D. Ghule, Swati D. Ghul. "Evaluation of Huffman Algorithm for Compression Standards." Bioscience Biotechnology Research Communications 14, no. 5 (2021): 44–46. http://dx.doi.org/10.21786/bbrc/14.5/9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Folorunso, O., H. O. D. Longe ., A. C. Ikekwere ., and S. K. Sharma . "A Framework for Encrypting the Huffman Algorithm." Journal of Applied Sciences 6, no. 5 (2006): 1138–41. http://dx.doi.org/10.3923/jas.2006.1138.1141.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Tariq, Hamza, and Heebah Saleem. "Improved Image Steganography Algorithm using Huffman Codes." International Journal of Computer Applications 147, no. 12 (2016): 1–4. http://dx.doi.org/10.5120/ijca2016911242.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Huang, H. C., and J. L. Wu. "Windowed Huffman coding algorithm with size adaptation." IEE Proceedings I Communications, Speech and Vision 140, no. 2 (1993): 109. http://dx.doi.org/10.1049/ip-i-2.1993.0015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Rahman, Md Atiqur, and Mohamed Hamada. "Burrows–Wheeler Transform Based Lossless Text Compression Using Keys and Huffman Coding." Symmetry 12, no. 10 (2020): 1654. http://dx.doi.org/10.3390/sym12101654.

Full text
Abstract:
Text compression is one of the most significant research fields, and various algorithms for text compression have already been developed. This is a significant issue, as the use of internet bandwidth is considerably increasing. This article proposes a Burrows–Wheeler transform and pattern matching-based lossless text compression algorithm that uses Huffman coding in order to achieve an excellent compression ratio. In this article, we introduce an algorithm with two keys that are used in order to reduce more frequently repeated characters after the Burrows–Wheeler transform. We then find patter
APA, Harvard, Vancouver, ISO, and other styles
35

Abdulmonim, Dhafer Abdulameer, and Zainab Hassan Muhamad. "Improvement of Lossless Text Compression Methods using a Hybrid Method by the Integration of RLE, LZW and Huffman Coding Algorithms." International Journal of Software Engineering & Applications 15, no. 5 (2024): 17–27. http://dx.doi.org/10.5121/ijsea.2024.15502.

Full text
Abstract:
In the field of computer science, data compression is essential in the process of data transfer because it reduces file size without losing information. Issues that depend on the characteristics of the text can greatly affect the effectiveness of any compression algorithm. Consequently, each traditional compression algorithm has its own strengths and limitations. Therefore, a highly efficient compression method that achieves higher compression ratio is needed. In this paper, based on the sequential implementation of Huffman, RLE and LZW algorithms, a hybrid data compression method has been pro
APA, Harvard, Vancouver, ISO, and other styles
36

Bhanu, Nageswaran Usha, Prathaban Banu Priya, Tiruveedhula Sajana, et al. "Dingo algorithm-based forwarder selection and huffman coding to improve authentication." Indonesian Journal of Electrical Engineering and Computer Science 32, no. 1 (2023): 432. http://dx.doi.org/10.11591/ijeecs.v32.i1.pp432-440.

Full text
Abstract:
<span>In wireless sensor network (WSN), the high volume of observe and transmitted data among sensor nodes make it requires to maintain the security. Even though numerous secure data transmission approaches designed over a network, an inadequate resource and the complex environment cause not able to used in WSNs. Moreover, secure data communication is a big challenging problem in WSNs especially for the military application. This paper proposes a dingo algorithm-based forwarder selection and huffman coding (DAHC) to improve authentication in internet of things (IoT) WSN. Initially, it de
APA, Harvard, Vancouver, ISO, and other styles
37

Nageswaran, Usha Bhanu, Banu Priya Prathaban, Sajana Tiruveedhula, et al. "Dingo algorithm-based forwarder selection and huffman coding to improve authentication." Dingo algorithm-based forwarder selection and huffman coding to improve authentication 32, no. 1 (2023): 432–40. https://doi.org/10.11591/ijeecs.v32.i1.pp432-440.

Full text
Abstract:
In wireless sensor network (WSN), the high volume of observe and transmitted data among sensor nodes make it requires to maintain the security. Even though numerous secure data transmission approaches designed over a network, an inadequate resource and the complex environment cause not able to used in WSNs. Moreover, secure data communication is a big challenging problem in WSNs especially for the military application. This paper proposes a dingo algorithm-based forwarder selection and huffman coding (DAHC) to improve authentication in internet of things (IoT) WSN. Initially, it detects the an
APA, Harvard, Vancouver, ISO, and other styles
38

Liu, Yong, Bing Li, Yan Zhang, and Xia Zhao. "A Huffman-Based Joint Compression and Encryption Scheme for Secure Data Storage Using Physical Unclonable Functions." Electronics 10, no. 11 (2021): 1267. http://dx.doi.org/10.3390/electronics10111267.

Full text
Abstract:
With the developments of Internet of Things (IoT) and cloud-computing technologies, cloud servers need storage of a huge volume of IoT data with high throughput and robust security. Joint Compression and Encryption (JCAE) scheme based on Huffman algorithm has been regarded as a promising technology to enhance the data storage method. Existing JCAE schemes still have the following limitations: (1) The keys in the JCAE would be cracked by physical and cloning attacks; (2) The rebuilding of Huffman tree reduces the operational efficiency; (3) The compression ratio should be further improved. In t
APA, Harvard, Vancouver, ISO, and other styles
39

Xin, Dai, and Hao Xue. "EFFICIENT FREQUENT ITEMSET DISCOVERY THROUGH HIERARCHICAL HUFFMAN ENCODING." ICTACT Journal on Soft Computing 15, no. 4 (2025): 3682–87. https://doi.org/10.21917/ijsc.2025.0510.

Full text
Abstract:
Frequent itemsets mining holds a crucial position in the field of data mining; however, traditional algorithms like Apriori and FP-Growth often encounter efficiency and memory consumption issues when handling large-scale datasets, which not only makes them difficult to cope with dynamic dataset changes in some situations but also limits their widespread use in practical applications. Therefore, a novel DTFIMA (Dynamic Tiered Frequent Itemset Mining Algorithm,) algorithm is proposed in this paper to address optimization problems related to the storage and searching process of frequent itemsets
APA, Harvard, Vancouver, ISO, and other styles
40

Liu, Yu, Jie Hui Zeng, and Xiao Yang Liu. "Study on a Hybrid Algorithm of Navigation Radar Echo Data Compression." Applied Mechanics and Materials 543-547 (March 2014): 2796–99. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.2796.

Full text
Abstract:
With the conflict between storage space of the ship navigation radar system and large raw echoes data and files, a mixed compression algorithm of the bitmap compression + half-byte compression + Double Huffman compression is presented which is based on the characteristics of the text type data. On the basis of the original half-byte compression algorithm, the method combined with the bitmap compression and Huffman improved compression algorithm. If just using the half a byte compression algorithm, the compression ratio can only be 50%; while using the hybrid compression, the total compression
APA, Harvard, Vancouver, ISO, and other styles
41

Rysak, Paweł. "Comparative analysis of code execution time by C and Python based on selected algorithms." Journal of Computer Sciences Institute 26 (March 30, 2023): 93–99. http://dx.doi.org/10.35784/jcsi.3109.

Full text
Abstract:
The article deals with a comparative analysis of the speed of code execution written in the C language and Python. In order to determine whether a scripting language can match the performance of a compiled language, a comparison of the languages was made using the following algorithms: the algorithm for solving the Hanoi tower problem, the Huffman encoding algorithm and the algorithm for converting numbers into text. Each of the listed algorithms was implemented in both languages. Then the execution time of the programs was measured and the results were obtained, which prove that the C languag
APA, Harvard, Vancouver, ISO, and other styles
42

ANDROSHCHUK, O., O. NAHREBETSKYY, V. ORLENKO, V. CHESHUN, and A. KATAIEVA. "BASIC OPERATIONS OF SHIFT CODE FORMATION ALGORITHM USING HUFFMAN ENTROPY CODING." Herald of Khmelnytskyi National University. Technical sciences 291, no. 6 (2020): 7–12. https://doi.org/10.31891/2307-5732-2020-291-6-7-12.

Full text
Abstract:
The complexity and relevance of the tasks of cryptographic protection of information in the context of the increased value of information resources in the cyberspace causes interest in improving existing encryption algorithms and developing new ones. The paper presents the results of a study of the characteristic features of the method for increasing the cryptographic strength of encryption algorithms by modifying the input data using the methods of optimal entropy uneven coding using the example of Caesar replacement ciphers and optimal Huffman coding. Based on the results of the analysis, a
APA, Harvard, Vancouver, ISO, and other styles
43

Zhu, Yaohua, Mingsheng Huang, Yanghang Zhu, and Yong Zhang. "A Low-Complexity Lossless Compression Method Based on a Code Table for Infrared Images." Applied Sciences 15, no. 5 (2025): 2826. https://doi.org/10.3390/app15052826.

Full text
Abstract:
Traditional JPEG series image compression algorithms have limitations in speed. To improve the storage and transmission of 14-bit/pixel images acquired by infrared line-scan detectors, a novel method is introduced for achieving high-speed and highly efficient compression of line-scan infrared images. The proposed method utilizes the features of infrared images to reduce image redundancy and employs improved Huffman coding for entropy coding. The improved Huffman coding addresses the low-probability long coding of 14-bit images by truncating long codes, which results in low complexity and minim
APA, Harvard, Vancouver, ISO, and other styles
44

Soltani, Mohammad, and Amid Khatibi Bardsiri. "A New Secure Hybrid Algorithm for QR-Code Images Encryption and Steganography." APTIKOM Journal on Computer Science and Information Technologies 2, no. 2 (2017): 86–96. http://dx.doi.org/10.11591/aptikom.j.csit.109.

Full text
Abstract:
Encryption is a method, for the protection of useful information, which is used as one of the security purposes and steganography is the art of hiding the fact that communication is taking place, by hiding information in other Information. In this article at first, plain text message as a security information is converted to the (Quick Response Code) QR-code image and then we proposed a new secure hybrid algorithm for the encryption and steganography of generated QR-code. In this article image encryption is based on two-dimensional logistic chaotic map and AES algorithm and steganography techn
APA, Harvard, Vancouver, ISO, and other styles
45

Dang, Fangbin. "Image lossless compression algorithm optimization and FPGA implementation." Frontiers in Computing and Intelligent Systems 3, no. 2 (2023): 51–57. http://dx.doi.org/10.54097/fcis.v3i2.7194.

Full text
Abstract:
In this thesis, a minimum redundant prefix coding with higher compression ratio and lower time complexity is proposed for lossless compression of HD images. The compression algorithm is based on Canonical Huffman coding, preprocesses the data source to be compressed according to the image data features, and then compresses the data in batches using the locally uneven features in the data, which improves the compression ratio by 1.678 times compared with the traditional canonical Huffman coding. During the implementation of the algorithm, the counting sorting method with lower time complexity a
APA, Harvard, Vancouver, ISO, and other styles
46

Shaikh, A. A., and P. P. Gadekar. "Huffman Coding Technique for Image Compression." COMPUSOFT: An International Journal of Advanced Computer Technology 04, no. 04 (2015): 1585–87. https://doi.org/10.5281/zenodo.14771964.

Full text
Abstract:
Image compression is one of the most important steps in image transmission and storage. “A picture is worth more than thousand words “is a common saying. Images play an indispensable role in representing vitalin formation and needs to be saved for further use or can be transmitted over a medium. In order to have efficient utilization of disk space and transmission rate, images need to be compressed. Image compression is the technique of reducing the file size of a image without compromising with the image quality at acceptable level. Image compression is been used from a long time
APA, Harvard, Vancouver, ISO, and other styles
47

Parker, D. Stott. "Erratum: Conditions for Optimality of the Huffman Algorithm." SIAM Journal on Computing 27, no. 1 (1998): 317. http://dx.doi.org/10.1137/s0097539797328550.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Hong-Chung, Chen, Wang Yue-Li, and Lan Yu-Feng. "A memory-efficient and fast Huffman decoding algorithm." Information Processing Letters 69, no. 3 (1999): 119–22. http://dx.doi.org/10.1016/s0020-0190(99)00002-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Ponalagusa, R., E. Kannan, and Michael Arock. "A Huffman Decoding Algorithm in Mobile Robot Platform." Information Technology Journal 6, no. 5 (2007): 776–79. http://dx.doi.org/10.3923/itj.2007.776.779.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Amusa, Kamoli, Adeoluwawale Adewusi, Tolulope Erinosho, Sule Salawu, and David Odufejo. "On the application of wavelet transform and Huffman algorithm to Yorùbá language syntax text files compression." Serbian Journal of Electrical Engineering 19, no. 3 (2022): 351–68. http://dx.doi.org/10.2298/sjee2203351a.

Full text
Abstract:
Most algorithms of data compression were developed with English language as target text syntax. However, this paper approaches the problem of Yor?b? text files compression via the use of Discrete Wavelet Transform (DWT) and Huffman algorithm. Text files in Yor?b? language syntax are first converted into signal format that are then decomposed using DWT. The decomposed ASCII code representation of the text files are subsequently encoded using Huffman algorithm. Twenty different variants of DWTs taken from four families of wavelet filters (Haar, Daubechies, Symlets and bi-orthogonal) are consider
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!