Academic literature on the topic 'Huffman’s algorithm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Huffman’s algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Huffman’s algorithm"

1

Blanchette, Jasmin Christian. "Proof Pearl: Mechanizing the Textbook Proof of Huffman’s Algorithm." Journal of Automated Reasoning 43, no. 1 (February 13, 2009): 1–18. http://dx.doi.org/10.1007/s10817-009-9116-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

García, Jesús, Verónica González-López, Gustavo Tasca, and Karina Yaginuma. "An Efficient Coding Technique for Stochastic Processes." Entropy 24, no. 1 (December 30, 2021): 65. http://dx.doi.org/10.3390/e24010065.

Full text
Abstract:
In the framework of coding theory, under the assumption of a Markov process (Xt) on a finite alphabet A, the compressed representation of the data will be composed of a description of the model used to code the data and the encoded data. Given the model, the Huffman’s algorithm is optimal for the number of bits needed to encode the data. On the other hand, modeling (Xt) through a Partition Markov Model (PMM) promotes a reduction in the number of transition probabilities needed to define the model. This paper shows how the use of Huffman code with a PMM reduces the number of bits needed in this process. We prove the estimation of a PMM allows for estimating the entropy of (Xt), providing an estimator of the minimum expected codeword length per symbol. We show the efficiency of the new methodology on a simulation study and, through a real problem of compression of DNA sequences of SARS-CoV-2, obtaining in the real data at least a reduction of 10.4%.
APA, Harvard, Vancouver, ISO, and other styles
3

Okazaki, Hiroyuki, Yuichi Futa, and Yasunari Shidama. "Constructing Binary Huffman Tree." Formalized Mathematics 21, no. 2 (June 1, 2013): 133–43. http://dx.doi.org/10.2478/forma-2013-0015.

Full text
Abstract:
Summary Huffman coding is one of a most famous entropy encoding methods for lossless data compression [16]. JPEG and ZIP formats employ variants of Huffman encoding as lossless compression algorithms. Huffman coding is a bijective map from source letters into leaves of the Huffman tree constructed by the algorithm. In this article we formalize an algorithm constructing a binary code tree, Huffman tree.
APA, Harvard, Vancouver, ISO, and other styles
4

Nasif, Ammar, Zulaiha Ali Othman, and Nor Samsiah Sani. "The Deep Learning Solutions on Lossless Compression Methods for Alleviating Data Load on IoT Nodes in Smart Cities." Sensors 21, no. 12 (June 20, 2021): 4223. http://dx.doi.org/10.3390/s21124223.

Full text
Abstract:
Networking is crucial for smart city projects nowadays, as it offers an environment where people and things are connected. This paper presents a chronology of factors on the development of smart cities, including IoT technologies as network infrastructure. Increasing IoT nodes leads to increasing data flow, which is a potential source of failure for IoT networks. The biggest challenge of IoT networks is that the IoT may have insufficient memory to handle all transaction data within the IoT network. We aim in this paper to propose a potential compression method for reducing IoT network data traffic. Therefore, we investigate various lossless compression algorithms, such as entropy or dictionary-based algorithms, and general compression methods to determine which algorithm or method adheres to the IoT specifications. Furthermore, this study conducts compression experiments using entropy (Huffman, Adaptive Huffman) and Dictionary (LZ77, LZ78) as well as five different types of datasets of the IoT data traffic. Though the above algorithms can alleviate the IoT data traffic, adaptive Huffman gave the best compression algorithm. Therefore, in this paper, we aim to propose a conceptual compression method for IoT data traffic by improving an adaptive Huffman based on deep learning concepts using weights, pruning, and pooling in the neural network. The proposed algorithm is believed to obtain a better compression ratio. Additionally, in this paper, we also discuss the challenges of applying the proposed algorithm to IoT data compression due to the limitations of IoT memory and IoT processor, which later it can be implemented in IoT networks.
APA, Harvard, Vancouver, ISO, and other styles
5

Lamorahan, Christine, Benny Pinontoan, and Nelson Nainggolan. "Data Compression Using Shannon-Fano Algorithm." d'CARTESIAN 2, no. 2 (October 1, 2013): 10. http://dx.doi.org/10.35799/dc.2.2.2013.3207.

Full text
Abstract:
Abstract Communication systems in the world of technology, information and communication are known as data transfer system. Sometimes the information received lost its authenticity, because size of data to be transferred exceeds the capacity of the media used. This problem can be reduced by applying compression process to shrink the size of the data to obtain a smaller size. This study considers compression for data text using Shannon – Fano algorithm and shows how effective these algorithms in compressing it when compared with the Huffman algorithm. This research shows that text data compression using Shannon-Fano algorithm has a same effectiveness with Huffman algorithm when all character in string all repeated and when the statement short and just one character in the statement that repeated, but the Shannon-Fano algorithm more effective then Huffman algorithm when the data has a long statement and data text have more combination character in statement or in string/ word. Keywords: Data compression, Huffman algorithm, Shannon-Fano algorithm Abstrak Sistem komunikasi dalam dunia teknologi informasi dan komunikasi dikenal sebagai sistem transfer data. Informasi yang diterima kadang tidak sesuai dengan aslinya, dan salah satu penyebabnya adalah besarnya ukuran data yang akan ditransfer melebihi kapasitas media yang digunakan. Masalah ini dapat diatasi dengan menerapkan proses kompresi untuk mengecilkan ukuran data yang besar sehingga diperoleh ukuran yang lebih kecil. Penelitian ini menunjukan salah satu kompresi untuk data teks dengan menggunakan algoritma Shannon – Fano serta menunjukan seberapa efektif algoritma tersebut dalam mengkompresi data jika dibandingkan dengan algoritma Huffman. Kompresi untuk data teks dengan algoritma Shannon-Fano menghasilkan suatu data dengan ukuran yang lebih kecil dari data sebelumnya dan perbandingan dengan algoritma Huffman menunjukkan bahwa algoritma Shannon- Fano memiliki keefektifan yang sama dengan algoritma Huffman jika semua karakter yang ada di data berulang dan jika dalam satu kalimat hanya ada satu karakter yang berulang, tapi algoritma Shannon-Fano lebih efektif jika kalimat lebih panjang dan jumlah karakter di dalam kalimat atau kata lebih banyak dan beragam. Kata kunci: Algoritma Huffman, Algoritma Shannon-Fano, Kompresi data
APA, Harvard, Vancouver, ISO, and other styles
6

Singh, Satpreet, and Harmandeep Singh. "Improved Adaptive Huffman Compression Algorithm." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 1, no. 1 (December 30, 2011): 16–22. http://dx.doi.org/10.24297/ijct.v1i1.2602.

Full text
Abstract:
In information age, sending the data from one end to another endneed lot of space as well as time. Data compression is atechnique to compress the information source (e.g. a data file, aspeech signal, an image, or a video signal) in possible fewnumbers of bits. One of the major factors that influence the DataCompression technique is the procedure to encode the sourcedata and space required for encoded data. There are many datacompressions methods which are used for data compression andout of which Huffman is mostly used for same. Huffmanalgorithms have two ranges static as well as adaptive. StaticHuffman algorithm is a technique that encoded the data in twopasses. In first pass it requires to calculate the frequency of eachsymbol and in second pass it constructs the Huffman tree.Adaptive Huffman algorithm is expanded on Huffman algorithmthat constructs the Huffman tree but take more space than StaticHuffman algorithm. This paper introduces a new datacompression Algorithm which is based on Huffman coding. Thisalgorithm not only reduces the number of pass but also reducethe storage space in compare to adaptive Huffman algorithm andcomparable to static.
APA, Harvard, Vancouver, ISO, and other styles
7

Gupta, Apratim. "Huffman Algorithm Improvement." International Journal of Advanced Research in Computer Science and Software Engineering 7, no. 6 (June 30, 2017): 903–4. http://dx.doi.org/10.23956/ijarcsse/v7i6/0341.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rahman, Md Atiqur, and Mohamed Hamada. "Burrows–Wheeler Transform Based Lossless Text Compression Using Keys and Huffman Coding." Symmetry 12, no. 10 (October 10, 2020): 1654. http://dx.doi.org/10.3390/sym12101654.

Full text
Abstract:
Text compression is one of the most significant research fields, and various algorithms for text compression have already been developed. This is a significant issue, as the use of internet bandwidth is considerably increasing. This article proposes a Burrows–Wheeler transform and pattern matching-based lossless text compression algorithm that uses Huffman coding in order to achieve an excellent compression ratio. In this article, we introduce an algorithm with two keys that are used in order to reduce more frequently repeated characters after the Burrows–Wheeler transform. We then find patterns of a certain length from the reduced text and apply Huffman encoding. We compare our proposed technique with state-of-the-art text compression algorithms. Finally, we conclude that the proposed technique demonstrates a gain in compression ratio when compared to other compression techniques. A small problem with our proposed method is that it does not work very well for symmetric communications like Brotli.
APA, Harvard, Vancouver, ISO, and other styles
9

Erdal, Erdal, and Atilla Ergüzen. "An Efficient Encoding Algorithm Using Local Path on Huffman Encoding Algorithm for Compression." Applied Sciences 9, no. 4 (February 22, 2019): 782. http://dx.doi.org/10.3390/app9040782.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Xing Ke, Ke Chen, and Bin Li. "Huffman Coding and Applications in Compression for Vector Maps." Applied Mechanics and Materials 333-335 (July 2013): 718–22. http://dx.doi.org/10.4028/www.scientific.net/amm.333-335.718.

Full text
Abstract:
Huffman coding is a statistical lossless coding method with high efficiency. The principal and implementation of Huffman coding is discussed and Huffman coding is implemented to the compression of vector maps. The property of the algorithm is discussed. Experiments demonstrated that the algorithm proposed can compress vector maps with high efficiency and no loss.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Huffman’s algorithm"

1

Хованська, Т. А. "Проблеми створення і стиснення великих інформаційних сховищ і складів даних." Master's thesis, Сумський державний університет, 2019. http://essuir.sumdu.edu.ua/handle/123456789/76429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Devulapalli, Venkata Lakshmi Narasimha. "Application of Huffman Data Compression Algorithm in Hashing Computation." TopSCHOLAR®, 2018. https://digitalcommons.wku.edu/theses/2614.

Full text
Abstract:
Cryptography is the art of protecting information by encrypting the original message into an unreadable format. A cryptographic hash function is a hash function which takes an arbitrary length of the text message as input and converts that text into a fixed length of encrypted characters which is infeasible to invert. The values returned by the hash function are called as the message digest or simply hash values. Because of its versatility, hash functions are used in many applications such as message authentication, digital signatures, and password hashing [Thomsen and Knudsen, 2005]. The purpose of this study is to apply Huffman data compression algorithm to the SHA-1 hash function in cryptography. Huffman data compression algorithm is an optimal compression or prefix algorithm where the frequencies of the letters are used to compress the data [Huffman, 1952]. An integrated approach is applied to achieve new compressed hash function by integrating Huffman compressed codes in the core functionality of hashing computation of the original hash function.
APA, Harvard, Vancouver, ISO, and other styles
3

Maciel, Marcos Costa. "Compressão de dados ambientais em redes de sensores sem fio usando código de Huffman." Universidade Tecnológica Federal do Paraná, 2013. http://repositorio.utfpr.edu.br/jspui/handle/1/506.

Full text
Abstract:
Fundação do Amparo à Pesquisa do Estado do Amazonas (FAPEAM)
Nesta dissertação de mestrado é apresentada uma proposta de um método simples de compressão de dados sem perda para Redes de Sensores sem Fio (RSSF). Este método é baseado numa codificação Huffman convencional aplicada a um conjunto de amostras de parâmetros monitorados que possuam uma forte correlação temporal, fazendo com que seja gerado um dicionário Huffman a partir dessas probabilidades e que possam ser utilizadas em outros conjuntos de parâmetros de mesma característica. Os resultados de simulação usando temperatura e umidade relativa mostram que este método supera alguns dos mais populares mecanismos de compressão projetados especificamente para RSSF.
In this masters thesis we present a lightweight lossless data compression method for wireless sensor networks(WSN). This method is based on a conventional Huffman coding applied to a sample set of monitored parameters that have a strong temporal correlation, so that a Huffman dictionary is generated from these probabilities, and which may be used in other sets of parameters with same characteristic. Simulations results using temperature and relative humidity measurements show that the proposed method outperforms popular compression mechanisms designed specifically for wireless sensor networks.
APA, Harvard, Vancouver, ISO, and other styles
4

Friedrich, Tomáš. "Komprese DNA sekvencí." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-237222.

Full text
Abstract:
The increasing volume of biological data requires finding new ways to save these data in genetic banks. The target of this work is design and implementation of a novel algorithm for compression of DNA sequences. The algorithm is based on aligning DNA sequences agains a reference sequence and storing only diferencies between sequence and reference model. The work contains basic prerequisities from molecular biology which are needed for understanding of algorithm details. Next aligment algorithms and common compress schemes suitable for storing of diferencies agains reference sequence are described. The work continues with a description of implementation, which is follewed by derivation of time and space complexity and comparison with common compression algorithms. Further continuation of this thesis is discussed in conclusion.
APA, Harvard, Vancouver, ISO, and other styles
5

Le, Thu Anh. "An Exploration of the Word2vec Algorithm: Creating a Vector Representation of a Language Vocabulary that Encodes Meaning and Usage Patterns in the Vector Space Structure." Thesis, University of North Texas, 2016. https://digital.library.unt.edu/ark:/67531/metadc849728/.

Full text
Abstract:
This thesis is an exloration and exposition of a highly efficient shallow neural network algorithm called word2vec, which was developed by T. Mikolov et al. in order to create vector representations of a language vocabulary such that information about the meaning and usage of the vocabulary words is encoded in the vector space structure. Chapter 1 introduces natural language processing, vector representations of language vocabularies, and the word2vec algorithm. Chapter 2 reviews the basic mathematical theory of deterministic convex optimization. Chapter 3 provides background on some concepts from computer science that are used in the word2vec algorithm: Huffman trees, neural networks, and binary cross-entropy. Chapter 4 provides a detailed discussion of the word2vec algorithm itself and includes a discussion of continuous bag of words, skip-gram, hierarchical softmax, and negative sampling. Finally, Chapter 5 explores some applications of vector representations: word categorization, analogy completion, and language translation assistance.
APA, Harvard, Vancouver, ISO, and other styles
6

Dvořák, Martin. "Výukový video kodek." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2012. http://www.nusl.cz/ntk/nusl-219882.

Full text
Abstract:
The first goal of diploma thesis is to study the basic principles of video signal compression. Introduction to techniques used to reduce irrelevancy and redundancy in the video signal. The second goal is, on the basis of information about compression tools, implement the individual compression tools in the programming environment of Matlab and assemble simple model of the video codec. Diploma thesis contains a description of the three basic blocks, namely - interframe coding, intraframe coding and coding with variable length word - according the standard MPEG-2.
APA, Harvard, Vancouver, ISO, and other styles
7

陳宏綜. "A Memory-Efficient and Fast Huffman Decoding Algorithm." Thesis, 1999. http://ndltd.ncl.edu.tw/handle/64740387702559843977.

Full text
Abstract:
碩士
國立臺灣科技大學
管理研究所資訊管理學程
87
To reduce the memory size and speed up the process of searching for a symbol in a Huffman tree, we propose a memory-efficient array data structure to represent the Huffman tree. Then, we present a fast Huffman decoding algorithm, which takes O(mlog n) time and uses 3n/2 + n/2logn + 1 memory space, where n is the number of symbols in a Huffman tree.
APA, Harvard, Vancouver, ISO, and other styles
8

Lin, Yi-Kai, and 林義凱. "A Space-Efficient Huffman Decoding Algorithm and its Parallelism." Thesis, 1997. http://ndltd.ncl.edu.tw/handle/51208485736616000258.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lin, Yih-Kai, and 林義凱. "A Space-Efficient Huffman Decoding Algorithm and its Parallelism." Thesis, 1997. http://ndltd.ncl.edu.tw/handle/72819908164921559975.

Full text
Abstract:
碩士
國立台灣工業技術學院
管理技術研究所
85
This paper first transforms the Huffman treeinto a single--side growing Huffman tree, then presents a memory--efficientdata structure to represent the single--side growing Huffman tree, whichrequires $O((n+d)\lceil \log_{2} n \rceil)$--bits memory space, where $n$ isthenumber of source symbols and $d$ is the depth of the Huffman tree.Based on the proposed data structure, we present an$O(d)$--timeHuffman decoding algorithm.Using the sameexample, thememory required in our decoding algorithm is much less than that ofHashemian in 1995. We finally modify our proposed data structure slightly todesign an$O(1)$--time parallel Huffman decoding algorithmon a concurrent read exclusive write parallel random--access machine(CREW PRAM) using $d$ processors.
APA, Harvard, Vancouver, ISO, and other styles
10

林垂慶. "Huffman Codec Design Based H.263+ Video Encryption Algorithms." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/56514332526578049996.

Full text
Abstract:
碩士
國立暨南國際大學
資訊工程學系
92
With the advancement of science and technology, the transmission of digital video data becomes more and more popular. Such as video conference system and online pay-TV which may be grabbed by a hacker. Hence, in order to prevent data piracy and plagiarism, the encryption of multimedia data becomes an extremely important issue. In this paper, we propose an efficient encryption framework which wouldn’t influence seriously the coding speed and compression efficiency of the original codec system. Since, no matter while employing bit-scrambling techniques on the data in the time domain or frequency domain, which would influence the performance of the codec system seriously. Besides, the first method of the compression domain based encryption algorithms proposed in [2] is also less efficient under the situation that there exists a large amount of motion vector codewords in the coded bitstream. Hence, we proposed a lightweight encryption framework through the modification of Huffman tables and it was implemented and verified to be efficient while embedded in H.263+ codec. In order to construct an encryption system which could still own the fast coding speed and better compression efficiency. First, we propose to scramble the fixed-length code (FLC) tables and variable-length code (VLC) tables by using the splay tree algorithm. Next, we use the chaotic algorithm to produce the secret key more promptly and confidentially. Furthermore, different configurations of security can be achieved by easily adapting our system.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Huffman’s algorithm"

1

McSkane, Brian. The implementation of a Huffman compression algorithm for medical radiologic images in Khoros. [s.l: The Author], 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Huffman’s algorithm"

1

Moffat, Alistair. "Huffman Coding." In Encyclopedia of Algorithms, 938–42. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4939-2864-4_633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Moffat, Alistair. "Huffman Coding." In Encyclopedia of Algorithms, 1–6. Boston, MA: Springer US, 2014. http://dx.doi.org/10.1007/978-3-642-27848-8_633-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hoogerwoord, Rob R. "A derivation of Huffman's algorithm." In Lecture Notes in Computer Science, 375–78. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/3-540-56625-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nag, Amitava, Jyoti Prakash Singh, Sushanta Biswas, Debasree Sarkar, and Partha Pratim Sarkar. "A Huffman Code Based Image Steganography Technique." In Applied Algorithms, 257–65. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-04126-1_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Adaş, Boran, Ersin Bayraktar, and M. Oğuzhan Külekci. "Huffman Codes versus Augmented Non-Prefix-Free Codes." In Experimental Algorithms, 315–26. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20086-6_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pinto, Paulo E. D., Fábio Protti, and Jayme L. Szwarcfiter. "A Huffman-Based Error Detecting Code." In Experimental and Efficient Algorithms, 446–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-24838-5_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fujiwara, Hiroshi, and Tobias Jacobs. "On the Huffman and Alphabetic Tree Problem with General Cost Functions." In Algorithms – ESA 2010, 439–50. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15775-2_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Yaqiong, Yuzhuo Wen, Dingrong Yuan, and Yuwei Cuan. "A Huffman Tree-Based Algorithm for Clustering Documents." In Advanced Data Mining and Applications, 630–40. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-14717-8_49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yanhui, He Zhang, Jun Shi, and Xiaozheng Yin. "Microscan Imager Logging Data Compression Using improved Huffman Algorithm." In Communications in Computer and Information Science, 616–20. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-662-45049-9_101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Jianhua, Wei Yang, Liusheng Huang, and Wuji Chen. "A Detection-Resistant Covert Timing Channel Based on Geometric Huffman Coding." In Wireless Algorithms, Systems, and Applications, 308–20. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-94268-1_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Huffman’s algorithm"

1

Ya-Jun He, Duo-Li Zhang, Bin Shen, and Luo-Feng Geng. "Implementation of fast Huffman decoding algorithm." In 2007 7th International Conference on ASIC. IEEE, 2007. http://dx.doi.org/10.1109/icasic.2007.4415744.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Wei, Zhen Peng Pang, and Zhi Jie Liu. "SPIHT Algorithm Combined with Huffman Encoding." In 2010 Third International Symposium on Intelligent Information Technology and Security Informatics (IITSI). IEEE, 2010. http://dx.doi.org/10.1109/iitsi.2010.63.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Choi, K. H., and H. K. Dai. "A marking scheme using Huffman codes for IP traceback." In 7th International Symposium on Parallel Architectures, Algorithms and Networks, 2004. Proceedings. IEEE, 2004. http://dx.doi.org/10.1109/ispan.2004.1300516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ergude, Bao, Li Weisheng, Fan Dongrui, and Ma Xiaoyu. "A Study and Implementation of the Huffman Algorithm Based on Condensed Huffman Table." In 2008 International Conference on Computer Science and Software Engineering. IEEE, 2008. http://dx.doi.org/10.1109/csse.2008.1432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Vidhyaa, V. G., S. Aarthi Rajalakshmi, Ragavi Raghavan, G. S. V. Venu Gopal, and R. Gandhiraj. "Huffman encoding and decoding algorithm using IJulia." In 2016 International Conference on Communication and Signal Processing (ICCSP). IEEE, 2016. http://dx.doi.org/10.1109/iccsp.2016.7754207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Asare, Sampson D., and Phyela Mbewe. "CMedia Compressor: An Application to Graphically Compare General Compression Algorithms and Adaptive Huffman Compression Algorithm." In 2018 International Conference on Intelligent and Innovative Computing Applications (ICONIC). IEEE, 2018. http://dx.doi.org/10.1109/iconic.2018.8601237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Khare, Monik, Claire Mathieu, and Neal E. Young. "First Come First Served for Online Slot Allocation and Huffman Coding." In Proceedings of the Twenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2013. http://dx.doi.org/10.1137/1.9781611973402.33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dhawale, Nidhi. "Implementation of Huffman algorithm and study for optimization." In 2014 International Conference on Advances in Communication and Computing Technologies (ICACACT). IEEE, 2014. http://dx.doi.org/10.1109/eic.2015.7230711.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hernandez, Marco Antonio Soto, Oscar Alvarado-Nava, Eduardo Rodriguez-Martinez, and Francisco J. Zaragoza Martinez. "Tree-less Huffman coding algorithm for embedded systems." In 2013 International Conference on ReConFigurable Computing and FPGAs (ReConFig). IEEE, 2013. http://dx.doi.org/10.1109/reconfig.2013.6732335.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pham, Hoang-Anh, Van-Hieu Bui, and Anh-Vu Dinh-Duc. "An Adaptive Huffman Decoding Algorithm for MP3 Decoder." In 2010 Fifth IEEE International Symposium on Electronic Design, Test & Applications. IEEE, 2010. http://dx.doi.org/10.1109/delta.2010.22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography