To see the other types of publications on this topic, follow the link: Differential and Huffman coding.

Journal articles on the topic 'Differential and Huffman coding'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Differential and Huffman coding.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sripal Reddy, K., and B. Leelaram Prakash. "HSV, Edge Preserved and Huffman Coding based Intra Frame High Efficient video Compression for Multimedia Communication." International Journal of Engineering & Technology 7, no. 3.12 (July 20, 2018): 1090. http://dx.doi.org/10.14419/ijet.v7i3.12.17767.

Full text
Abstract:
High Efficiency Video Coding (HEVC) is another pressure standard for high resolution video content, which just needs 50% of the bit rate of H.264/Advanced Video Coding (AVC) at the same perceptual quality. Be that as it may, the computational intricacy is increments dramatically to adopt quad-tree organized Coding Unit (CU). In this document a new Hue Saturation Lightness (HSV), Edge Preserving and Huffman coding (HC) based intra frame high efficient video compression algorithm is introduced which is named as HSV-EPHC-IFHEVC. To increase the compression ratio of the video frames Huffman and Differential Pulse Code Modulation (DPCM) encodings are used. To improve the de-compressed frame quality in compression Sharpening filter based Edge preserving technique is used. This HSV-EPHC-IFHEVC algorithm provides much better performance compared to existing systems. The performance measurement is in the terms of MSE, PSNR, RMSE and Execution time.
APA, Harvard, Vancouver, ISO, and other styles
2

Hakim, P. R., and R. Permala. "Analysis of LAPAN-IPB image lossless compression using differential pulse code modulation and huffman coding." IOP Conference Series: Earth and Environmental Science 54 (January 2017): 012096. http://dx.doi.org/10.1088/1755-1315/54/1/012096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pang, Liaojun, Deyu Miao, Huixian Li, and Qiong Wang. "Improved Secret Image Sharing Scheme in Embedding Capacity without Underflow and Overflow." Scientific World Journal 2015 (2015): 1–16. http://dx.doi.org/10.1155/2015/861546.

Full text
Abstract:
Computational secret image sharing (CSIS) is an effective way to protect a secret image during its transmission and storage, and thus it has attracted lots of attentions since its appearance. Nowadays, it has become a hot topic for researchers to improve the embedding capacity and eliminate the underflow and overflow situations, which is embarrassing and difficult to deal with. The scheme, which has the highest embedding capacity among the existing schemes, has the underflow and overflow problems. Although the underflow and overflow situations have been well dealt with by different methods, the embedding capacities of these methods are reduced more or less. Motivated by these concerns, we propose a novel scheme, in which we take the differential coding, Huffman coding, and data converting to compress the secret image before embedding it to further improve the embedding capacity, and the pixel mapping matrix embedding method with a newly designed matrix is used to embed secret image data into the cover image to avoid the underflow and overflow situations. Experiment results show that our scheme can improve the embedding capacity further and eliminate the underflow and overflow situations at the same time.
APA, Harvard, Vancouver, ISO, and other styles
4

Sang, Jun, Muhammad Azeem Akbar, Bin Cai, Hong Xiang, and Haibo Hu. "Joint Image Compression and Encryption Using IWT with SPIHT, Kd-Tree and Chaotic Maps." Applied Sciences 8, no. 10 (October 17, 2018): 1963. http://dx.doi.org/10.3390/app8101963.

Full text
Abstract:
Confidentiality and efficient bandwidth utilization require a combination of compression and encryption of digital images. In this paper, a new method for joint image compression and encryption based on set partitioning in hierarchical trees (SPIHT) with optimized Kd-tree and multiple chaotic maps was proposed. First, the lossless compression and encryption of the original images were performed based on integer wavelet transform (IWT) with SPIHT. Wavelet coefficients undergo diffusions and permutations before encoded through SPIHT. Second, maximum confusion, diffusion and compression of the SPIHT output were performed via the modified Kd-tree, wavelet tree and Huffman coding. Finally, the compressed output was further encrypted with varying parameter logistic maps and modified quadratic chaotic maps. The performance of the proposed technique was evaluated through compression ratio (CR) and peak-signal-to-noise ratio (PSNR), key space and histogram analyses. Moreover, this scheme passes several security tests, such as sensitivity, entropy and differential analysis tests. According to the theoretical analysis and experimental results, the proposed method is more secure and decreases the redundant information of the image more than the existing techniques for hybrid compression and encryption.
APA, Harvard, Vancouver, ISO, and other styles
5

Moffat, Alistair. "Huffman Coding." ACM Computing Surveys 52, no. 4 (September 18, 2019): 1–35. http://dx.doi.org/10.1145/3342555.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Abdul Mannan, M., and M. Kaykobad. "Block huffman coding." Computers & Mathematics with Applications 46, no. 10-11 (November 2003): 1581–87. http://dx.doi.org/10.1016/s0898-1221(03)90193-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fraenkel, A. S. "Bidirectional Huffman Coding." Computer Journal 33, no. 4 (April 1, 1990): 296–307. http://dx.doi.org/10.1093/comjnl/33.4.296.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Knuth, Donald E. "Dynamic huffman coding." Journal of Algorithms 6, no. 2 (June 1985): 163–80. http://dx.doi.org/10.1016/0196-6774(85)90036-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dr. Atul Suryavanshi. "Optimized Adaptive Huffmann Coding For Paper Reduction in OFDM Systems." International Journal of New Practices in Management and Engineering 2, no. 04 (December 31, 2013): 13–18. http://dx.doi.org/10.17762/ijnpme.v2i04.23.

Full text
Abstract:
The main defect of OFDM systems is its high peak-to-average power ratio (PAPR). To decrease PAPR, Adaptive Huffman coding is essential. Encoding is transferred by two encoding techniques Huffman coding and Adaptive Huffman coding at the transmitter side. Mapping is done by QAM 16 and PSK 16.The PAPR results of Huffman and adaptive Huffman coding with QAM 16 and PSK 16 is compared. Simulation results shows that the Adaptive Huffman coding along with QAM 16 produces fruitful results in comparison with Huffman coding and adaptive Huffman coding with PSK 16.
APA, Harvard, Vancouver, ISO, and other styles
10

Amarunnishad, T. M., V. K. Govindan, and Abraham T. Mathew. "Block Truncation Coding with Huffman Coding." Journal of Medical Imaging and Health Informatics 1, no. 2 (June 1, 2011): 170–76. http://dx.doi.org/10.1166/jmihi.2011.1021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Sridhara, Deepak. "Efficient coding of information: Huffman coding." Resonance 11, no. 2 (February 2006): 51–73. http://dx.doi.org/10.1007/bf02837275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lakhani, G. "Modified JPEG huffman coding." IEEE Transactions on Image Processing 12, no. 2 (February 2003): 159–69. http://dx.doi.org/10.1109/tip.2003.809001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Mohr, August. "Huffman coding as metaphor." Kybernetes 42, no. 9/10 (November 11, 2013): 1431–38. http://dx.doi.org/10.1108/k-10-2012-0068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Bookstein, A., and S. T. Klein. "Is Huffman coding dead?" Computing 50, no. 4 (December 1993): 279–96. http://dx.doi.org/10.1007/bf02243872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Yang, Xiao Bo, and Bang Ze Chen. "Huffman Tree Visualization." Advanced Materials Research 468-471 (February 2012): 1883–86. http://dx.doi.org/10.4028/www.scientific.net/amr.468-471.1883.

Full text
Abstract:
Huffman tree is also called the optimal binary tree, is a kind of weighted shortest path length of the binary tree; Huffman coding is a coding method, which is used for a lossless data compression entropy coding ( right encoding ) optimal coding method. The realization of Huffman tree visualization is of great significance, this paper uses the object-oriented method, using a complete binary tree of Huffman tree visualization, visual image display of the Huffman coding process.
APA, Harvard, Vancouver, ISO, and other styles
16

Liu, Xing Ke, Ke Chen, and Bin Li. "Huffman Coding and Applications in Compression for Vector Maps." Applied Mechanics and Materials 333-335 (July 2013): 718–22. http://dx.doi.org/10.4028/www.scientific.net/amm.333-335.718.

Full text
Abstract:
Huffman coding is a statistical lossless coding method with high efficiency. The principal and implementation of Huffman coding is discussed and Huffman coding is implemented to the compression of vector maps. The property of the algorithm is discussed. Experiments demonstrated that the algorithm proposed can compress vector maps with high efficiency and no loss.
APA, Harvard, Vancouver, ISO, and other styles
17

Kumar, Vikas. "Compression Techniques Vs Huffman Coding." International Journal of Informatics and Communication Technology (IJ-ICT) 4, no. 1 (April 1, 2015): 29. http://dx.doi.org/10.11591/ijict.v4i1.pp29-37.

Full text
Abstract:
<p>The technique for compressioning the Images has been increasing because the fresh images need large amounts of disk space. It is seems to be a big disadvantage during transmission &amp; storage of image. Even though there are so many compression technique already presents and have better technique which is faster, memory efficient and simple, and friendly with the requirements of the user. In this paper we proposed the method for image compression and decompression using a simple coding technique called Huffman coding and show why this is more efficient then other technique. This technique is simple in implementation and utilizes less memory compression to other. A software algorithm has been developed and implemented to compress and decompress the given image using Huffman coding techniques in a MATLAB platform.</p><p> </p>
APA, Harvard, Vancouver, ISO, and other styles
18

ICHIHARA, H. "Huffman-Based Test Response Coding." IEICE Transactions on Information and Systems E88-D, no. 1 (January 1, 2005): 158–61. http://dx.doi.org/10.1093/ietisy/e88-d.1.158.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Okazaki, Hiroyuki, Yuichi Futa, and Yasunari Shidama. "Constructing Binary Huffman Tree." Formalized Mathematics 21, no. 2 (June 1, 2013): 133–43. http://dx.doi.org/10.2478/forma-2013-0015.

Full text
Abstract:
Summary Huffman coding is one of a most famous entropy encoding methods for lossless data compression [16]. JPEG and ZIP formats employ variants of Huffman encoding as lossless compression algorithms. Huffman coding is a bijective map from source letters into leaves of the Huffman tree constructed by the algorithm. In this article we formalize an algorithm constructing a binary code tree, Huffman tree.
APA, Harvard, Vancouver, ISO, and other styles
20

Luo, Hong Jun, Hai Cheng Xu, and Jian Hong Sun. "Analysis of Encoding Methods Contributing to the Payload of Steganography." Advanced Materials Research 756-759 (September 2013): 1912–15. http://dx.doi.org/10.4028/www.scientific.net/amr.756-759.1912.

Full text
Abstract:
In this paper, we compare Huffman coding and Arithmetic coding with ASCII coding to observe their contribution to steganography. This way is to compress the embedded information first for enhance the payload of a cover image. There are many wonderful coding methods to compress this kind of string, such as Huffman coding and Arithmetic coding. The experiment results indicate that Huffman coding and Arithmetic coding as an important preprocess step of steganography can effectively reduce the amount of secret information with efficient encoding performance. And a theory opinion related to payload calculation is proposed which can be used to describe the relationship among the three parameters of steganography: imperceptibility, information embedding efficiency and payload.
APA, Harvard, Vancouver, ISO, and other styles
21

Garg, Astha, Aditya Kumar Singh Pundir, and Sudhir Kumar Sharma. "Image Compression Using Modified Huffman Coding." INROADS- An International Journal of Jaipur National University 4, no. 1 (2015): 53. http://dx.doi.org/10.5958/2277-4912.2015.00010.7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Lu, W. W., and M. P. Gough. "A fast-adaptive Huffman coding algorithm." IEEE Transactions on Communications 41, no. 4 (April 1993): 535–38. http://dx.doi.org/10.1109/26.223776.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Braunstein, S. L., C. A. Fuchs, D. Gottesman, and Hoi-Kwong Lo. "A quantum analog of Huffman coding." IEEE Transactions on Information Theory 46, no. 4 (July 2000): 1644–49. http://dx.doi.org/10.1109/18.850709.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Kato, A., Te Sun Han, and H. Nagaoka. "Huffman coding with an infinite alphabet." IEEE Transactions on Information Theory 42, no. 3 (May 1996): 977–84. http://dx.doi.org/10.1109/18.490559.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Wang, Wei, and Wei Zhang. "Huffman Coding-Based Adaptive Spatial Modulation." IEEE Transactions on Wireless Communications 16, no. 8 (August 2017): 5090–101. http://dx.doi.org/10.1109/twc.2017.2705679.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Papamichalis, P. "Markov-Huffman coding of LPC parameters." IEEE Transactions on Acoustics, Speech, and Signal Processing 33, no. 2 (April 1985): 451–53. http://dx.doi.org/10.1109/tassp.1985.1164545.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Lakhani, G. "Optimal Huffman Coding of DCT Blocks." IEEE Transactions on Circuits and Systems for Video Technology 14, no. 4 (April 2004): 522–27. http://dx.doi.org/10.1109/tcsvt.2004.825565.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

KITAKAMI, M. "Burst Error Recovery for Huffman Coding." IEICE Transactions on Information and Systems E88-D, no. 9 (September 1, 2005): 2197–200. http://dx.doi.org/10.1093/ietisy/e88-d.9.2197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Klein, Shmuel T., and Dana Shapira. "Huffman Coding with Non-Sorted Frequencies." Mathematics in Computer Science 5, no. 2 (June 2011): 171–78. http://dx.doi.org/10.1007/s11786-011-0067-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Khaitu, Shree Ram, and Sanjeeb Prasad Panday. "Fractal Image Compression Using Canonical Huffman Coding." Journal of the Institute of Engineering 15, no. 1 (February 16, 2020): 91–105. http://dx.doi.org/10.3126/jie.v15i1.27718.

Full text
Abstract:
Image Compression techniques have become a very important subject with the rapid growth of multimedia application. The main motivations behind the image compression are for the efficient and lossless transmission as well as for storage of digital data. Image Compression techniques are of two types; Lossless and Lossy compression techniques. Lossy compression techniques are applied for the natural images as minor loss of the data are acceptable. Entropy encoding is the lossless compression scheme that is independent with particular features of the media as it has its own unique codes and symbols. Huffman coding is an entropy coding approach for efficient transmission of data. This paper highlights the fractal image compression method based on the fractal features and searching and finding the best replacement blocks for the original image. Canonical Huffman coding which provides good fractal compression than arithmetic coding is used in this paper. The result obtained depicts that Canonical Huffman coding based fractal compression technique increases the speed of the compression and has better PNSR as well as better compression ratio than standard Huffman coding.
APA, Harvard, Vancouver, ISO, and other styles
31

Klein, Shmuel T., and Dana Shapira. "On the Randomness of Compressed Data." Information 11, no. 4 (April 7, 2020): 196. http://dx.doi.org/10.3390/info11040196.

Full text
Abstract:
It seems reasonable to expect from a good compression method that its output should not be further compressible, because it should behave essentially like random data. We investigate this premise for a variety of known lossless compression techniques, and find that, surprisingly, there is much variability in the randomness, depending on the chosen method. Arithmetic coding seems to produce perfectly random output, whereas that of Huffman or Ziv-Lempel coding still contains many dependencies. In particular, the output of Huffman coding has already been proven to be random under certain conditions, and we present evidence here that arithmetic coding may produce an output that is identical to that of Huffman.
APA, Harvard, Vancouver, ISO, and other styles
32

Wu, Chin-Hsien, Hao-Wei Zhang, Chia-Wei Liu, Ta-Ching Yu, and Chi-Yen Yang. "A Dynamic Huffman Coding Method for Reliable TLC NAND Flash Memory." ACM Transactions on Design Automation of Electronic Systems 26, no. 5 (June 5, 2021): 1–25. http://dx.doi.org/10.1145/3446771.

Full text
Abstract:
With the progress of the manufacturing process, NAND flash memory has evolved from the single-level cell and multi-level cell into the triple-level cell (TLC). NAND flash memory has physical problems such as the characteristic of erase-before-write and the limitation of program/erase cycles. Moreover, TLC NAND flash memory has low reliability and short lifetime. Thus, we propose a dynamic Huffman coding method that can apply to the write operations of NAND flash memory. The proposed method exploits observations from a Huffman tree and machine learning from data patterns to dynamically select a suitable Huffman coding. According to the experimental results, the proposed method can improve the reliability of TLC NAND flash memory and also consider the compression performance for those applications that require the Huffman coding.
APA, Harvard, Vancouver, ISO, and other styles
33

Idris, Azlina, Nur Atiqah Md Deros, Idris Taib, Murizah Kassim, Mohd Danial Rozaini, and Darmawaty Mohd Ali. "PAPR Reduction Using Huffman and Arithmetic Coding Techniques in F-OFDM System." Bulletin of Electrical Engineering and Informatics 7, no. 2 (June 1, 2018): 257–63. http://dx.doi.org/10.11591/eei.v7i2.1169.

Full text
Abstract:
Filtered orthogonal frequency division multiplexing (F-OFDM) was introduced to overcome the high side lobes in the OFDM system. Filtering is implemented in the system to reduce the out-of-band emission (OOBE) for the spectrum utilization and to meet the diversified expectation of the upcoming 5G networks. The main drawback in the system is the high peak to average ratio (PAPR). This paper investigates the method used in reducing the PAPR in the F-OFDM system. The proposed method using the block coding technique to overcome the problem of high PAPR are the Arithmetic coding and Huffman coding. This research evaluates the performance of F-OFDM system based on the PAPR values. From the simulation results, the PAPR reduction of the Arithmetic coding is 8.9% lower, while the Huffman Coding is 6.7% lower in the F-OFDM system. The results prove that the Arithmetic Coding will out-perform the Huffman coding in the F-OFDM system.
APA, Harvard, Vancouver, ISO, and other styles
34

., Harmandeep Singh Nijjar. "EMBEDDED ZEROTREE WAVELET CODING WITH JOINT HUFFMAN AND ARITHMETIC CODING." International Journal of Research in Engineering and Technology 05, no. 06 (June 25, 2016): 411–17. http://dx.doi.org/10.15623/ijret.2016.0506073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Nandi, Utpal, and Jyotsna Kumar Mandal. "Windowed Huffman Coding with Limited Distinct Symbols." Procedia Technology 4 (2012): 589–94. http://dx.doi.org/10.1016/j.protcy.2012.05.094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Foldes, Stephan. "Huffman coding and chains in partition lattices." Mathematics for Application 9, no. 2 (December 22, 2020): 91–94. http://dx.doi.org/10.13164/ma.2020.08.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Park, H., and V. K. Prasanna. "Area efficient VLSI architectures for Huffman coding." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 40, no. 9 (1993): 568–75. http://dx.doi.org/10.1109/82.257334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Kui Liu, Yong, and Borut Žalik. "An efficient chain code with Huffman coding." Pattern Recognition 38, no. 4 (April 2005): 553–57. http://dx.doi.org/10.1016/j.patcog.2004.08.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Shamilov, Aladdin, and Senay Asma. "A Fano-Huffman Based Statistical Coding Method." Journal of Modern Applied Statistical Methods 6, no. 1 (May 1, 2007): 265–78. http://dx.doi.org/10.22237/jmasm/1177993440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Huang, H. C., and J. L. Wu. "Windowed Huffman coding algorithm with size adaptation." IEE Proceedings I Communications, Speech and Vision 140, no. 2 (1993): 109. http://dx.doi.org/10.1049/ip-i-2.1993.0015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

NAGARAJ, NITHIN. "HUFFMAN CODING AS A NONLINEAR DYNAMICAL SYSTEM." International Journal of Bifurcation and Chaos 21, no. 06 (June 2011): 1727–36. http://dx.doi.org/10.1142/s0218127411029392.

Full text
Abstract:
In this paper, source coding or data compression is viewed as a measurement problem. Given a measurement device with fewer states than the observable of a stochastic source, how can one capture their essential information? We propose modeling stochastic sources as piecewise-linear discrete chaotic dynamical systems known as Generalized Luröth Series (GLS) which has its roots in Georg Cantor's work in 1869. These GLS are special maps with the property that their Lyapunov exponent is equal to the Shannon's entropy of the source (up to a constant of proportionality). By successively approximating the source with GLS having fewer states (with the nearest Lyapunov exponent), we derive a binary coding algorithm which turns out to be a rediscovery of Huffman coding, the popular lossless compression algorithm used in the JPEG international standard for still image compression.
APA, Harvard, Vancouver, ISO, and other styles
42

Nuha, Hilal H. "Lossless Text Image Compression using Two Dimensional Run Length Encoding." Jurnal Online Informatika 4, no. 2 (February 14, 2020): 75. http://dx.doi.org/10.15575/join.v4i2.330.

Full text
Abstract:
Text images are used in many types of conventional data communication where texts are not directly represented by digital character such as ASCII but represented by an image, for instance facsimile file or scanned documents. We propose a combination of Run Length Encoding (RLE) and Huffman coding for two dimensional binary image compression namely 2DRLE. Firstly, each row in an image is read sequentially. Each consecutive recurring row is kept once and the number of occurrences is stored. Secondly, the same procedure is performed column-wise to the image produced by the first stage to obtain an image without consecutive recurring row and column. The image from the last stage is then compressed using Huffman coding. The experiment shows that the 2DRLE achieves a higher compression ratio than conventional Huffman coding for image by achieving more than 8:1 of compression ratio without any distortion.
APA, Harvard, Vancouver, ISO, and other styles
43

Hashemian, R. "Memory efficient and high-speed search Huffman coding." IEEE Transactions on Communications 43, no. 10 (1995): 2576–81. http://dx.doi.org/10.1109/26.469442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Kavousianos, Xrysovalantis, Emmanouil Kalligeros, and Dimitris Nikolos. "Optimal Selective Huffman Coding for Test-Data Compression." IEEE Transactions on Computers 56, no. 8 (August 2007): 1146–52. http://dx.doi.org/10.1109/tc.2007.1057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Polaczyk, Bartosz, Piotr Chołda, and Andrzej Jajszczyk. "Peer-to-Peer Multicasting Inspired by Huffman Coding." Journal of Computer Networks and Communications 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/312376.

Full text
Abstract:
Stringent QoS requirements of video streaming are not addressed by the delay characteristics of highly dynamic peer-to-peer (P2P) networks. To solve this problem, a novel locality-aware method for choosing optimal neighbors in live streaming multicast P2P overlays is presented in this paper. To create the appropriate multicast tree topology, a round-trip-time (RTT) value is used as a parameter distinguishing peers capabilities. The multicast tree construction is based on the Huffman source coding algorithm. First, a centrally managed version is presented, and then an effective use of a distributed paradigm is shown. Performance evaluation results prove that the proposed approach considerably improves the overlay efficiency from the viewpoint of end-users and content providers. Moreover, the proposed technique ensures a high level of resilience against gateway-link failures and adaptively reorganizes the overlay topology in case of dynamic, transient network fluctuations.
APA, Harvard, Vancouver, ISO, and other styles
46

Liu, L. Y., J. F. Wang, R. J. Wang, and J. Y. Lee. "Design and hardware architectures for dynamic Huffman coding." IEE Proceedings - Computers and Digital Techniques 142, no. 6 (1995): 411. http://dx.doi.org/10.1049/ip-cdt:19952157.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Mustafa, Rashed. "An Improved Decoding Technique for Efficient Huffman Coding." Journal of Computer Science Applications and Information Technology 2, no. 1 (February 15, 2017): 1–5. http://dx.doi.org/10.15226/2474-9257/2/1/00110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Shadiya M.K, Shabas, and Abdul Rahim V.C. "Huffman Coding Based Spatially Modulated Optical MIMO-OFDM." International Journal of Electronics and Communication Engineering 5, no. 10 (October 25, 2018): 1–5. http://dx.doi.org/10.14445/23488549/ijece-v5i10p101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Joshi, Reva, Githika G, Prudvi Ch, Fairooz SK, and Shaik Mohammed Rafi. "Efficient Data compression using variable length Huffman coding." International Journal of VLSI and Signal Processing 7, no. 2 (June 25, 2020): 6–10. http://dx.doi.org/10.14445/23942584/ijvsp-v7i2p102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Lu, Cheng-Chang, and Yong Ho Shin. "Parallel Implementations Of Huffman Coding Using Associative Memory." International Journal of Modelling and Simulation 16, no. 2 (January 1996): 67–72. http://dx.doi.org/10.1080/02286203.1996.11760281.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography