Please use this identifier to cite or link to this item:
|Title:||A performance analysis of delta and huffman compression algorithms|
|Authors:||Oke, A. O.|
Fakolujo, O. A.
Emuoyibofarhe, O. J.
|Abstract:||With the recent trend in Information and Communication Technology, Storage and Transfer of data and Information are two vital issues which have Cost and Speed implication respectively. Large volume of data (text or image) is constantly being processed on the internet or on a Personal Computer, which has led to the Upgrade of current System. Hence, the need for compression, which reduces storage capacity and effect Speed of transfer. Data Compression is the act of reducing the size of a file by minimizing redundant data. In a text file, redundant data can be frequently occurring characters or common vowels. This research involves a comparative performance analysis of Huffman and Delta Compression schemes. A compression program is used to convert data from an easy-to-use format (ASCII) to one optimized for compactness. Huffman and Delta algorithms were implemented using C#. Result was also presented on the efficiency of the former based on three parameters: the number of bit, compression ratio and percentage of compression. It was discovered that Huffman algorithm for data compression performs better, since it can store / transmit the least number of bits. The average compression percentage for Huffman and Delta algorithm was found to be 39% and 45% respectively. Which simply implies that for a large text file, Huffman algorithm will achieve a 39% reduction in the file size and as such increase the capacity of the storage medium.|
|Appears in Collections:||scholarly works|
Files in This Item:
|(28)ui_art_oke_performance_2009.pdf||1.85 MB||Adobe PDF|
Items in UISpace are protected by copyright, with all rights reserved, unless otherwise indicated.