Data compression is the compacting of info by decreasing the number of bits that are stored or transmitted. In this way, the compressed information will take considerably less disk space than the original one, so more content might be stored using the same amount of space. You'll find various compression algorithms that work in different ways and with some of them only the redundant bits are removed, so once the information is uncompressed, there is no loss of quality. Others remove excessive bits, but uncompressing the data subsequently will lead to reduced quality in comparison with the original. Compressing and uncompressing content takes a large amount of system resources, especially CPU processing time, therefore each and every Internet hosting platform that uses compression in real time should have enough power to support this feature. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of saving the actual code.

Data Compression in Shared Web Hosting

The compression algorithm employed by the ZFS file system that runs on our cloud internet hosting platform is known as LZ4. It can supercharge the performance of any site hosted in a shared web hosting account with us because not only does it compress info more effectively than algorithms employed by other file systems, but it uncompresses data at speeds which are higher than the hard drive reading speeds. This can be done by using a lot of CPU processing time, that is not a problem for our platform considering that it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it allows us to make backup copies more speedily and on reduced disk space, so we will have several daily backups of your databases and files and their generation will not change the performance of the servers. This way, we can always restore any content that you may have erased by accident.