Data compression is the compacting of information by reducing the number of bits that are stored or transmitted. As a result, the compressed data takes less disk space than the original one, so a lot more content can be stored using identical amount of space. You can find many different compression algorithms that function in different ways and with some of them just the redundant bits are deleted, which means that once the data is uncompressed, there is no decrease in quality. Others erase unneeded bits, but uncompressing the data at a later time will lead to lower quality compared to the original. Compressing and uncompressing content requires a huge amount of system resources, particularly CPU processing time, so any hosting platform which employs compression in real time must have sufficient power to support this attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of saving the actual code.
Data Compression in Shared Hosting
The ZFS file system that runs on our cloud web hosting platform uses a compression algorithm called LZ4. The latter is considerably faster and better than every other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data really well and it does that very fast, we are able to generate several backup copies of all the content kept in the shared hosting accounts on our servers on a daily basis. Both your content and its backups will take less space and since both ZFS and LZ4 work very fast, the backup generation will not influence the performance of the web servers where your content will be kept.