Data compression is the compacting of info by reducing the number of bits which are stored or transmitted. As a result, the compressed information requires much less disk space than the initial one, so much more content could be stored on identical amount of space. There are various compression algorithms that work in different ways and with a number of them only the redundant bits are deleted, so once the information is uncompressed, there's no loss of quality. Others remove excessive bits, but uncompressing the data later on will result in lower quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, particularly CPU processing time, so any hosting platform that employs compression in real time must have ample power to support that attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of saving the actual code.

Data Compression in Cloud Hosting

The compression algorithm that we work with on the cloud hosting platform where your new cloud hosting account shall be created is known as LZ4 and it is applied by the revolutionary ZFS file system that powers the platform. The algorithm is greater than the ones other file systems work with as its compression ratio is a lot higher and it processes data a lot faster. The speed is most noticeable when content is being uncompressed as this happens more quickly than info can be read from a hdd. Because of this, LZ4 improves the performance of any site stored on a server that uses the algorithm. We take full advantage of LZ4 in one more way - its speed and compression ratio allow us to make a couple of daily backups of the whole content of all accounts and store them for 30 days. Not only do the backup copies take less space, but in addition their generation will not slow the servers down like it can often happen with some other file systems.