Data compression is the compacting of data by reducing the number of bits that are stored or transmitted. Because of this, the compressed info will require much less disk space than the initial one, so more content might be stored using the same amount of space. There are different compression algorithms that function in different ways and with a lot of them only the redundant bits are deleted, so once the data is uncompressed, there's no loss of quality. Others remove unnecessary bits, but uncompressing the data following that will lead to lower quality in comparison with the original. Compressing and uncompressing content needs a huge amount of system resources, particularly CPU processing time, therefore any web hosting platform which employs compression in real time needs to have adequate power to support that attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of storing the actual code.

Data Compression in Shared Web Hosting

The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm called LZ4. The latter is considerably faster and better than any other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of sites hosted on ZFS-based platforms. Because the algorithm compresses data very well and it does that very quickly, we are able to generate several backups of all the content stored in the shared web hosting accounts on our servers on a daily basis. Both your content and its backups will need reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not affect the performance of the servers where your content will be kept.

Data Compression in Semi-dedicated Hosting

The semi-dedicated hosting plans that we offer are created on a powerful cloud hosting platform which runs on the ZFS file system. ZFS uses a compression algorithm known as LZ4 that surpasses any other algorithm you can find in terms of speed and data compression ratio when it comes to processing website content. This is valid especially when data is uncompressed as LZ4 does that quicker than it would be to read uncompressed data from a hard disk and owing to this, Internet sites running on a platform where LZ4 is present will work at a higher speed. We are able to take advantage of this feature despite of the fact that it needs quite a large amount of CPU processing time as our platform uses many powerful servers working together and we don't make accounts on just a single machine like the vast majority of companies do. There is another reward of using LZ4 - considering that it compresses data really well and does that extremely fast, we can also make several daily backups of all accounts without influencing the performance of the servers and keep them for an entire month. By doing this, you can always bring back any content that you erase by mistake.