Data compression is the compacting of information by reducing the number of bits that are stored or transmitted. This way, the compressed data will take considerably less disk space than the initial one, so extra content could be stored on identical amount of space. There are various compression algorithms that work in different ways and with a lot of them only the redundant bits are erased, which means that once the data is uncompressed, there's no loss of quality. Others remove unneeded bits, but uncompressing the data later on will lead to reduced quality in comparison with the original. Compressing and uncompressing content needs a huge amount of system resources, and in particular CPU processing time, so any Internet hosting platform which employs compression in real time must have adequate power to support that feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of keeping the actual code.

Data Compression in Cloud Web Hosting

The ZFS file system that runs on our cloud hosting platform employs a compression algorithm called LZ4. The latter is significantly faster and better than any other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of websites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that very quickly, we can generate several backups of all the content stored in the cloud web hosting accounts on our servers on a daily basis. Both your content and its backups will need less space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the servers where your content will be kept.

Data Compression in Semi-dedicated Servers

The ZFS file system which runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It is one of the best algorithms out there and definitely the best one when it comes to compressing and uncompressing website content, as its ratio is very high and it will uncompress data at a higher speed than the same data can be read from a hard disk drive if it were uncompressed. Thus, using LZ4 will quicken every Internet site that runs on a platform where the algorithm is present. The high performance requires lots of CPU processing time, that's provided by the multitude of clusters working together as part of our platform. What's more, LZ4 allows us to generate several backups of your content every day and save them for a month as they will take much less space than standard backups and will be created much more quickly without loading the servers.