The term data compression means decreasing the number of bits of information that should be stored or transmitted. You can do this with or without losing data, so what will be removed during the compression will be either redundant data or unnecessary one. When the data is uncompressed later on, in the first case the info and its quality will be identical, whereas in the second case the quality shall be worse. You will find various compression algorithms which are better for different type of info. Compressing and uncompressing data normally takes plenty of processing time, so the server performing the action must have enough resources to be able to process the info fast enough. One simple example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 within the binary code rather than storing the actual 1s and 0s.

Data Compression in Shared Hosting

The compression algorithm employed by the ZFS file system which runs on our cloud hosting platform is called LZ4. It can upgrade the performance of any website hosted in a shared hosting account with us since not only does it compress info much better than algorithms used by alternative file systems, but also uncompresses data at speeds which are higher than the hard disk reading speeds. This is achieved by using a great deal of CPU processing time, which is not a problem for our platform considering the fact that it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it allows us to generate backups more rapidly and on less disk space, so we shall have a couple of daily backups of your files and databases and their generation won't influence the performance of the servers. In this way, we could always restore all the content that you could have erased by mistake.