The term data compression means decreasing the number of bits of information which needs to be stored or transmitted. You can do this with or without the loss of information, so what will be deleted at the time of the compression will be either redundant data or unnecessary one. When the data is uncompressed later on, in the first case the info and the quality will be identical, whereas in the second case the quality will be worse. You'll find different compression algorithms that are more effective for different sort of info. Compressing and uncompressing data usually takes a lot of processing time, therefore the server carrying out the action must have plenty of resources in order to be able to process the data fast enough. An example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 within the binary code as an alternative to storing the particular 1s and 0s.
Data Compression in Shared Hosting
The ZFS file system that runs on our cloud hosting platform employs a compression algorithm identified as LZ4. The aforementioned is a lot faster and better than any other algorithm you'll find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that quickly, we're able to generate several backup copies of all the content kept in the shared hosting accounts on our servers daily. Both your content and its backups will take reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the hosting servers where your content will be kept.