The term data compression means lowering the number of bits of info which has to be stored or transmitted. This can be achieved with or without losing information, which means that what will be deleted in the course of the compression shall be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the content and its quality will be the same, while in the second case the quality will be worse. There are different compression algorithms which are more effective for different kind of information. Compressing and uncompressing data usually takes plenty of processing time, which means that the server carrying out the action must have enough resources to be able to process your data quick enough. An example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 inside the binary code rather than storing the particular 1s and 0s.
Data Compression in Shared Web Hosting
The compression algorithm which we work with on the cloud hosting platform where your new shared web hosting account will be created is called LZ4 and it's applied by the state-of-the-art ZFS file system which powers the system. The algorithm is greater than the ones other file systems use because its compression ratio is a lot higher and it processes data significantly quicker. The speed is most noticeable when content is being uncompressed since this happens more quickly than data can be read from a hard disk. For that reason, LZ4 improves the performance of each website stored on a server which uses this algorithm. We take full advantage of LZ4 in one more way - its speed and compression ratio allow us to generate multiple daily backups of the entire content of all accounts and keep them for thirty days. Not only do the backup copies take less space, but in addition their generation won't slow the servers down like it can often happen with alternative file systems.