minus-squarewaigl@lemmy.worldtoLinux@lemmy.ml•Which of the 3 standard compression algorithms on Unix (gz, xz, or bz2) is best for long term data archival at their highest compression?linkfedilinkarrow-up3arrow-down1·4 days ago Error correction and compression are usually at odds. Not really. If your data compresses well, you can compress it by easily 60, 70%, then add Reed-Solomon forward error correction blocks at like 20% redundancy, and you’d still be up overall. linkfedilink
Not really. If your data compresses well, you can compress it by easily 60, 70%, then add Reed-Solomon forward error correction blocks at like 20% redundancy, and you’d still be up overall.