Skip to content

aicodix/cfe

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cauchy Fermat Prime Field Erasure Coding

Quick start:

Create file input.dat with 128KiB of random data:

dd if=/dev/urandom of=input.dat bs=512 count=256

Encode file input.dat to a thousand chunk files, each of 1024 bytes in size:

./encode input.dat 1024 chunk{000..999}.cfe

Output should be CFE(1000, 131), which means we only need any 131 chunks of the 1000 encoded.

Randomly delete (erase) 869 chunk files to only keep 131:

ls chunk*.cfe | sort -R | head -n 869 | xargs rm

Decode output.dat from remaining chunk files:

./decode output.dat chunk*.cfe

Compare original input.dat with output.dat:

diff -q -s input.dat output.dat

About

Cauchy Fermat Prime Field Erasure Coding

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages