What we already explored:
- Full replication
- KZG univariate with chunk sampling
- Multivariate polynomial commitment
- Greyhound polynomial commitment
- Blaze polynomial commitment
- Zoda with Hadamar and Tensor
- Frida
- Whir
- Semi-AVID
What we want to compare:
- Does it requires a trusted setup?
- What’s the size of the setup?
- Is it post-quantum?
- What’s the prover complexity for the disperser?
- What’s the size of a shard (without proof)? In percentage of the original data?
- What’s the size of the commitment(s) required to verify the shard?
- What’s the proof size of one shard?
- What’s the verifier complexity of one shard (DA node)?
- Is it supporting chunk sampling?