https://chatgpt.com/canvas/shared/693bc241917c819192f5bae040e4a0db

Suppose we want to enable DA to support at least a GiB of blob data per block, and to limit each blob to be around 4 MiB, it means that each block should have around 250 blobs. Da nodes would receive 250 shares and their proof.

KZG Greyhound Blaze ZODA (512x512) ZODA (1024x1024)
Proof per share 48 B 43 kiB 30 kiB 0.93 kiB 1.03 kiB
size of a share 4 kiB 4 kiB 4 kiB 32 kiB 16 kiB
Commitment size 48 B 4464 B 32 B 64 B 64 B
share + verification material (one sample) 4 kiB 52 kiB 34 kiB 33 kiB 17 kiB
DA node required bandwidth for 250 blobs of 4 MiB per block **33.3 kiB/s
267 kbit/s** **428 kiB/s
3424 kbit/s** 283.3 kiB/s 2267 kbit/s 274.4 kiB/s 2195 kbit/s 141.9 kiB/s 1135 kbit/s
Chunk sampling Possible Not Possible Not Possible Not Possible

Shares for Candidates

Commitments for Candidates

Proofs for Candidates

Calculation Details

Blaze