I’ve been trying to measure 1D commitment scheme times with the following results:
Benchmarking trusted-setup-generation/trusted-setup-elements-size: 699050: Warming up for 3.0000 s
trusted-setup-generation/trusted-setup-elements-size: 699050
time: [1.4975 s 1.5317 s 1.5753 s]
trusted-setup-generation/trusted-setup-elements-size: 1048576
time: [2.2183 s 2.2315 s 2.2494 s]
trusted-setup-generation/trusted-setup-elements-size: 2097152
time: [4.3336 s 4.3599 s 4.3877 s]
Benchmarking polycommit/commit-for-data-size-64MB: Warming up for 3.0000 s
polycommit/commit-for-data-size-64MB
time: [2.1694 s 2.1936 s 2.2218 s]
Benchmarking polycommit/generate-proof: Warming up for 3.0000 s
polycommit/generate-proof
time: [2.2696 s 2.2817 s 2.2989 s]
Trusted setup up for 2097152 elements of 32Bytes (total 64MB)
Trusted setup is good enough, commit time would be enough as well, but a single proof takes too much time. Keep in mind we need to generate a proof for each element. Doing calculations it would take around 1000 hours to compute all of them. Even parallelization is not possible as the library already uses parallelization internally.