💥 Gate Square Event: #PostToWinFLK 💥
Post original content on Gate Square related to FLK, the HODLer Airdrop, or Launchpool, and get a chance to share 200 FLK rewards!
📅 Event Period: Oct 15, 2025, 10:00 – Oct 24, 2025, 16:00 UTC
📌 Related Campaigns:
HODLer Airdrop 👉 https://www.gate.com/announcements/article/47573
Launchpool 👉 https://www.gate.com/announcements/article/47592
FLK Campaign Collection 👉 https://www.gate.com/announcements/article/47586
📌 How to Participate:
1️⃣ Post original content related to FLK or one of the above campaigns (HODLer Airdrop / Launchpool).
2️⃣ Content mu
Vitalik Buterin: Hope developers can express their performance in terms of ratios, rather than "N operations per second."
[Vitalik Buterin: I hope developers can express their performance in terms of ratios, rather than “N operations per second”] Vitalik Buterin posted on X, stating that he hopes more ZK and FHE developers can present their performance in terms of ratios (encryption computation time vs. raw computation time), i.e., “time required for encryption computation / time required for raw computation,” rather than just saying “we can do N operations per second.” Because this method is more hardware-independent and can provide a very valuable metric: how much efficiency have I sacrificed to enable the application to have encryption/privacy features instead of relying on trust mechanisms? This method is usually more useful for estimation because, as a developer, I already know roughly how much time the original calculation will take, so I just need to multiply by this ratio to estimate the encryption overhead. (Yes, I know this is difficult to achieve because the operations required between execution and proof are heterogeneous, especially in terms of SIMD parallelization and memory access patterns, which means that even the ratios are still somewhat influenced by hardware. Nevertheless, I still believe that the overhead factor is a valuable metric, even though it is not perfect.)