At least one of the most powerful neural nets at end of 2030 will be trained using 10^27 FLOPs
Basic
6
Ṁ3102031
90%
chance
1D
1W
1M
ALL
Resolves YES if at least one of the most powerfull neural nets publicly known to exist by end of 2030 was trained using at least 10^27 FLOPs. This is ~30 exaFLOP/s years. It does not matter if the compute is distributed, as long as one of the largest models used it. A neural net which uses 10^27 FLOPs but is inferior to other models does not count. Low precision floating point such as fp32, fp16, or fp8 is permitted.
Resolves NO if no such model exists by end of 2030.
If we have no good estimates of training compute usage of top models, resolves N/A.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
predictedYES
See also:
Related questions
Related questions
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
81% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
86% chance
At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
89% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
77% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2025?
15% chance
By March 14, 2025, will there be an AI model with over 10 trillion parameters?
62% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
22% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
53% chance
Neural Nets will generate at least 1 scientific breakthrough or novel theorem by the end of 2025
39% chance