What will the inference cost of the best publicly available LM be in 2030?
Basic
5
Ṁ3912031
7%
12%
.01-.1 docs
23%
.1-1 docs
26%
1-10 docs
12%
10-100 docs
13%
100-1K docs
2%
1K-10K docs
0.2%
10K-100K docs
4%
.001-.01 docs
Consider the best publicly available language model in 2030. For a single 2023 US dollar, how many 2K word documents can I each generate five words for?
I will pick a somewhat conservative estimate of the best inference cost I can achieve after working for three weeks with whatever funds I have at the time and without using publicly inaccessible resources.
Multimodal models that can operate on text count as LMs for the purpose of this question.
I will only accept answers that range over an OOM like so:
.1-1 docs, 1-10 docs, 10-100 docs
I may trade in this market.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
The new ChatGPT model (https://openai.com/blog/introducing-chatgpt-and-whisper-apis) is $.002/1K tokens, so
187 docs/$ = 1/[(2.005 x 1K words/doc)*(4/3 tok/word)*(0.002 $ / 1K tok)]
Related questions
Related questions
How much will gpt-4.5 inference cost?
Will OpenAI inference costs fall by 100x over the next 18 months?
32% chance
Will there be a gpt-4 quality LLM with distributed inference by the end of 2024?
27% chance
Will the best LLM in 2024 have <1 trillion parameters?
30% chance
Will second-order optimizers displace first-order optimizers for training LLMs by 2030?
42% chance
Before 2032, how much will be spent lobbying against regulation of General AI systems? (2024$, best guess)
Will the best LLM in 2024 have <500 billion parameters?
15% chance
Will it cost $30 to train a GPT-3 level model in 2030?
19% chance
Will Inflection AI claim to have the best LLM in the world before 2025?
18% chance
Will the best LLM in 2025 have <500 billion parameters?
25% chance