Conditional on not having died from unaligned AGI, I consider myself a full time alignment researcher by the end of 2030
Plus
16
Ṁ5502030
34%
chance
1D
1W
1M
ALL
I suspect that the primary mechanism by which this market resolves to NO would be either burnout or running out of funding. However, do not be limited to these mechanisms when trading.
Relevant market: https://manifold.markets/AlanaXiang/will-i-consider-myself-a-fulltime-a
I do not intend to buy shares in this market (either YES or NO).
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Related questions
Related questions
Will we reach "weak AGI" by the end of 2025?
30% chance
In 2025, will I believe that aligning automated AI research AI should be the focus of the alignment community?
59% chance
Will we get AGI before 2030?
51% chance
Will we get AGI before 2036?
69% chance
Will I have a career as an alignment researcher by the end of 2024?
38% chance
Will we get AGI before 2031?
57% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
34% chance
Will tailcalled think that the Brain-Like AGI alignment research program has achieved something important by October 20th, 2026?
36% chance
Will we get AGI before 2032?
61% chance
I make a contribution to AI safety that is endorsed by at least one high profile AI alignment researcher by the end of 2026
59% chance