Longtermism
Caring about the long term future.
Critique about longtermism is that most of its cause areas are about problems which current humans will live to face.
- AI safety
- preparing for (artificial) pandemics
- resolving the climate crisis
The only area I can come up with that is true long term, is the idea that we should stop mining coal in case our civilisation ends and the next one needs coal in order to advance their tech tree.
I think AGI is the last problem humanity has to solve. Solving problems that will happen after AGI arrives is wasted resources since AGI will find a better solution using less resources.
If AGI is not aligned, I expect human extinction. There is no point in leaving presents for future civilisations then.