1. Pandemics
A pandemic is like an epidemic, but on a much larger scale. Every little child knows this nowadays. Not so long ago, our planet was practically paralyzed by Covid.
Compared to other pandemics in the past, such as the plague or the Spanish flu, Covid was a weak tea, even though it caused significant trouble for humanity.
In the future, however, much more dangerous viruses may emerge. And one of them might be able to wipe out humanity completely.
Although we currently have technologies and medicines that previous generations could only dream of, infectious diseases spread much faster than ever before due to the interconnectedness of the world.
In the beginning, just a few infected individuals are enough to spread the virus worldwide.
Many laypeople believe that the disaster mainly threatens diseases that can kill quickly. That's a mistake. If a disease manifests itself within a few hours of infection, from an epidemiological perspective, it can be dealt with relatively easily.
A much worse virus would be one that starts causing havoc in the organism several days after infection. This is how the aforementioned Covid managed to spread.
2. Climate change
Climate change can also have very detrimental effects on human generations. Humanity may not become extinct because of it, but established orders may collapse.
All it takes is the melting of ice in Antarctica to continue. This will raise ocean levels, which in turn will affect ocean currents, which will then affect many other things.
When everything adds up in the end, part of the planet will be flooded, part will suffer under a cloak of winter, and part will have to struggle with very strong hurricanes and tornadoes daily.
Ultimately, these changes will result in migration on a scale this planet has never experienced before, and, of course, the collapse of the current civilization.
3. Technology
What helps us today can one day destroy us. A good example is artificial intelligence. Although it is still in its infancy, it becomes more efficient every year.
In the future, there may theoretically be a situation where computers decide it would be a good idea to launch a few nuclear warheads at the enemy.
There is no need to describe in detail what would likely follow. We can only hope that no one will think of letting AI manage nuclear weapons.