Human cohesion and coordination at increasing scales, say global institutions for example, provide both good and bad actors with increasingly powerful platforms. It’s parallel to the increasing risk of godlike technology without godlike wisdom.
IMO the best framework for managing such risks is found in the fields of “continuous improvement” and quality control, using some variation of the “OODA Loop”. And the downside of large scale coordination is best addressed with the principle of subsidiarity.
James Lovelock (originator of the Gaia Hypothesis) suggested that people who want to survive climate change should form self-sufficient communities above the Arctic circle. This not only addresses moving in the direction of shifting habitable zones but also getting out of the way of the clash of Titans continuing at lower latitudes.
However, I agree with the premise of Octavia Butler’s “Parable” novels, that all such “life raft” communities will eventually be overrun by regressive forces whether internal or external.
The solution to the Fermi Paradox may be that intelligent life is self terminating. Like organisms, civilizations have life cycles. One civilization dies but is survived by others until at some point a planet climaxes in a single global civilization. That planetary civilization too will die but perhaps not before it sends it’s seeds into space. Those seeds may land among the ruins of other dead civilizations.
Leave a Reply