The role of caution

Free illustrations of Compass

If the above ordering is correct, then the future of the galaxy looks better to the extent that:

  • Misaligned AI is avoided: powerful AI systems act to help humans, rather than pursuing objectives of their own.
  • Adversarial Technological Maturity is avoided. This likely means that people do not deploy advanced AI systems, or the technologies they could bring about, in adversarial ways (unless this ends up necessary to prevent something worse).
  • Enough coordination is achieved so that key players can “take their time,” and Reflection becomes a possibility.

Ideally, everyone with the potential to build something PASTA-like would be able to pour energy into building something safe (not misaligned), and carefully planning out (and negotiating with others on) how to roll it out, without a rush or a race. With this in mind, perhaps we should be doing things like:

  • Working to improve trust and cooperation between major world powers. Perhaps via AI-centric versions of Pugwash (an international conference aimed at reducing the risk of military conflict), perhaps by pushing back against hawkish foreign relations moves.
  • Discouraging governments and investors from shoveling money into AI research, encouraging AI labs to thoroughly consider the implications of their research before publishing it or scaling it up, etc. Slowing things down in this manner could buy more time to do research on avoiding misaligned AI, more time to build trust and cooperation mechanisms, more time to generally gain strategic clarity, and a lower likelihood of the Adversarial Technological Maturity dynamic.

The “competition” frame

(Note: there’s some potential for confusion between the “competition” idea and the Adversarial Technological Maturity idea, so I’ve tried to use very different terms. I spell out the contrast in a footnote.)

The “competition” frame focuses less on how the transition to a radically different future happens, and more on who’s making the key decisions as it happens.

  • If something like PASTA is developed primarily (or first) in country X, then the government of country X could be making a lot of crucial decisions about whether and how to regulate a potential explosion of new technologies.
  • In addition, the people and organizations leading the way on AI and other technology advancement at that time could be especially influential in such decisions.

This means it could matter enormously “who leads the way on transformative AI” – which country or countries, which people or organizations.

  • Will the governments leading the way on transformative AI be authoritarian regimes?
  • Which governments are most likely to (effectively) have a reasonable understanding of the risks and stakes, when making key decisions?
  • Which governments are least likely to try to use advanced technology for entrenching the power and dominance of one group? (Unfortunately, I can’t say there are any that I feel great about here.) Which are most likely to leave the possibility open for something like “avoiding locked-in outcomes, leaving time for general progress worldwide to raise the odds of a good outcome for everyone possible?”
  • Similar questions apply to the people and organizations leading the way on transformative AI. Which ones are most likely to push things in a positive direction?

Some people feel that we can make confident statements today about which specific countries, and/or which people and organizations, we should hope lead the way on transformative AI. These people might advocate for actions like:

  • Increasing the odds that the first PASTA systems are built in countries that are e.g. less authoritarian, which could mean e.g. pushing for more investment and attention to AI development in these countries.
  • Supporting and trying to speed up AI labs run by people who are likely to make wise decisions (about things like how to engage with governments, what AI systems to publish and deploy vs. keep secret, etc.)

Like Our Story ? Donate to Support Us, Click Here

You want to share a story with us? Do you want to advertise with us? Do you need publicity/live coverage for product, service, or event? Contact us on WhatsApp +16477721660 or email Adebaconnector@gmail.com

Leave a Reply

Your email address will not be published. Required fields are marked *