Pace University held an event in the Bianco room from October 19th to 20th, hosted by a series of humanitarian disarmament campaigns and organizations. Issues discussed ranged from nuclear disarmament, to mine action, to humane tech and AI involvement in warfare.
Students from Dr. Matthew Bolton’s Global Politics of Disarmament and Arms Controls course were offered the opportunity to join the conversation on disarmament, representing the voices of the youth. I had the honor of being part of a group selected to attend the event on the 20th. The specific topics of discussion were the advances of AI and its implications in terms of weapons of mass destruction.
As someone who is just starting to understand the world of disarmament and arms control, it was refreshing to meet those who work so hard to build a safer future for the world. It was inspiring to watch leaders from various organizations share ideas and participate in open and polite discourse, which is something we do not have enough of when it comes to AI. Conversations like these are a necessary measure to ensure the safe use of AI in the future.
A panel including different campaigns discussed topics like, international level of open-source intelligence data collection, emerging technologies like drones, autonomous weapon systems, nuclear risks, and AI.
How AI poses a real threat to the future of warfare was discussed heavily. AI can advance weapons immensely, rendering them completely self-operating and autonomous. Certain technologies like drones will no longer require humans to operate; the machine will acquire a target and activate the command alone. No barriers.
This concept is becoming increasingly involved in discussions related to AI, which is known as the dehumanization of digitalization. Targeting is already an issue in modern warfare. It becomes even more of a concern when you add AI to the mix since AI has no verification: no one to check what it’s targeting, how it’s targeting, and who it’s choosing to target. Currently, AI systems are being used in the Israel-Hamas War. Weapons powered by AI like “Lavender” and “Where’s Daddy” are used to generate a list of possible targets and their locations in Gaza. Many times, the locations offered by the AI as targets are located in civilian neighborhoods.
We discussed the implications of this targeting on different genders and children worldwide. This will affect everyone, whether or not you are an active participant in a battle. It poses a dangerous threat to the future of warfare. It is the responsibility of our generation to put an end to this.
I am a student who is only just entering the long fight against disarmament. This is just the beginning for AI. We all must advocate for the prohibition of autonomous weapons. We need to actively participate in discussions and educate ourselves and others about what is happening and how AI is being used on the battlefield so we can pave a path for a safer future for everyone.
If you are interested in getting involved, this is a link to the campaigns listed above, along with great sources for information on staying updated on the use of AI in warfare.
Disarmament Campaigns:
Trusted sources:
- Bellingcat, when learning more about open source.
- Automated Decision Research, when learning more about automosy weapons.
- Stimson, when learning more about disarmament and nuclear Deterrence.
- The Center for American Security is a think tank discussing AI and war’s future.
- The future Life information, learning more about tech advancements and their implications.