OpenAI Selected for Voice-Control Technology in US Drone Swarm Challenge
The Pentagon has selected two defence technology companies to compete for the development of voice-controlled, drone swarming software for the US military. OpenAI has partnered with these companies. For the sole purpose of converting spoken orders from military leaders into digital directives for the drones, OpenAI's technology would be utilised. The drone swarm, weapon integration, and targeting authority would not make use of it.
In order to discuss private, non-public issues, everyone involved requested to remain anonymous. In January, the Pentagon unveiled a $100 million reward challenge, which includes this project. The challenge requires the participating firm to provide working prototypes of technology that can autonomously control groups of drones that can make decisions and carry out missions. Depending on how well and how interested the competitors are, the six-month tournament will move forward in stages.
More Specifics of OpenAI’s Partnership
There has been no prior reporting regarding the engagement of OpenAI. Nevertheless, the chosen businesses remain anonymous. Several sources have indicated that OpenAI has not yet determined its next steps or finalised its agreements with the participating defence-tech businesses. Instead of providing their state-of-the-art models, OpenAI would just supply the open-source version of their model.
The business reportedly offers installation support as well. A representative for OpenAI said that the company did not enter its own proposal for the award and that its participation in the challenge would be minimal at best. The drone swarm challenge has already seen direct entries from other AI businesses. As the company's participation in the drone swarm competition shows, its defence work is going to increase the military's present tool usage.
Defence Analysts Bit Concerned Over Infusing AI in Defence
A number of defence experts are worried about the possibility of chatbots and voice-to-text commands being integrated into weapon platforms. The Pentagon is anxious to speed up the implementation of AI and autonomy despite these concerns. Their main point was that generative AI should only be used for translation and not for controlling the drones' actions. The potential dangers of using generative AI to convert speech into operational choices without human oversight have been voiced by multiple sources acquainted with the situation. This comes at a time when several workers at prominent labs have quit over ethical issues related to artificial intelligence.
Despite these concerns, AI businesses are still trying to make money to fund their research and development. A researcher from OpenAI is among them; she has voiced her displeasure with ChatGPT advertisements. Meanwhile, a researcher at Anthropic resigned publicly, bringing greater attention to issues surrounding AI research and development. A lot of chatbots, including OpenAI's ChatGPT, rely on large language models, which can be biased and even experience hallucinations. The AI can fool itself into thinking its outputs are based on reality when, in fact, they aren't.
|
Quick Shots |
|
•OpenAI has been selected to support voice-control technology
for the US military’s drone swarm challenge. •The project aims to convert spoken commands into digital
instructions for drone coordination. •OpenAI’s role is limited to voice-to-text and translation, not
weapon control or targeting. •The initiative is part of a $100 million challenge launched by
the United States Department of Defense. |
Must have tools for startups - Recommended by StartupTalky
- Convert Visitors into Leads- SeizeLead
- Website Builder SquareSpace
- Run your business Smoothly Systeme.io
- Stock Images Shutterstock