On October 28, the Department of Commerce published a Request for Information (RFI) on the AI Exports Program, the forthcoming industry-led consortia program to deliver full-stack American AI export packages. The RFI is a preliminary step to inform the Department’s future request for proposal(s) related to development of the program and evaluation for submissions. Comments – covering topics from AI stack components and consortium formation to foreign market targets, business models, federal support, and national security considerations – are due November 28. You can find more of FGS Global’s analysis of the RFI here.
On October 28, a bipartisan group of Senators, led by Sens. Josh Hawley (R-MO) and Richard Blumenthal (D-CT), introduced the GUARD Act, legislation that would ban AI companions for minors, mandate AI chatbots disclose its non-human status, and create new crimes for companies who make AI for minors that solicits or produces sexual content. The bill’s introduction comes after tense congressional hearings over reports of chatbot-linked suicides and other harmful interactions with children and as AI companies have sought to preemptively take steps to make their platforms safer. Recently, OpenAI updated ChatGPT to better detect and respond to distress, expanded parental controls, and trained its model with mental health experts. Character.AI also announced it will block chatbots for minors before November 15, add stricter age checks and content filters, and launch an AI Safety Lab.
In recent weeks, official accounts of prominent elected officials and party organizations – including the National Republican Senatorial Committee, President Trump and California Gov. Gavin Newsom (D) have posted AI-generated content depicting their political opponents on social media platforms. The posts from some of the country’s most powerful politicians highlight the growing use of AI-generated media in mainstream U.S. political messaging and have raised concerns about the potential proliferation of political disinformation.
