The rapid evolution of Artificial Intelligence (AI) has left legal frameworks struggling to keep up. With AI generative models now deeply embedded in various aspects of daily life, litigation is on the rise, making the development of AI law an urgent necessity.
Compliance and oversight remain complex due to conflicting interests, tangled regulations, and significant legal gaps.
Emergence of AI Law
The AI Foundation Model Transparency Act was introduced in response to several high-profile copyright lawsuits, including Doe v. GitHub, Inc., Getty Images, Inc. v. Stability AI, and Andersen v. Stability AI.
Stability AI, a defendant in two of these cases, spent $160,000 on lobbying in 2023 to influence AI-related legislation.
As the 2024 presidential election approaches, many organizations are lobbying for the Protect Elections from Deceptive AI Act. This bill aims to amend the Federal Election Campaign Act aka FECA, prohibiting the distribution of “materially deceptive AI-generated audio or visual media” about candidates before the election.
The Laws of Robotics
Science fiction has long explored the ethical and moral dilemmas of AI, from Isaac Asimov’s writings to TV shows like Star Trek. Asimov’s famous laws of robotics illustrate foundational ethical principles for AI:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given by humans except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
These fictional guidelines highlight enduring concerns that are now being addressed in real-world AI legislation.
AI Legislation in the U.S.
In 2023, Microsoft faced a lawsuit from the New York Times over allegations of using copyrighted content to train its machine learning models. Microsoft responded by emphasizing the importance of democratic, law-making processes in creating principled and actionable norms for AI development and deployment.