Both the United States and China are considering possible security policies aimed to control the creation of artificial intelligence, or AI.
The Biden administration is seeking public comments on rules for guiding the artificial intelligence industry. China’s internet supervisor recently announced proposed laws related to AI.
ChatGPT is a new AI program that has captured wide public attention. It has human-like ability to answer many different kinds of questions in writing. American lawmakers are looking into the product. ChatGPT is the fastest-growing app in market history. It has more than 100 million monthly active users.
The National Telecommunications and Information Administration (NTIA) advises the Biden administration on telecommunications and information policy.
The agency wants to know if there are measures that could be put in place to provide security “that AI systems are legal, effective, ethical, safe, and otherwise trustworthy.”
NTIA Administrator Alan Davidson said responsible AI systems could bring great gains, but only if we deal with their possible harms and effects. He added that for these systems to reach their full possibility, companies and users “… need to be able to trust them.”
President Joe Biden last week said it remained to be seen if AI is dangerous. He said, "Tech companies have a responsibility, in my view, to make sure their products are safe before making them public.”
ChatGPT is made by California-based OpenAI and backed by the tech company Microsoft.
NTIA plans to create a report as it looks at efforts to make sure AI systems work as companies say they do and do not cause harm. The agency said the effort aims to ensure a federal method for supervising AI-related risks and benefits.
Center for Artificial Intelligence and Digital policy is an independent non-profit technology ethics group. It has asked American officials to block OpenAI from issuing new releases of GPT-4. The group says the app is “a risk to privacy and public safety.”
China’s AI regulation
China’s internet regulator has released plans of a proposed law that will require makers of new artificial intelligence products to agree to security reviews before public release.
The proposed law was released Tuesday by the Cyberspace Administration of China. The law says that content created by future AI products must represent the country’s values. It says the technology may not be used in opposition of state power.
The proposed law also said AI content must not discriminate based on ethnicity, race, and sex, and should not provide false information.
The proposed law is expected to take effect sometime this year. The regulations come as several China-based tech companies, including Alibaba, JD.com, and Baidu, have released new AI products. Those products create human-like speech and content including images and text. AI products have gained great popularity since OpenAI introduced ChatGPT last November.
I’m Gregory Stachel.
Gregory Stachel adapted this report for VOA Learning English using information from Voice of America and Reuters.
Words in This Story
artificial intelligence – n. an area of computer science that deals with giving machines the ability to seem like they have human intelligence
app – n. a computer program that performs a particular task (such as word processing)
benefit – n. something that produces good or helpful results or effects or that supports well-being
ethical – adj. following accepted rules of behavior: morally right and good
What do you think of this story?
We want to hear from you. We have a new comment system. Here is how it works:
- Write your comment in the box.
- Under the box, you can see four images for social media accounts. They are for Disqus, Facebook, Twitter and Google.
- Click on one image and a box appears. Enter the login for your social media account. Or you may create one on the Disqus system. It is the blue circle with “D” on it. It is free.
Each time you return to comment on the Learning English site, you can use your account and see your comments and replies to them. Our comment policy is here.