U.K.The government proposed a new online safety bill on Tuesday aimed at protecting children from online pornography, and one of the controversial proposals is to useArtificial Intelligence TechnologyDetermine whether the user is of legal age to view pornographic content.
Under the new Online Safety Act, websites and apps that display or publish pornographic material will need to ensure that children are generally not exposed to pornographic material on their services.PornAge must be 18 years and above.
Source Note: The image is generated by AI, and the image is authorized by Midjourney
According to a study by the Office of the Children's Commissioner for England, the average child is 13 years old.firstMelanie Dawes, chief executive of media regulator Ofcom, said: "Whatever approach is adopted, we expect all services to provide strong protection for children from accidental exposure to pornography, and ensure that the privacy rights and freedoms of adults accessing legal content are protected."
The regulator's proposal for facial age estimation involves using artificial intelligence to analyze viewers' facial features, which may require taking a selfie on the device and uploading it. In addition, the proposed guidelines include photo identity matching, requiring users to upload photo identification such as a passport or driver's license, and credit checks. Another suggestion is open banking, which would allow users to agree to banks sharing information with online porn sites to confirm they are over 18.
However, the Institute of Economic Affairs, a free-market think tank, said mandatory age verification could threaten user privacy and put users at risk of breach and abuse by increasing the amount of sensitive data held by third parties. The regulator said weaker methods such as self-declaration of age, online payment methods that do not require an 18-year-old, and disclaimers or warnings would no longer meet the criteria in its new guidance. Ofcom said it expected to publish final guidance in early 2025.