Bill in Philippines urges people to register their likeness as trademarks to combat deepfakes

05 September 2025

Bill in Philippines urges people to register their likeness as trademarks to combat deepfakes

A politician in the Philippines has filed a bill in the House of Representatives that seeks to prohibit the creation, distribution and use of AI-generated deepfakes featuring a person’s likeness without the written consent of the person whose likeness was used or copied in the deepfake content.

Among others, House Bill No. 3214, filed by Parañaque 2nd District Representative Brian Raymund Yamsuan, encourages individuals to register their likeness as a trademark with the Intellectual Property Office of the Philippines.

“Under Philippine trademark laws, people are already allowed to register their likeness as a trademark,” revealed Samantha Rosales, partner at Bengzon & Untalan in Manila.

The Philippine IP Code under Section 121.1 defines a “mark” as “any visible sign capable of distinguishing the goods or services of an enterprise.” Furthermore, Section 123.1(c) provides that a mark cannot be registered if it “consists of a name, portrait or signature identifying a particular living individual except by his written consent.” 

“Based on these two provisions, a person is allowed to register their likeness as a trademark,” said Rosales. “The limitation is that under current Philippine law, trademarks are limited to ‘visible signs.’” Hence, sounds and voices cannot be registered as trademarks.

“It is worth noting that, by its very nature, trademark registration protects against the unauthorized use of a person’s likeness for the promotion of goods or services. In the absence of such promotional use, trademark rights typically do not protect against libel, the spreading of misinformation, hoax calls and other forms of deceit,” Rosales pointed out.

However, she added that a person’s likeness and voice are still protected under the law even without trademark registration. Section 169.1 (a) of the IP Code protects an individual’s publicity rights. “Under this provision, a person’s name, likeness and voice cannot be used to advertise any goods, services or other commercial activity without the person’s permission or consent,” she explained.

House Bill No. 3214, or the proposed Deepfake Regulation Act, also imposes the following:

  • Imprisonment and a fine ranging from P50,000 (US$880) to P200,000 (US$3,500) for anyone who knowingly creates, distributes or refuses to take down deepfake content.
  • A fine of P50,000 for online platforms that refuse to take down deepfake content per day of non-compliance with the takedown order, among other penalties.

“I commend the recognition of the unique dangers presented by deepfakes and the proactive measures aimed at mitigating the harm caused by this emerging technology,” said Rosales. “However, focusing so specifically on deepfakes created through AI may be overly restrictive, given the broader, long-term challenges related to audio, video and other media that have been manipulated and/or edited without the use of AI, such as digitally altered photos and videos created through the use of non-AI photo and video editing software or applications. As written, House Bill 3214 does not appear to address issues unique to AI-generated deepfakes that could not also apply to other forms of unauthorized media manipulation created without the use of AI,” she noted.

According to a 2023 report by verification and monitoring platform Sumsub, the Philippines registered the highest increase in deepfakes in Asia Pacific from 2022 to 2023. In 2024, President Ferdinand Marcos Jr. became a victim of deepfake technology when an audio clip mimicking his voice surfaced. In the clip, the voice ordered the Philippine military to take action against China, but Marcos never gave such an order. More recently, Filipino actress Angel Aquino and content creator Queen Hera testified at a Senate hearing on the use of AI-generated deepfakes in pornography. Aquino’s face was used in a porn video, while an edited photo of Queen Hera’s baby daughter was used in another pornographic deepfake content on the dark web. The latter shared that as a content creator, she would post videos and photos of her daughter.

“We need to have a strong law against deepfakes, which are used to scam others, malign reputations or spread fake news. If we do not act now, this technology may even be used to incite violence,” said Yamsuan, as reported by Inquirer.net.

- Espie Angelica A. de Leon


Law firms

Please wait while the page is loading...

loader