Meta boss Mark Zuckerberg faces a potential revolt at the companys annual meeting on Wednesday as shareholders push the Big Tech firm to step up transparency regarding its efforts to protect kids online.

A group led by Lisette Cooper, vice chair of the Franklin Templeton subsidiary Fiduciary Trust International and the parent of a child sex abuse survivor, is backing a non-binding resolution urging Metas board to publish an annual report tracking the companys performance on child safety and protecting young users from harm on its apps.

The report would require quantitative metrics appropriate to assessing whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.

If they want to reassure advertisers, parents, legislators, shareholders on whether theyre making a different on dealing with this problem on harm to children, they need to have transparency, Cooper said in an interview with The Post. They need better metrics.

The resolution is set for a vote during a time of intense scrutiny for Zuckerberg-led Meta which faces a legal and regulatory crackdown in the US and abroad about its alleged failure to keep kids safe on Instagram and Facebook.

Zuckerberg himself recently apologized to the families of victims of online sex abuse during a high-profile Congressional hearing.

Metas board of directors opposes the resolution, arguing in an April proxy statement that the requested report is unnecessary and would not provide additional benefit to our shareholders.

Cooper and her allies cite a raft of pending litation against Meta related to child safety. Last October, Meta was sued by dozens of states who alleged that the company had ignored the sweeping damage these platforms have caused to the mental and physical health, including poor sleep, disruption to schoolwork, anxiety and depression.

A separate suit from New Mexicos attorney general alleged Meta has exposed underage users to purported sex predators.

As The Post reported earlier this month, Meta is also spearheading a massive lobbying campaign to kill or weaken a pair of New York bills aimed at protecting kids online.

Children are going to be the users of the future. If they have a bad experience on the platform, they are not going to keep coming back. This makes a huge difference to us as investors, Cooper added.

Two of the largest proxy advisory firms, Institutional Shareholder Services and Glass Lewis & Co. have recommended shareholders vote in favor of the resolution.

We believe that the requested report and the adoption and reporting of targets will provide shareholders with valuable information, so they can better understand this sensitive issue in the context of the Company’s efforts to minimize harmful content on its platforms, Glass Lewis said regarding the proposal.

ISS determined that shareholders would benefit from additional information on how the company is managing the risks related to child safety.

Shareholder resolutions are essentially doomed without the support of Zuckerberg, who controls 61% of the companys voting power through his ownership of so-called super voting Class B shares.

Proxy Impact, which filed the resolution on Coopers behalf, noted in a filing that a similar proposal at last years annual meeting received nearly 54% support from shares that werent controlled by Meta management.

This is like a basic first step for any business plan get the data, said Proxy Impact CEO Michael Passoff. What gets measured gets managed, and theyre not doing that. Or if they are, they just arent making it available to anyone.

In its proxy filing, Meta noted various steps it has taken to address online child safety concerns, including the creation of more than 30 tools across our apps to help support teens and families and existing policies that prohibit harmful content that seeks to exploit kids.

We want people, especially young people, to foster their online relationships in a safe, positive, and supportive environment, and we work closely with a broad range of stakeholders to inform our approach to safety, the company said.

Metas board also recommended that shareholders reject a number of other resolutions, including one requesting a third-party report that would assess the potential risks and benefits of instituting a higher minimum age for users of its social media products.

Metas legal and regulatory headaches on the issue of child safety arent limited to the US.

Earlier this month, the European Commission revealed it was investigating whether Meta had violated a sweeping new law called the Digital Service Act, which requires the largest tech firms to police content on their platforms.

European watchdogs expressed concern that Facebook and Instagram may stimulate behavioral addictions in children as well as rabbit-hole effects, where kids stay glued to the apps despite harmful health effects.

Meta could face fines of up to 6% of its annual revenue if it is found to have violated the DSA.