Ireland needs to "step up to the plate" to ensure tech companies are properly regulated, a Facebook whistleblower says.
However, she believes Europe must work together to regulate the firms - saying placing the entire burden on Ireland alone isn't fair.
Frances Haugen - who used to work as a product manager at Facebook - made international headlines last year after she leaked internal company documents to US media.
She left her job and claimed that Facebook was guilty of putting profits ahead of public safety.
She has since spoken publicly about her concerns with Facebook’s practices - suggesting that the company’s products "harm children, stoke division and weaken democracy".
Ms Haugen will today appear before an Oireachtas committee to discuss online disinformation and media literacy.
She’s expected to call for an independent review of the Data Protection Commission.
Ahead of that appearance, Ms Haugen spoke to Newstalk Breakfast - saying Ireland has a "really unique opportunity" to address some of the concerns around big tech.
She said: "Because much of big tech is homed out of Ireland, Ireland plays a critical role in making sure things like the [EU's proposed] Digital Services Act are adequately enforced.
“I strongly encourage Ireland to step up to the plate and make sure a regulator is put in place to make sure these regulations are implemented.
"A law is only as good as its implementation.”
Ms Haugen believes an EU-wide regulator that shares the burden of regulation would “almost certainly” be a better way to go.
She observed: “If we want to do a good job of holding tech accountable - and I think Ireland holds a very special role on that - we have to adequately fund our regulator.
“I worry that placing the entire burden on Ireland isn’t fair. As we’ve seen with the current data protection authority in Ireland, there’s a huge backlog - it’s hard to get the resources to do these things adequately.
“By working together, you’ll get a much more robust and effective regulator."
She suggested that extra funding and resources would allow a regulator to hire experts or 'algorithmic specialists' away from tech giants.
Ms Haugen believes the regulation shouldn't focus on content - saying we can get "lured into a trap" by social companies when talking about these issues.
She observed: “They want us to argue about censorship, but not the reality that the way their algorithms are designed today, the most extreme ideas - on the left, right or whatever dimension you want to look at - get the most distribution."
Ms Haugen explained that platforms have to make choices about the content people see on their platform - currently, that’s determined by “engagement-based rankings”.
She said: “They prioritise the content that is most likely to elicit a reaction from you.
"The only problem is when we assess the value of content based on the likelihood that it gets a reaction from us, it gives a bias to content that elicits strong emotional reactions.
"The fastest path to a click is hate.”
As a result, the system will keep “escalating” the type of content users see - even on seemingly innocuous topics such as healthy eating.
She added: “These systems are always looking at what rabbit hole they can pull you down.
"Just by clicking on the content that Facebook gives you, over the course of a couple of weeks it will lead you to things like eating disorder content. It’s super scary.”
Data Protection Commission
Speaking to Newstalk, Graham Doyle - Deputy Commissioner with Data Protection Commission - extended an invitation to Frances Haugen to gain a better understanding of how the DPC works.
He said: "She [has been] honest and forthcoming about her lack of knowledge of any specific investigations that are being dealt with by the DPC, and indeed how agencies work in Europe.
"We extend an open invitation to Ms Haugen to meet with us to discuss our work and the specifics of the legal framework under which we regulate."
He said the DPC's annual report will be published tomorrow, saying that will detail 'regulatory outcomes' achieved by the commission in the past year.
Meta, the Facebook parent company, insisted in a statement they've "always had the commercial incentive to remove harmful content from our platform".
A spokesperson said: "While we have rules against harmful content and publish regular transparency reports, we don’t believe that businesses such as ours should be making these decisions on our own.
"We’re pleased that Ireland is progressing with the appointment of an Online Safety Commissioner."