💡 Download the complete guide to AI-generated synthetic data!
Go to the ebook
Episode 29

Unpacking the Transatlantic Data Privacy Framework with Scott Marcus, Bruegel

Hosted by
Alexandra Ebert
The recently announced Transatlantic Data Privacy Framework will foster data flows between the US and the EU, addressing the concerns raised by the Schrems II. decision. The US-made an unprecedented commitment to strengthen the privacy protection applicable to US signals intelligence activities within the new framework. New safeguards will be implemented to protect citizens' rights while advancing cross-border data flows. The next step is to translate this framework agreement into legal documents that will be put into practice on both sides of the Atlantic. But what does this mean for data privacy in practice? What are the major challenges, and what can we expect in the long run? We spoke to J. Scott Marcus, Senior Fellow at the EU's economic think tank, Bruegel, about the history and future of transatlantic data flows.  What are the implications of the newly announced transatlantic data privacy framework? What are the challenges, and what does the future of compliant data sharing look like? We talked to Scott Marcus, an expert in digital policy, regulation, and US surveillance from Bruegel, the EU's economic think tank. Scott shared valuable insights with us on surveillance and why it's so challenging to regulate. Listen to this Data Democratization Podcast episode to learn about:
  • the history of transatlantic data-sharing challenges,
  • how to make the transatlantic data privacy framework work,
  • what are the major barriers, and what needs to be fixed to make sure the framework does what is expected?
We will be keeping an eye on the developments regarding the framework. Meanwhile, learn how safe cross-border data sharing is already possible with synthetic data! 

Transcript

Alexandra Ebert: Welcome to the 29th episode of the Data Democratization Podcast. I'm Alexandra Ebert, your host, and MOSTLY AI's Chief Trust Officer. Today, I have a super timely episode for you. We will dedicate this episode to what President Biden calls a breakthrough in transatlantic data flows. Namely, the new Transatlantic Data Privacy Framework, which was just announced last Friday on March the 25th.

My guest to discuss this with Scott Marcus, a Senior Fellow from Bruegel, one of the most important think tanks in the European Union, who not only has deep technical expertise as an engineer by trade and former CTO but also closely followed the digital policy and regulatory developments and wrote about them in his reports over the past decades. Of course, he's also currently highly involved in the digital roadmap that is being built in the European Union. I'm very much looking forward to this conversation.

You can expect a brief history lesson about data sharing between the EU and the US, and why this has been such a headache for companies, particularly in the past few years. Plus, you will learn about Scott's take on the new framework, and whether this finally will be something that provides a feasible solution to make data flows between European and West-based companies possible again. We will also dedicate some time to discuss US surveillance, and why this was something that made finding an agreement so difficult in the first place. Lastly, we will briefly discuss the Data Act and the Data Governance Act. Quite some interesting topics to cover. Let's dive right in.

Alexandra: Welcome, Scott. It's so great to have you on the show today. Actually, we have an amazing date picked out for our recording, but we will come to that in a second. Before we jump into our discussion of trans-Atlantic data flows and maybe also some of the acts that are on the EU policy roadmap, could you briefly introduce yourself to our listeners, and maybe also share a little bit about what makes you so passionate about the work you do?

Scott Marcus: Oh, thank you for giving me that opportunity. Yes, I'm Scott Marcus. I'm a Senior Fellow at Bruegel, which is a very prominent economics think tank based in Brussels. As you can probably already heard, I'm an American by birth, but I've been living continuously in Europe since 2005. In my time in the US, I worked largely as an engineer. I was the Chief Technology Officer for GTE, which is one of the two large firms that merged to form Verizon. I was also the Senior Advisor for Internet Technology. I was, in effect, the Chief Technologist for the US Federal Communications Commission.

That was effectively my entrée into regulatory matters. Around 2004, 2005, I really decided that what Europe was doing in terms of regulatory policy was more interesting and more promising, and so I packed up, pulled up my stakes, and moved. I've been working in Europe continuously ever since. I spent about 10 years as a director of an institute based in Germany, that does regulatory economics. The WIK, or long-winded name, Wissenschaftliches Institut für Infrastruktur und Kommunikationsdienste. Since then, basically still keeping busy.

I'm retirement age, but very far from retired. I like what I do. I do this kind of work because I enjoy it.

Alexandra: That absolutely sounds like that. Amazing that you continue to work in a field that you're so passionate about. Today, we are recording this episode actually on the 25th of March. Just earlier today, it was announced, what President Biden calls a breakthrough in trans-Atlantic data flows. Namely, a preliminary deal between the EU and US on trans-Atlantic data sharing. I really want to discuss this and your takes on trans-Atlantic data flows and the new preliminary deal with EU.

For our listeners who haven't followed that closely on the developments since the European Court of Justice, the Schrems II ruling back in, I think July 2020, can you take us on a brief history lesson on what happened since this ruling, and what were the challenges, and also the headaches that this caused for privacy and data professionals on both sides of the pond?

Scott: Indeed. It was a complicated decision and a complicated and long history. The actual case that led to Schrems II began even before the current European regulation, the GDPR, General Data Protection Regulation. It actually antedates that. I think 2012 is when the then young, today's slightly less young Austrian privacy advocate named Maximilian Schrems, very nice and very bright fellow, by the way, filed this seemingly hopeless case against Facebook saying that they were transferring his data to the United States.

That once it reached the United States, that it was fair game for who knew what national security agencies and law enforcement agencies, and that effectively, he didn't have the kind of protections on his data that a European normally expects. That was what the case was fundamentally about. Now, within GDPR, interestingly, there are different rules for transfers of data between countries. Within Europe, there's a fairly loose formulation that effectively says that European countries should do what European countries routinely do in terms of protecting democratic process.

Interception is, in principle, permitted. There's no real process for challenging transfers within Europe because the GDPR was trying actually to free up the movement of data rather than to block it. A separate article within the GDPR deals with transfers to third countries. The idea was that the protection on data should follow the data when it moves. In principle, it's a great idea. Now, the case that Schrems had lodged said the US isn't doing this. There had been a Safe Harbor Agreement where the European Commission, going back to the year 2000, it said, "Ah, the US is sort of good enough.

They are broadly comparable to European Data Protection, and so we'll have free exchange with the US." Of course, this enabled an awful lot of trans-Atlantic e-commerce data in platforms like Facebook and Google, to operate without restriction, which was good for business. One can then raise the question, is it also good for consumer protection? Now, what the court agreed in a first decision, 2015 or '16, the so-called Schrems I, was they said, "There don't seem to be adequate protections here on the access to data by government authorities."

Despite that-- Well, I won't say relatively little was done. Actually, a lot was done. An agreement called Privacy Shield was reached between the European Union and the United States. It established some reasonably good mechanisms where US-based companies could file commitments, undertakings with the US-- Actually, I guess it went to the Department of Commerce, but enforcement was part of the Federal Trade Commission in the United States. Essentially, the companies then committed to follow practices that were, let's say broadly in line with what now is enforced through the GDPR. It didn't have to be literally the same, but it should be broadly comparable.

That's really what the court decision also says. The second court ruling, because Schrems went back to the European Court of Justice and said, "Look, this really isn't enough," and

there's a lot to that case. The issues that the Court of Justice didn't reach is that the agreement between the European Union and the United States isn't really an agreement when it comes to security, relative to national authorities. What the US did was to provide copies of some executive orders that were already in force and some memoranda between US Security Agencies and the Department of Commerce, or the Office of the Director of National Intelligence.

Their memoranda within the United States government, there's nothing here that's enforceable from the European side. There's no actual agreement, so--

Alexandra: Sorry to interrupt you, but basically, for the European Union, it was then not really a commitment, but more a sign of goodwill or intent.

Scott: That's actually very well-stated. What I wrote in previous work is, "With good intent, maybe this could have some useful effect, but there's nothing that would be enforceable by a court." Schrems went back, and he said, "Look, my data is still not protected. This doesn't go very far." In the Schrems II ruling in July of 2020, the court agreed. They agreed with pretty much everything that he had stated. They argued essentially that US surveillance wasn't adequately constrained by US law, and also, that there were no meaningful rights of appeal for US persons who had been improperly surveilled.

That's the core of it, but what did the ruling actually do? First, it said that Privacy Shield, whatever else was in effect, no longer constituted a grounds for what's called an adequacy decision. Adequacy decision is a broad permission that says that European entities can transfer data as freely as they would to a European entity. That was struck down. Notably, this was all about surveillance. This is very widely misunderstood. It had nothing to do with conventional privacy law. It certainly did not require that US change its privacy laws. It didn't require any change.

In fact, I'm not aware of any challenges to the commercial privacy parts of Privacy Shield. It was all about government surveillance, but basically, what the CJEU decided, the European Court, is that the adequacy decision was gone. Companies could, in principle, still transfer under other provisions of the GDPR, but it was actually up to the company to ensure that proper safety was provided for the data, including by US government agencies. Now, after this, we had decisions from the European Data Protection Board, which represents--

It's a group that's collectively, The National Data Protection Authorities. They came up with some really pretty strict rules. At the point where these were finalized, they were made slightly looser, I believe under pressure from the commission, but still almost unworkable for a lot of European companies.

Alexandra: What were the aspects, why those were unworkable, or can you give a few examples?

Scott: To me, one of the biggest concerns is it essentially obliged the European company to make an assessment of US national security practices, and to ensure that the US government followed its stated practices.

Alexandra: Oh, okay.

Scott: Now, I really question how a private company is supposed to be able to assess what the NSA is doing in its deep shadowed corridors. Yet, the legal liability for the companies was enormous. Here you have enormous uncertainty. You also have a number of things that could, in principle, have been used to get around this, but basically aren't permitted. One of the most notable is informed consent on the part of the user. The data protection authorities have been very reluctant to give much scope to this. They worry that the user could be effectively forced to give consent.

It's a fair point, but in many other areas of European privacy law, like cookie rules, we rely totally on that, but it means essentially, I, as a European user, can't say, "I'm okay with Google having my data." Google has to instead, ensure that the government won't misuse my data. I'm totally at a loss how they would do that.

Alexandra: Even a company like Google, so definitely sounds like a lost case from the start.

Scott: I think so. At least that's the worry. I think that brings us mostly up to the present.

Alexandra: I think so too. What are your thoughts now on this, what Biden calls a breakthrough preliminary deal on data flows? Is now everything smooth from now on, or are there some key issues that still need to be sorted out since they said they had an agreement "in principle"?

Scott: It is hard to be 100% sure because, at the moment, we have only a political level announcement that an agreement was reached. We have no details as to what was agreed. I think, however, a judicially sustainable successor to Privacy Shield was really the only practical way forward. The guidance that was coming from the courts and from the European Data Protection Board is, I would say legally logical in terms of what the GDPR says. Parts of the law, I would argue, are structured a little too inflexibly, and we can come back to that.

I think basically, there wasn't much wiggle room for the courts, for the European Commission, for the national authorities That's really what we were seeing. This is the reason why we were hearing threats that Facebook might actually have to shut down in Europe, which I think would not have been in anybody's interest. Whatever else you might think about Facebook, it was not in anybody's interest for that to happen. Now, the real question is, what do we actually have in the new agreement? One can make some speculation based on what was in the original Schrems II ruling.

If you go back to the ruling, they didn't actually object to that much. The main objections really fall in two categories. The first is simply that US surveillance agencies weren't sufficiently constrained by US law. Now, in fact, there were some good things in executive orders from the Obama era that reigned this in a bit. One of Trump's first moves was an immigration act that included one article that came from nowhere, that effectively undermined those provisions, but never actually went in force, essentially putting some more constraints on.

They were trying to actually-- US law has some fairly good measures. With sensible enforcement, they could actually have some meaning. The other piece though, and I think the more readily remedied one has to do with means for US persons to have redress if they're improperly surveilled. There, what the court noted, I would have to say rightly, is--

Alexandra: For US persons or for EU persons?

Scott: I'm sorry for EU persons.

Alexandra: Yes.

Scott: In fact, I would also argue that US persons have very little redress in effect either.

Alexandra: That's true.

Scott: I'll come back to that. It's actually part of the same problem. For EU persons, there was an ombudsperson put in place, and this was supposed to represent a point where people could file complaints. Now, in fact, this all came into place shortly before Trump was elected, and then Trump left the position vacant for two years. Again, not exactly a show of good faith.

Alexandra: Not necessarily, no.

Scott: Also, the point that the court made is the person is an employee of the State Department. The person has no formal legal powers to deal with the intelligence agencies, and so it's unclear that this represents redress. The CJEU judgment specifically talks about the right to redress before an impartial tribunal. The argument is that this official in the State Department isn't there. Fair enough. That's something that could be fixed. Now, the question of how much difference it makes in practice, I think is still there. In practice, if you're surveilled--

By the way, I should add, my expert witness testimony has been used in more than 40 cases in the United States against phone companies, and against NSA, against George W Bush, for apparently illegal surveillance conducted during the George W Bush years. Going back to 2002 and 2003.

Alexandra: You're an expert on US surveillance, one can say?

Scott: Sort of, kind of. If you haven't worked in the agency you don't know exactly what they're doing.

Alexandra: Sure.

Scott: What I did, in fact, was to report on information provided by a pre-Snowden whistleblower. This was an AT&T employee who handed over three wiring diagrams of things he'd been asked to implement in the San Francisco Point of Presence of AT&T. My testimony effectively reverse engineers what the NSA was presumably doing. Nobody has actually ever argued with that analysis. I'm pretty sure I got that right, so, I know how this works. I also know how difficult it is to get meaningful redress on a case in the United States.

Alexandra: Mainly because of the reason it's so hard to prove that you're being surveilled, or what's the problem?

Scott: That's actually one of the biggest problems. Let's say, how would you know definitively, either that you're being surveilled, or even that you're not being surveilled? How would you prove it before a court?

Alexandra: It's difficult if not something is leaked that tells me I'm in there, and this came from the NSA or somebody else, then, of course, it's [crosstalk].

Scott: Yes, exactly. There's actually, by the way, one absolutely remarkable case, Al-Haramain, where, in the discovery process of the trial, the FBI erroneously handed over to them a document that definitively showed that they'd been surveilled. They weren't allowed to use it in the trial. The FBI said, "Look it's a classified document. You have to give it back, and once you give it back, you won't be able to prove it." Either you would be relying on unreliable facts, or you would be making-- It's really Alice in Wonderland, but even though they absolutely knew that they had been surveilled, they weren't able to use it in the court case. That's the first problem. To go to court, you have to show standing. To show standing, you have to show that you've been surveilled.

You have to show that there was injury, in fact. Typically, when it comes to the NSA, unless you have a whistleblower, you're not going to know that. The second very high hurdle is something called the state secrets privilege. It's an evidentiary privilege that says that essentially the government can block things from being used in court if it will be damaging to US government interests. This evidentiary privilege has a rather ugly history. The best-known use is a case against the air force where it was used to block a suit against the air force that actually had good grounds, and for which it turns out the national security argument was extremely weak, but it was used to protect the government not to defend real state secrets.

Alexandra: Understood.

Scott: The bottom line though, even for a US person, it's very difficult to get meaningful redress, and for a European, just that much harder.

Alexandra: Do I understand correctly, that actually then, the Schrems II decision, mainly focused on, not that important part with empowering this ombudsperson because even though you say that's easy to fix, it wouldn't solve anything because it's still so hard to come up with the evidence that you can make a court case in the US?

Scott: It was one of two bits. That one indeed I believe could easily be fixed, and probably has relatively little practical consequence. At least by itself, it has little practical consequence. The other piece though, is insufficient controls over who gets surveilled and why? They complain that the so-called FISA court in the US, endorses programs, not individualized surveillance, and that doesn't really provide adequate protection of the rights of Europeans. That would be harder to fix. That's something that I think could have real consequence, positive consequence, not only for European persons but also for American persons.

I hope they can come up with something. The original intent of the US laws was to provide that kind of protection. Basically, over the past 20 years, US governments, under presidents of both parties, have ridden roughshod over these rules.

Alexandra: That's interesting. I earlier checked the statement also from President Biden, and he stated that this new deal will also have unprecedented protections on data and security for US citizens. I'm really curious to see what they will come up with in the end. One other question, you mentioned earlier that there are some mechanisms how you can reign in US authorities. You also mentioned executive orders, and if I understand correctly, executive orders are something that could easily change from president to president.

Can this whole case be built up on something that's achieved via executive orders, and do they need to happen other processes and practices to make this more stable, regardless who's the president?

Scott: That's a brilliant question. Indeed, the problem with an executive order is that an executive order issued by one president has the force of law, but it can be changed at the stroke of a pen by the next president. Not only that, some of the executive orders, especially those that deal with national security, are not made public. It could change without anybody knowing. Now, what conceivably could be done and wasn't done in Privacy Shield was that the US government could not only provide-- Essentially, what they did in the past was to provide an executive order with no assurance that it would remain in place or very little.

There was one very weak statement in one document. If there were a stronger assurance that the US government wouldn't substantially change this without due consultation with the European Union, then maybe you would have a real basis for an agreement that not only is valid at the point when it's signed but hopefully, remains valid unless something fundamentally changes. To me, that's probably the way to address it. If that were done in conjunction with creating a more meaningful redress mechanism, you would probably have a basis, you would probably have a sound response to the Schrems II case that would provide a decent possibility for going forward.

Now, one point I should really make though, and in strong terms, when I say that not enough was done, I should also point out, I'm not aware of any other case in history where one country agreed to restrict its surveillance of another. Even friends surveil friends. That's been the pattern for hundreds of years, thousands of years maybe. The fact that anything at all was done is a great tribute both to the European Union and the United States. I also think the court was right to say it needs to go a bit further.

Alexandra: Why do you think that's the reason? Why do you think that US is even agreeing to that? There were some speculations that the developments we've seen in privacy with different national laws in the US, and also the privacy laws coming up in China, India, and other parts besides European Union's changed the sentiment around privacy in the US. What's your thought on that?

Scott: Oh, I definitely think that, in the United States, there's a changing sentiment. There is a tech lash, a big resistance to the large digital platforms. Many people, particularly Democrats, particularly progressive Democrats, see the need for comprehensive privacy legislation within the United States. It would be actually wrong, by the way, to say that the US has no privacy rules, but the privacy rules in the United States tend to be in one of two categories. A lot of them are sector-specific. For example, HIPAA in the health sector. You also have financial services rules.

You also have some fairly strong rules at state levels, especially in states like California. There are rules, but you don't have a comprehensive overarching structure. That means that their gaps are lacunae in what the United States has. I think that there was a changing sentiment in the United States. I think that there are a lot of people that would like to see comprehensive privacy rules in the United States now.

Alexandra: On federal level

Scott: At federal level. Many of them see the GDPR as being an interesting model. It's maybe a bit heavyweight model, but it's also a strong model. I think this is respected in many quarters. Now, when we talk about foreign rules, if you look for example, at the privacy rules that came into force in China a few months ago-- Well, came in force one has to hedge a bit. In China, often legislation comes into play before the supporting rules are in place, so, there are still a lot of gaps in what exists in China. If you actually read the rules, they were in large part inspired by GDPR.

If China did a sensible implementation, they would actually be pretty good. The biggest question, of course, are there large exceptions for national security in the rules as implemented, but there is actually something there. If you look also at what China agreed to in the RCEP Agreement, there also you see that they made privacy agreements with however a big carve-out for national security. I actually don't think that it's quite as hopeless with China as a lot of people think, but it's certainly hard. Mechanisms for getting a more global agreement are limited.

To me, the most interesting option would be the plurilateral WTO discussions on e-commerce, where there could actually be some agreement that would be overarching. In any case though, back to your original question, yes, I think attitudes in the US have changed, but the other huge thing that's changed is a recognition that otherwise, we risk a big blow to trans-Atlantic e-commerce, trans-Atlantic data flows. This is bad for everybody. I think everybody recognizes this. Within US industry, within the sector, there's a widespread recognition that something needs to be done.

By the way, I would add, within European industry, I talk with people like Airbus. This isn't just about US-based online platforms. Multinationals based in the EU that have operations in the United States, also face huge hurdles as a result of the Schrems II case. If we don't get some kind of deal, they're also going to suffer.

Alexandra: Absolutely. I think it's so many things. It's all the cloud services that are increasingly being used, are affected. When I'm talking with senior leaders from no matter which industry, they're really suffering from this legal uncertainty that we have nowadays. Therefore, I also think having a deal in place is tremendously important. $7 trillion in value, that can be realized with enabling and facilitating these data flows again.

What I'm still wondering is, are we now on our way to a new long-term solution with Max Schrems already now announcing, "Okay, I think we will be back in court in a few months' time"?

Scott: There was a fairly vocal announcement by Schrems, and by his friends at NOYB, none of your business, but essentially, for him to go back to court, there has to be a basis for him to go back to court on. I think until we know more about what was actually agreed, we don't, in fact, know whether there's a basis for a complaint and whether there's a basis that the courts would find sound.

Alexandra: Sure.

Scott: At this point, we have some experience with these things. If there was good faith on both sides, my hope is that what was agreed actually could respond to the fairly narrow complaints that are visible in the Schrems II decision. My hope would be that something was actually done. Now- something was done that would be judicially sustainable- I should also add, by the way, that one other thing that we haven't talked about is the huge geopolitical shift that took place. If you go back to the Trump years, Trump was really very much disdainful of Europe, very disdainful of the European Union, very disdainful of NATO.

Suddenly, NATO looks to the United States to be something valuable, but even in the earliest days of the Biden administration, you had the initiation of this EU-US Trade and Technology Council. Clearly, the body language between Europe and the United States is we should try to figure out ways to make all of this work. The US and the EU are one another's most important trading partners. Nobody really wants to see that relationship go down the drain. That's a big change.

Alexandra: Absolutely on the same page with you here. I think that's really a fundamental challenge for Europe, but also in the relationship with the United States. On the one hand, we want to have strong privacy laws, we want to know that our privacy is protected, but we need to balance this with the business needs and the societal needs of the digital economy. Therefore, I'm also in big favor of finding an agreement that's workable for both sides of the party. For us, with our focus on synthetic data, we've also seen that this technology, and in general, privacy-enhancing technologies are now on the rise.

The council you just mentioned, I think even issued a kind of prize or competition to find additional privacy-enhancing technologies or case studies with the existing ones, to even better realize these data transfers in a privacy-preserving manner, so, really curious to seeing which direction this will go.

Scott: Yes. Now, if I could look back to something I raised at the very beginning of the discussion, it's coming back to the GDPR itself and its structure.

Alexandra: Sure.

Scott: When it lists its goals, it really lists only one goal, which is data protection, and protection of EU individuals. There is a recital, Recital 4 within the GDPR that talks about the need to balance this against other goals, and says that privacy is not an absolute right. That other things have to be taken into account. Now, that's a good principle, but it isn't visible anywhere in the operative language. It's only there in the recital. That means that there are very few levers for the courts or for the data protection authorities to actually take this into account properly.

If you look at other bits of EU legislation, I work quite a bit, for example, with the European Electronic Communications Code, and there are maybe 10 different objectives that national regulatory authorities are required to take into account. Now, here you have only one and no concrete mechanism to take that into account. I really see this as one of the-- I would call it a fundamental defect in GDPR. It's one of the reasons that it's so difficult for the commission, for the courts, for the data protection authorities to come up with a balanced view.

Now, actually, there's a way, it might be possible to do something through the back door. I can tell you that parliament is very reluctant to open GDPR up for major surgery right now. It was hard enough to get it through. Nobody really wants to go there, but a suggestion that I made was taken up by one of the MEPs, Axel Voss, who's really one of the most knowledgeable on these issues. My suggestion was that the UK and a few other countries, put in place a structure not long ago, where they bring together several of the different agencies that have to look at these issues.

In the UK, the, I think it's DRCF, brings together Ofcom, which is the regulatory authority for media and telecoms. It brings together financial authorities, brings together a couple more, and tries to achieve a more joined-up policy.

Alexandra: Yes, I think that's the way to go forward. I'm also engaging a lot in digital policy issues on the European level, and now also with the AI Act, where there was this discussion between, should the European Data Protection Board be the enforcement agency or some other agency? Now they're going for the suggestion of the European AI Board, I think it's tremendously important that we find an authority that, on the one hand, has enough expertise to regulate AI and modern data issues, but also that really can take a balanced perspective because data protection and privacy is an important aspect.

I'm proud and happy to live in the European Union where there is this perspective, and we're seeing it that way, but it's just such a fundamental resource for everything that's affecting economy, society, national security, even democracy. I think it's irresponsible to have one single authority, which, their main perspective on privacy being the main authority on deciding in which direction we are going because they can't just take everything into account. I really hope that with the AI Act, but increasingly also with the other digital acts that are on the horizon, we will find a unified enforcement agency that will be able to take different perspectives into account.

Scott: Indeed. This is exactly what I suggested. This prominent MEP is very much taking it on board and suggested doing exactly that. He proposed that, within the AI Act, we could actually create this infrastructure. In my notes, I think that rather than an overall overhaul of things like GDPR, small surgical amendments to it could just expand, for example, this protection in Recital 4, so that there's actually some weight given to a body where the different authorities with different perspectives come together.

Alexandra: Absolutely.

Scott: That would create a little space for the courts and the data protection authorities to think of things beyond data protection when they make the rules. Now, you don't want to cripple the effectiveness of the GDPR. Clearly, you don't want to go too far, but I think that a sensible approach could be substantially better than what we have today.

Alexandra: Yes, I think so too. Also, coming back to the points that are raised during this EPP hearing on the GDPR shortcomings, where you also presented, I think, in many cases, it's not even necessary to go on the text and start changing that again, but just going for the low hanging fruits, and maybe rethinking how you can achieve a better balance in interpreting GDPR. The example I gave back then was a privileging of anonymization because, with synthetic data and other technologies, you can achieve an anonymization that has the utility of a real original data set, but it's just impossible to reidentify, and therefore, out of scope of GDPR.

The process of anonymizing data under GDPR is still perceived as quite burdensome by businesses. Just making it easier to incentivize businesses to actually anonymize their data and get rid of the personal identifiable information, is something I think is in the interest of both data protection authorities, but also the European Union as a whole because we want to be in a position where we can utilize data freely, and where we can innovate and scale AI initiatives just without getting privacy or just in a way where we can get privacy out of the spectrum and know that is safely protected.

I think these are a few examples where we can really just, in the way data protection authorities enforce it and interpret it, achieve so much for the economy and tackle this fundamental challenge of utility versus privacy.

Scott: I think you raised a really good point there. The anonymization indeed is one of the provisions that could be really useful. It's recognized by the data protection authorities, by the EDPB, but I think they would be quick to say, "We don't actually know what constitutes a compliant anonymization." Clearly, the sloppily done anonymization can easily be tracked back, so, some guidance, some practical means would really be useful. Interestingly, what I hear in some of the meetings I take part in, you have experts often saying, "We don't really want the commission to be providing guidance on things like this.

It should come from the marketplace, " but often, the same experts, one or two minutes later, would say, "We need more guidance on anonymization." One way or another it needs to come into play.

Alexandra: Yes, definitely. I think it's two things. On the one hand, you're completely right, guidance on what is sufficiently anonymous and what is not. With synthetic data, it's different because it's an approach that's always the same, no matter which data set, but traditional anonymization approaches are always case by case basis, where you have to assess which columns do you actually need to perform a task. Then the processes that some companies have in place in assessing whether you can actually release those data points together, or there are some toxic combination that could be to re-identifying.

They're so terribly complex. The most surprising thing to me when I once interviewed experts about that, oftentimes companies don't even have a process in place to assess whether something is then re-identifiable, on the one hand also because it's so hard, from a computational privacy point of view, to actually make this assessment with traditional anonymization. This is one of the reasons why it's great to see that more and more data protection authorities are advocating for synthetic data.

Besides having guidelines on what is actually anonymous and what is not, a general privileging of the process of anonymizing data is needed because current understandings or the majority understandings are that you need to fulfill or that you need to have a legal basis for that. That it's not too easy to argument the legal basis because they just see anonymization is a processing, and you have to go through the same exercises as if you were to process real data. I think that's maybe putting hurdles up, in a way, where it's not really incentivizing organizations to get rid of the sensitive information.

Having anonymous data that's useful, but still privacy safe to use, and then be out of the scope of GDPR to innovate with this data. I think, therefore, we need both guidelines, but also a shift in how the GDPR is interpreted because there's room to do that.

You don't have to say that anonymization is always a processing that has to has a separate legal basis. This alone would change so much in our economy because it would open up the data resources we are so desperately needing, and why we are also now seeing this Data Act and Data Governance Act because we're realizing there's a fuel crisis of data to actually achieve all these ambitious goals of AI adoption and data-driven innovation that the European Commission set out.

Scott: I think that's really well said, and, by the way, also in the hearing for the EPP, I very much like your point about anonymization. I think it is spot on. I think this is an area where there's more that can be done, and I think you're also right to say there's probably a lot more that can be done with interpretation of GDPR without going into changes to the text.

Although, as I said, in my suggestion about having essentially a grouping of the various multinational boards that deal with essentially everything from innovation and competition to privacy and security, I think it would be useful to have some surgical changes into the relevant legislative instruments that just gives the opportunity to give a little weight to the recommendations that such a board would produce. I don't think this represents a major overhaul. It creates a little space for that work to be recognized.

Alexandra: Yes, you're right, that would definitely be an important development. I think also with all the new acts on the horizon, Data Act, Data Governance Act, and so on, and so forth, just to ensure that there's a holistic approach taken, and that not silos are working there separately on all their acts, which then maybe creates some implementation and other issues, and increased legal burden for companies. I think this is really important for us to preserve or create the agility that we need in the economy to speed up our processes.

Scott: That's well said. Now, on the Data Governance Act, generally, I like what the Data Governance Act says, but I wonder if there's enough attention paid to incentives.

Alexandra: That's a good point.

Scott: When I look, for example, at creating a data broker, the provisions in the Data Governance Act make it almost impossible to create a profitable business model for a data broker. With the mechanisms that are there, I can easily understand why sector companies would like to take data out of the repository. I have some difficulty seeing what motivates companies to make altruistic contributions of data into a repository. There could be sector-specific answers. Apparently, the automotive sector has come together somewhat, but whether this will really produce much, I'm not yet convinced.

I think this is going to take a little more work, a little more thought, and a little more study.

Alexandra: Yes, I think these are important issues that you raised also with the data brokers. One concern is definitely, who would be interested in doing that with all the burdens that you have to fulfill in the non-profit nature of being a data broker under the Data Governance Act? One other concern I have is if you really have this non-profit nature of a data broker, how likely is it that you will have the top-notch security, and privacy experts, and infrastructure, to really safeguard this important treasure trove of data that they compiled from adversaries who want to access that?

Not really convinced of this framework either. Since you mentioned incentives for data altruism, and providing data to these various European data hubs and data spaces, and however they are called, are there some things that you think would work well, or that you would recommend the regulators to consider?

Scott: I don't have snap answers on that. Again, I look at the legislation, and I feel that it creates mechanism that won't necessarily get used, but I'm not sure how to fix it.

Alexandra: We will hopefully figure this out together as a society. Since we're coming to an end, maybe my last question to you, Scott, would be, you already mentioned that you're advocating for a unified and more balanced regulating entity to balance just these different perspective. What else is on your wish list for all these data acts and AI acts on the horizon? What do you think would be important to achieve?

Scott: That's probably far and away my strongest request. Relatedly, the body that I talked about is trying to achieve joined-up policy across different thematic areas within the regulations. The related challenge is joined-up policy between EU level versus the member states, and also among the member states themselves. What we're seeing is a barrage of new legislation, but essentially, you've got a lot of different measures, each of which typically has its own set of mechanisms for the member states to coordinate with one another, and also with the European institutions.

In the GDPR, which again, we started with as a poster child here, you have the one-stop-shop. By almost all accounts, that's a mechanism that's worked pretty badly. Given that the largest digital platforms are headquartered in mainly two relatively small member states, Ireland and Luxembourg, a constant complaint has been, first, you simply look at the number of staff of their data protection authorities.

Alexandra: The resources.

Scott: There's a resource problem. It means that the burdens are unfairly shared, the costs are unfairly shared. Also, if you look at the number of enforcement actions that have been brought, and the level of fines imposed, you see that they're really pretty swollen, those member states, compared to, let's say, what's been imposed, especially in Germany, but also, to some extent, in France and Italy. Even though the biggest platforms are there, you would tend to think that that would be where the bulk of enforcement should take place, and so once again, we've got an incentives problem that shows up very clearly.

You ask yourself, does-- Essentially, Ireland and Luxembourg, they're in competition with other member states, and this is certainly true for other member states as well. Hungary, you name it.

Alexandra: Sure.

Scott: They compete with low corporate taxation, they compete with a favorable regulatory environment. Now, the taxation problem maybe is on the way to being fixed with compromises that have been reached at OECD and G20 level, but on the regulatory side, this notion of competition among the member states, is really, it creates incentives for a race to the bottom.

Alexandra: Definitely.

Scott: That needs to get worked on. I also worry that we simply have too many different mechanisms for coordination, and the risk of them getting in the way of one another is considerable. At the same time, I don't think the answer is to make everything the same because the nature of the problem is different among different areas of digital policy. In areas like telecommunications, where I do a lot of work, national markets, markets for, let's say fixed networking, those tend to be national markets. Whereas, if you look at many of the markets that we deal with for digital platforms, those are at least Europe-wide.

In many case, they're global markets, and so, having regulation mainly at member state level risks reaching incompatible and incoherent decisions.

Alexandra: Definitely.

Scott: Somehow we need a better solution here too.

Alexandra: Yes, I think that's one of the big challenges. Also, when talking with senior executives for multinationals, they fear that there's, at one point in time, we'll just have to operate in a vacuum because there's no overlap between all these different legislations anymore. Definitely not an easy challenge. Scott, thank you so much for everything that you shared today. It was truly a thought-provoking discussion, and I think I've gained insights into the inner workings of legal aspects in the US and surveillance in US that I didn't have before. I found it really valuable. Thank you so much for being on the show.

Scott: Welcome. My pleasure, and you're welcome. Thank you very much for some very, very thoughtful and insightful questions. I've enjoyed this very much as well. Thank you.

Alexandra: Happy to hear that. Thank you so much.

Alexandra: I hope you enjoyed this episode about the new Transatlantic Data Privacy Framework as much as I did. I definitely learned some new things about US privacy regulations and how everything works over there. If you have comments, questions, or even suggestions for future guests for us, don't hesitate to reach out to us via LinkedIn, or also via email at podcast@mostly.ai. Until then, see you in two weeks.

Ready to try synthetic data generation?

The best way to learn about synthetic data is to experiment with synthetic data generation. Try it for free or get in touch with our sales team for a demo.
magnifiercross