💡 Download the complete guide to AI-generated synthetic data!
Go to the ebook
Episode 32.

A journey through the global data privacy landscape with Omer Tene

Hosted by
Alexandra Ebert
Omer Tene is an experienced data privacy lawyer, a partner at the global law firm, Goodwin. He was Vice President and Chief Knowledge Officer at the IAPP, the International Association of Privacy Professionals, a Senior Fellow at The Future of Privacy Forum, and an Affiliate Scholar at the Center for Internet and Society at Stanford Law School. Omer has a deep and global understanding of the regulatory landscape. In this episode, we asked him to give us an overview of the state of privacy in the world in 2022. From the US to Europe and China, new data privacy regulations are popping up everywhere, making it increasingly important for companies to have tools like synthetic data to comply. Tune in to learn about:
  • the emerging privacy landscape in the US
  • what are the challenges and the expected impact of a federal data privacy legislation
  • how enforcement of data privacy laws differ in the US and in the EU
  • how sentiment about data privacy is shifting 
  • why data privacy is a high priority topic in the US
  • how start-ups should handle data privacy from the get-go
  • regulating adtech companies and data brokers
  • the new data privacy law in China (PIPL) and its challenges
  • the hottest topics in privacy: crypto, NFTs, the metaverse, and how society is affected
  • who are the privacy generalists, and why do we need them


Alexandra Ebert: Welcome to the 32nd episode of the Data Democratization Podcast. I'm Alexandra Ebert, your host and MOSTLY AI's Chief Trust Officer. Today, we have a guest that presumably all of you that work in privacy will be familiar with. Omer Tene, the former VP and Chief Knowledge Officer at the IAPP, the International Association of Privacy Professionals, who is now a partner at Goodwin and works in the firm's technology group on data privacy and cybersecurity.

If that wasn't enough, he's also an Affiliate Scholar at the Stanford Center for Internet and Society, a Senior Fellow at the Future of Privacy Forum, and was also appointed to the Arbitration Panel on the U.S.-EU Privacy Shield Agreement. He was the Rapporteur for the 30-year review of the OECD Privacy Guidelines. You can definitely look forward to hearing from him. In our episode today, Omer will share his perspective on regulatory privacy developments in the United States. Why do we not yet have a comprehensive US federal privacy law?

Which challenges do US organizations encounter with the regulatory fragmentation and the emerging privacy laws on state level? Why is privacy protection becoming much more important for US organizations, regardless of the size? Besides the US, we also cover the new Chinese privacy legislation, and how it contrasts to the approach we're used to from GDPR and other Western privacy regulations. All in all, there's plenty to draw insights from. Let us dive right in.

Alexandra: Welcome to the Data Democratization Podcast, Omer. It's a pleasure to have you on the show today. I was very much looking forward to having this chat with you, but before we dive into the topics that we want to cover today, could you briefly introduce yourself to our listeners, or at least to those who not yet know you, and also share a little bit about what makes you so passionate about privacy and the work you're doing?

Omer Tene: Sure. My current role is a partner at the Boston office of Goodwin, which is an international law firm, but I've been in this space for about 20 years now. I'm Israeli. I was a corporate lawyer actually for several years before switching to privacy back in 2003 or 2004. Initially, I just wanted to try something else, and I took a job at a think tank based in London, focused on data protection. At the time, I hardly knew what that was. It was really a nascent field, certainly in law, but more generally. Of course, fast forward 20 years, it just became a huge focus area.

The reason I'm passionate about it, and still love it, and think that that career change was the best move that could have happened to me, is that it's the best field to be in if you want to be at the intersection of technology, and policy, and diplomacy, and social norms. It's very fast-moving and interesting front-page news, of course, almost every week, or sometimes day now. It's a very exciting field.

Alexandra: I can only agree, and I think how you just described it hopefully gets many more lawyers interested to join in the privacy space because we need them, and it definitely is an exciting field to be in. Today, I want to cover a range of topics. You describe yourself as a privacy generalist. After hearing you speak at so many occasions and events in the past few years, even before the pandemic in person, I would say you're even a privacy polymath. There's quite a lot of ground that we could cover, but one thing that I want to start out with is actually US privacy laws. A very easy question for you, when will we have the first federal US privacy law?

Omer: Sure. Just take a step back, it's funny that you say privacy generalist because I do describe myself as that now, but again, back when I started doing this, every privacy person was a privacy generalist, just because it wasn't such a broad and deep field yet. Of course, the Data Protection Directive in Europe existed since 1995, and there were member state laws from the 1970s and '80, and some legislation in the United States and work at the OECD. There wasn't nothing, but it certainly wasn't a field where you'd expect people, for example, lawyers, to specialize.

Of course, these days, it's almost too much. I think it is too much to stay on top of privacy developments for every industry and across the globe. What you increasingly see is privacy specialists for the financial sector, for the healthcare sector, for different countries, for cross-border data transfers, for B2B or B2C. Privacy generalists are, I think becoming fewer just because the field is much more specialized like any field that matures. In terms of having a federal privacy law, of course, there are federal privacy laws in the United States, but one that is an omnibus law that applies to every sector, I wouldn't hold my breath.

There have been a lot of fits and starts over the past few years with proposals that seemed mature, and even similar, coming from both sides of the aisle. Both Democrats, and Republicans, and bicameral also, so, both the Senate and the House. For various reasons, it hasn't happened yet. One reason, of course, is that it's just very difficult to get any legislation passed in Washington these days, given that politics have become so polarized, and there isn't much collaboration across the aisle anymore.

Although for an issue like this, which isn't political really because it's not a trademark Democratic or Republican issue, and it doesn't really divide the parties, and you could see this in the bills that came out of Senate leaders from both parties, that they were pretty similar. You would hope that there could be a compromise and a conclusion, but so far, there hasn't. This year, on the one hand, there is a sense of urgency, I think particularly from the Democrats because they are concerned that the scales might tip the other way in a few months, and the Republicans will once again control some of the key committees.

There's talk about Senator Cruz becoming the Chair of the Commerce Committee if that happens. Democrats would obviously prefer to get something done while they're still in the driver's seat, but I guess it's like a mirror image, and sometimes might be opposite from the other side of the aisle. There are a few issues that still divide the two sides. In the past, we thought it was the private right of action and the preemption issue.

Now, it seems to be focused more on things like whether there should be room for mandatory arbitration clauses, which, of course, are very contentious, particularly in light of some Supreme Court jurisprudence that affirm their validity, and obviously, consumer advocates are strongly opposed to that. Another set of issues is the intersection of privacy and civil rights, which is a very big emerging issue in privacy, and more generally in AI and tech policy, over the past few years. Business interests are saying basically, leave market regulation to focus on the market and on industry issues, and the traditional issues that FTC has dealt with, and address civil rights concerns elsewhere.

Whereas consumer advocates or civil liberties advocates want it covered by this law. As you can see, there are forces in both directions, but I'm not particularly optmistic that it's going to happen in the short to medium term, but maybe I'll be surprised, and I hope I'm surprised.

Alexandra: I hope so too. For our listeners, what would you say would be the benefits of having this more comprehensive federal law in place, or also what are the consequences for businesses currently in the US, where we don't have that in place, and see more and more state-level privacy laws emerging?

Omer: For businesses, the advantage is obvious. It's harmonizing a space that's becoming increasingly fractured with different states advancing their own privacy legislation. If you are a sizeable or even a medium-sized business in the United States, you're not dealing just with the laws of California, but also Virginia, and Colorado, and Utah just passed a law. Connecticut seems on the cusp, and maybe, in Maine, we'll have a biometric law. That's a good point because you not only have states' omnibus laws, you also have specific state privacy legislation.

From a business perspective, it's quite a headache to try to wrap your mind around the alphabet soup that privacy legislation has become in the United States, both at the federal-- There are also all of these federal statutes; the HIPAA, and GLBA, and FCRA, and COPPA, and VPPA, and others, FERPA. The advantage there is obvious.

Alexandra: Definitely. Just to briefly interrupt. For our listeners that are not familiar with all of these abbreviations, these are the more sector-specific laws like HIPAA for health care sector if I'm not mistaken. The other ones we don't need to go through in detail, but basically.

Omer: Yes, GLBA for the financial sector, FCRA for credit reporting, FERPA for education, for students' privacy. VPPA for video, initially cassettes, but now streaming. COPPA for kids, and on and on the list goes. BIPA at the state level for biometric information. You need a good lawyer to--

Alexandra: Absolutely. I'm wondering if, at some point in time, they will bring out any games where you can really try to come up with a privacy law with any given letter of the alphabet because I think we're close.

Omer: It's a good idea, like a privacy Wordle.

Alexandra: Maybe for the IAPP or something like that, as the next gift for the conferences.

Omer: Yes. For individuals, it's, some areas in the US are still unregulated. Despite all of these laws, you do have residual FTC authority under Section 5 of the FTC Act, although that too has its limitations. For example, the FTC doesn't regulate non-profits, and political parties, and certain sectors of the economy. At the end of the day, some areas in the United States are unregulated. That means that data brokers have a lot of power, and the Ad Tech industry, some people would say, features like data excesses. To any advantage, there is disadvantages, and businesses are concerned that a strict federal standard would weigh down against existing business practices, and innovation.

If there is a private right of action, for example, businesses are concerned that we will see class action litigations, sometimes frivolous, and plaintiff attorneys chasing companies to make a quick buck for themselves. There are different interests on both sides, and for now, these massive scales haven't tipped to the action side yet.

Alexandra: Let's see once this happens. We're recording this on the 20th of April, and last week was the Global Privacy Summit from the IAPP, a professional privacy organization in Washington, and one thing that was apparent is that particularly large organizations like Microsoft, Apple, and others, advocated and asked for this type of federal law. Is the motivating factor for them also that they're global players, and anyways, have to comply with GDPR, so they expect that the new federal law would make that big of a difference for them, or what are the factors do you see play into the statements?

Omer: Yes. First of all, I think Microsoft and Apple specifically, are probably not representative of the industry at large, just because they're very unique companies, as are the other big tech platforms.

Alexandra: Sure, but their business model doesn't rest upon customer data as much.

Omer: Yes, they have different interests and considerations, and some of them are more geared to the competition side of things, so, people who've heard Apple's CEO remarked that he's possibly raising privacy as a shield from additional competition in areas like the App Store or others. Without getting into the motivations of any specific companies, I would say that, look, industry, writ large, has supported the federal privacy legislation in the United States. I think even the Chamber of Commerce came out in support of it. Of course, the devil lies in the detail, and they want federal privacy law because of all those things we mentioned earlier.

For harmonization, and reducing the transaction cost of having to deal with this sprawling kind of quilt work of state and federal laws. At the same time, not at any cost, and they're certainly concerned about and protective of certain issues. I think it's an incredibly complex and complicated game, which, I guess is not surprising given that we're talking about regulation that will govern every industry in a world than an economy, which is so heavily data-focused and oriented. The engine of growth for the US economy has, for years now, been in technology and data.

It's not by coincidence it's the biggest economy in the world, and it has been incredibly successful. Doing anything that could disrupt that, I think isn't easy.

Alexandra: Definitely, and something that will undergo careful consideration. Once again, back to the new state laws that we saw California Consumer Privacy Act was already widely discussed, but with the newer ones in Colorado, Virginia, and now I'm forgetting the fourth one. Utah was the newest one. How different are those laws? Are they basically copying what each other state was doing, or there are some distinct differences, which make it challenging to comply if you want to operate in the different states?

Omer: Each one has different features and flavors, sometimes models, so people do talk now about, it's those first three movers that get to have the models named after them. California, Virginia, and Colorado. California is a bit of an outlier because it wasn't legislated by a legislature. It was legislated-- The CPRA passed through citizen-- I forget the word, but passed through the ballot, essentially. The CCPA was drafted by a citizen, by an activist, and passed almost as a ballot initiative. At the very last moment, they passed through the State Assembly as a compromise just before that.

It looks very different than any other piece of legislation in this space. There are differences, and differences around opt-ins or opt-outs, and what do you opt out of, which, of course, is very significant when considering online businesses that operate websites across state lines. Even a minute difference between opting out of data selling in California, or data sharing in Virginia could make a big difference. All in all, these laws do follow the FIPPs, the Fair Information Practice Principles, so they have the rights that you see under GDPR, for access, for rectification, for deletion, to some extent.

They have breach notification, they have an opt-out of selling or sharing information with third parties. Some of them have sensitive data as a category. CCPA didn't really, but CPRA is adding that, and some of the other state laws do have it. CPRA, I should say, in California, is establishing a new privacy enforcement agency. That's something new because, in general, it's the State Attorneys General who enforce these laws. These are only four. When we have 20, or 30, there are obviously going to be more differences.

Alexandra: Definitely. Actually, I nearly took away my next question, which would be about enforcements, particularly in the context of CCPA because currently, of course, not much is happening, but how do you see this evolve? Also, maybe if you can draw an analogy or a parallel to the European Union, and enforcement there, where there are also some critique points about data protection authorities not being as active, and also not as well resourced as needed, to really enforce GDPR. what are your predictions on enforcement on CCPA?

Omer: First of all, there is enforcement of CCPA by the State Attorney General. The new Privacy Protection Authority is just being established as we speak, but the executive director of that authority, Ashkan Soltani, everybody knows him in our space. He's very much a privacy advocate. He was one of the people who drafted the CCPA and CPRA and was also Chief Technologist of the FTC. We know where his heart is, and he's very capable, a knowledgeable person in the space. I certainly expect him to be a strong enforcer of the law.

Look, State Attorneys General are feared enforcement agencies.

Of course, they have a broad portfolio. They also enforce state criminal laws. That means that their priorities might not place privacy at the top of the list because they have drug enforcement and other big public policy issues. At the same time, this is a popular issue as displayed by California's ballot initiative, and the fact that this is passing through state assemblies in both red and blue states. Utah, of course, is very much a red state. Some of the states on the shortlist like Florida, Alaska, are on the cusp of passing new laws, are also red states.

As I said earlier, it's not a signature blue or red issue. That goes to say that I do think Attorneys General will enforce and have enforced, and it's taken very seriously in the US. Add to that the fact that there is some privacy right of actions under state laws. California's is limited to data breaches, but those plaintiff attorneys try to expand the definition of what constitutes a data breach, for example, to unauthorized sharing, in the sense that it's not expected by a consumer. That starts to look more like a privacy violation than a security one.

There's still some room for discussion and litigation there. Then, some state laws, Utah doesn't have a private right of action at all, but others might have stronger private right of actions. All that means that individuals too will be able to enforce these laws, including maybe class action attorneys. At the state level, there have already been big class actions involving privacy laws, for example, under BIPA, Illinois's biometric privacy law. Facebook was hit by, I can't remember the number, but it's something like a $650 million settlement in connection with facial recognition.

I think, all in all, privacy enforcement is very much a real thing, and a risk for businesses that say, it's probably more so than in Europe. GDPR, of course, supercharge European DPAs with stronger enforcement powers. To this day, the biggest privacy cases still happened in the US. The biggest of all was the $5 billion fine that the FTC imposed on Facebook in connection with the Cambridge Analytica story. Of course, that same event impacted European individuals as well, but the enforcement actions in Europe haven't yet reached that magnitude in this space. I think more enforcement on both sides of the Atlantic, but the US is still leading I think by a lot.

Alexandra: Yes, it sounds like that, and also, the different mechanisms that come into play make it apparently a whole different picture and story for US companies or companies facing enforcement in the US. How did all these developments, since the advent of CCPA, change the sentiment towards privacy in the US, on the business side, and maybe also on the public side?

Omer: What's nice about privacy is that everything is so complicated, and nuanced, and intricate.

Alexandra: Which is why it's such an exciting field to work in, as we stated earlier.

Omer: Right, so I don't even know where to start answering this question. I think on the business side, it's easier because I think it's very obvious that there's much more recognition among businesses, that this is a serious thing that needs to be addressed, and a risk that needs to be mitigated, and an opportunity to leverage data that needs to be enhanced by handling privacy and security rights. There are just countless reasons for it on the security side. Just a constant drumbeat of major incidents and ransomware is the new wave, impacting not just data, but also IT system and critical infrastructure.

Alexandra: Absolutely. Just, sorry, to add to that before we go to the public side, absolutely plenty of reasons to take privacy seriously. I was just wondering, because when I talk with practitioners from the US, sometimes I hear people from the side of the spectrum, which is more towards, "Okay, we don't care that much about privacy. It's more the moving fast, breaking things, and being the first." Then coming back from [unintelligible 00:32:03] and Global Privacy Summit, also many advocating for, "Okay, if you only comply with the laws, you're not doing enough.

You should do even more with your privacy program so that it doesn't really matter which laws come in place." Therefore, I was super curious to get your take on where you see the majority currently positioned.

Omer: Look, privacy occupies a major space in US media. It's covered by the New York Times and by more specific specialized tech journals every day. It's like every new technology, facial recognition, and biometrics, and COVID stuff, and virtual assistance, and virtual reality, and AI this or that, raises privacy issues. It's definitely front and center of the public debate. In terms of individual consumers, to what extent, the way privacy considerations, or ascribe a value to them when making decisions about buying a car, or clicking through a website, or using an app, there are surveys about that.

I don't know the answer. I'd say that it's complicated because as you well know, there's a bit of a privacy paradox where people often say that it's the most important thing in the world, but then the next second you offer them half a penny for their entire web history, and they give it to you.

Alexandra: Exactly.

Omer: There are different incentives, and there's a lot of research from the cognitive-behavioral economic side. It's a big issue and question. We could spend more than an hour on it.

Alexandra: I bet we can. This always brings to mind a conversation I had also on the podcast here with a friend of mine from the Telco company in Switzerland. He's responsible for data governance and said he sees it as a cultural shift that needs to happen because data is such a valuable asset nowadays, but it's something that's not quickly realized by individuals because we just haven't grown accustomed to it yet, versus from Switzerland, he drew the analogy to money.

If I were to give a person $10,000, this person would immediately know what to do with that, and what might not to do with this amount of money to keep it safe, and so on and so forth. He expect that this will still take a decade and above for people to really internalize how to keep their privacy safe. When I made this statement earlier on, on this, okay, some companies are more on the side of this breaking things and moving fast and not caring that much about privacy, I was actually more referring to the business side and not that much on the consumers who say, "I don't care about privacy, let them track me and look into me because I don't have anything to hide."

It's really the businesses where I was so curious to get your take because, as mentioned, I really see this extreme ends of the spectrum of those who say, "We can't, we have to do more than the law requires," and those who say, "Pfft, privacy, don't care about that."

Of course, you don't have the crystal ball next to you, but just from your conversations with organizations with businesses, would you say that with all these developments also on the business side, privacy is more on the, "We need to get it right," side currently, or many businesses in the US still, "Yes, privacy. Maybe we will look into that, but we are just doing the minimum and the minimum necessity."?

Omer: I think it depends. First of all, there's the obvious issue of maturity because when you are a startup, you have existential challenges and needs. If you don't grow and get traction and get investments initially, you won't have a business. People sometimes defer thinking about privacy to a later stage. Obviously, when businesses are more mature and they have entire departments of lawyers, and compliance people, and a culture of compliance, then they take care of privacy, as well as the other things. Environmental, and anti-money laundering if you're a bank, and expert controls, and whatever areas of regulation they're subject to.

I do see differences, and I think it depends on the entrepreneurs. Some entrepreneurs are very forward-looking and recognize the sensitivity and the strategic also, aspect of this issue, particularly if you are in a sensitive kind of data area. If your startup deals with facial recognition, or biometrics, or medical devices, or financial data, and you don't make sure that you're designing it right from a privacy point of view, I can almost assure you that it's going to come back to haunt you later on. It will be an obstacle and an impediment to raising the next round from a VC or a private equity firm, from selling the company, or from doing an IPO.

I definitely say to and urge these business leaders to pause to think about privacy upfront. We call it privacy or data protection by design because it's just much easier and much better to design. You have engineers and computer scientists designing very complex systems. You don't want to come to them a year and a half down the road, and tell them, "Now go back and undo everything because we want to include privacy." It doesn't make business sense. There's no way around it. If you're dealing with facial recognition, this will be the central defining issue for your business down the road.

If you're not going to take care of it, it's just you're doomed to fail, at least once you meet us, big law firms that know this area and are mindful about doing a deep dive into and checking your privacy credentials. You'll meet these law firms when you want to raise funding, or when you want to sell the company to a larger business. You should come ready.

Alexandra: That definitely makes sense, and I think that that should be the case. Maybe let's zoom out from the US a little bit and zoom into China because I also wanted to get your take on the regulatory developments that we've seen there over the last couple of months. As many of our listeners might know or at least have seen the headlines, China passed its new Personal Information Protection Law in, I think November last year. Then they also have a new AI regulation on the horizon, and a few other developments going on there as well.

Maybe focusing first on the PIPL or PIPLE, whatever is the most common abbreviation for that, what stands out with this new regulation? What's particularly remarkable from your point of view?

Omer: China passed PIPL in August actually, and it came into force in November. That in and of itself tells us something because the GDPR had a two-year implementation period, and the US state laws also have a couple of years or year and a half of implementation periods. China passed a law and kicked it into force a couple of months later. Because of its different political system, it has the ability to do that. We talked about the incredible complexity and difficulty of just reaching privacy legislation in the United States. This has been a debate that's going on for more than a decade.

Here comes China, and in one fell swoop, they put in place the Cybersecurity Law, the CSL, the Data Security Law, the DSL, the PIPL for personal information, and a whole plethora of regulations, which they seem to be launching one every week or every few days even. There is a new regulation about cross-border data flows, and about data security, and about the security assessments that you need to do in connection with cross-border data flows, and for AI recommendation systems, and connected cars, and for kids and apps. The list goes on and on.

They have been an incredibly active tech policy generator, which I guess matches their ambition and success. I have to say, in technology, more generally, they are probably second to just the United States at this point in terms of the reach and power of their tech sector. Again, China is not a democracy, and it looks very different from what privacy law looks like and means in the US or in the EU. It has strong emphasis on national security, and not just in the security military sense of the word, but also in sense of the public interests of the People Republic of China.

That's front and center of this law. We say that the European law is grounded in fundamental rights. The Charter, the Lisbon Treaty, the US law is grounded in consumer rights and consumer protection. China's law is grounded in the public policies and interests of the government of China, and in national security. That I think is the stark difference. It also has language around the trade issue and the trade interests of China. China is probably the number one trading bloc in the world, I think, yes.

Alexandra: I think so.

Omer: I think it's the biggest trade partner for Europe, and probably for the United States too, and not to mention smaller countries. This law talks about trade, which European or US privacy laws don't address trade. It talks about reciprocity, and if countries discriminate "against Chinese companies," then China will reciprocate under this law. It has blacklists and whitelists. It does look different, but at the same time, it is a privacy protection law. I think, as with many issues, Chinese policy is first and foremost domestic policy. They certainly have a lot of consumers and citizens in China.

Alexandra: They sure do.

Omer: They increasingly care about what businesses do with their personal information. While this law may be more limited in terms of relationships between individuals and government, it certainly enables the government to tip the scales, balance the scales between individuals and corporates. Some corporates in China are incredibly large and powerful, and sprawling also. Particularly Tencent and Alibaba, who have tentacles in just dozens of economic sectors from chat to payment, to e-commerce, to-

Alexandra: Exactly. Many other areas.

Omer: -right, ridesharing and many other areas.

Alexandra: I actually wanted to ask here because I've only scratched the surface and didn't look closely into these new laws that passed, but some of the provisions I've read about also had the impression that it was to also the intention behind it to little bit tip the scale between the big brands that you just mentioned, the big companies, and the power of the Chinese government. Basically, when we look at the privacy laws we have in place in Europe, and in the United States, it's oftentimes about protecting consumer information from businesses, but also making sure that privacy is preserved when the government is your counterpart.

Here, it looked a little bit to make sure that those tech giants don't become too powerful. Chinese government also has mechanisms in place that help them to preserve, as you put it earlier, also their national interest. Can you put it like that, or would you say it's a different scenario since you have more in-depth knowledge on these laws or at least more than I?

Omer: Absolutely, Alexandra. I think that the policymakers in China have been very upfront about it. From President Xi, down, there is a very clear mandate now to rebalance the scales and to head off the power of big tech platforms to some extent.

I don't think China is trying to destroy its golden goose, and it knows that these technologies are incredibly successful on the global stage if you think about Huawei, for example, with the 5G rollout in Europe and other places, and AI and other technologies that China is really leading in. I don't think that they're trying to do so much as decimate the tech sector, but absolutely, the final decision maker and power in China, always is the political class and the President, and the Congress. They are definitely intent on making it stay that way.

We saw it with some of the pushback against IPOs on Wall Street by some of the Chinese tech companies. Chinese government said, "This is as far as this goes."

Alexandra: Right.

Omer: Jack Ma, of course, he disappeared from the global stage for a few months.

Alexandra: Yes, that definitely didn't went unnoticed. The other questions I have regarding China's development on the regulatory front, I have to keep for another conversation or for another episode since we are approaching the end of our recording slot. Maybe as a last question for you, what else is on the privacy horizon that currently gets you excited, or the new challenge that you think privacy professionals can dive in, in the coming years?

Omer: There are a few things that I find exciting both on the tech side and on the policy side. On the policy side, Europe is again driving a very ambitious policy and regulatory agenda. We are seeing the different pieces falling into place with DGA, DMA already, DSA perhaps as soon as Friday, the day after tomorrow. It's April 20th today. The AI Act and maybe privacy regulation shortly thereafter. There's tremendous interest in that because it's going to impact all the usual suspects like the tech companies and more. On the technology side, I think for me the most interesting, exciting developments are crypto and the metaverse, and AR, VR, XR.

Crypto has reached fever pitch. It's probably the hottest sector in technology, certainly in America now. It reaches beyond. It started with currency, of course, and with money, and the monetary system, but even there, it evolved from decentralized currencies to central bank digital currencies, which are basically on the cusp of replacing cash. That, of course, has significant privacy implications because cash is privacy, right?

Alexandra: Definitely. Then also societal implications, which again shows how wide-spanning this whole field is.

Omer: Right. Exactly, because it's money, right?

Alexandra: Yes.

Omer: It's society, and it's economy, and the financial system and the political system really. While crypto is decentralized, and cryptography mediated them trustless, which are all privacy aspirations, and some of this ideology came from privacy advocates. At the same time, it's also transparent and immutable, and things that are anathema to privacy policy. It's a complicated kind of dance there, and I think it's very interesting to watch the developments of crypto. That's also related to the other topic, to the metaverse, because, of course, crypto, if you think about NFTs, they have use in the metaverse.

People talk about Axie Infinity, the play-to-earn platform. At the beginning, people didn't know what NFTs would be used for, and they thought it's just maybe for collectors for art, but now we see their deployment in the context of multiplayer games and the metaverse. That brings up issues, but let me put it this way. While Europe is still debating and hammering out the law around cookies, and we see enforcement of the Google Analytics cases, and the cookie banners, technology has leaped over the mobile phones now to a new platform.

AR, VR, or XR, extended reality, which includes not just location tracking like phones, but also tracking of eyeball movements, and just bodily movements, and biometrics, and at the, ultimately, brain-machine interfaces. It just shows, I think, how much work there is yet to be done in terms of tech policy, beyond the issues we deal with every day.

Alexandra: Absolutely. I think we definitely should do an episode on that at one point in time, where at least I would love to have a conversation about you to go deeper into the challenges of crypto privacy and everything else that comes with it. Thank you so much for taking the time today and for everything that you shared. It was a true pleasure having you here, and I'm very much looking forward to hopefully meeting you at one of the upcoming conferences in the next few weeks and months.

Omer: Thanks for inviting me, Alexandra. Always a pleasure to chat.

Alexandra: Thanks, Omer.

Ready to try synthetic data generation?

The best way to learn about synthetic data is to experiment with synthetic data generation. Try it for free or get in touch with our sales team for a demo.