🚀 Launching Synthetic Text to Unlock High-Value Proprietary Text Data
Read all about it here
Episode 10

Data ethics best practices - Nicolas Passadelis, Head of Data Governance, Swisscom

Hosted by
Alexandra Ebert and Jeffrey Dobin
Nicolas Passadelis is a lawyer and privacy expert who has been leading the Data Governance team at Swisscom for over four years. In this episode of the Data Democratization Podcast, Nicolas shares his best practices of data governance and provides real-life examples of how Swisscom’s data ethics board works. The episode contains an amazing collection of actionable takeaways and great answers to the big questions of data governance such as:
  • How to implement data governance in an enterprise setting successfully?
  • How to automate and scale compliance?
  • How to self-assess your data protection practices?
  • What is data ethics?
  • How to create a data ethics framework?
  • How to create a data ethics board?
  • What are the six ethics principles for assessment at Swisscom?
Subscribe to the Data Democratization Podcast on SpotifyApple Podcast or wherever you get your shows! Listen to the previous episode and read the transcript about Fair synthetic data and ethical algorithms – the fairness conversation with Paul Tiwald, Head of Data Science at MOSTLY AI

Transcript

Alexandra Ebert: Welcome to the 10th episode of the Data Democratization Podcast. I’m Alexandra Ebert, MOSTLY AI’s Chief Trust Officer. Here with me is Jeffrey Dobin, Privacy Expert from Duality Technologies. Hi, Jeff.

Jeffrey Dobin: Hey, Alexandra. It’s great to be back and see you virtually over Zoom here, over the computer. We have finally hit double digits. We’ve had some really tremendous conversations over these last few months with privacy and security experts in banking and healthcare, to business leaders from the world of finance, data scientists and legal folks have joined us on the show too, all bringing their best frontline data stories to the table. Reflecting back, I am grateful that Agnes took the initiative to make this podcast happen. A shout out to our producer, Agnes. You are awesome. We are most appreciative that you really made this happen. Good work.

Alexandra: Absolutely. Agnes is awesome.

Jeffrey: Two of the reasons that I’ve personally enjoyed hosting this podcast with you, Alexandra, is that number one, I’m learning a ton from our featured guests. Number two, I am having an absolute blast along the way with you. What about you?

Alexandra: I would say you just stole my two reasons. I’m also learning a lot and I’m having a lot of fun too. I think this podcast is really an excellent platform for having in-depth conversations that otherwise would have never happened. I really appreciate the space it creates for these insights and also these dialogues, and from the feedback, we received so far, I gathered that our listeners actually feel the same way.

Jeffrey: Absolutely. I’ve received some positive feedback too, and also some suggestions on how we can make our podcast even better. I hope that we incorporate these well and the audience experiences this, but speaking of which, the number one way we reach new people is through reviews. If you, yes, you, the one listening right now, if you could please take 30 seconds to rate and review our podcast, that would be a big help.

I’m smiling your way and giving you a virtual high five right now. Thank you in advance for leaving us a review. Back to the show. Alexandra, who are we chatting with today?

Alexandra: Well, Jeff, we have a really great guest today. It’s Nicolas Passadelis, Head of Data Governance at Swisscom, which is Switzerland’s biggest telco company. Nicolas and I actually go way back, we met each other years ago at a privacy conference in Brussels. Back when you still were able to go to conferences. I can’t even remember that anymore. Anyways, Nicolas has a legal background and I just really admire his approach to implementing data governance practices, but the listeners will hear more about that in the upcoming episode.

Jeffrey: Two things you just mentioned. One, another lawyer. That’s awesome, and he’s got applicable experience. Then number two, traveling and going to conferences in real life. I just signed up for one and I’m really looking forward to seeing other people in the industry this fall.

Alexandra: Oh, I can imagine.

Jeffrey: Maybe I’ll see you there. Someone who can really share a few exciting data stories from the real world is Nicolas. I’m really excited to hear your interview with him. Let’s meet him in a moment. In 3, 2, 1, let’s go.

Alexandra: Nicolas, hi. Welcome to our Data Democratization Podcast. It’s so great to have you on the show. Where does this podcast find you?

Nicolas Passadelis: First of all, thanks a lot for having me, Alexandra. It’s a real pleasure to be with you today. Where does this podcast find me? It’s busy times for someone in my function being the data governance head of Swisscom. It’s certainly not only today, but it has been the last few years like this. You know as well as I do that we are living in times, which present quite a challenge to professionals in data protection and adjacent technologies, to struggle with all the different challenges which our times present.

Alexandra: Yes, absolutely. There’s actually so many things I want to talk about with you today, like data governance, data ethics, also the regulatory developments that we currently see, especially on the European Union side. Then also how to prepare not only your organizations but also the people within the organization, to deal with these changes and increasing importance of privacy.

Maybe to get started, we all know, and you already mentioned it, we live in a time where artificial intelligence and other technologies are rapidly evolving, but where regulations are lagging behind. Also, many people find themselves struggling to keep up with the speed of transformation to developments. Nicolas, what would be your advice, how to deal with this volatile environment as a responsible and proactive organization like Swisscom?

Nicolas: I think it’s all about focus. The big-time question is on what do you actually focus on in your daily work as well as in your strategies? I think the focus is trust. In times where we have enormous insecurities of all players being regulators, being the users, and all the companies, I feel that we do have a lack of trust. People, I think, have come to believe that companies do not really cope with data protection, they do not deal data properly as they should.

This leads to their trust decreasing over time. I think the focus which we really need to have is to build trust or to preserve trust depending on where you come from. For me, it’s really not something which is compliance work. I think compliance is the baseline, but it’s only a means to the end, which is trust, building trust, preserving trust, and showing to your stakeholders, be it regulators, be it the customers, be it the public, that you actually take data processing seriously. That you do everything which you need to do in order to make sure that the data is treated in the right way.

Alexandra: Yes, absolutely. Why are data protection and privacy protection important issues for Swisscom?

Nicolas: We have been in the spotlight for many decades if not centuries. Swisscom is, as I feel, almost a public property and practically everyone in Switzerland has an opinion of Swisscom. Hopefully a good one, but not always. They expect from us more than they would expect from a “normal” company. We are supposed to do things right, which is also a reason for that, is that we are government-owned by 51%. We are in the public domain if we like it or not. Therefore we are in the spotlight and we need to do things better than a normal company.

Alexandra: Would you say that trust and the trust of your customers is also an important asset for you as a company, since you deal with rather sensitive data like telecommunications data, sometimes financial transaction data?

Nicolas: I think it’s a key element of our company and also of our DNA. We have been always strong in technical skills. Our networks are supposed to be the best, and this is also certainly a part of our DNA. We are an engineering company, to start with. However, because we are so big because we provide the electronic backbone of communications in Switzerland, and our service portfolio is quite broader than just telecommunication, we have a very strong IT focus as well. And we have adjacent business areas, which we also deal with.

Therefore we are, let’s say from a data protection point of view, we are a very comprehensive data processor. On the quantitative side, we have enormous amounts of data produced every day, which are not something we seek, but they are just a result of our services. When you do telecommunications, obviously the data comes by the way of the service provision.

Alexandra: Yes, absolutely. Actually, coming to data governance with all these different data sources that you mentioned, one thing that I admire about your work at Swisscom is really this holistic approach to data governance that you introduced. Why would you say is having this comprehensive and holistic data governance approach so important for Swisscom and also in today’s times?

Nicolas: I may be kind of blind, but for me, it’s the only way to go forward. I’ve seen compliance initiatives, which focus solely on compliance. Probably a lot of colleagues will know that feeling when you come and hit someone’s head by saying, “You have not been compliant. You need to be compliant, it’s the law.” Most of our peers who do not come from the compliance world or the legal world, they have a certain reservation about what we do. This may be okay or not be okay, but we can’t change it. You must find a way to actually preserve compliance, to build compliance processes and reach the goals you have, by telling them that there is another benefit of what they do. The way I put it to my stakeholder is that our prime aim is to help them be successful in a digital world. Then the second aim is to make sure that they don’t have any problems in doing so.

Alexandra: Can you be more precise? How do you help your business partners and stakeholders achieve this new perspective on compliance?

Nicolas: There are a lot of elements which we need to cater to. First element is that we help. It’s in the DNA of our team and everyone who’s involved in this that the main message is, we are here to help. We are consultants more than we are legal people, and we consult and we try to find with them ways to achieve the goal without jeopardizing data protection or the trust in the company, then we are building tools, which make their life as easy as possible.

What we try to avoid is having big projects, where you have months and months and months where you gather data through data mapping exercises, and then you sit at the table with your teams and then go through every detail and so consume their time, over a certain period of months. We have developed tools, which help them to do self-assessments on all the relevant questions, which take them, let’s say, three hours to complete those.

What we say is, look, if you do that once a year, you should be okay. There may be single questions we need to deal with offline but that’s the main expectations we have. Do the self-assessment, correct what you have to correct, and then it’s a limited time exercise over a year.

Alexandra: Yes, absolutely. I think that’s definitely promising. Is it also, when you mentioned this practice and supporting approach that your data governance team has, is it also in your responsibility to look for new solutions out in the market like privacy-enhancing technologies? Or are you really more on a case by case basis involved in the project and try to come up with a solution for the specific case?

Nicolas: We certainly interpret our role like that. If we come across new technologies, like the one of MOSTLY AI, the synthetic data processing, then we certainly propose that to our business teams and say, “Look, we’ve seen here something that may be interesting for us.” I think the main goal is to limit the exposure whenever you can, and one way of achieving this is by doing self-assessments. Another way is to minimize data. How do you minimize data? Perhaps you develop into synthetic data for certain use cases.

Whatever helps is good. There is nothing we actually conclude from the outset. Nothing we exclude from our functionality, we try to do whatever is possible and whatever it takes to be successful, and to achieve the goals I’ve been mentioning. One certainly important aspect is communication. You need to talk. You need to talk again, and you need to present and you need to present again, and you need to hammer this message in as much as you can.

Alexandra: Absolutely this – I think it’s really a promising approach to be that proactive because what we know from many enterprises is that the compliance function is really more than “you can’t do this, stop, not this” and so on and so forth. This is hampering today’s digital innovation projects. Therefore, I really like what you’re doing at Swisscom. We talked about being proactive and supporting and making it as easy as possible.

With all these different people being involved in data protection from different hierarchical levels, what do you do to create awareness and really educate people about the importance of data protection and privacy? Do you see challenges here?

Nicolas: Indeed, I do. Now I’ve been in my role about four years, and I do experience this being probably one of the hardest bits to achieve. I’ve been thinking quite a bit about it, and I’m not sure whether I have a final answer to the question. However, I feel that this is maybe a cultural thing or the state of development we are in. Many of us and many of the people in the company they have not been growing up with digitalization. Digitalization hit us somewhere on the way. I think digitalization is something which requires a completely new thinking.

That may sound a bit silly saying this but actually, for me, that’s what it is really about. We need to learn our world completely differently. I guess, and this is my interpretation and I’m far from having the truth, but my interpretation is we just need time. There is an analogy, which helps me a little to understand what’s going on and it’s the analogy to money. If you see that money is very deeply rooted in our culture and in all different aspects of our society, and you look back when money has been introduced in society, so we’re talking about many centuries.

Obviously, through the centuries, we’ve developed a very refined way of coping with all the challenges money poses to us as an individual, as well as to the society as a whole. We have time to develop these skills, and therefore we are reasonably skilled in dealing with money. Now, if you take data, also as an asset, then I think until we develop all these skills and these refined ways of dealing with data, this is going to take time.

I hope it doesn’t take time many centuries. However, it’s probably going to take one or two generations, until people are dealing with data as fluently as they’re dealing with money today.

Alexandra: It’s a great analogy and definitely true, especially, if you look not only on the people side but also organizations. They’re just starting to see data as an asset now and thinking about it in a way that’s more like thinking about money, as opposed to just data as data. If you say it takes two generations until we have this maturity to deal with data, are we facing two decades full of data breaches and issues around data? Any practical advice on how to avoid big bad things happening in the privacy space?

Nicolas: That’s a tricky question, Alexandra. I’m not sure if I can say yes, but deep down, I think it’s going to probably take that long, yes. It’s going to take that long until companies have reached the maturity to really reach the level where we could be satisfied. I ask you to think back, data protection is not new. We have data protection since the ’90s and it’s now three decades.

We may as well ask ourselves, “What have we achieved since then?” If you look back, probably it’s not as much as we could have achieved. Therefore, I think adding another two decades is maybe not too unrealistic.

Alexandra: Okay. Well, let’s see how this evolves. Data protection was around even longer, since we already had protection of financial and medical information more than two decades ago. Nevertheless, I think it’s super important to also find ways on how to increase awareness now. From the things and the measures that you take within Swisscom to educate your employees about the importance of data protection, are there any measures proven to be quite effective compared to others? Any practical advice you can give to our enterprise client listeners?

Nicolas: One of the strategic pushes we adopted from the beginning is automatization and self-service. We never engage too much into the one-to-one consulting of projects. We try to come up with tools and elements where people can assess what they do themselves. We’ve developed a tool which is called Data loads, where people can actually through a questionnaire answer questions. These questions are rated by the software automatically in risk.

The software determines which answers are more risky than others. Based on a certain risk level, the software produces requirements automatically. Therefore, if a team or a responsible person goes into the self-assessment, in the end of let’s say two to three hours, she or he will receive a set of requirements, which are precisely enough to tell the people what they have to do in order to mitigate risk. There is obviously two goals involved in this.

The first goal is to scale our competence and our means, when in a normal multinational company, you have hundreds if not thousands of different data processing. There is no way you can deal with that on an offline analog basis. Therefore, you need to find a way to scale up these assessments without having to invest people into that and you know as well as I do, the trend is not to build bigger departments, but actually to make them smaller, regardless of whether it’s warranted under data protection efforts or not, but this is how things go. Therefore, you need to bridge that gap through the masses of things you need to look at and the effect of people resources you have, so therefore optimization and self-assessments. There is a second aspect to this, the aspect of data culture. In the beginning, people may feel frightened if you present them with a questionnaire of 70 questions. However, we’ve built the tool in a way which actually leads people through that and explains why we ask a question.

It’s not only the question, but it’s also there is an explanation saying with definitions, and we explain to them why this is important to know. If they use it even just once or twice, they feel much more comfortable in doing this. We certainly don’t see that for all of them, but many of the users actually find it pretty interesting to do that because it opens to them a new world. Many of the teams, and we tried those with hardcore developer teams in the beginning, they said, well, now we finally understand what it’s all about.

You actually cater to compliance but at the same time you build a data culture. People really understand that this is for them personally, quite an interesting development because they may leave the company at some point and go somewhere else. Then, obviously, for their own sake, it’s an interesting pitch to be able to say, “Look, I’m not only a great developer, but actually, I do know why data protection is important. I do know to ask the questions and I know how to answer them.”

I try to sell “this tool” to our teams by saying, “Look, this increases your ability to find great jobs on the market because you have an asset no one else has. You have a personal interest to do that.”

Alexandra: I think that’s a very promising point to package data protection and why it’s important to comply like that. In one of our earlier conversations, you also mentioned that you not only try to come with the compliance side and perspective of, okay, you have to do it because it’s the law, but that you really try to connect to the business goals and to the values of a company. Can you share a little bit of that?

Nicolas: I’m very fortunate to work in a company which actually has it in its DNA to do things right. I see that whenever something goes wrong, and as a telecom company, obviously, one of the worst things which can happen to you is actually that your network has an outage. People suffer. When that happens within Swisscom, people really suffer. This is in a way a good thing that they are really committed to do things right.

It’s easier to talk to them rather than if you have a company which has a gold digger approach to things, where just maybe the commercial success is the only thing which counts, and therefore there is nothing else. I think then it’s much much harder to reach even the smallest goals compared to us. I think really we just talked about it, trying to work out what is good for them and to emphasize that there is much benefit in doing what we expect them to do, is the key.

As we said, since people in digitalization matters are not that experienced yet depending on age as well, you sometimes need to state the obvious. It may be obvious for us professionals that certain things are like that, but it’s not obvious to all of our colleagues. Therefore, we need to spell out the benefits for the company, especially the reaction of regulators whenever we do something. The day it goes into the media, it’s going to take probably 24 hours until we have the first tweets questioning what we’ve been doing, questioning whether this is ethical.

Questioning whether data protection has been complied with, asking whether we are a company which tries to have as much data as they can, et cetera. We also have the response of the public which helps then again, to guide the company into the right direction.

Alexandra: I think that’s a super important discussion to have also with your teams and colleagues so that they can really see it from that perspective. If I remember correctly, I think you also mentioned back then that you, for example, also tied to values like customer centricity and being customer-oriented. That protecting data is something that today’s customers just bluntly expect from you as an organization and this therefore is the logical next step.

Nicolas: You know as well as I do that the data protection regulation always focus on the person concerned. The concept is basically the same like in the business world, saying it’s customer-focus or customer-centric, however, they do understand it better if you say customer-centric. Sometimes, I just tell them read the newspapers. In the newspapers, until five years ago, you never read anything about data protection. Now you read something every day. There is practically every day some breaking news in that area.

I tell them just read the newspapers and then you know what’s going on. It’s not really my focus to tell you how the world looks like, you can do that by yourself. I think one also, another little trick I use when they don’t really see what I’m trying to say is say, “Okay, picture yourself in the main news show of the evening Swiss television. You’re there for an interview and you’re there to defend what you do. Do you feel comfortable in doing this or don’t you feel comfortable in doing this?”

If you prefer not to go and if you prefer someone else to go, then probably you do not feel comfortable. If you don’t feel comfortable in really defending this in broad public, then something’s wrong. Therefore, whatever we do we should be able to go to the main media show, in television, and completely relaxed, saying what we do, why we do it and why it’s a good thing. I think this is a little, say psychological test, to find out whether we think we are doing the right thing or not.

Alexandra: I think that’s a very good one, although I’m thinking of some of the engineers and developers I know, and their comfort in front of TVs and video cameras. Potentially, even if they’re convinced about what they’re doing they would be rather resistant. I think it’s a very good mind experiment to walk people through, to really enable them to see it from a different perspective.

You also mentioned that within Swisscom, of course, you have a somewhat easier job since you’re not the gold diggers and have a approach that’s a little bit different than those companies that solely focus on revenues and profits. I would like to talk about the data ethics framework that you introduce, because privacy protection, I would say buyers in artificial intelligence and doing the ethical right thing with data, is another of these things. It’s more and more over the news. We are working there with the synthetic data to contribute, to mitigate biases in AI.

Also, organizations having data ethics frameworks, is a super important step to moving in the right direction. If I remember correctly, Swisscom actually has the only functioning data ethics framework in production. I’m super curious to learn more about that?

Nicolas: Maybe it’s not the only one, but maybe it’s one of the more mature ones. Well, my role also requires me to listen to all the stakeholders, which are not necessarily the stakeholders have in the company. As I call it, and this is certainly not in any way disrespectively meant, but I’m listening to activists. I’m following what they do internationally and nationally, and I talk to them I know many of them personally. We talk about things and they kind of criticize what we’re doing and asked me, why do you use this and this tool with just an opt out and not an opt in?

In the beginning when I started the job, I had the tendency to get a little bit angry on them because I felt attacked, and I felt the need to defend myself. At some point, I managed to overcome that initial reaction and truly listen and reflect on what they say. I found out on the way that really probably trust and the regulation are not aligned, i.e. the regulation becomes more effective, becomes more strict on companies and data controls, as well as data processes. However, the correlation to trust is probably not as strong as it should be or could be.

In other words, in my humble opinion, people feel a certain way of distrust towards company despite the regulation having become much more effective. I gathered from this that the anxiety, which goes with digitalization and all the pitfalls digitalization can have on us as a society as well as on us personally, provides a gap which I started to think how to cross that gap. Whether rightly or wrongly, I came up with data ethics. I say your actually compliance is maybe not the key focus, but it’s really building trust. Therefore, if the mere compliance is not enough to actually bridge that anxiety gap and really provide trust to your stakeholders, what you have to do. Therefore the answer we came up with is, “Okay, we engage in data ethics.” We asked ourselves, “How do we want to behave if no one looks?” This is how we understand data ethics in our daily business.

Alexandra: What would you say are the success criteria of this data ethics framework?

Nicolas: One success criteria, probably the most important one is really the value base of the undertaking. If you work in an undertaking which has a strong set of values already in place, and has lived up to them for decades, then it’s obviously easier to come up with a new angle to watch. In our case, I tried to convey to the company that it’s nothing else that we’ve been doing in the past, i.e., if trust is one of your key assets you want to preserve, then you actually need to also do the right thing when it comes to data.

It’s just, let’s say, a new world where you do the same as you’ve been doing in the old world. That was the basic start. Then it’s really to provide a framework which is operable, which is functional and effective without, and this is really important, without consuming too much resources because resources, and I’m sure this is true for all the companies in Europe as well, resources is absolute top criteria. Whenever you come up with something which uses a lot of time or money, I think you don’t. Therefore, you need to find a way to do things to achieve goals with minimizing the use of resources.

Trying to figure out where this balance is, is certainly a key factor. Then, what else is to focus on benefits when you present that and not on, “Do the right thing,” because people want to do the right thing but also sounds a little bit of work. You need to do things, “I’ve got already enough things to do, therefore, I actually don’t want to do more.” Therefore, to come from the other side as we talked about data governance to say, “Look, there is a strong benefit.”

For instance, you may try to do things ethically when you present a new tool to the market, a new product to the market, or you disregard ethics and then try to see what happens. We were lucky enough to have one or two examples, which from a data protection point of view, were completely, completely problem-less. There was no problem whatsoever. Still, there was a huge discussion in the market, whether that’s ethical, and the discussion was not warranted because it was not an ethical problem.

However, it was very helpful to the organization to see that these discussions come up regardless of whether they are really warranted or not. In fact, they had to withdraw a product for a few months before they came up again with the product and the product never had success. The question could be, is it really due to ethical issues that the product was never successful? We try to show them, “Look, there is a strong business benefit to do that. That’s the very reason to do it and not because you want to add some compliance tasks to your daily chart.”

Alexandra: I can absolutely understand how this is important. Coming back to the efficiency of this framework and also the operationalizability, what would you say are the most important pillars of your framework? Or, could you give the top five things that data as a framework should include so that it works in practice?

Nicolas: I think there is not even five pillars. It is actually two pillars. One is the principles. You need to have very clear principles you want to focus. The way we understand ethics is, ethics is not trying to achieve some objective criteria, which someone has defined but actually to ask yourself, “What are our values which we want to uphold in our business?” and then to have, and that’s the second pillar, to have a process where you try to review what you do against these principles.

Basically, that’s the two factors in our framework, which we came up with, to have clear set of principles, and a process which makes sure that these principles are lived up to.

Alexandra: Can you share with our listeners how this process looks like? Who is involved? When does this data ethics framework start? Is it involved directly from the beginning? How does it work?”

Nicolas: The first step in the entire framework is basically consulting. My team and all the stakeholders, which we train on the way, is to, when they work with the teams, that they actually provide ethical guidance as well. Historically, the responsible teams have been asking, “What do I have to do to be lawful?” They tended to say, “If it’s lawful, that’s fine for me.” Now, we go one step beyond and say, “Well, in order to be lawful you have to X, however, in our opinion, if you want to be ethical you have to do Y and Z in addition.”

Obviously, that’s very often involved with more work, more expenditure of cost, and therefore people are, as I said, before, are reluctant to do that, and then you need to have a discussion with them why you still think it’s beneficial to do it. Once they do that, we may have cases which are completely okay, therefore, they’ve done what they’re supposed to do, and it does not go further, that’s fine. There are cases where maybe my team will not know exactly whether they are on the right side of things, whether it’s “ethical” or “unethical” because it’s a boundary case.

In those cases, I have the authority to ask the board to look at this. The procedure in front of the data ethics board is basically built like a court trial. People come in, present their case, the board can ask questions, and then at some point, the team has to withdraw and the board, in a secret deliberation, talks about the case and makes just one thing, they assess whether the case meets our principles. This is the only thing they assess. They don’t assess whether they think it’s good. They don’t assess whether people may think it’s good.

It’s really, “Does it meet our values? Yes or no.” Once this deliberation has been done, then the team, the responsible team or the sponsor is called in again and presented basically with the results. If the board has reached a decision where they think one or two principles has not been met, then the sponsor has three possibilities. One is to change the case and then come up again to present it, secondly to say, “I drop the case,” or thirdly, to actually go to the board of directors and present the case there again because the board of directors, in the end, always has the final word.

How the board is framed within Swisscom it doesn’t change the organization of responsibility we have before. It’s, let’s say, a mere formalized sounding board, where we have a process which makes sure that we have a clear answer on the meeting of our values. However, it’s then the board of directors who takes the final business decision. In extreme case, if they said, “We know that the case is ethical, but we still want to do it,” then first of all, they have legal competence to do that, and then they also have the moral responsibility to deal with it when it turns out badly.

We do not missionary too much with our ethics framework, which is, really again, it’s a help to give a very founded guidance of what we think is the right thing to do, but in the end the business needs to take decisions.

Alexandra: Of course. Who are the members of ethics board? How long are they in the function? Is there also one point in these discussions where you get in perspectives from customers or some outside people? Or, is it only within Swisscom that these decisions are made?

Nicolas: This is a very epic question. How do you compose a board? Is that composed by employees of the company, is it composed by external people? We thought a great deal about it in the beginning. We decided we wanted to have it internal. The sole reason for that is confidentiality. There were different ways of having ethics board. Some provide more strategic guidance on, for instance, business models, on technologies as such. We chose to have it more concrete and to have real business cases which are dealt with by the board.

Therefore we have a strong confidentiality issue on that level. It is true that when you have external people, you always have a confidentiality problem, especially if it’s, let’s say, some kind of flashy engagement you have there, the tendency that you want to share that with other people is quite high. Especially, if these people are professionals and they want to engage us in self-marketing then there’s this danger. Having an NDA doesn’t solve the problem. We see that in many, many other areas that NDAs are routinely disregarded, although you have them.

Therefore, we opted for an internal board, and then we tried to compose a board as diverse as we can. We were looking from a diversity perspective, to age, to seniority in the company, to function, to sex, to formation, and to business function. We try to have people from communications, from development AI, from corporate business, from customer business, and from compliance. We try to have the most possible input. For instance, we have a very young lady who comes from customer complaint. She knows what customer complained about every day. Basically, her DNA is to deal with customers directly. Then we have people who have a business responsibility for turnover in corporate business, so we try to have as much views as possible. To my surprise, actually, it worked better than I expected to. In fact, the board is stricter than I would have thought in the beginning.

Myself, I’m not a part of the board, but I’m actually just standing by. To say so I do not have a voting right. The reason is because I do have a conflict of interest. As I said, my team also tries to help the teams before that in order to implement ethics. Therefore it would be strange if they had a voting right to vote on the same things we’ve been advising on. Therefore I’m there as a standby function. We have a secretary who is trained ethics philosopher. Therefore we have seven people formally in that board and trying to have as many views as possible. We’ve been doing that since one and a half years. I can’t exclude that we have external people also on the board. At the moment we don’t see any reason why we should do it because we think it works.

Alexandra: I see that you have intermediaries in there that bringing the customer voice, but especially since you mentioned that increasing trust is one of your goals, are there some ways where these ethics considerations are made publicly available whenever a new product is launched? Or we do any additional things to bring it to the outside to the public, such as saying you are having a data ethics framework?

Nicholas: The answer is not really, we have focused on the internal side so far. I think there is a strong interest to do that also public. However, our communications department probably has been a bit reluctant to do that. I haven’t talked to them about it recently, so I don’t know where they stand at the moment, but I cannot exclude. It’s been at least on the list of ideas to do that in the time to come to be more open and more transparent about it. We’ve been publicly transparent that we do have it, but we haven’t shared too much about the cases we dealt with.

Alexandra: Are there any publicly available resources on how data ethics board works in practice that interested listeners could look like?

Nicholas: Not yet. We share that more on a one-to-one basis rather than publicly.

Alexandra: Everybody that’s interested in building up a data ethics framework can reach out to you after listening to this episode. Can you give us, if it’s possible to share a practical example of a use case that initially, maybe didn’t make it or didn’t pass a board decision, but then was more ethical and beneficial for your customers. Any example?

Nicholas: Maybe one of the examples which didn’t make it through the board was an initiative by our corporate responsibility department. They wanted to measure the ecological footprint of Swisscom employees. They wanted to do that by analyzing their mobility data through the mobile phones, to see how much they travel and how much they really do that. One of the things was also the goal. They said, “We want to measure whether they should have this train subscription.” We do provide too many employees that where you can ride the trains in Switzerland for a yearly lump sum. The business sponsor of that project came to the board and presented the case and the board was not convinced that this is the right thing to do. Why? because first of all, the impact on the personality of the employees by analyzing their mobility data is huge.

By raising the bar on personality impact, you need to have a tremendously good story to match that. Even then, it’s not clear that we would say yes, but you raise the stakes. The deeper you take the better you need to take in order to make it work. The board was not convinced that assessing the CO2 footprint of Swiss employees is a convincing enough goal to analyze their personal data, and therefore the board said no. In fact, it never happened since. Obviously, it’s very important to work towards sustainability and reducing the CO2 footprint, but there are so many things you can do as an alternative that may be analyzing the mobility of employees is probably the last thing you would do.

In fact, the question is how effective would it be if you did it? I.e. is it that employees in today’s world really use their mobility means completely unreflected? Is it really true that they need a lot of guidance on their personal mobility so that they can provide their share in reducing the CO2 level? For me personally, the answer is no. This topic has been so prominent in media, in discussions, in private discussions et cetera. Therefore if you start to think about the case then you find out that the goal does not justify what you really try to achieve, because the goal itself is probably not as beneficial as one may think in the beginning.

This is exactly the way the board reflects, is really questioning very deeply. What are you trying to achieve? How effective is what you try to achieve and is that justified still what’s the impact it has on the people? In fact, it’s not about law because legally you could probably try to find a way of make that work, but it’s not about law. It’s about ethics. One example of which worked well was the introduction of the voice assistant in our TV product.

Alexandra: Quick question before we jump to that. Do you remember correctly that’s really how one use case would impact the individual and to societies. One of the key criteria is that this data ethics committee uses in its assessment or what are the core ethical principles that are used to assess each and every use case?

Nicholas: We have six principles, one is being transparency termination, the impact on privacy, then the benefit it has on society or the individual, the governance aspect, and the discrimination aspect. These you may find are not extremely original principles, because these are the ones you see in many ethical discussions. However, what is important for us is that these principles start where the law ends. When we talk about transparency, we don’t talk about meeting the GDPR criteria, but we actually say meeting the GDPR criteria or the baseline, and then ask is the use case and what you do with that data beyond GDPR transparent.

The question is not, do you have a data protection policy or data protection information? but it’s actually, how do you make sure that people really understand what you do and really see what you do? You go beyond what’s required by law.

Alexandra: You mentioned that you’re doing this for 1.5 years now. Could you imagine having this more ethical approach to data utilization and going beyond your compliance, would this have positive effects also on the profit side, since now, privacy and ethics are top of mind for consumers and to start to expect this from companies. What’s your take on that?

Nicholas: I think it’s not that big yet. I think it’s seen by the public, but the pro-public, if you have several millions of customers, I don’t think that if we ask them, “Do you know that Swisscom has an ethics framework?” Most of them would say, “Yes.” In fact, my expectation is that only a fraction of them would have heard anything about it. However, sometimes I think it’s not that important that people know consciously about it. I think what’s even more important is that they don’t feel uneasy. As you know as well as I do anxiety is something which you may be aware of or not.

Therefore, I think the goal is that if you go around and ask people, “Do you feel comfortable in Swisscom dealing with your data?” If they answer say, “Yes, I do feel comfortable. I don’t know why, but I do feel comfortable.” I think this is where we want to get at. The less criticism you meet, the less uproar you have, the less negative impact you produce. I think then you’re in the right way.

Alexandra: I would agree that really about not knowing the individual measures that an organization takes, but that the general perception is that you can trust them and that they handle your data responsibly and that you have a good gut feeling to be a customer of theirs. Sorry, I interrupted you before you wanted to share one of the success stories. I’m super curious to hear this one as well.

Nicholas: As I said, it was about the voice assistant off our TV product. Basically, the box which steers the TV product has now a voice assistant functionality. We introduced at about two years ago when all the voice assistants of all the companies were quite in the media for making recordings, which you didn’t know of, et cetera. We worked very intensively with the responsible team about how to do that, how to implement it, but also how to structure the entire functionality in order to make sure that it meets our criteria. The solution we came up with was that basically voice assistant needed actually to pull an electronical switch, in order to make it work.

Normally, the team would have preferred the switch to be on green on on by the time you deliver it to the customers, so you just need to plug in and then play. However, we convinced them that for ethical reasons, it will be better to switch it on and off so the customer needs to really make a conscious switch to on, in order to make it work. That wasn’t the only one, then on the second on the screen, they need to then check two boxes, in addition to that, to make that work. It’s basically, let’s say almost a double opt-in.

On the transparency, we tried on screen to explain that in the best words we could find, we had leaflets. In addition to that, we had a media campaign explaining it when we launched the product. We had a very strict governance regime, within the company on how to deal with those voice samples. We took all the measures we possibly could, in order to make sure that these voice samples are not used for anything else and just guiding the TV product. We’ve also provided the functionality within the voice samples that you can actually call the call center and say, “I want my samples to be deleted.”

We had preserved by Chinese was a way to re-identify these voice samples which belong to you, so we can actually delete them when you want that. We tried to strike the balance there between having complete anonymity, which would have been one possibility, but then if you have complete anonymity, you can’t delete them, based on their request. We tried to find a way around that, and therefore that’s, in a nutshell, what we try to achieve.

This has all been done before the product went to the board. Obviously, the board had no reservations then to find it in compliance with our principles, so that’s a success story. The market didn’t react at all on the data protection site, so we had no negative feedback, whatsoever, when it was launched. I found that, very objective and quite surprising, because voice and voice processing is a rather hot topic. Therefore, I think we did something right there.

Alexandra: I would definitely say with, not only having a digital switch but also the switch directly on the product, and then with all the measures you went through to make this new feature functionality publicly aware. I think this is very responsible and of a role model behavior on a company’s side. Do you know anything about the adoption of this new feature? Was it as high as expected, or do you feel that there was a decrease of adoption due to this double opt-in process, which of course made it a little bit more complicated?

Nicholas: I think the success was reasonable. It was not an overwhelming success, but I think it’s not due to the product or to the data processing, but again, there is a strong reservation of people to use a voice. We see that in other functionalities as well. For instance, people do not use the dictation functionality on their iPhones, on mobile phones as much as they could, although I think it really works well. People are reluctant to use their voice.

Also, in my view, due to very negative feedback in the press and also I think they understood that you can do much more with the voice than you think, and this is exactly the anxiety I mentioned at the beginning of our podcast. That really this is where I think the success of digitalization is, and also the threats. It’s in our heads, in everyone’s head and I think we need to take that very, very seriously. We try to do everything we can to cater to this anxiety and make sure people when they entrust us with their data, have a very good reason to do so.

Alexandra: Absolutely. I think this is really one of the aspects we should approach in as secure and responsible way as possible, but if we do it, there’s huge potential that technologies like voice can really make technologies more available for also minority groups of people that are not digitally native. I’m just thinking of, for example, my grandpa who is 87 at the moment and he loves his voice functionality and can do so many things now with his smartphone that he wouldn’t be able to do otherwise.

He, of course, is not that critical and concerned about privacy things because there’s just such a huge benefit for him. There are also many people who are rather reluctant to use it. I think that doing it in a way like you described this approach of Swisscom will really help to build this trust and eventually also increase the adoptions of services like that. One last point I wanted to come to before we end the episode, I would be curious to hear your take on the current regulatory developments that we see, especially under European Union’ s side. Are you happy? Are you thinking we’re moving in the right direction? Is there anything you’re missing on the regulatory side? What’s your take?

Nicholas: I listened to your last podcast with Axel and I thought that was a really, really good episode. There is not much I can add to what Axel has already said so if everyone hasn’t heard that yet, it’s warmly recommended to do that. Now, from a pure practitioner point of view, I’m slightly frustrated by the regulation. I’ve been quite a big fan of the GDPR. Why? Because, despite the fact that it’s burdensome, it’s cumbersome to do it, I think there’s a huge benefit in that direction we just discussed now, of alleviating anxiety, of doing the right thing, for companies which they haven’t been doing in the past decades, so check behind this.

However, on the entire Schrems II thing, I’m completely unhappy. Why? Because I think that there is a war between Europe and the US, which is a war about digital supremacy. I have my full understanding that Europe tries to catch up, which they’ve been failing to do so, in the past two decades, which Axel I think, very nicely said out. However, I think, it’s completely wrong strategy to pull the companies into that fight. We have that problem now, like many other companies I know have that problem when they want to move into the cloud.

We have a problem of trying to bring down costs and increase functionality at the same time by moving into the cloud, so this is a strategically extremely important push. Obviously, it’s also about risk management, risk assessment. Yes, it’s true that we do have this Sword of Damocles hanging over our heads by the US or due to the US capturing this data with their intelligence services. However, there is nothing we can do. We are left now in the gap between the US and the EU and left with no solutions, but an extreme push from the business side of EU. What do we do?

I think this is completely unfair of the EU to push the companies into that direction. I read yesterday that Joe Biden, obviously is in talks with the EU commission, to find a safe harbor three, privacy shield number two, or whatever it is. There is obviously a push now into that direction, which is obviously okay. I think, if you want to digitalize your region, i.e Europe, it’s completely the wrong way to do that, because it doesn’t foster trust. It destroys trust. As we said, trust is key. If you want to be successful in digitalization, you need to build trust.

You need to do everything to build trust, and avoid everything which destroys trust. That’s exactly what they did with Schrems 2, although I have no criticism to the Schrems’ organization, because I think even if this makes our lives much harder, they’re doing a great job. I think we need those organizations which are extremely intelligent, savvy, committed, and determined to protect the rights of the people. We need those players in the game, but it’s certainly the wrong decisions to put transatlantic war on the shoulders of the company and on the shoulders of the people in an undertaking the deal with privacy and data protection. I think I’ve made my point clear. I’m not a big fan of this initiative.

Alexandra: I’m seeing the frustration there. If you say it destroys trust, what could they have done differently, or what could you wish to see to make this not as hard on the companies?

Nicholas: They need to make it work. They need to make it work. I think the US as well as Europe they have a strong interest in balancing their forces. The irony about it is that the European intelligence services are no different. All of them want to have that data. In fact, I think it may be politically neat to engage in the finger-pointing game and to point the fingers on the US, and to hope that the European foreign intelligence services they’re out of the game as much as they can.

However, that’s simply not true. Therefore, we need to have a very open discussion in society about where is a strong interest and take the data let’s say from a security point of view and where is it just too much. This may be painful for governments appreciate it. However, I think it’s unavoidable. This is where we need to start and not on saying whether the US are the bad guys or not because in my world, they’re not. They just had the “unfortunate luck that they had Snowden.” Europe didn’t have Snowden yet. I think that’s the difference.

If you hear that Denmark supports the US in spying on German government, [crosstalk] what can you do? There’s a lot of things we don’t know. Therefore, I think we need to have a workable solution which balances the interest. It’s not about Schrems 2 in my world.

Alexandra: Let’s see what they’re going to come up with and whether this will make it easier for companies or not. Let’s have some patience and see what’s going to happen.

Nicholas: I have my doubts.

Alexandra: I can imagine. Before we come to an end, any last remarks or words that you would like to share with our listeners or specifically practitioners in the field of data governance and compliance?

Nicholas: Yes. I think there is one thing which stands up above everything else which compliance folks like I have to do. This is resilience and patience. I think this is probably one of the most important, if not the most important assets I can bring to the table, which is resilience and patience. It’s a long way. It’s a stony way. One needs to keep up the good faith and the passion and the commitment towards what we do and not get too much frustrated by all the little loopholes and obstacles which are on the way. I think they’re quite many of them.

Therefore I think focusing on your personal health and focusing on the positive spirit by being resilient by being patient, and to think that what we do is not for our job. It’s not for our lifetime. It’s something we do for society. It’s a little stone in the entire mosaic. I think this awareness is key for me. We just all need to contribute to the right goal, and on all levels be it the activists, be it the professionals, be it people like you in technology. We all work towards the same goal. We all the community, the regulators, of course. I think this is where we need to think about that and not forget it in the daily struggle,

Alexandra: Very wise words. I think this really helps to put the daily struggles into perspective. That this really is a journey that we are just in the middle of or just started

and will continue over the next years and decades. Perfect. Nicholas, it was really wonderful to have you on the show. I think there’s so many takeaways for our listeners.

Thanks a lot for your time. It was a pleasure.

Nicholas: Thank you for your patience. I greatly appreciated it. Your time and having the possibility to talk about it was really interesting, and I’ve greatly enjoyed it.

Thank you.

Alexandra: Thank you very much.

Jeffrey: Fantastic interview, Alexandra. This conversation is a solid lesson about how to address data governance effectively. Let’s pull together the most important takeaways from today’s conversation. I’ll get us started with number one. Nicholas says to remember that compliance is only a means to an end to build and preserve trust.

Alexandra: That’s a good one. Number two, to make compliance work, we need to convince all involved parties of the benefits, and we need to find ways to build compliance into processes. How? By providing help. Data governance teams should act as consultants that people can turn to, and by providing tools for self-assessment and automating as many things as possible. Then, also by proposing new technologies that help your teams to achieve the same or better results while minimizing risks.

For example, with synthetic data that can help with data minimization and scaling compliance. Then also via communication. You need to drive home your message again and again to raise awareness. Nicholas also emphasized how important it is to be patient.

Jeffrey: Yes, yes, yes to all of this. To number three, Nicholas says to conduct a bit of a thought exercise to assess yourself, what does he mean? Imagine having to defend your data protection practices publicly on the Evening News, or if you’re a bit younger and you don’t watch the news, imagine having to defend this on your own Instagram Live, or TikTok. If you feel uncomfortable, it means you’ve got some things to sort out.

Alexandra: I’d love to see more data privacy coverage on TikTok definitely. Number four, what is data ethics? Data ethics goes beyond mere legal compliance. It governs how you behave when no one is watching. You should use your existing company values and make data ethics an extension of those values. Nicholas recommended that you really listen to activists and engage in conversations with them because they’re valuable resource to learn what you could do better. You should also focus on the business benefits of creating ethical products.

Jeffrey: Nice. I like number four. Here’s number five. How to create a data ethics framework. The first step is to add ethical guidance to the legal framework across the company. Go ahead and establish a data ethics board. It functions just like a court. Its job is to assess if a case meets a company’s principles. Diversity is key here to ensure you have the perspectives of people with, of course, various backgrounds. Include people on this board with various levels of seniority, with people from different divisions in the company, and with different job functions.

Of course, goes without saying, but ensure there’s inclusive representation of gender, race, and creed so that the ethics board reflects the point of views of your employees, but also your user or customer base.

Alexandra: Absolutely. Diversity is always important. Number six. The sixth ethical principles for the data ethics assessment at Swisscom are transparency, self-determination, impact on privacy, benefits to society, or the individual, governance, and discrimination.

Jeffrey: I like this. I think Nicholas can write a book or a guidebook on data ethics. We already have some good chapters outlined here with these six ethical principles. Speaking of which, I think it would be a really good thought exercise for all of our listeners. What are your six ethical principles at your company? If you don’t have them outlined, what would you imagine they should be? Alexandra and I would love to hear. If you send us an email to podcast@mostly.ai, we’ll review them and share our favorite ones on a future episode. How does that sound, Alexandra?

Alexandra: That’s a great idea, Jeff. Looking forward to reading those.

Jeffrey: Awesome. Thank you, Alexandra, for hosting, and thank you Nicholas for being our guest. This was a lot of fun. We learned a lot and looking forward to seeing you all next time.

Alexandra: See you next time.

The Data Democratization Podcast was hosted by Alexandra Ebert and Jeffrey Dobin. It’s produced, edited, and engineered by Agnes Fekete and sponsored by MOSTLY AI, the world’s leading synthetic data company.

Ready to start?

Sign up for free or contact our sales team to schedule a demo.
magnifiercross