Jeffrey Dobin: Good morning and welcome to the Data Democratization Podcast! I'm Jeffrey Dobin, a privacy expert at MOSTLY AI. Organizations leverage our technology to democratize data by generating truthful, privacy-compliant synthetic alternatives to offload privacy sensitive data. If you're unfamiliar, - and I can't claim to have come up with this analogy, but I really like it - AI generated synthetic data is a bit like diet soda. You get all the fizz without the calories. With synthesis, you get all the richness of data without exposing any personally identifiable information. Today is Data Privacy Day and to celebrate, we're releasing the first episode of our brand new podcast dedicated to exploring inspiring stories from the frontiers of data and privacy management. This podcast will bring you the most influential data leaders who will share their most valuable insights and advice. Our first guest is Shampa Chatterjee, Director of Data Privacy at Silicon Valley Bank. Shampa has spent almost 18 years at American Express and was working in the field long before privacy was even a thing, which makes her an OG in the data privacy world. We're so grateful to have her. Today she is co-chair of the IAPP's chapter in Phoenix and also a member of the Women Leading Privacy Advisory Board. Shampa, Happy Data Privacy Day! Thanks for joining us.
Shampa Chatterjee: Thank you so much, Jeff. And thank you for having me. It's indeed an honor.
Jeffrey Dobin: So let's jump right into things. I understand you kind of fell into the privacy space, but that you've remained working in this area by choice. Why is this? Can you share your story?
Shampa Chatterjee: Yeah, sure. So I'm going to start off with a little bit of a disclaimer that the views that I express here are my own and they do not represent those of my employer. Of course, you know, my opinions and viewpoints are shaped by and informed by my work in the financial services sector, here in the United States, talking about the last 25 years or so.
Being a computer science major in grad school, I really started as a programmer at American Express about 18-19 years ago, and I likely would have stayed along those same lines back in the day. Obviously, data protection didn't used to be a former profession. Organizations would probably have one or two people sitting within either their CISO organization or within compliance or within risk that would work on things related to privacy, among other things. Being in the financial services space, though, has been helpful for me.
So, as you know, here in the US, we've had sectoral laws such as GLBA, that regulates consumer privacy and the concept of non-public personal information. Working for a big bank, I was certainly aware of the risks and obligations, which was definitely helpful.Very early on, I also started supporting international markets around the globe, which allowed me an appreciation for the differences in the way privacy is perceived in non-US countries. Gave me an appreciation for the regional nuances, if you will.
For example, in the EU nations, privacy is considered a basic fundamental human right, which is not the case here in the US. The awareness on the main street among laypeople is so much deeper rooted than it is here and the regulatory framework is even so much more mature. The US doesn't even meet the adequacy notion of EU member states, for example. And then what you're seeing in the Asian countries is the rapidly evolving the privacy regulatory landscape. India has the sweeping national privacy law taking the street as we speak. Latin America, even in the past few years, you've seen comprehensive privacy laws in countries like Brazil with the LGPD. Long story short the revolution of the regulatory space has been fascinating to watch.
So going back to my days at American Express very early on, it kind of fell into my lap to create a consent management platform for our international markets. You know that at that time, this was being handled within a marketing department, so that's where I sat. But it was an opportunity for me to design customer digital journeys, and UIs that were complying with different international consent requirements. It kind of allowed me to put myself into my customers' shoes. What would I like to see from my bank? Or what assurances would I like to get from my healthcare providers or for my children's schools that they were doing the right things when it comes to handling my personal and sensitive data. It's a little bit about advocacy, isn't it? Being an advocate for the customer which was very close to my heart even before then.
In my personal life, serving the community, advocating for those who are not as privileged as I find myself to be has always been very important for me my entire life. As my dad used to say or likes to say, it's a bone you're born with, not something that you chance upon or something that you find along the way. Over the years, I've volunteered extensively with organizations such as Habitat for Humanity, Special Olympics for children with special needs, several local food banks, shelters for the homeless, oftentimes raising grant monies to support these organizations financially. I've taught financial literacy and workplace readiness in classrooms to children in schools for many years. And you know what? It was not in one aha-moment that this connection became clear to me. It was really a gradual journey that unraveled one layer at a time. So here I am that idea of customer advocacy or advocating for someone. It kind of carried itself into my professional life. Of course, egged on by the external triggers, not to mention the high profile data breaches and cybersecurity incidents that regularly come to our notice and also supported by a company that truly cares about privacy on data protection.
I would also like to mention organizations such as IAPP that you mentioned in the intro, the International Association for Privacy Professionals. It's really the only organization of its kind that has really given a lot of freedoms to the professional privacy. In fact, I was the first badge that certified the CIPT, which is their technology certification back in 2009. I've been with them since then, volunteering for them and most recently this year being part of their Women Leading Privacy Advisory Board. And I'm really excited to be. Amazing women and I learn a lot from their journeys and their world. So very exciting times for me, indeed.
Jeffrey Dobin: Absolutely. And I can tell it's definitely a bone that you were born with and how you've helped many others along the way, not just at your organization but also in your free time in helping others. And that's a great quality. You've been able to empathize and really put yourself in the shoes of others, to look at them from their vantage point, whether it's privacy or whether it's someone who isn't as fortunate. I've worked with other organizations in the past, like Habitat for Humanity or the Special Olympics, and I think that's really great that you've done that.
Shampa Chatterjee: Empathy. That is the word that I was looking for. You hit the nail on the head, thank you!
Jeffrey Dobin: Let's go back 20 years or so for a minute. You mentioned something that sounds obvious to me now that you said it, but I haven't heard this term before. You mentioned consent management platform. Was this a new thing at the time in the early 2000s? Or was this something that already existed? And how did you go about creating that? Can you share a little insight about what that means for the lay person like myself when you talk about that?
Shampa Chatterjee: It was probably not even a term then. I think I picked that term up later on in my career. It was just a marketing platform, they called it marketing because, the idea of consent and choice, plays a very big role in marketing and how we use customer personal information to generate marketing or to disseminate marketing to them. So I'm that time. I don't think this was called a consent management platform.
It was in-house capability that we were developing that basically allowed the customer to make their preferences known to American Express. So what kind of marketing are they interested in or what kind of offers would they like? What kind of offers do they not want to receive from American Express and so on and so forth. So it was just designing the entire front end of the customer experience. Also during that customer journey, internally creating a system of record for that data and then disseminating that information to marketing so that they could then apply it and honor those choices in a way that was required by the laws of that particular country. So that was the entire in-house project or initiative that I led.
I believe I traveled to 10 countriesWe rolled that out initially for requirements gathering, working with legal compliance with business folks and all of those markets. And then eventually, after rollout, I had to go back again to do the training and kind of help with the implementation activities.
Jeffrey Dobin: That's so fun that you've got to travel to 10 countries or so while also working. Question back about consent management platform:It sounds like you put these things in place that are now required by law in terms of getting people to opt in or consent to different things. Were you doing this because you were data driven and it was important to capture this information to better serve the customers from a marketing standpoint or you were doing it because that's what was required for compliance or somewhere in betwee?
Shampa Chatterjee: Great question. I think at the time it was more because we were customer centric. American Express as a brand is known for being a very customer-centric, customer-focused brand. At the time, although some of the countries may have had certain consent requirements, I think we were doing it more to be customer-centric, the get-to-know-your-customer kind of a brand. We wanted to know what you want, what you like. The idea there being that your opted in customers are probably your most engaged customers and your most loyal customers. I think that was the incentive behind creating that not so much the requirement driven from a compliance and regulatory need.
Jeffrey Dobin: Well, it's funny, or maybe it's not funny how things have changed over the years and evolved. You mentioned before, basic human rights - from one perspective, if you're in the EU versus here in the US, having CCPA and other laws up in the pipeline from different states coming to be. Now that we have some context and really understand what motivates you and one of those things seems to be helping others, let's give the listeners some actionable takeaways they can apply themselves. What are your three recommendations for the audience?
Shampa Chatterjee: I think some of the action of the recommendations might be to create a policy framework that you can hang your hat on. Privacy can sometimes be viewed as somewhat of a subjective, a bit judgmental at times kind of a concept. When you put pen to paper, you not only crystallize your thoughts, but you are able to create something concrete that you can lean on, you're able to connect with other maybe available frameworks in your companies, such as information security programs or your data governance framework, etc. A solid policy framework is also a good way to evidence to the regulators that you actually are doing this right when you have got this. Have that policy, that concrete policy framework, create that, envision that.
The second thing that is very important for privacy is: follow a principle based approach. Privacy is about trust and fairness. Creating a program that is built upon a set of principles that are rooted in the core values of your company will keep you ahead of the curve. Principles are a set of promises that you're actually making to your client or to your customer. The ideas of collection minimization, retention, limitation, transparency, allowing individual access, maintaining integrity and quality of your personal data, etc. So once you believe in these, you will not find yourself having to respond fresh to each new regulation that comes along your program, which you have built around these values and principles, likely already accounts for it. Privacy for you then makes up more than a compliance check box exercise. That's the second thing.
The final thing I'd like to touch upon is privacy risk management. If you follow a risk prioritized approach to privacy, it breaks it down for you and for your company. At the end of the day, privacy is a risk discipline, so being able to formally put it into your enterprise risk framework and risk management strategy is important. Privacy can be viewed as not just a compliance or operational risk, but also as social reputation risk for your organization. If you are able to follow sound principles of privacy management, this will lend for the credence in something that can sometimes come across as being at odds with business objectives. As an example, if you're able to assess and quantify inherent risks and let's say a particular process or system or platform or your vendors even - this will give you a good sense of what kind of control environment you should be looking to build. So just applying, you know, risk management. 101 principles is very helpful.
Jeffrey Dobin: There's some really good points here, so maybe we can dig in a little deeper on a couple of these. Let's start with the last one. So you just recently mentioned, quantifying that risk. For someone who maybe is new to privacy or the space, how does one even go about thinking about this from a multiple step approach, creating a framework like that. What does that look like in practice? Can you share a bit of insight on how to create a framework or an example of what a policy would look like? And then how you actually measure those things to make it more objective.
Shampa Chatterjee: Thankfully, there are models available that you can leverage very easily, especially for risk assessment. That's an easy one, because, if you are able to assess your inherent risk, the two factors that go into quantifying inherent risk would be your impact and the likelihood of that risk materializing. And then, if you put numerical scores against those, you should be able to come up with high, medium or low inherent risk. There are markings available that very quickly leverage. Once you have that, the next step would be to quantify the effectiveness of your controls because that will give you a good sense of what residual risk, if any, you are left with. The stronger your control environment, obviously, the lesser you have to deal with any residual risks and vice versa. So I think quantifying things in privacy makes a lot of sense. Privacy is not as mature discipline as something like information security is, so all the more reason to make it quantitative so that you can showcase to management or to your c suite what the limitations are of, let's say not following through something that you are recommending or what could be the fallout in terms of regulatory sanctions or fines and things of that nature? The more numbers you can add to your presentation, the better. This is very important, especially for privacy.
Jeffrey Dobin: You mentioned info security and we know organizations are spending lots of time, money and resources in in infosec. When you think of privacy, do you think of this more as something relating to compliance? Or do you think that this is almost like insurance for the company and it's something to do proactively? Or do you see some organizations applying privacy rules or applications in house more proactively?
Shampa Chatterjee: It started out being a compliance kind of thing that you had to do. Like a compliance burden almost. But now it is morphing itself into more of a business enabler or a differentiator in the marketplace. Privacy is all about trust. And if your clients are able to trust you with their personal and sensitive information, that's where you differentiate yourself as a business in the marketplace. What had started as a compliance burden, your compliance obligation is slowly morphing into something that people are doing or should look to do proactively. And that goes back to my earlier point about building your framework in creating the policy structure and building it around values and principles and not building it around a specific regulation or requirements that are provided by certain regulations.
Jeffrey Dobin: What is your one prediction for 2021?
Shampa Chatterjee: A need for discipline and sound data management practices will become very important for organizations in 2021. The idea that data is your key strategic asset has been around for a while now.Knowing your data - as cliche as that might sound - is really key to the success of an organization. Organizations realizing that in order to unlock the potential of the data that they had or that they're collecting, they need to have a good data governance strategy - again founded upon those ethical principles that we spoke about earlier. A chief data office is founded upon strong data governance principles and a framework will make a successful data office. So when I say data, I mean personal data and sensitive data. What are you collecting? What are you processing them for? How are you classifying them? Where are you storing them? Cloud or on on-prem? What countries are they going to? Are they flowing across borders? Who are you sharing them with - for example, is there third party data sharing? So all of those things are very, very important to not only know, but also put those sound governance strategies around each one of these areas. It's already out there, but I feel there will be further fine tuning of this area in 2021 in a very big way.
Jeffrey Dobin: You mentioned some interesting things here around PII, personally identifiable information and knowing your data and unlocking the potential of the data that's being collected. You talked about cloud versus on-prem and cross-border data sharing. In the same thought of looking to the future. What privacy enhancing technologies are you most excited about that will help you accomplish some of those goals?
Shampa Chatterjee: I sit in the risk and compliance organization, right? So on most days I'm telling business what they cannot do, either because there is a law out there or there is a policy. What I would like to do is provide them with alternatives or ways that they can achieve their objective, but in a privacy friendly way. This is a really exciting area for someone like me to totally geek out on. The idea of embedding privacy preserving technologies in your most critical systems and business processes, that also really fulfill the idea of privacy by design and by default. One of the foundational pillars in the famous privacy by design framework that was created and socialized by Dr Ann Cavoukian is exactly this: embedding privacy by design and by default. And there's so many technologies that are out there already.
I personally have used automation in a big way, and I cannot really stress enough about the value that automation brings to the table. Automated controls are better controls because they provide consistency. They provide reliability. They provide a solid audit trail. Of course, a lot depends on exactly how you're implementing them, and how you configure them to work most optimally in your environment, etc, to give you the best value.
But as an example of automated privacy impact assessment tool, we all use privacy assessments today. I really started when we used to do these on Excel spreadsheets. Very manual, very tedious, very hard to do version control. But now you have amazing tools, workflow management that lets you manage this whole privacy impact assessment in automated fashion.
DSRs - you and I both live in a world where data subject requests have become so huge today and their solutions that help organizations facilitate these enquiries that are made by individuals, who want to exercise their data rights. Tools for data discovery. We talked about data discovery a little bit earlier. This is done through automated technology that again helps organizations determine and classify the kind of PII, the kind of sensitive data that they own and process. Deidentification solutions that help data scientists and researchers and other stakeholders to derive value from data sets without compromising the privacy of their data subjects or their clients. All of these tools.
This is a very exciting area and one that has been growing steadily over the last few years. I think around GDPR or right before GDPR is when we saw a whole slew of these tools come out, hit the marketplace. Those were kind of the initial versions. Now you've seen a lot of sophistication that it has reached. It's a really exciting area - privacy preserving technologies that are out there and people are building on a regular basis.
Jeffrey Dobin: In the breath of PII and deidentification - do you see yourself using a synthetic data solution? Are you already using other tools today to anonymize data?
Shampa Chatterjee: Yes, we use some tools. The idea of synthetic data is just fascinating to me. I just feel like that is a solution to all of my problems. Literally. Being able to, get my hands on this AI-generated synthetic data. I know very little about it, but I know enough to know that it will solve a lot of my problems that I deal with on a daily basis. This is data that is completely anonymous. It looks like my client data, but it doesn't relate back to any of my clients, which is awesome. And it can be leveraged in the same place where today I am having to use production data because I don't have access to this synthetic, non-private, non-personal data. I think this is what gets me excited, gets me going, is to know that there are companies that are really investing time and effort and investing a lot of money to come up with these solutions. I see banks such as mine and other institutions and even in healthcare, are in all of these other areas. I'm sure that these are absolutely essential strategies that organizations must embrace for success.
Jeffrey Dobin: The themes that I'm picking up for your 2021 prediction. and for technologies you're most excited about really revolve around unlocking the data that is there and also automating things to be more efficient in your work.
Shampa Chatterjee: Absolutely. It's all about data automation.
Jeffrey Dobin: Shampa Chatterjee is the Director of Data Privacy at Silicon Valley Bank, co-chair of the IAPP's Chapter in Phoenix and a member of the Women Leading Privacy Advisory Board. Thank you so much, Shampa. It was a pleasure getting to know you today.
Shampa Chatterjee: Thank you so much. I enjoyed it thoroughly. E
Jeffrey Dobin: I hope you enjoyed listening to the very first episode of our show. We will be back soon with another exciting story from the frontlines of data and privacy. Please follow, download and subscribe to the Data Democratization Podcast. If you have any questions, please send us a voice recording to podcast@mostly.ai! Happy Data Privacy Day, everyone! Make sure you take this opportunity to reflect on your privacy. health, be it in a professional or personal setting. Thank you for listening and see you next time.