💡 Download the complete guide to AI-generated synthetic data!
Go to the ebook
Episode 18

Data in action with Lisa Palmer from Splunk

Hosted by
Alexandra Ebert and Jeffrey Dobin
Lisa Palmer is the chief technical advisor of Splunk, the popular data platform designed to remove barriers between data and action. Besides working for Splunk, she is a university professor, a podcast host, and an author. She spent years at Microsoft, Gartner, and Teradata, building up her unique perspective on data and technology-related business opportunities. In this episode, Lisa shares her most exciting insights and data stories around tackling real-life problems with data, including:
  • fighting wildfires and winning F1 races,
  • how privacy-enhancing technologies like homomorphic encryption and synthetic data can be used for social good,
  • why we need truly complete datasets to address biases,
  • the role synthetic data plays in ethical AI and bias mitigation,
  • how to increase diversity, especially in tech and
  • how to think about data opportunities in times of disruption and opportunity.
Subscribe to the Data Democratization Podcast on SpotifyApple Podcast or wherever you get your shows. Listen to the previous episode and read the transcript about Synthetic data engineering in insurance and banking!

Transcript

Alexandra Ebert: Welcome to the 18th episode of the Data Democratization podcast. I’m Alexandra Ebert, MOSTLY AI's chief trust officer. I'm here with my co-host, Jeffrey Dobin, lawyer and privacy tech expert at Duality. Jeff, today's episode is a real gem. You and Lisa Palmer spoke about the power of real-time data, how companies can leverage existing technologies like homomorphic encryption and synthetic data to create real impact and even do good with data. You also touched topics of bias in AI and why Lisa thinks that synthetic data has tremendous potential to help here. You spoke about increasing diversity in tech and many, many more exciting topics. Could you introduce Lisa to our listeners?

Jeffrey Dobin: Yes, absolutely. We had an amazing conversation. I thought this episode is going to be one you are really going to enjoy. Lisa Palmer is the chief technical advisor at Splunk. She has spent time working for some incredible organizations like Teradata and Microsoft and was also with Gartner. She is also studying to get her doctorate. So she is incredibly smart, she's passionate, and she's great at managing her time because she has a ton of responsibility.

She is also a university professor teaching marketing, finance, behavior, and ethics. And she is a soon-to-be author with a book coming out soon as well as someone who is hosting a podcast, Much Ado About Data along with Splunk. She is really knowledgeable in this field. She loves using data to drive decisions. She can serve as an inspiration to many people who are listening. We're going to talk about how you can tackle some of the biggest challenges in data today.

Alexandra: Wow, what a powerhouse of a woman. I definitely envy her about her time management skills. I can't wait for this episode. Shall we chime in?

Jeffrey: Absolutely. Let's do it in three, two, one.

Jeffrey: Lisa, welcome to the Data Democratization podcast. It's great to be here with you today.

Lisa Palmer: Thank you for having me, Jeff. I'm excited to be here.

Jeffrey: Makes two of us. Let's get started and jump right into things. What's something that you want to hit our audience with? What's a good takeaway that our listeners can have right off the bat?

Lisa: I think it's really important for every one of us as a consumer. It's important to remember that if you are using a service and you are not paying for it, then you are the product. The data that they gather from you as an individual person, that is what they sell. Keep that in mind when you are using platforms, using services that you don't pay for. Nothing is really free.

Jeffrey: Things that come to mind on my side are social media platforms like Facebook or Instagram or TikTok. Are those the type of things that you have in mind or are there some other ones that aren't so obvious?

Lisa: Those are absolutely ones that come to mind, but there are other things that you may not think of. If you're ordering your groceries online, even though you're buying your groceries and you're paying for the actual products that you're receiving, it's very possible that they are selling that data about what it is that you purchased and what your habits are to other interested parties. It's obvious, I think, in many cases that our social media platforms are selling our data.

Really, anything that you're interacting with that is online, that you're using an app to interact with, you really just need to be aware as consumers that it's likely that your data is being shared in some way, shape, or form. Of course, we can dig into some of the privacy laws that have been put in place, say, in California, in other states that are getting those types of things put in place, and then, of course, what's happened with GDPR in the EU. Overall, you need to be cautious about where you're sharing your data because that is something that's being monetized by the organizations that you interact with.

Jeffrey: Interesting. What's the alternative or what's your recommendation? Does that mean that either just be cognizant and aware of it or are there other alternatives or if there are services that are free versus ones that you can pay for? Do you recommend paying for that service instead to ensure that your data isn't being used and exchanged there as you're paying for the service that would deliver your goods instead of them selling your data elsewhere?

Lisa: I think that it's about choice. Awareness and choice. You need to be aware that that is likely happening and make a cognitive choice whether or not that's okay with you and you're willing to do that or you would prefer to look for somebody that offers a paid service and assures you that that's not going to happen. My view of it, again, not being prescriptive. We're all adults.

You can make your own choices, but I think it's important that we elevate awareness and make sure that people are choosing what happens versus having the unintended consequences of going with something that is just comfortable to them without really knowing what the potential downstream impacts of that are. We're all adults. You can make your own choices, but be aware that if you're using something that's a free service, then they are getting something in return for it.

Jeffrey: Excellent point. We all have choices. In the podcast world, we have choices in what we want to listen to in the morning when we commute to work. That's what I used to do anyway in the olden days pre-COVID. Now, there's no more commute, but I still find time to listen to some podcasts while I walk the dog. I understand that you're launching a new podcast with Splunk called, Much Ado About Data. Can you tell us a little bit about your podcast?

Lisa: I am really excited about our podcast. Very much along the same vein of people having the opportunity to choose what happens to them, we're working diligently to raise awareness about data and the human impacts of what happens with the data that is available and proliferating in the marketplace every day. Not everybody gets excited or interested in data as a topic, and yet it's impacting all of us every day. On our podcast, we talk to all kinds of people from all different walks of life, many that are professionals that work in and around the data arena or the data space, and give us their insights about what are some examples of ways that you have seen data actually impacting humans on a daily basis?

Then we dig into some things like, "Hey, if you had a magic wand and you could fix the future, what's the one thing that you would like to see changed in the marketplace or how would you like to see data or technology used differently to ensure that it will serve humanity in the long-run?" That's the human spin on data that we're taking on my Much Ado About Data podcast.

Jeffrey: Sounds like a lot of fun. I'm curious to know on a personal note how you ended up in this role of hosting a podcast. Is this something that you wanted to do or did you just fall into it?

Lisa: I'll tell you that from a personal perspective, right before the pandemic really became entrenched in the US, so around January of 2020, I have an annual process wherein I pick a theme word for the year. I decided that it was time to try to impact particularly women in technology and elevate my personal profile to just be more visible. If you've ever seen a #SeeHerBeHer, that was very much what I wanted to do.

I wanted to change what I had done previously. I'm an introvert. I'm an engineer. I tend to be a little bit more quiet by nature and not always stepping into the limelight. I try to embrace that idea that it's important for others to see you when you're at a certain point in your career so that they have something that will encourage them or, best-case scenario, I love to inspire people if it's possible. I chose #MyVoice at that point.

Then the pandemic really, really descended upon us. As a result of that, podcast just exploded. I was invited to one podcast and then another one and I ended up being on over 40 podcasts in 2020, which was just total insanity. I never expected something like that to happen. As a result of that, when I had the opportunity to join Splunk, they had been wanting to launch a podcast with this human twist on data.

When I joined the team, they were just excited about the fact that I was really comfortable on podcast. I had a lot of experience at that point. They came to me and asked me to do it and I was excited to be able to be the host. The whole path was a little bit unexpected for me when I chose my annual theme. It became far bigger than I had ever dreamed it would.

Jeffrey: I love this idea of having an annual theme and picking a word or words that you can live by for the whole year. I'm curious and also excited to see what you come up with for 2022. You're also inspiring me and, hopefully, some of our listeners to do the same. Maybe we can use this opportunity to talk a little bit about your story, right? You shared a little insight on how you ended up becoming the host of this podcast, what you do about data, but can you tell us a little bit about your journey leading up to this point?

Lisa: I like to tell people that my career has really afforded me three different lenses for looking at the world. The first is that I was an IT practitioner for years, so my undergraduate degree is in engineering. I started out in network planning. I spent two whole months in an engineering function, learned all of the data structures, found myself in IT, and a career was launched as they say. That's the first lens I was in all different realms of IT with my most recent role being as a chief innovation officer, which was a combo CIO/CMO role.

The second lens was that of an enterprise seller for Microsoft. When I was in my role as an IT practitioner, of course, I worked with vendors with significantly-sized vendors all the time and Microsoft was one of my core partners. They came to me when they had an opening in their sales organization and asked me to consider this role. I had never done anything like that before. I'd never been on the revenue-generating side of the business and I thought, "What a fantastic opportunity to add some key skills to my portfolio of experience."

I worked for Microsoft in an enterprise seller role, in an enterprise sales leader role, just learned a ton in that capacity, was able to serve so many clients from that role. I really got the bug for being on the partner side of the relationships through my role at Microsoft. Then the third primary lens that I see the world through is during my tenure with Gartner, they are an IT research and advisory services firm, and I specifically led a team that was responsible for advising executive clients on how to use technology to drive digital transformation in our organizations.

In that role, I had about 900 executive clients that were under my umbrella of responsibility. It was just such a fabulous opportunity to step even further into that role to serve many customers from a partner lens, so that's a little bit of how I got to where I'm at today. With regard to Splunk, I took a frontlines role for a period of time because my mother was very ill and I wanted to make sure that I had time to spend with her.

I couldn't do that from the giant role that I had at Gartner, so I took a frontlines role and really took a step back for about a year. During that period of time, Splunk was actively recruiting me for the role that I have today and I felt like it all just fell into place. I am truly just loving the space that I've fallen into at Splunk with the opportunity to continue to serve clients and to leverage data, which I have loved since the very beginning of my career.

Jeffrey: Sounds like an interesting path and tying things together with Gartner and Splunk. If I remember correctly, Splunk sits in the upper-right quadrant as a leader in the symmetric quadrant and maybe some others as well.

Lisa: Yes, we do. It's a great place to be.

Jeffrey: Tell us a little bit about what it means to be a chief technical advisor at Splunk and maybe give a little context and background about what Splunk does for our audience.

Lisa: Yes, absolutely. In my role as chief technical advisor at Splunk, I focus on helping our customers to identify ways to leverage data in their environments to really drive their enterprise objectives. Of course, Splunk has core competency in real-time data and so that is something that, of course, during the pandemic, became even more critical than ever in the past that the information that businesses are making decisions of is truly current. It's in the moment.

Using data that was warehoused or even in data lakes became very difficult because none of that data was current anymore. The disruption that came about during the pandemic really allowed Splunk to shine, to step forward into the light with regard to how can businesses use this real-time data to drive the decisions that they need to make to provide better customer service to maintain business resiliency in their environments. Those are some of the topics that I work with our largest executive clients on, really helping them to drive outcomes in their business from the plethora of data that they have. Then, of course, two other key areas of expertise for Splunk. We're very well-known for security.

We're able to be very involved in the last couple of breaches that have happened in helping organizations to solve those problems, to be able to trace what happened during their breaches, and to provide that information to the broader community wherever possible so that we can help to solve security issues at scale wherever possible. Then observability is fairly new in the marketplace. It is the opportunity for us to help our customers to really see what's happening inside of their applications to identify areas of concern or challenge and be able to remediate those challenges, so that's something that's fairly new to the Splunk portfolio and we're doing exceptionally well with that.

Jeffrey: Interesting. When I think of Splunk besides security, I think about visibility and data-driven insights. You tied it all together really nicely there. I can tell you're passionate about what you do and really enthusiastic about the space. Could you share a couple of use cases or examples about how Splunk is used today to help solve some customer challenges or to drive data-driven decisions?

Lisa: I want to share a couple of really fun ones. One is a socially responsible one and one is entertainment-based and just super fun. The first is wildfires. As we all know, we've seen in the news, sadly, for a long period of time, particularly along the West Coast in the US, we have been fighting wildfires. One of the really creative ways that we have seen Splunk be used is imagine that in these situations that the electric transmission lines are often sparking fires in very rural or heavily forested areas. That makes it very difficult to see these fires until they become too large to fight.

It makes it very difficult to get resources there, et cetera. What if we knew sooner that there hadn't been a fire that was started in one of these areas? This was the premise that was followed with applying Splunk in these situations. Imagine a piece of rebar with a sensor on the end and they're stuck in the ground along the entire route that electric transmission lines follow, and then every one of those sensors is tied back into a Splunk dashboard. Immediately, if there is a spike in temperature along any place, regardless of how remote the area is, that Splunk dashboard lights up and tells people that there is potentially an issue there.

Now, let's map weather data over top of that and you'll be able to see things like wind patterns. If there is a fire there, what direction would that be moving and how fast would it potentially be moving because of the weather patterns in the area? Then we could immediately send resources to check that out. Is there a fire in that area? You can automatically deploy resources in a very quick fashion versus waiting for somebody to actually see it before you're able to send resources to fight those fires. That's a super fun, socially conscious way that Splunk is being used.

Jeffrey: I love that, so it can be used proactively to potentially prevent fires. Then if there is a fire, it can be used as a resource or a tool to help put it out and also get people to safety if you're tracking the wind direction and other forces at play. It sounds really interesting.

Lisa: That's exactly right. There's so many ways that that data can be used to protect the environment, to protect humans, to protect financial assets, people's homes, et cetera. That one is a really feel-good example of what's possible when you harness real-time data.

Jeffrey: Love it. You mentioned that there was another example as well. What's the other one?

Lisa: The other one that I'll talk about is just super fun. This is McLaren F1 racing. If anyone is a racing fan, you know that the ability to keep your cars performing at the absolute top level possible through every moment of a race, that's really what separates the winners from the losers. It could be half-an-inch difference when they're crossing that finish line, right? In the McLaren example, every possible, intelligent piece of equipment on their cars is those sensors are connected back into a Splunk dashboard. Through the entirety of the race, that data is just flowing into our dashboard.

We're using machine learning and artificial intelligence to identify anything in that data that we need to make the team aware of so that they can make the adjustments that need to be made to keep their cars performing at the absolute top level possible throughout the race. Imagine every one of those intelligent pieces of equipment on the entire car feeding that sensor data back in. It's just this massive dashboard across multiple screens that creates this cockpit environment for problem-solving live during the race. It's just an incredibly fun example of what's possible with Splunk.

Jeffrey: That's really exciting. I'm sure that there are lots of race car fans and also sports enthusiasts that are thinking of other ideas. It sounds to me that Splunk can be used for a plethora of use cases. You talk about preventing wildfires and operating efficiency and performance with race cars. Is there any limit to what Splunk can be used for or is it really just about capturing data and then using it to make these data-driven and optimization decisions?

Lisa: I'd like to encourage people to think of anything is possible. If we can record data in any format possible, it doesn't have to be structured. It can be highly unstructured data as well. Your imagination is really the only limit to what's possible.

Jeffrey: Then let's stay on theme here in terms of using our imagination that anything's possible. How can we use data to drive social good?

Lisa: I have some really arguably grandiose ideas about what can be done with data for social good. Jeff, you and I've talked about this before. I get really excited about what I think is possible. If we expand our aperture and think about creative partnerships between public entities and private companies and academia, if we really think about the possibilities of doing things like how can we solve the water shortage around the globe to some of the globe's most arid climates?

What are creative ways that we could address things like water shortages, like farming or agricultural challenges that lead to world hunger? What are ways that we could potentially go after climate change in a really creative way, all fueled by data? If we roll back all of the assumptions that we have that private companies won't want to collaborate with one another, what if we're able to peel that back and we look for solutions that are both purpose-oriented and profit-driven? I don't think we have to divorce one from the other to be able to drive these massive, socially active, social-for-good, potential solutions.

Jeffrey: Thinking about this even further than as a leader at Splunk, but then also I'm sure you work with many leaders at different organizations, how would you challenge them to think about data a little differently moving forward?

Lisa: Some of the things that I think have real potential. If we think about homomorphic encryption and the potential that it has, for example, to allow some of the biggest companies in the world, let's just pick somebody like PepsiCo and Coca-Cola who have been heavy competitors for many years. They're also big consumers of water. Both of them are. What if we could use technologies that include things like synthetic data and homomorphic encryption, and allow these big players in the marketplace to share their data without exposing any of the individual or the specificity of their data that would prevent them from being willing to share that data?

I think there's a combination of a multitude of different technologies that are now available that if we would peel back the assumptions that these organizations won't want to cooperate or collaborate and bring a series of these different technologies all to bear at the same time, I think we can start to solve some of these massive problems while also creating profits for those organizations that are interested in being involved in these kinds of problem-solving activities.

Jeffrey: You bring up some really interesting points about how organizations can embrace privacy-enhancing technologies in general to address different issues that can have an impact, not just on their business but on social good. I know that addressing diversity is also an important issue for you as well. Can you talk a little bit about different ways that diversity can be addressed and how we can do better or do more good at our own organizations?

Lisa: One of the most important things from my perspective with regard to diversity is ensuring that we have complete data sets. There is this fantastic book called Invisible Women. It talks about the fact that in many data sets, data is not disaggregated by gender, and so that leads to a myriad of problems. I'll give some examples from the book. One was an example for safety gear in manufacturing organizations.

All of the safety gear had been purchased for an average-sized man. In this specific example, they learned when they went and did a study around all of the individuals who were having their hands pulled into manufacturing equipment and accidents that they were finding that women were having this happened far more often. It was because the gloves they were wearing were far too big.

In the ultimate irony, the safety equipment that had been given to them to try to keep them safe was actually putting them at higher risk, but we didn't know that because the purchasing had been done for men and that data wasn't being separated or addressed by gender. It was difficult to uncover what the actual issue was. There's another one around safety equipment for police officers and something similar that the police officer safety equipment is designed for male anatomy.

In many cases, it's actually harmful or is leaving female police officers at greater risk because it doesn't fit them properly. When I think about things like how do we address diversity, equity, and inclusion, I want to make sure that we are thinking about building truly complete data sets, making sure that if you only have a portion of the data that you are looking for things like synthetic data to help you to fill those gaps.

If you are in a situation where you're trying to make sure that you're bringing more diverse talent into your environment, are you looking for things like- For example, there was a great example in Amazon's hiring, where they had learned that one of the algorithms that they had built to weed through applications had actually come to the conclusion that women were not a good fit for roles because, historically, men held most of the roles.

When they trained their machine learning models on their data, it was learning that men were better suited. It was automatically throwing out any woman's application that she must not be a good fit. Now, of course, Amazon dealt with that well and they don't use that anymore. There are lots and lots of examples of situations where these kinds of things happen, so we need to be looking for holistic data sets.

We need to make sure that we have a diverse set of eyes looking at the solutions that we're creating, meaning that you and I, for example, are probably going to see different things. We may see different possibilities. We may see different challenges. We may see different holes when we look at data or when we're trying to come to solutions. We need to make sure that we have as many diverse perspectives looking at our data and our solutions created by that data to elevate the level of diversity, equity, and inclusion potential and actually execute against that potential. Those are things that I think are really important.

Jeffrey: I love this and you're hitting on a really hot topic today around addressing bias and using ethical AI and the way that I understand and follow what you're able to share. By the way, that book that you mentioned, I actually purchased it after you mentioned it last time and it's on my list. Hopefully, the next time we speak, I'll have read it. It sounds to me like you're saying, "Hey, if you look at historical data, it's only going to capture what was available at the time." If we have been discriminating against large portions of the population in this example that you gave against women, then, of course, there's not going to be any data to train the AI.

That's going to show that women can be- I don't want to say "use," but that women can work in this area, right? If they're serving on a police force, for example, and we need a model to train what type of armor or vest that are going to fit on all sorts of people that are carrying weapons, we need to know what everybody looks like. That includes all shapes, sizes, genders, et cetera. I think you gave a really good way in terms of how to think about completing data sets and making sure that they're fully inclusive and diverse. If we don't have data from the past, how are we going to make sure that data is shaped appropriately for the future?

Lisa: We're actually baking in the biases that have been in place before through our machine learning models. If you're using the data that is based on history only, then whatever biases existed in history are actually being systemically deployed through machine learning, through artificial intelligence. It's more critical now than ever that we make sure that our data sets are complete.

Jeffrey: Lisa, if someone is experiencing this or they just had an aha moment listening and they think, "Okay, we actually are dealing with this problem. We're only looking at historical data and it might be discriminating a certain group," how can one address this ethical AI challenge or this bias in their data?

Lisa: Well, I'm a big fan of synthetic data. I think it has tremendous potential to help to address some of the biases that we see in our data sets. I expect synthetic data to play a more and more significant role moving forward.

Jeffrey: You mentioned a book before, Invisible Women, in case any of our listeners want to check it out, but rumor has it that you're also writing a book. Is this true?

Lisa: This is true. I am writing a book.

Jeffrey: Well, then tell me more, Lisa. What's this about?

Lisa: It's a deeper dive into a little bit of what we talked about earlier, which is the combination of many of the technologies that are readily available today, and how can we use those to solve some of the bigger challenges? Some of those are social challenges that we talked about earlier. I also believe that we have the opportunity to open entirely new ways of doing business, to create new business models, to create new ways to serve constituents in public sector.

I'll be diving into specific cases that I feel have the opportunity for us to create new and creative solutions by combining many of the technologies that are available in the marketplace today. Those include things like we have tremendous capability with networking today through things like 5G. We've got so much potential around data. I heard a statistic today that the incredible amount of data that is being gathered on a daily basis today is going to be fully tripled by 2025. You're talking about a lot of zeroes worth of data in this period of time.

We have this massive amount of data that we can use to train models. We have the ability to use synthetic data to create better outcomes with the models that we create. We have the ability to use things like homomorphic encryption to allow companies that are used to competing to be working in collaboration to solve some of the world's bigger problems. There is a litany of these different types of technologies that I'm going to be weaving together in the book to talk to business leaders about how they can solve some of the problems that they never dreamed they could solve.

Jeffrey: I am personally excited to get my hands on this book when it comes out and will definitely read it. You're hitting on a really good point here. There is an abundance of data that companies are collecting today. You mentioned it's going to triple by 2025. Right now, companies are only using a fraction of this data. It seems like this fraction will only become smaller if they're collecting more and more data, but you just hit on a couple of key ways that organizations can potentially leverage this data.

We'll see if, over the next few years, more methods or technologies are developed or certain technologies that you mentioned today are fully embraced to then help these companies take advantage of all the data that they're collecting. It'll be interesting to see in the future too how a company like Splunk helps companies further embrace this data and to see what they can come up with and accomplish.

Lisa: I am really excited about the future and there's so many opportunities for leveraging data even more holistically than has done today. You've absolutely nailed it that in spite of the fact that there is a lot of data that's being captured today, much of it is not being actioned against.

To me, that's a next opportunity level for many of our clients, is let's get your data into a situation where you can take action against it for your data to be able to provide insights that are immediately creating value in your environment, helping you to serve your customers better, helping you to open new lines of revenue, new lines of business, looking for opportunities to control costs, to work together more completely and more wholly with your partners. Supply chain comes to mind heavily in that space. I just think there's so much potential, Jeff. We're at a really exciting time.

Jeffrey: I totally agree, Lisa. One of the reasons why I asked to interview you for this podcast is because you're so knowledgeable and passionate about this space. Personally, I think that there's a great opportunity to increase diversity in tech. It's great to see a woman like you rise through the rankings and take power and also help others as you're doing it. What do you think that we can do, meaning me and you and anyone else listening? How can we increase diversity in tech?

Lisa: I think that it's really important for women- If you're in tech, please get out there and be seen no matter how uncomfortable it feels. I understand. I really truly do. No matter how uncomfortable it might feel, there's someone who needs to see you. There's somebody who will be encouraged or inspired to follow a similar path to yours if they see that it's possible through seeing you.

If you're a woman in tech, please be seen if you're someone who has the opportunity to be an ally to others to help to grow them into technology careers. I was talking with a gentleman the other day who said, "I had a person on my team and this was someone" He was a CFO. He said to me that he had this woman on his team who was just fantastic with everything technology that ever came up.

He said, "I just decided that I was really going to try to help her to grow her career in that direction because there was so much potential for her." I was inspired by his perspective that he identified that talent within his ranks and then was looking for opportunities to sponsor her. If you're in a position of power in any organization, underrepresented groups, they don't just need mentors. They need sponsors.

They need you to invest in them, give them connections, give them access to your power networks, give them access to meetings and information that will help them to raise their game. If that's you and you're in a position that you can actually sponsor individuals that are in underrepresented groups, please do that. Please think of that. In a very practical, tactical way, I always tell people this. If you are in a conversation and you ever see someone being spoken over top of, it's very simple. Point the floor back to them.

If you see somebody get spoken over top of, it's very simple. The next time there's a low in the conversation, you point it back and say, "Janice, I believe you were trying to say something. Would you like to finish your point?" That's something that every person can do to raise the profile of other individuals. As a result of that, we get to hear what they have to say. We get that diversity of perspective injected into our conversations, into our overall consciousness. That kind of engagement and involvement will encourage people to follow some of their dreams and that could be in the tech space.

Jeffrey: I love this, Lisa. It sounds like you're saying, "Be heard. Put yourself out there," number one. Number two, if you're an ally, don't just say you're an ally, but be an advocate for that person. Try to spotlight on those other individuals. Help lift them up and put them in a position to succeed and also show them the way if you've been there before. Make those connections. I love that last point that you made that, really, anyone can do anyone in the room, right?

If someone gets spoken over, which happens all the time, just make sure you pivot the conversation back to them and just give that person the same respect and dignity that you would want if you didn't get to share your idea either. It seems like a pretty common sense, straightforward idea. Even though it sounds like it's common sense, a lot of people don't do it, so thanks for sharing that.

Lisa: Thank you for giving me the opportunity to share it, Jeff. There are things that every one of us can do to make the future look brighter. I love for us to challenge everyone to think of, what can you as an individual do? It doesn't have to be something grandiose, something as simple as pivoting a conversation. You never know what it can lead to.

Jeffrey: Excellent. Lisa, before we hop off the line, I think that's a really good takeaway in itself. This could have answered the last question I was going to ask, but we'd love to put the mic right back in your hands and ask you, is there anything that our audience can do to help you or is there anything that you'd like to share with us while you have the floor?

Lisa: I would like for everyone to think of the possibilities for data because we are at a point in our history where there's so much disruption. Society has been disrupted. The way that businesses have been interacting between employers and employees has been disrupted. Our supply chains globally have been disrupted. There's so much change that's going on. This particular moment in time is what I call an inflection point and inflection points offer opportunity.

Think about problem-solving without all of those assumptions that we've had in the past. In many ways, the constraints that existed before the pandemic look very different today. I encourage you to think broader. Open your aperture to the way that you're thinking about in solving problems and really think about every interaction that you have is a data-gathering and data-use opportunity. If we did that, what is possible? I would say that pretty much anything we can dream of is possible.

Jeffrey: Amen to that. Lisa, I feel like we could probably talk for hours on the data subject, but I want to be respectful of your time. We are so grateful to have you on the show, so thank you for sharing some of your day with us, sharing some of your insight, experience, and also view on the world, and also giving us some really good tips and takeaways. I hope our listeners go ahead and check out your podcast, Much Ado About Data, which is brought to you by Lisa Palmer and Splunk. Then when that book comes out, we can't wait to read it. Thanks again, Lisa.

Lisa: Thank you for hosting me, Jeff. It's sincerely been my pleasure.

Jeffrey: All right, talk to you soon.

Alexandra: Wow, what a great conversation. I really love the part where Lisa talked about concrete steps each and every one of us can take to increase diversity in tech. Also, her approach of thinking more broadly when problem-solving and not letting assumed constraints stop you is truly inspiring.

Jeffrey: Indeed. Lisa was amazing to talk with and there are many insights and actions that she shared with us today, which makes this an action-packed episode. Alexandra, why don't we summarize some of these key takeaways for our audience? Shall we?

Alexandra: For sure. Takeaway number one of today's episode with Lisa Palmer. If you don't pay for a product, be aware that you are the product and that it's quite likely that your data is being sold. Once you are aware, you can make a conscious choice, whether you're willing to accept that or rather seek out a more data-friendly alternative even if it might cost something.

Jeffrey: Really makes you think if it's worth sharing all this information on these social media platforms that so many of us use today. After talking to her, I had some reflections myself and I was thinking, "Should I really post this next time?" There's a risk/reward and there's also just that being aware if that's a choice you want to make, so a really good point by Lisa there.

We also heard Lisa talk about her upcoming new podcast through Splunk that she'll be launching very soon. It's called Much Ado About Data and it's going to highlight the impact data has on humans today and what we should change in the way we use technology to have a more positive impact for individuals as well as a society tomorrow.

Alexandra: That sounds exciting. I'm definitely going to listen to that once it's out. Next up is takeaway number three. Especially due to the pandemic, real-time data is becoming more and more important, but, of course, not only collecting it but also bringing your organization a position to take actions based on it.

Jeffrey: True that. It's always about taking actions. I think the example of how real-time data can help us prevent or even stop wildfires is an awesomely powerful example that illustrates the impact of being truly data-driven as a private or public organization.

Alexandra: That was a great example.

Jeffrey: Takeaway number four, data has tremendous potential to be used for good. According to Lisa, it's possible to find solutions that are actually both purpose-oriented as well as profit-driven. The secret to that creativity is combining existing privacy-enhancing technologies.

She listed a few like homomorphic encryption and synthetic data to bring companies together and into a position where they can collaborate with their competitors, as well as their partners, and incorporate feedback from researchers at the same time without exposing any sensitive data assets or putting this data at risk. By the way, how using this technology that's readily available today to solve some of the biggest problems we have in our society, this is also a hot topic and one Lisa wants to talk about in her upcoming book.

Alexandra: You're right. I think that's also a great read once it's out. Key takeaway number five. To address diversity, equity, and inclusion, especially in the context of AI, we need to make sure that we have truly complete data sets. Because if you have a data set that is lacking enough examples from minority groups, it's to be expected that the algorithm will perform worse on those and that you will end up with certain biases. Lisa also pointed out that she's a big fan of synthetic data in regards to algorithmic fairness and that she sees tremendous potential for it to help AI become less biased as it could fill all the gaps whenever you have any incomplete data sets. I can, of course, only subscribe to that point.

Jeffrey: Absolutely, I'm with you there. We all know that bias in AI is really like a multifaceted problem. Besides getting your training data right, just having more diverse teams is also an important piece of the puzzle. I really like that Lisa shared and pointed out some easy-to-do steps and action items all of us could take to really increase diversity in tech and really help other people at the same time.

The first one was if you belong to a minority group and you made it in tech, get out there and be seen. Even if it's uncomfortable, you got to put yourself in uncomfortable positions or situations and be heard. There are other people out there who need to see you and need to hear you, so go inspire those people to take a similar path. That's part of your responsibility.

Second, if you have an opportunity to help someone grow into tech careers, then be an ally to them even if you aren't the minority person. Underrepresented groups don't just need mentors. They need sponsors who give them access to their connections, to their powerful networks, and any meetings or information that they might be privy to that others aren't aware of.

This can be really helpful in propelling other people forward. Then third, she said, remember that it doesn't always have to be about the grand gesture, right? Simply noticing when someone is being hushed or overspoken in meetings and then pointing the microphone or attention back to them in a very simple and polite way can encourage more people to speak up, give them the opportunity to be heard, and really empower them to become leaders and to follow their dreams.

Alexandra: That's such a good point, especially because today, so many people have the expectation of them that it has to be something grand, something really impactful. If it's not that, they're not going to do anything. I think that's a pity and Lisa really motivated me and I hope also many other people to change that and really appreciate also the small steps and what it could lead to.

I think this was one of the aspects for me why I found listening to Lisa's episode so inspiring. Let's get to the last takeaway, takeaway number seven. No matter in which position you are, Lisa encouraged you to think about the possibilities of data and the problems it can solve more broadly because there are so many assumptions that tend to hold respect. Lisa highlighted that just with the current times, we are at an inflection point in history where the pandemic caused lots and lots of disruption for businesses, supply chains, for society as a whole.

At the same time, it also forced us to change the status quo and created many opportunities that we purposely thought weren't possible. She said these constraints we have today look much different than the constraints we had before the pandemic. Therefore, we should really shift our perspective beyond these assumed constraints and look into the possibilities we have with data more broadly. I think this is also a really, really good point she made.

Jeffrey: I totally agree. We're collecting more data than ever before and we're still only using a very small percentage of it, but the amount of data we collect moving forward over the next few years is expected to grow exponentially. There's a tremendous opportunity. If we can figure out how to use more of this data to make data-driven decisions, we can do all sorts of things. There are tons of possibilities here and this is a really exciting and good point that she made.

Alexandra: Agreed.

Jeffrey: Those are our seven takeaways from today's episode with Lisa Palmer. Thank you to our audience for listening. Of course, if you have any questions or comments about the topics we discussed today or suggestions on what we should talk about in the future or people we should interview, please write us an email to podcast@mostly.ai. As always, if you have a few seconds to rate our podcast or subscribe to our podcast on Spotify, Apple, or whatever you're listening, we would be most grateful. Thank you so much.

Ready to try synthetic data generation?

The best way to learn about synthetic data is to experiment with synthetic data generation. Try it for free or get in touch with our sales team for a demo.
magnifiercross