Alexandra: Hi, there, welcome to the Data Democratization Podcast. I’m Alexandra Ebert, MOSTLY AI’s Chief Trust Officer. In this episode, my co-host, Jeffrey Dobin, a data protection expert and privacy lawyer from Duality Technologies will be talking to an old friend of his.
Jeffrey: That’s right. We hosted John Frushour, Deputy Chief Information Security Officer from New York-Presbyterian Hospital. For those of you that don’t know, John previously managed enterprise security for a healthcare AI service provider, built large-scale government security architecture, and learned leadership skills as a marine.
Alexandra: Wow, that’s a super impressive resume. Where do you know John from?
Jeffrey: We go back a few years. I’ve actually known John since I think 2016 when we first started working together on some security projects which you’ll hear about later. He’s been doing some amazing work at New York-Presby, driving data innovation as a deputy CISO. I thought his story is not only something we should cover, but I thought it was something that our data gurus out there would love to hear about.
Alexandra: Indeed. Data stories don’t get any more frontline than this. John shared some really exciting projects with us from fighting opioid abuse with data, to how they scaled operations during the COVID-19 crisis in New York. There’s so much food for thought here. We should jump right in.
Jeffery: Good morning, John. Thanks so much for joining us on the Data Democratization Podcast. Good to see you again.
John: Is this where I’ll be speaking with Joe Rogan or? Okay.
Jeffrey: Obviously, John and I know each other. He’s a fun guy. We actually go back a few years. I first met John when I was working at Blackwood formerly known as BAI. We’ve developed a nice little friendship. I asked him if he’d join us in the podcast today, and he was nice enough to agree. Thank you for joining. Good to see you.
John: Of course.
Jeffrey: Let’s jump right into things, John. Can you briefly introduce yourself and share a little bit about your professional background with our listeners?
John: Yes, of course. As you said, I’m John Frushour. I’m the deputy CISO, a Chief Information Security Officer at New York-Presbyterian Hospital. We are very proud of being the nation’s number four hospital according to the U.S. News & World Report. It’s a large healthcare conglomerate spread amongst most of the five boroughs of New York City. Let’s see, prior to that I was in a similar role at Nuance Communications. I handled enterprise security for their revenue-generating divisions. I think it was the director of InfoSec.
Then before that, I ran a brief stint at a company called NetBrain. Then I worked at Motorola for a couple of years doing large-scale government public safety architectures. Before that, I was a marine officer for almost 20 years. Yes, that’s me.
Jeffrey: I see that you’re representing right now.
John: Yes, I got my shirt on today. Got to look good for the camera.
Jeffrey: Very nice. I was going to ask this later in the interview. Maybe go right now. Since your career basically started in the military, and first off goes without saying, but I think it’s important to acknowledge it, thank you for your service.
John: Of course.
Jeffrey: People like you give us the freedom to say what we want, do what we want, and we’re grateful. Appreciate that, John.
Jeffrey: Can you share a little insight or background on how you got started in the marines and what lessons that you learned from that experience that has impacted you either on a personal level or even in business?
John: How I got started was I was a screwball in college. I started college after high school, and I didn’t do very well. I didn’t pay attention. I was more concerned with sports and the opposite sex, and so basically enlisted to get my head on straight. I had an uncle of mine who very influential. He was a marine and squared me away. That remedied that situation pretty quickly. I was a reservist for a while and took my time with college and made sure that I could balance all those things together.
Ultimately, I got some of the best leadership training in the world. I eventually, what we called, converted over to the dark side. It’s called a Mustang Officer, a prior-enlisted officer. I became a Commissioned Officer. I think that’s where I really got my knack for understanding what makes a good leader, the difference between management and leadership, and then that was all tested with many overseas deployments and a lot of troop leadership. I’d like to think that I got trained by the best in the world.
The technical side of my background really comes later in my career as an officer because when I enlisted first I was in the infantry which has nothing to do with what I do now. Later as an officer, I started to get intrigued by computers. I’ve always been a geek. Networking is where I centered, and then eventually InfoSec. As a normal progression, I think a lot of block and tackle IT specialists eventually mature into InfoSec.
Jeffrey: How did you segway into InfoSec while you were still in the marines?
John: My last tour was at an acquisitions command. I was pretty good at networking. I had a bunch of Cisco certs. I had some VMware certs. I’d mastered large tactical networks, large-scale deployments, and large routing topologies. I’d gotten my master’s degree. I focused really hard on graph theory and encryption in my master’s program. I think it was just a natural progression of stimulation. There are only so many dynamic routing tables you can look at before you’re like, “Uh.” It gets boring.
InfoSec was a stimulant. It was like, “Whoa, what if an attacker were to change that routing table? What if an attacker were to insert something and spoof my hash or password spray?” Whatever it was. I think it just became a natural progression later at that last command I was at.
Jeffrey: Awesome. Are you still applying lessons learned or I guess managerial skills that you learned back then today in New York-Presby?
John: I think you should ask my team probably because they’re sick of me using military acronyms and calling things “force multipliers” and saying “Geronimo” and all these military, marine-y ease that comes out of my voice. Luckily, I have a couple of people on the team that are also prior-enlisted or whatever, and they luckily they tell me like, “Oh, come down, come down, no one understands that John.” It’s the fundamentals of leadership, right?
John: You put your troops in front of yourself. My uber goal is to make sure that they’re well-fed, fat and happy, stimulated, motivated. If they do a good job, then my job is easy.
Jeffrey: I think that’s something that we can all apply like treat others the way you want to be treated, and put others before yourself, and you are saying you put your truth before yourself, right? That’s your team, you take care of them, they’ll take care of you and take care of the group.
John: Yes, absolutely. There’s a lot of leadership by example I think too. I think in a technical role if you go in my house upstairs, my daughter’s closet has a 36U rack in it with a bunch of virtual servers and ops and a 4U storage array, and that’s my little lab. I think that as a technical leader I’ve got to be 80% as competent as the engineers. The 20% I can accept. When they start really going off the rails, I’m like, “Yes, whatever, I don’t understand that but do good things.”
The 80%, I need to understand DNS. I need to understand DHCP. I need to understand threat modeling and threat hunting, and how an endpoint protection agent blocks threats, and how DLP manages regex expressions. I got to get the 80%, but the 20% that’s them.
Jeffrey: I’m surprised that you’re using your daughter’s closet because I imagine at a certain point she’s going to take that over 100%.
John: Nothing would make me more of a proud papa than her. She’s really interested in the Mars Helicopter right now. I keep showing her pictures from JPL because I used to work at JPL when I was in the service, and her eyes get real big and I’m just like, “Oh Daddy’s going to have the first Martian lander.” For her astronaut, I should say, but anyways.
Jeffrey: That’s pretty cool. Let’s fast forward in your career. Right now, you’re at New York-Presbyterian Deputy CISO, for those that don’t know, what does the Chief Information Security Officer do on a day today? It sounds like you’re managing a team, but what does that entail?
John: In the security world, we have a few teams. One of our teams is there is our Security Operations Center, which’s probably the easiest to recognize. These are the individuals that are watching the wires, they’re watching our perimeter. They’re getting alarms, they’re getting hundreds and thousands of events every day and they’re parsing this and saying, “Okay, well, looks like Sally installed something bad on her computer, or it looks like Billy clicked on something he shouldn’t have. Or it looks like our CEO is getting a lot of spear-phishing mail.”
They’re watching our environment, and putting all these different events together to try to figure out where an incident is worthy of investigation. Events turn into incidents, incidents turned into a human investigating them. That’s the SOC. We have an engineering team that does kind of the block and tackles InfoSec stuff, proxying, mobile device management, certificate management, a lot of privileged account management because you have a lot of that in IT.
We have a forensics and vulnerability team, they’re scanning, they’re trying to find the flaws, the holes, the weaknesses in our own infrastructure before the attackers do. They’re constantly probing and prodding and aggravating people because they’re like, “Hey, look that’s broke, you need to fix it.” We do a lot of forensic investigation, forensic recovery. Recovering computers and phones and servers and trying to do file carving and reconstruct the actions taken during one of those incident investigations.
We also have a risk team and that’s a big one right now, because of the solar winds hack and the solar storm itself. They’re doing a lot of assessment of third-party risk when we make an agreement with a vendor, how do we know that their environment is secure? We’re going to connect to them and open our doors to them, how do we know that they’re treating our information in the same way we would?
There’s a lot going on there. We have an identity management team that is really responsible for aggregating all elements of identity, which is pretty complex in a clinical environment, and then creating these little containers that have as much enriched information as we can so we can do a good job of saying that, “Nurse Sally rates access to this, nurse Billy doesn’t have that, so they don’t get access to that,” and nurse Petey’s birthday is in 30 years, and at that time, we’re going to take away their credentials, or whatever, making that up.
Jeffrey: You hit on a bunch of things and this reminds me of my time at Blackwood, where we saw a serious investment in security operations, digital identity, data protection, and you hit on a number of things here. Going back to maybe talking about the SOC, Security Operation Center, or even other aspects of your security team in general, how are you thinking about automation in the context of cybersecurity?
You mentioned before events occur there, there are incidents, and then people are doing these investigations. How do you balance automation and I guess, human manual effort, and how does that all tie in? What type of tools do you use to do that? How do you automate things? Can you share a little insight there?
John: It’s a good question. Security automation, if you look at a security InfoSec program, let’s say we’re going to look at the lifecycle of an InfoSec program. When you first start out, mostly you just focus on aggregating data, getting the data, collecting the data, data is events and this is everything from firewall alarms and traffic to endpoint alarms, to access logs and authorization and all the triple-A stuff, and you’re just collecting data that’s like level one.
Eventually, you’re going to mature into turning that data into events. I only care about these types of events, these particular pieces of data, eventually, you mutate those in the incidents and humans start investigating, humans say, “Well, I saw this event, that event, and this other event, I string together that looks like something that might be worthy of our attention, I’m going to go investigate it.” That has a very real operational tax on your staff, you can’t investigate everything.
You want to investigate everything because you don’t know, everything looks anomalous when you’re first starting out. Automation becomes a very critical need as your InfoSec program matures. Automation becomes very much a daily component of everything done. “Well, we saw this incident last night, we saw 10 of them again this morning. Why don’t we write a job or rule or create a script or do something that can filter out those incidents and let us focus only on the one that really matters?”
Maybe you wait for them, maybe there’s a threshold, maybe there’s some enrichment done, data curation is a big word as big buzzword today, data curation is basically just funneling and focusing your data events or your incidents that you want to investigate. SOAR tools, we have a SOAR tool at NYP which is an orchestration tool used to implement that automation. A SOAR tool lets you build a workflow to say, “If this occurs, that occurs, and something occurs graduated by these other facts or these other API calls or this other information that I want to pull, do something.”
The end result of a SOAR tool of any job or playbook and a SOAR tool is to take action, do something which is what you do expect a human to do. As that maturity keeps going and you get past the SOAR functionality, you’ll get about to where we are today, which is DevOps. You’ll start to commit more outside of automation into DevOps, and in our field at SecDevOps, security DevOps.
A DevOps team is like automation on steroids, they’re writing dashboards, and web front ends, and GUI panels and they’re taking automation to a point where not just security is going to use it and try to promote the sense of automation and data science outside of just InfoSec.
Jeffrey: There’s a lot there, that’s a great response. Just to clarify for the audience too, when you say SOAR, I think you’re talking about Security, Orchestration, Automation, and Response?
John: That’s right.
Jeffrey: My memory serves me correctly my days in working in cybersecurity. Then on the SecDevOps side, we are your words, automation on steroids, is that how you describe that?
Jeffrey: Yes, a DevOps team, our DevOps director probably skewer me alive for describing like that, but I think that’s a good layman’s terms explanation. The kind of things that we’re sending to our DevOps team, for instance, what’s a good one? We’re doing a lot of data exfiltration for upstream identity systems, we have a large publicly accessible directory that we’re going to populate with our doctor’s information so you can come to us and scan through all the doctors that work in the hospital.
A lot of people do this with their insurance site when they’re picking a doctor, but we’re going to promote a public site where you can see what the doctor specialties are, how long they’ve been practicing, view a photo of them, see if they float around to different campuses, you can catch them at one of our other hospitals?
We’re building this but to do it, we need to grab data from 10 different sources. It’s not really an InfoSec task other than those data sources are InfoSec containers, but aggregating that data together, combining it, and then making it human digestible, or human-readable, that’s all an automated job. A sock analyst might be doing a similar task by bringing down there was an endpoint alarm, the user opened it, we have a proxy alarm, saying they visited a bad website, and then we have an email alarm that says after they clicked on the website, they received some type of bad email with a payload.
They’re in the same way pulling all those events together and saying, what do I do now? In the former case, we’re pulling a lot of data together, the DevOps team is, making it look pretty and then shipping it off to this front end for consumption by the public. The SOC analyst is pulling a lot of data together and then analyzing and saying, “I should probably turn off this person’s computer because they are not a very good employee.” It’s very much the same, this concept of automating a lot of data and putting it together to make a decision.
Jeffrey: Yes, it sounds like you’re talking about collecting this data, aggregating it, using it to make decisions. Also, you mentioned creating dashboards, so it’s regardless of whether you’re creating something that you’re going to share with the consumer or patients like doctor information, which in some way is considered dashboard. You’re providing information in an easy-to-read format, or you’re creating this dashboard of security events for your sock analysts or any other members of your team to look at.
Jeffrey: Okay, cool. Maybe this is a good segue then as we’re talking about the idea of collecting data and using it to drive decisions, patient data is obviously super important at New York-Presbyterian, not just because of HIPAA, but also because your reputation relies on it. I think you mentioned that you’re the number fourth, I don’t know the fourth-best or fourth-largest or maybe both. I think you’re number one, John.
Jeffrey: Your reputation relies on this, right? You need to maintain trust with your patients, you also need to maintain trust with the doctors you work with, and also the external organizations. How do you balance the need to protect patient data while also using it to make data-driven decisions?
John: Complex question. Most of our concern around patient data is privacy-related, patient privacy. We’re certainly concerned when a clinician accidentally puts something on Twitter or takes a picture and adds it to Instagram and the background of the picture is like the patient chart for that day with all the medical record numbers and patient procedures on it. We’re very concerned about that kind of thing and making sure that doesn’t happen, so we do a lot of training.
We’re pretty proud of our annual security training to make sure people are aware that everything matters whether even when you don’t think it matters, the metrics for what is or isn’t electronic PHI is a pretty low bar, so be careful. Be careful of what you’re spreading around, but the patient privacy concern internally drives a lot of development and a lot of concern.
That’s really Nurse Sally looking at patient Billy’s record when Nurse Sally isn’t supposed to. Maybe Nurse Sally didn’t have patient Billy on their patient register, or they have no relationship to patient Billy, so Nurse Sally should not be looking at patient Billy’s record. I’m oversimplifying the relationship a little bit but this could result in some type of HIPAA-based privacy violation.
We’re very concerned about, there’s a term in healthcare called ‘break the glass’ which is you can imagine there’s a bunch of nurses waiting to take care of a patient, maybe they’re in the ICU, and they know who their patients are that are in the ICU room. All of a sudden here comes a new patient that is crashing, that nobody’s heard about and they’ve got to treat them, they’ve got to keep this patient alive.
They might need to get into that patient’s record immediately and see what’s going on. That’s an outlier break the glass scenario where they need to get in, see what’s going on with this patient, get the patient’s history, but that can be abused too. We’re very concerned with how we internally police patient privacy. How we treat patient data. We’re pretty confident in how we store it and how our relationship with our electronic health care provider with several of them, we’re pretty confident in those relationships and that that data is encrypted, either in transaction or arrest and it’s secure.
Our own people are a big area of focus right now. We actually created something called the NYP privacy platform. It’s a homegrown tool based on the same tools that our security operation center is using every day. This homegrown tool is designed to watch just that, is to give us some assuredness that patient privacy is secure, and we’re handling patient data the right way.
Jeffrey: So many follow-up questions I have. One route, I’m thinking of what made you decide to build something internally versus purchasing an external product. Another one involves using the information you’re collecting to better serve your patients, but maybe so this NYP privacy platform, is that based on another vendor or technology you’re already using, or you created this internally by building upon something or you built it from scratch?
John: We went to industry a few years ago and looked around at the different privacy tools because there are commercial products that do this. I think the one thing that all commercial products suffer from is their ability to ingest many, many different types of data. We talked a lot about automation and we talked about how a SOC analyst is looking at all these different events.
We’ll take the case of Sally looking at Billy’s record. Let’s drive that one to depth. We might care when Sally logged in. We need her authentication event from Microsoft Active Directory. We might care that Sally logged in remotely, maybe from a machine in Minnesota. We’ll get her IP and the physical location from that IP address via our multifactor system.
We might care that Sally didn’t multifactor in, maybe they got into the system without a multifactor so they were logged in from a local network. We might care about Sally’s shift time. When Sally logged in to look at Billy’s record was she on shift or was she after working hours. That would come from our time clock system.
You’ve got all these different data sources that are providing one event, Sally logging in. Multifactor, Sally using the proxy, Sally using our virtual desktop environment to interface with the EHR. All these different events from these different systems, but come together and paint a picture. They tell us that Sally looked at Billy’s record off-shift or on-shift from Minnesota through a web browser, on a virtual desktop, whatever.
They paint a picture that we can go back and say, “Wait a minute. Sally hits Minnesota, has never been in Minnesota, knows no one in Minnesota, has no record. Why is Sally logging in from Minnesota? Okay, that might be an attacker.” With the privacy platform and the commercial products that we looked at, we always had more data that we want to bring in. More event types from more systems to enrich that picture. In some cases to automate our activities when Sally looked at Billy’s record.
That goes into the whole SOAR piece of it. With commercial products, there’s a limit. There’s a limit to what we could bring in. There was a limit to that structure. By building ourselves, we had the ability to bring in any type of data in any way we could prune it, we could dupe it, we could reduce it, we could normalize it. Then we could build dashboards that are keenly aware of the NYP way we do business, the unique kind of systems that we operate, the unique way that our connections do business.
Jeffrey: I love this. Your story is actually a really good one. You’re talking about adding context to the data because if someone logs in at a certain time or from a certain place, on its own that means nothing but like you said this, where does this person live or are they on vacation or are they working normal working hours? Is this a normal patient that they wouldn’t look at their data?
I think that’s really fascinating. It sounds like you’re grabbing all this data that you’re collecting, you’re aggregating it, and then you’re figuring out a way for it to all– you’re not necessarily joining the data because not thinking of this type of structured format but you’re figuring out the context and what the data actually means given everything else going on to see if there’s actually risk or not.
John: Yes, building relationships with these different events, and defining a structure such that if these events satisfy certain relationships, warrants our investigation. It becomes actionable intelligence, which is another military term but it’s appropriate.
Jeffrey: Actionable intelligence. What you’re talking about now reminds me of one of your publicly known success stories about how New York-Presbyterian has addressed the opioid crisis. For those that are unfamiliar, can you share some information about the problem that you were trying to address and basically how you use data to combat opioid addiction?
John: Absolutely. Pretty much a lot of people know if you don’t already know that there’s an opioid crisis in this country. It’s a runaway addiction problem. The question is sourcing. Where do all these drugs come from? A lot of these narcotics are clinical narcotics stolen from healthcare providers like us. Of course, there’s a lot of illegal content too but for the stuff that we have in our hands, how can we curb that? How can we catch potential diversion? Diversion is a fancy word for theft or moving things away from where they were supposed to end up. How do we become a good kind of social steward of curbing the opioid crisis? We think about data and we think about how- let’s call them Petey, how Petey might get Oxy out of a hospital, or how Petey might be able to redirect the logistics train for the delivery of Vicodin to a hospital.
A lot of the time, it’s very much what I call the Superman problem, the Superman effect. It’s scraping small amounts right off the top. I think it was Superman three where they rounded up transit actions to the nearest penny and put those into an account. Nobody was there wiser because who cares about a penny, but three months later you got a lot of money.
I think they did it in the office space. Is it the office space that they did it? Anyways. Petey scrapes two 200-milligram pills of Vicodin from a pharmacist tray. It’s two pills out of 200 that were in that tray, kind of hard to notice, but if Petey does it every day for a week, you got a pretty good supply of Vicodin.
All this stuff is tracked. Those 200 pills that go into that tray that the pharmacist is interacting with, those are counted, those are measured and there’s a digital record somewhere of those 200 pills. There’s a record of when Petey opened the tray or the pharmacists opened the tray. There’s a record of when the pills were delivered. There’s a record of when they were resupplied. There’s even a record of a patient and how much a patient is being administered.
Sometimes a doctor might prescribe, I frequently use the example, it’s very out of bounds, it’s not practical at all, but maybe a doctor might prescribe 200 milliliters of a suspended morphine drip, but when you look at the script, it says 2 liters of a suspended morphine drip. Now 2 liters would probably kill King Kong, but that’s a bad thing. You don’t want to see that. Again, it’s just fudging the system a little bit. It’s this little tiny Superman problem. It’s all data, it’s all data.
If you have a system that’s collecting every time Petey opens the tray and every time the pharmacist refills the tray and every time the particular narcotic is prescribed to a patient, you can put all those events together and you can build a story and you can say, “Okay, when we consolidated, or resolved, or check how many pills were in that tray at the end of the day, it’s always short two or the patient’s always getting prescribed double than what it seems the doctor is putting into the EHR.”
Those are events. You put them into a system, you tie them together, and eventually, you’ll find the outliers. You’ll find the anomalies. You’ll find those little places where Petey has indeed been slicing off the top or many other ways to accomplish the diversion.
Jeffrey: In this situation, you’re saying like Petey is potentially a pharmacist at the hospital and could have been scamming those one or two pills on a daily basis over and over, and no one would have noticed, but the system that you put in place, you’re keeping track of all this data of whichever pharmacist or person is opening up a drawer that contains the Vicodin or the OxyContin, whatever it is, and then the bottle and how many they’re putting on the tray and how many are going to the bottle.
That way, if one or two are missing, and this happens, if it happened just once, it could have just been an accident and it fell on the floor or something, but if you’re seeing a recurring event, as you’re calling it, over and over, then you can catch these anomalies and figure out actually what’s happening. Is that how you guys piece this together?
John: Yes. It’s incredible. That’s exactly it. I want to be careful not to indemnify Petey or people like Petey because we have easily the best pharmacy team in the world, we have the best pharmacists in the world. I don’t think any one of them would ever do this, but another scenario might be we order 10,000, 50-milligram doses of Oxy and we only receive or sign for 8,000. Whereas the other 2,000. It wasn’t on the truck.
There are so many areas where narcotics can be diverted. I use the example of a bad actor. It’s not necessarily a pharmacist. Narcotics move through several different hands before they go to a patient. We want to limit that, of course, but there’s a lot of ways this can happen. I think finding those needles in the haystack, that’s what we’re trying to do.
Jeffrey: It sounds like it’s really about tracking this data as much of it as possible from the beginning process to end. Because there are all these different touchpoints where drugs can change hands. If you have a system in place, that’ll track this data, and if it can be automated, even better. Then you can use these security machines or whatever tools you’re talking about to then figure out where the anomalies are, and then pinpoint the actual problem to address if it were to occur either under your watch or someone else’s, at least you’d have some insight on where to go.
John: Yes. These really just very advanced use cases of SIM platforms, of log management platforms, of analytics platforms. I talked about the maturity of an InfoSec team earlier, but as you march along, you eventually get to a point where data science becomes as important to an InfoSec team as just patching. When you get that good and you get that much data and you see so many events, data science becomes just if not a more important function of a SOC analyst, a SOC engineer, a SOC architect, or really any InfoSec professional.
Jeffrey: I think that’s true more now than ever. There’s tons of data being collected, but if you don’t have a data scientist or data science team to make sense of it, then what are you doing with that data? I don’t know if there’s a fair question to ask or not, but are there any other use cases or similar ideas you either have that are in the works now or in the pipeline that could be similar stories to how New York-Presby addressed opioids? Are you working on some similar projects?
John: We have some ideas, some are more vaporware and some are more to be built or to be constructed next month or something like that. One of the things that we’ve talked about before is telemetry data. When you come in and visit us at one of our institutions, let’s say you’re in one of our ICUs, I think the average in an ICU room is 11 devices in that room on average are going to be spitting out telemetry data, so your pulse-ox, your heart rate, maybe there’s a motion monitor.
Maybe there’s some type of infusion pump with a drip, maybe you’re vented and we’re providing oxygen at some interval to you. What else is in there? You’re on a dialysis machine and we’re cleaning your blood. I’m not a clinician as you may have been able to discern, but I know a dialysis machine cleans your blood. It’s measuring the relative dirtiness of your blood.
There are all these sensors and all these things that are hooked up in these rooms, and they’re all spitting out data. We have this really cool construct known as the CLOC, which is admittedly not the best acronym. The CLOC is a Clinical Operations Center where real clinicians, real nurses, real PAs are sitting there, and they’re watching all this telemetry data and they’re watching patient Timmy in his bed.
The theory is, the hope is they can catch something when the nurse is not at the patient’s bedside, because when a nurse is in there, they’re looking at all these telemetry monitors, they’re making decisions. They’re looking at heart rate, they’re looking at the patient, they’re getting pain scores, but it’s a lot of data.
One of the projects that we have is how do we present this telemetry data? How do we bring it all into one system and then apply some of those correlation rules that we would use for say, catching narcotics diversion, correlation rules that we would use for catching a bad actor, trying to escalate privilege, or achieve lateral movement or our network, or follow the MITRE ATT&CK chain? Take those rules, which is just stringing events together and apply them to telemetry data and say, “Whoa, heart rate spiked, and sodium levels are too high. Their temperature is way too low. That means that they are– I have whatever, something in sepsis,” I have no idea. I’m not a clinician, but there’s something there for a clinician to investigate.
We would catch that, maybe when that clinician, that nurse, or the doctor’s not at their bedside. That’s worth exploring ideas and how to do this. Then also present this information in a very easy way to a clinician, either many techniques augmented reality is one of them on their phone, just with some type of real quick presentation on their phone with some correlated data. That’s one of our active kind of, can we, can we do anything here?
Jeffrey: That is really interesting. It sounds like you’re taking all this data. You mentioned up to 11 devices in the patient’s room and figuring out a way to tell a story around it, and really better serve the patient. I guess this is beneficial for everyone involved, you can better serve your patients. You can reduce healthcare costs, you can reduce the burden on your doctors and nurses, right?
Jeffrey: That’s fascinating, and it’s 2021. You think as you’re saying this, you think, “Why doesn’t this already exist today? Why does a nurse have to come in, who’s balancing many things on her plate or a doctor, and they might walk in and at that exact moment, there might not know anything, that’s alarming, right?
You’re saying if you have a machine, take a look at this, look at everything in the aggregate, and also looking at it over a period of time, adding more context to the data, the same way you were sharing other stories before, you can get more insights and then maybe give a doctor a nurse very easy to digest dashboard or you mentioned, augmented reality. That’s a really cool idea.
John: We’re not the first to come up with. The BioMed device manufacturers are all trying to aggregate as many systems as they can into bedside monitors. It’s like one panel to rule them all where you’ll show, Phillips is a really great system that we use, where you’ll see many of these different telemetry monitors on one panel, so a doctor or a nurse only has to look at one screen to make a conclusion, but they’re still factoring things in their head.
They’re still looking at three different images, and then in their human brain saying, “Well, if this, then this, then this.” It’s conditional logic. That’s good, but what if we could shortcut that? That’s the idea, what if we could build those thresholds, build those conditions into some type of automated activity, and then present that data immediately, so the doctor doesn’t even have to think, they could back check and they could say like if heart rate and temperature and sodium content then this, but if we can do that in real-time, and even when they’re not there. I think that’s got a lot of potentials.
Jeffrey: That’s some powerful stuff and a great way to use data to help your patients and your doctors. Outside of patient health and stress on medical professionals and really the entire healthcare system in general, I’m curious what impact COVID had on you and your team during this last year?
John: From an InfoSec perspective, the big impact was on identity. For a while there, we were the center of the world as far as, daily infection rates, intubated rates, just COVID positive patients. I think at one point we had seen more COVID-positive patients than any other hospital in the world, just a morbid data point. I think from an InfoSec perspective with such an influx of patients also came a huge influx of clinicians.
We had the Naval hospital ship, USNS Mercy, I think it was the Mercy that was docked in the Harbor, in New York Harbor. We converted the Javits Center, which is this massive conference center, basically into a triage facility, soccer fields, and just tent cities going up everywhere. We put out a call to the city and said, “Hey, if you were ever a doctor, give us a call because we need your help, maybe your credentials are in another country, but doesn’t matter, we can use you. If you have any medical training, we can use you.”
We had members of the armed forces that were here and then we had this massive influx of volunteer nurses and clinicians, people coming from all over the country. It was just a wonderful, wonderful just makes you feel proud to be an American kind of thing. These are all, like from an InfoSec perspective, these are all newbies. I’ve not vetted them. I’ve not checked their credentials. They didn’t sign our end user license agreement. They’re not employees, they’re not beholden to any employee credo, but we need their help.
How do you identify these people? How do you entitle them with the appropriate permissions, give them the right access? How do you give them, all the IT utility they need in such a short amount of time. Then once they’re done, they go home, how do you clean all that up? How do you get rid of that access and make sure that there are no privileged actors or bad actors hanging around?
There was a really big push. Our identity team is world-class and was just working day and night to onboard these persons, entitle them, credential them, get them the right access, provide multifactor and operate inside, or a still relatively good set of security controls, so they weren’t just running around with domain admin passwords and re-imaging desktops all day. That was I think the big thing for us with COVID was just a really, really huge uptick in identity management and a very quick, how do we do this at scale challenge for NYP?
Jeffrey: Yes. It sounds like a serious challenge. I bet going back to your days in the military helped prepare you for an emergency type of situation like this, where you got to think on your feet quickly and figure out a game plan to help others.
John: Oh, yes. Yes, absolutely.
Jeffrey: You and I have had a number of conversations about AI and your thoughts on the term on how maybe it’s sometimes overused or misunderstood, but then also how there’s a lot of benefits and value from artificial intelligence as well. I’m curious, from your perspective, how much potential do you think it is for AI to be leveraged in the medical fields, and also, how do you think about governance around it as well?
John: It’s a big question. I’m fairly passionate about the use of the term AI. I think AI is a superset of many different things. I don’t believe that anyone’s close to a pure AI today. No one’s passing the Turing test. I think if the term is overused a lot, probably not out of any type of malicious intent or ignorant intent, but just because it sounds cool to say it. There are a lot of movies about it and everybody wants to be the Terminator. I do think that the foundational components of artificial intelligence are here, and we do use them, machine learning is one of them.
Even if machine learning can be broken into many different levels, but the very simplistic machine learning, which is just conditional logic with some type of a holding state. I think is here, we use that every day, we use that in the privacy platform. In our narcotics, we call it the medication analytics platform, but what about opioid diversion?
We built some modeling using open AI and the ML toolkit in the particular tool that we use. We were looking for anomalies. We were looking for outliers and then measuring standards of deviation from some type of normalized data.
Making an inference on anomalies, based on some type of both datasets, is a component of machine learning, where it gets off the rails is when you start to do simulation or synthetic data when you’re adding synthetic data to your models. Then we start to get into places where stochastic modeling is a necessary tool, heuristic modeling, looking for some type of standard of error or standard deviation. These statistical tools start to evolve machine learning. The more that we base our decisions off of a dataset, and then pivot away from that based on what the data tells us, the more machine learning starts to creep closer and closer and closer to artificial intelligence, but we’re not there today. Most of what we do is just taking large data sets and doing a lot of comparative analytics on them. Sometimes you’ll see, varied insertion or modeling what an anomaly would do to a dataset and then making an inference a little bit, but really it’s just a lot of comparative analytics, synthetic data is an important component of that. We don’t do a lot of it in the healthcare system other than the research side, but in the operational side where I work synthetic data doesn’t do a lot for us because we already have the real patient data if we needed to make any big scale conclusions.
As far as regulation goes, that’s a very hard question to answer because I don’t think we’re mature enough in our use of artificial intelligence, those block and tackle fundamentals. I don’t think anyone’s really mature enough to need the regulation just yet. There are other verticals like autonomous vehicles that seem to warrant regulation a lot more than us. You saw the Tesla crash just the other day. Unfortunately, a couple of lives were lost in that. I think that’s probably an area much closer to regulation, but I do see this. Vendors right now in the information security space are making big land grabs for our data, large companies that have healthy security portfolios in XDR, EDR, user behavior analytics, cloud-based management orchestration tools, end-point, and perimeter detection, mechanisms, proxies, firewalls, these companies that have a portfolio and a range of products and there are collecting data.
This term data lake has been tossed around all over the place. We know that to build a good AI, you need a lot of data, and these companies are hoarding it. If you’ve got stock in Glacier or some big cloud storage companies, you’re probably watching your stocks slowly climb up because these companies are storing this data and they’re creating these data lakes. Now, they give access to their customers in the form of if you give us more data, we’ll be able to better detect when, an event occurs, when a bad thing happens, or is being passed around your community and vertical. I think generally, that’s true, but hoarding of this data then raises the question, wait a minute.
Some of that’s my data and it’s being co-mingled with all these other people’s data. Do they see my data? Are they making calculations based on what I put into the system? Well, why don’t I get a piece of that? Why can’t I? Well, I want their data. If you can see my data, I want to see your data. The question becomes how do you regulate that? Is that even an appropriate mechanism? Wouldn’t it be great if all these companies where this was just an open kimono-style relationship, where everybody had access to this data? We can all use it to make some informed decisions, but right now that’s not the case. If anywhere regulation is going to rear its ugly head, probably before AI becomes a factor, it’s going to be on these data lakes and exactly what customers have access to, what the vendor is responsible to protect encrypt or mask, de-identify, or whatever. Then how they share that and who gets access to what I think, that’s where we’re going to see regulation soon.
Jeffrey: That makes sense. That’s an interesting concept too, think of all that data that is being collected, if it was shared and made available to the greater population or for the greater good, what could really be accomplished with that. Maybe there are some in-betweens in that too. Maybe it doesn’t have to be fully open because you might not necessarily want your data shared with certain organizations or people, but maybe there are groups that could benefit from seeing this aggregate data to help others.
John: Absolutely. Why wouldn’t we want to share? If a particular company has a bunch of financial sector customers and they’re getting all, let’s say firewall data, from all these public firewalls. All of a sudden, a threat actor pops up and it looks like the threat actors going after a particular subsidiaries of these financial services companies, that might matter to me because I might use that subsidiary and they’re being attacked right now, or maybe I need to strengthen my defense. That particular source of that threat actor is blocked on my network. This information becomes– it almost has a monetary significance to it. The question becomes, how do you share it, say regulation becomes a reality. How do you share it? Well, that’s where the promise of things like homomorphic encryption comes in. The promise of homomorphic encryption is that we can encrypt and share it and perform computation on it the same way as if it was plain text.
Right now, HME is more of a raw statistical-style utility. I just gather up a huge column of data and compare that to another column of data and see if there’s matching or see a check for two bulls, proportion, or basically just very vanilla statistical style calculations. I’m not actually adding things together. I’m not doing computation, pluses, minuses, multiplication not doing that yet. The technique offers such a capability, but I think that’s going to be one of the factors in sharing this data in these data lakes, something needs to protect that data. It could just be simple de-identification, that’s a technique we use quite a bit. As long as you strip out the PHI, the data is safe in such a way as it can be shared within the research community as an example for clinical studies, clinical trials, but the identification that’s one way. HMEs another way. I don’t think just traditional asymmetric encryption is going to be traditional, maybe I should say traditional asymmetric encryption algorithms are going to be the solution here, but that’s a big question. If you’re going to regulate it, you’ve got to enforce some type of protective mechanism around it.
Jeffrey: For sure. Shifting gears a little bit and looking at some of my pre-production notes, what would you say is our biggest diversity gap in cybersecurity?
John: Now you’re really changing topics. We could talk about this for days. There are not enough women in cybersecurity. It is a thing that I think hurts us to a degree I don’t even know that we completely understand. I’ll be very caveman for a second, you don’t really solve world hunger with a bunch of guys in a locker room. I’m not saying information security is a locker room. I’m not trying that analogy, but I’m being overly dramatic to say that a bunch of guys have the same discipline from the same background, then work in the same field, you might solve small problems and make progress in some pretty cool areas.
I just don’t think you’re going to solve meta problems like world hunger without bringing in diversity, without bringing in people that come from different backgrounds, have different scopes fields of study, different approaches to solving problems. As I look across all of our diversity gaps, the lack of women in our field is a huge factor in solving the meta problems, solving the big problems. I think we need to attract women. I have two daughters and I am uniquely sensitive to this issue, but I think it starts at a young age. We’ve got to stop vilifying girls that are technical and want to code or want to play video games, I just took apart a computer with my older daughter the other night and replaced a battery. She wanted to see how it worked, see all the little screws and fans and doohickeys in the computer. We’ve got to promote that. We’ve got to celebrate that and appreciate the fact that those women are empowerment to our field.
Jeffrey: Yes, for one, you sound like an awesome dad and I already know that you are but it’s really cool that you take apart a computer with your daughter and put it back together. I understand where you’re coming from, I support a few different women in tech meetups in New York, where we used to meet up in person, pre-COVID and different female founders would pitch their tech company, and the idea was to get investors in the room and to help them get more funding. You hit something it sounds like it’s the fundamental problem really starts at an early age, in terms of these gender norms that we’re putting on, on young girls, right? Instead of like, forget about math, right, do something else.
I mean, it’s got to start at an early age, we have to build a foundation, because otherwise, how are we going to fill the pipeline with strong women in this field who can apply themselves in tech? I’m sure that there are many, many girls in college today that want to get involved in cybersecurity or technology, but maybe they’re starting too late and there’s not enough supply. I guess today, you’re talking about a good way to change that is early on but what do you think we can do in the near term?
Is it you’re offering a good like long-term solution? I totally agree, but what do we do in the short term to help not just help women get involved in tech, but also help ourselves. If you want to build a more diverse infosec team and have diverse thought you want women minorities on the team now, so how do we go about doing that?
John: Infosec is kind of it’s not a field that you really start out and I don’t think the best infosec engineers start out in infosec. I mean, it happens sometimes but typically, it’s what I call a field of maturity, where you start out in the block and tackle the discipline of IT, you master how to do an ETL from an Oracle structure, you master cabling. I don’t know what’s the AMB cable specification for Cat 5, like a 5,7, 8 or something like that, white, green, green, white, orange, blue, blue, you master cabling in the networking field, you master customer service in the service desk, and how to troubleshoot the installation of Microsoft Word.
You master the block and tackle stuff, and then you come to InfoSec because it represents a further challenge to your knowledge, I think the way that we diversify, InfoSec, is we bring in those people, bring those people and challenge them to solve a bigger problem than just running 100 meters of cat five, or establishing a BGP peer group or I don’t know, I’m building a Hadoop cluster and you’re orchestrating a docker container distribution, whatever.
I think we bring in people that have a little bit of a taste of it in their mouth and then we give them that hard InfoSec problem that maybe leans on their existing experience. It’s internships and fellowships, those are good pathways for people that have a little bit of IT experience, but I think it’s challenging them with hard problems and then seeing like, all right, well because you’re good at this, you’re able to solve this problem. Now, this now that now there’s an increasing challenge. The biggest thing with assigning especially, considering diversity is if they don’t make it, they’re not a reject.
If a young lady is stimulated by InfoSec, a young man is stimulated by InfoSec and they want to come and work with us and they don’t pick it up right away, you can’t say, “You’ll never make it in this deal. Get out of here,” because infosec is diverse. We have our fingers and many different, facets of IT, that mentality of saying no doesn’t work. We have to find a way to say yes.
Jeffrey: I totally agree with you and I’m glad you brought this up. John is a #girldad.
Jeffrey: John, at the end of each episode, we’d like to play a familiar game called this or that. Maybe we’ll add a couple of different variations to this. We’ll ask you some other questions or follow-ups to it. All right, cool with you, you already.
John: Yes, hit me.
Jeffrey: Just feel free to share whatever comes to mind. It could be a one-word answer. It could be a full explanation, your call. First one GDPR or CCPA?
John: I know little of GDPR. I know a little more about that one than the other so GDPR.
Jeffrey: Okay, open-source or proprietary.
John: Oh, dependent depends on the team depends on the problem we solve, whether I’m rich, or poor, at that point in the year, either or, I love the ingenuity of open source but you got to build it, it takes time.
Jeffrey: Yes, I was going to ask you a similar question, and this kind of comes up with you and your daughter taking apart the computer but failed and also you talking about the privacy platform internally. Do you build or do you buy?
John: I think it’s a very similar answer to open source versus closed source software, or, commercial software. You’ve got to evaluate both scenarios. Sometimes it’s rent versus buy. It’s sometimes it’s smart to do one, sometimes it’s smart to do the other. I think, at least in the realm of raising young girls, I’d rather teach them, I’d rather build because hopefully when they’re 25, they’ll remember like, “I know how to do that. My dad made me do that and sucked.” I think building that construct at least.
Jeffrey: It’s funny how you say that because the younger we are, the more fun it is to take things apart and build them, and then for whatever reason, I guess our days get too busy, or our mind filled with other activities or smartphones and then we want to take the easy route but I think we should do more. All right, Starbucks or Dunkin Donuts.
John: Starbucks for me. If I crapped on Dunkin Donuts, I think this thing would probably go viral and I’d get hate mail and whatever. I lived in Massachusetts for a little while and there’s a lot of Dunkin Donuts people out there. It’s not a bad place, it’s not a bad place. I just, like my Starbucks.
Jeffrey: When you go to Starbucks, what’s your go-to order?
John: It’s a mocha. I’d have to say hot now because there’s a cold Mocha cause I went to Starbucks in Manhattan and then I just said, “Grande mocha, almond milk, no whip.” The girls like, “And?” I’m like, “And I’d like to pay and please make it.” She’s like, “No, do you want hot or cold?” I’m like, I didn’t know there was a cold. I thought she was just trying to be nasty with me and I said, “Well, it’s hot,” and she’s like, “There is a cold one.” I’m like, “Oh, okay, well”. I don’t go to Starbucks. As you might have deduced.
Jeffrey: Yes, I guess maybe next time you’re at the office.
John: Yes, yes.
Jeffrey: Speaking of which, I know you’ve spent time living in many different cities and states. I will ask you between New York or Texas?
John: Well, I live in Austin, which is kind of like a borough of Brooklyn. It’s fancy and hippie and trendy and young. It’s the Williamsburg of Texas, you might think. I don’t know that there’s a huge amount of difference but I do enjoy the rural cow pastures and horses and you drive two minutes from my house and you’re in Texas, Texas and that’s fun. Seeing the stars at night, which can’t do that often in Manhattan, seeing the stars at night is really cool.
Jeffrey: Awesome, speaking of, I guess, Hipster Ville, you just mentioned Williamsburg, Brooklyn. I think there was a time where me, you, Matt, Brett were out and we went to some hole-in-the-wall dive bar. My question to you is tequila or mezcal?
John: I don’t ever want to taste mezcal for the rest of my life. For the audience, it was a speakeasy hidden behind like an ice cream shop. The guy had 300 bottles of mezcal that I think tasted like each one was poured across eight pairs of dirty feet and some Kingsford charcoal. It was disgusting, so no, thank you. I will pretty much take anything besides mezcal. You have to get some hate mail from the mezcal manufacturers, let you know.
Jeffrey: I’m glad we had that experience, but I will agree with you on that to feel. I just tried some really good YaVe Tequila, which is a new brand that makes mango and jalapeno flavor. Not sure on the last topic here but if you like something spicy and summer’s coming up and you want a good Margarita, I recommend it.
John: I’ll try that. No mezcal, no smokey feet nonsense.
Jeffrey: Yes, I’m with you. Other fun questions here, Monterey Bay or the Gulf of Mexico?
John: Woof. I used to live in Monterey for a little while before I went to grad school. Love Monterey Bay but if you want to live in a treehouse on the coast of Mexico and just disconnect from the world, I’d say the Gulf is where you need to be.
Jeffrey: All right. Now, in the same spirit of this so we’re taking golf there. For a guy who has a pretty awesome at-home gym setup that sets some photos, this is kind of on the fly, but if you’re going to put up a picture for inspiration, do you go with Ronnie Coleman or Arnold Schwarzenegger?
John: It’s a no-brainer. Hey, much love to Ronnie Coleman, a great guy, great personality, great story but the art it’s always Arnold. It’s always, and show me a guy, no, I shouldn’t say a guy. Show me in American man, gender, sex, it doesn’t matter, show me an American man that sees a picture of Arnold and doesn’t do the voice. Then just like, “Well, I’m getting a pump.” Every time you see it, that’s your motivation.” You don’t get that with Ronnie. You don’t get the voice.
Jeffrey: It’s funny because I’m hearing Arnold in my mind and you brought up Terminator earlier. Terminator 2, one of my favorites. It’s a classic. When you’re lifting, do you lift for weight or do you lift for reps?
John: It’s always reps. I’m getting old, so it’s all reps.
Jeffrey: Nice. All right. Seeing your background, this is a Chicago kind of question and I already know the answer but for anyone listening in, Chicago White Sox or Chicago Cubs?
John: That’s a no-brainer. It’s my Cubbies, 100%.
Jeffrey: Who’s your favorite? I see Harry.
John: I have wonderfully fond memories of sitting there and you come to the strip. If you’re a kid in the late ‘80s, early ‘90s when Harry was a legend and you sitting in Wrigley Field, you look up at the booth, the WGM booth where he’d hang out. It was like seeing Santa Claus fly over your house. It was that magical of a moment to sit there and see Harry with this obviously inebriated self, drinking an old-style beer, hang out there, and throw that microphone out.
That’s why I use it as my background because every time I see her, I think it makes you feel good.
Jeffrey: It makes you smile.
John: Yes. He’s the man.
Jeffrey: I’m assuming you read and if you do, do you prefer a Kindle or real book or are you going to the audio version on Audible?
John: Actually, I think it depends on the book. I recently listened to The Martian on Audible, which is I actually, I shouldn’t say recently because a few years ago. I listened to it on a long drive before they made the movie and it was fantastic. It was a perfect book for audiobooks. History books, I think generally lend themselves better to physical, hard copies of maps and stuff in there you can constantly reference but Kindle works too.
Jeffrey: Do you have any good–you mentioned The Martian but any good cybersecurity books you’d recommend or for non-cybersec books?
John: I probably should shamelessly plug there. There was an effort, it still exists. It’s called the Cybersecurity Canon, Rick Howard used to own it. He was the CISO at Palo Alto, but now it’s been handed off. It still exists. It’s a wonderful repo of user-submitted cybersecurity books that are cannon-like everybody needs to read. I think The Cuckoo’s Egg is probably– if you’re getting into cybersecurity or you’d like Cops and Robbers, Tom Clancy techie stuff, The Cuckoo’s Egg is mandatory reading, so it’s a terrific, terrific book.
Jeffrey: Okay. The Cuckoo’s Egg. Notice I haven’t heard of it, but I’ll have to add it to my list.
John: It’s a quick read. You’ll fly right through it.
Jeffrey: Good deal. Let’s shift gears to important sci-fi questions. Would you prefer Star Trek or Star Wars?
John: I go to Star Wars but these last few Star Trek movies are pretty good. What’s his name? Is the Captain Kirk guy. I can’t remember his name but Chris Pine, Chris Pine. They’re pretty good. I love Star Wars. When I was a kid, that was the rage, so I got to go to Star Wars.
Jeffrey: I remember watching Star Trek the Next Generation probably in the late ‘80s or early ‘90s. Then, obviously, the Star Wars movies were younger at least the first three. Well, they’re not technically the first three but the first three produces.
John: The first three produced.
Jeffrey: Yes. Then let’s go, maybe you can settle with the debate than as a Star Wars fan?
Jeffrey: Who shot first? Han Solo or Greedo?
John: It’s Han. I don’t understand why we have to have this where people even debate about this. It’s always been Han. We have a special edition which just that act didn’t happen. That didn’t make anybody happy and it raised this debate, which is– the guy’s a gunslinger. He’s feeling he was getting squeezed and he just had enough of the guy.
He had enough of this lip service and said, “I’m done with this, pa-pa.” He’s a gunfighter. Let’s not trying to convince ourselves at Han Solo is some kind of good guy. Look how his son ended up, right?
Jeffrey: Well done. Nice. Well, John, this has been awesome. Thank you very much for chatting today. Maybe before we wrap this up, is there anything that our audience can do for you, or is there anything you’d like to share with the greater community?
John: Yes. Don’t reuse the same password for anything, use a password manager and never use the same password and please separate your business and your personal lives. You will thank me because one of them certainly will get exposed or hacked in your lifetime and better to mitigate that by 50% than commingle those two hemispheres of your world. That’s just the nature of digital life today, is you’ve got to accept a certain degree of bipolarity in those two worlds, so if you can separate those and control those passwords, I think that would help me, for sure.
Jeffrey: Good pieces of advice.
John: Yes, absolutely.
Jeffrey: Awesome. Thanks, John, I appreciate you joining. Hope to see you soon.
John: Yes, thanks.
Alexandra: Wow, so many great stories. I think we can pull together a rich collection of takeaways, especially for CISO’s and people working in infosec.
Jeffrey: Yes. Let’s do that.
Alexandra: Number one, mature CISO teams should be seriously multifaceted. The Security Operations Center monitors alerts, incidents, and events, while the forensics and vulnerabilities team proactively searches for threats. The risk team’s job is to assess the third-party risk of vendors while the identity management team takes care of people who need access to the data and services within the organization. As infosec matures, data science becomes extremely important for advanced analytics and privacy.
Jeffrey: Number two, numero dos, automation systems are critical to have in place. John talked about security, DevOps being automation on steroids, and promoting automation and data science outside of infosec. John’s DevOps group does a lot of data provisioning for downstream tasks, pulling data together, and making it accessible for humans.
Alexandra: Number three, protecting patients’ personal and health data comes with a lot of data literacy and security training across the organization. In the healthcare environment, the system also needs to be prepared for so-called break-the-glass scenarios when a patient’s data is needed immediately. At New York Presbyterian, the data security team built their own data privacy platform to suit the hospital’s unique needs. They’re using machine learning every day, looking for anomalies and doing everything they can to protect the patients’ data.
Jeffrey: Here’s number four, one of my favorite from the show. One of the most unique ways in which John and his team use data and machine learning are to curb opioid addiction, looking for anomalies and patterns to discover ways in which opioids leak from the distribution system.
Alexandra: Number five, John shared another very advanced use case for analytics with us. They are building a system, like to build for infosec, for the telemetry data coming from all the sensors tracking their patient’s vital signs, kind of like DevOps platform for the ICU.
Jeffrey: Alexandra, we also heard about the superhuman achievement of the identity team during the COVID-19 crisis in New York City when a huge influx of clinicians and volunteers had to be on and off-boarded in a really short period of time. It was a truly impressive operation at scale.
Alexandra: Absolutely, that was fascinating to hear. Towards the end of the conversation, you also talked about a topic that is personally very important to me. How we could systematically tackle diversity gaps and get more women into cybersecurity, both near term and long term? It’s a pressing issue that needs to be proactively addressed. I really enjoyed hearing John’s passion in how he’s motivated to inspire his daughter, and how they took apart his computer and put it back together. Such a fun father-daughter activity.
Jeffrey: Yes, he’s clearly more than an awesome CISO. He’s an awesome dad. Don’t forget about how John feels about Star Wars. I think I said Creedo instead of Greedo, by the way. Total amateur move. John and all the Star Wars fans out there, please forgive me.
Alexandra: I hope they will. We also got some great tips for finding the best tequila out there too.
Jeffrey: That’s right. What a night, at least the parts that I can remember.
Alexandra: What an episode I would say. Thanks to everyone who listened. We’ll be back with another fascinating episode and another amazing data and privacy story in two weeks’ time. See you then.
Jeffrey: See you then, guys. If I may ask one quick favor. If you can please take 27 seconds and leave us a review, it’d be a huge help. It’s the quickest and easiest way for us to grow our subscriber base.
Alexandra: Yes, that would be awesome. Thank you, everyone.
Alexandra: The Data Democratization podcast was hosted by Alexandra Ebert and Jeffrey Dobin. It’s produced, edited, and engineered by Agnes Fekete and sponsored by MOSTLY AI, the world’s leading synthetic data company.