Jeffrey Dobin: Good morning. Hello and welcome to Episode Four of the Data Democratization podcast. Jeffrey Dobin here, along with my tag team partner Alexandra Ebert, Chief Trust Officer at MOSTLY AI. Today we’re joined by a true frontline warrior and data privacy champion. Alexandra, what should people expect?
Alexandra Ebert: Definitely a lot. Sang is a seasoned startup veteran with some interesting stories and a ton of insight. He’s currently working in Singapore, helping startups reach their full potential as Director of Digital Innovation at Temasek, an investment management company. Today, he talks about the data protection app he invented that was used by millions of people, but eventually was taken down by Apple.
He also talks about the current privacy landscape in Europe, in the United States, as well as in Singapore and Southeast Asia. Plus, he shares his predictions for the tech sector for 2021 and beyond, with us. There’s a whole bunch of other insights he shares during today’s episode. Listen in and enjoy.
Hi, Sang where does this podcast find you? Can you briefly introduce yourself to our listeners?
Sang Shin: Hello. My name is Sang. You found me in this interesting space and time of my life. Currently, I’m in Singapore. I’m a Korean who was born and raised in the Philippines and moved to the United States to study in college. After that, I stayed in the States for a couple of decades, where I worked in investment management technologies, building different systems. Towards the last step of my stay, I ventured into the startup world. Moved over to California, and did a bunch of startups. Once that ended, I moved over here to Southeast Asia to see how I could take what I learned over the past few years and decades and help grow the startup ecosystem in this neck of the woods.
Alexandra: Perfect. It sounds like you’ve got your fair share of traveling before we hit this pandemic and are not allowed to travel anymore. A wise choice to do back then.
Sang: You can never travel enough. Let me put it that way.
Alexandra: Absolutely agree with that. One thing I would love to talk about, you actually invented the data protection app that was not only downloaded, but also used by millions of people. Then Apple pulled it from the App Store. Can you share the story with us?
Sang: Sure, yes. That was a very interesting part of my life. I was towards the tail end of my career in corporate America. I was working in some hedge fund at that point in time. An ex-colleague of mine, and I decided to drop things and tackle this problem, where we just didn’t understand back then and still today, how data was just being collected and monetized in a very not explicit or not clear way, with consumers. There’s a lot of money to be made doing that. We just thought it didn’t make sense the way we had arrived there.
The path to that point was an interesting one. It started with the internet, and the web and with cookies, and browsers and Netscape. It just continued to evolve into the mobile world when phones started coming up and different types of tracking in the mobile setup. We decided to try and change it and take a stab at it. We built an app that, essentially, if you installed the app by default, it would block all trackers from taking data from you, and your history and your activities, also would block the ability for advertisers to send you advertising based on your data, and effectively put you off-grid from that whole digital advertising market data collection space.
Our app really only had one major utility or function. It was a simple switch, on or off state. Default state is you’re off-grid. Should you choose to you can flip the switch and join our ad network or our data sharing and monetization network. The difference with our network is we explicitly are clear on what data we’re collecting, and actually pay you for your data, and for the right for advertisers to show you advertising, based on the data you shared with them. This was a very different take on the way things were. It was a very explicit agreement. You as a user would effectively take control over the value of your data. You could sell it over and over to different advertisers and make money out of it.
That essentially, was the app that we built called Been Choice.
Alexandra: Absolutely impressive. I think it’s not only about taking control, but also get money for the data yourself. It’s definitely beneficial for users. What would you say were the benefits for the companies that were purchasing this type of data, where users gave explicit consent?
Sang: From the business side, there was a lot of issues in terms of options. Meaning that unless you were in the walled gardens of, let’s say, Facebook or Google, or at that time emerging was Amazon, you really had no choice. Where else would the data come from? You would be limited. You would have to rely on other alternative data sources, which would be, at that time was a hodgepodge of data collectors and sellers. They would try it through some complex mechanisms, and sometimes very surreptitious ways, try to put and stitch microdata together to build a picture of who you are, which really wasn’t that accurate.
You would sometimes see ads that you really didn’t care about popping up on your phone. That was a challenge for businesses because their investment into advertising and marketing wasn’t really paying the dividends. Unless you would then go to Facebook’s advertising platform, or one of the walled gardens, where they control all the data and control also the marketing and advertising platform itself, you’re very limited. The ad would only then appear on when they used Facebook, for example. As a business, you were left with not really great options as a result to that.
The value add to them would be to be accessing very high-quality first-party data, meaning data that the user themselves or him or herself has specifically said, “I do want to share this data with you, you company specifically because I’m actually interested in your products and services. I do want to see if you have any deals or discounts for your products and services.” For me, that would be I like to buy, let’s say, Nike products, Nike shoes, for example. I wouldn’t mind sharing my data with them in order to be getting a discount or be able to join a lucky draw or some other marketing thing that they’re doing. They can target me because I want to be targeted by them. They would be happy too because their spend would yield high results because they’re targeting now people who actually want information from them and deals from them.
Alexandra: This sounds actually like an invention that was truly helpful in reconciling, not only privacy protection, but also privacy friendly way of updating utilization. Then you mentioned, when we had a conversation before the recording of this episode, that the app actually was pulled from the App Store. Why was that?
Sang: There’s multiple stories around why the app was pulled. Technically, the official technical story is, and if you Google it, you will see that it was pulled because of some infringement in the way that it handled some certificates, which without getting too technical, is just the way that in Apple and iOS for its iPhones, handled the way apps in general, what access they had in controlling, in our case, some of the data.
The technology we used was nothing groundbreaking, in terms of that it was a VPN, ultimately. It was a very special VPN. There were a lot of VPN software apps out there using exactly what we were doing, to be honest. It wasn’t something new that came up and that they disallowed. It was, in essence, something that had been going on forever that for some interesting reason, they picked that point in time on us to stop.
Alexandra: I think you even mentioned that point in time was after a New York Times article.
Sang: It was when we were published on Financial Times, that publication, that led to a lot of interest in what we’re doing and triggered an avalanche of media coverage of the potential and game-changing and disruption that this business model capability would bring about. That, I think, precipitated at the high levels of the company, some difficult decision making they had to do.
If you remember, I don’t know if most people won’t remember, but I do remember because I was in the hell of it. At that time it was around when iOS, if I recall, 9 or when for people like iPhone 5 kind of, iPhone 6, that era, it was at that time when Apple themselves actually released for their iOS browser Safari on the phone, on the iPhone. iPhone’s browser, they released the capability to actually block trackers and ads for Safari for any developer to come in and develop something and then add it on to the Safari browser. Effectively, also go off-grid, meaning it would block collection of data and block ads from being sent to you. There was a whole lot of these small little ad blockers that came up using that capability that Apple enabled.
When we came about, which was actually just not too far from that, it was like a month or two months, if I recall, it caused an issue because what we’re able to do is do it not only for any browser, let alone the Safari iOS browser, but any app. In fact, your entire phone would be shielded, not just the Safari browser. Then it was a difficult discussion because they had just allowed their browser to do this, but then what are they going to do about apps and other browsers? Particularly, apps where big companies were making huge revenue.
The question invariably boiled down to are we going to allow this for browsers, particularly our browser but not allow it for apps, and why would that be? Because the same company would have a website, and also an app. It’s displaying the same info and collecting similar info. The app would collect more info because it can, but what would be the delineation and difference? That’s the struggle they had. They had to make a decision.
Ultimately, after some months of back and forth, they eventually made the decision that they can’t allow that in apps. Eventually, they made the decision. Then the way they implemented that decision was through this technicality, which I described earlier on the certificate that has existed for many, many, many years, and used that as an infringement policy to whatever their rules and badness. Took us out of the store as a result.
Alexandra: Okay. It can be assumed that somebody had something against millions of people having the option to go off the grid and opt-out?
Sang: Yes, you could make that assumption. Look, if we didn’t have a lot of adoption, if it was going nowhere and nobody cared and nobody wrote about it, then we would have continued to exist. It was only because people were writing about it, people cared, and people were downloading it that it raised the concern.
Alexandra: Yes, I can imagine. I think people downloading it, actually is a strong indicator that this is something that people actually desire to have and wish to have. How did this whole story and how things turned out to be change your views on privacy and data utilization?
Sang: You’ve got to remember this was back quite a while ago.
Alexandra: What year was it?
Sang: We started thinking about this about six or seven years ago, so back then unlike today, people didn’t realize they were the product. Now, it’s almost common knowledge about data collection issues and monetization, but back then, that wasn’t the case. When we were doing it, it was quite eye-opening. I didn’t know what was really going on and the most eye-opening thing was when we were actually creating our app. When we were creating a product, we actually had to do research and development into what data was actually being collected, how was it being collected, and by whom and all this stuff? When I did that work, it just really blew my mind how much data was being collected, that people had no idea was being collected, even technical people.
At one point, I was like, “Wow.” I realized that over half of the data that your phone uses over cellular, and back then it wasn’t cheap, it was 3G, there was no 4G LTE yet, so you actually watched how much data you use, and more than half, 60%, 70% was just tracker information and ads coming out of your phone and you had no idea you’re paying for it. That’s why companies that actually targeted that, company such as Onavo, which Facebook bought, their value add was they were able to reduce your data bandwidth every month by blocking all that stuff out, and so people would download, and install it because they would save money every month.
Alexandra: Yes, of course. Not only that you’re not getting paid for the data you share, and if you also have to pay for it, you’re probably not going to be happy. I would argue, still, with the onset of GDPR and all the emerging privacy regulations, but still, the majority of people is not aware of how much data actually is collected about them every day. Luckily, I think we are really moving in the right direction with this new awareness and people demanding that their privacy is protected and respected.
You left Silicon Valley and moved to Singapore. What’s your perspective on the data landscape in Singapore, also in comparison to the United States and to Europe?
Sang: Singapore is somewhat unique. I don’t think it represents, necessarily, the Southeast Asian region, in terms of data maturity, in terms of privacy, and policy. It’s more advanced. There is a national policy that’s been crafted and whatnot, but I do think generally speaking in this region of the world, it’s still a little early. Meaning it’s not as mature, let’s say, as Europe with GDPR and even America coming closer to that. It’s a little bit like the Wild West, in terms of, especially if we go to some of the more developing countries out there, what the data policies are and who’s actually enforcing it, it’s a little bit loose. I don’t think it’s going to be loose forever.
What I’ve envisioned is that in this region of the world, at least, there’s a window of time where it will get more strict over time. It’s both an opportunity and a challenge. It’s a challenge because we have to make sure that things aren’t abused and we don’t go the same route that we went in other parts of the world, but it’s an opportunity in the sense that we have a chance to make it, so that it doesn’t go there. It doesn’t evolve in that weird way it did evolve over in America, for example. That we help with evolution to bypass some of those rabbit holes and end up with a better setup.
Right now, the history is that it’s so entrenched, and then so concentrated into some big tech companies in the US, for example, that they now wield so much power, and that some of the power is somewhat of a threat even to the politics of the country and where now there’s all this mixture of politics and data and using the data to show you content and with the Netflix documentary that everybody has watched how it can affect people’s version of reality. Hopefully, there’s a way we can get to a point here, where we avoid such concentrations of power.
Alexandra: How confident are you that everything will move in there as you described without these strict regulations being in place? Do you think it’s possible that organizations, it could be this decision to themselves without forcing them to some regulation, but they will opt for more privacy protection and less data collection, and data utilization?
Sang: Yes. On the surface, one would think that it can’t work unless there is legislation and rules and enforcement of the rules. I don’t necessarily think so. I think there’s another component that will affect the way it evolves and that is essentially in profits. If the market evolves, where there actually is not much profit to be gained anymore by doing that, and that we’ve moved on to other things, then that could be a reason not to do it anymore.
Let me give you an example of that. Google recently said they’re not going to collect data anymore from web browsers and sell the ads, they’re shutting that down. You’ve got to ask yourself why they’re doing that.
Alexandra: Yes, absolutely.
Sang: If you think they’re doing that because they woke up one day and decided we’re not going to do any more evil, like we promised a decade ago, probably not. The reason most likely is it’s a diminishing return, in terms of revenue stream. In my opinion, we’ve kind of finished about a decade’s worth of doing that, and what is that? That was kind of faking you out using human psychology to addict you or to make you tap this tap that, so you can get that data this data. Basically, hacking your human psychology and then using that to get as much data as you can.
There’s a whole bunch of mobile apps that develop everything from Uber to location data in the past decade. I think that that kind of is closing. That kind of way of making money and apps is slowly being replaced by a new interest in creating new technology and apps to actually make impact. Because we’ve got serious issues coming up, and then see a lot of investment money going into these types of funds that look at impact, such as sustainability with a huge influx of money coming in there, for example. Investment money is pouring into these things and not so much anymore some whiz bang app, that’s going to collect your data. I think it’s going to change. I think the next couple of decades it’s going to morph into those types of investments, and ultimately innovations and new technologies.
Alexandra: Absolutely. I think we have to move in these directions to save the planet. Still, I’m a little bit surprised at what you described about Google because I always thought that the majority of the revenue is actually generated by selling. Now the possibility to advertise in the search engine, it would be a big step, I assume, for them to really move past and beyond that and focus on other areas that would have a bigger impact, as you described.
When we come back to Singapore, have you noticed any notable privacy tech innovations we should watch out for?
Sang: Sure, there’s quite a bunch. The most obvious one I can think of the one that I’ve been privy to help out, when I can with the COVID situation, there’s this whole, how are we going to re-enable travel in a secure way and is there going to be some kind of a digital passport? How do I know you got tested and with what test kit and by whom? Eventually, is going to be how do I know that you get the vaccine, for example? There have been trial tests here.
For example, with the privacy, one of the solutions that has been pushed and been trialed, that we’re somewhat involved in is to actually not have the data through using DLT and the chain, not actually have the data go anywhere, but just have the attestation of the fact that he did it by an accredited place, and through wallets, confirm that. That’s an interesting way, where the data is preserved and what you’re doing is attestation and the accreditation of what happened, rather than the actual thing itself.
I think that’s going to be something you’re going to see more and more where the data isn’t transacted on.
Alexandra: Understood. Do you actually see a role for synthetic data in the space of enabling privacy-friendly data utilization?
Sang: Yes, totally. Synthetic data, I think is as you said, is an enabler. It lets you do certain things as a proxy. You can generate certain types of insights you can generate certain value out of that data because it’s synthetic, without actually using the data, which comes into the issues that we talked about. I do think so. Yes, definitely.
Alexandra: Perfect. Have you encountered any examples in the wild, that you can share?
Sang: I’ve been involved in it actually. One time we ran a datathon type of hackathon, where it was the first time in the country of Singapore where we put together data from private and public sector, which never happened before into a sandbox and had hackers basically come in and see what they could build out of this data set and it was synthetic data. It was the only way we’re able to get the agreement between all the entities, and you can imagine: the government, with retirement, all that with the banking info from a major bank, for example, and traveling for a major airline and all this, was through that.
Alexandra: It sounds impressive and of course, I think that’s a nice way to really make data accessible and also unlock some innovations. What was the outcome of this hackathon, any great innovations insights that happened there?
Sang: Yes, it was actually interesting. Obviously, the hope was out of the hackathon or the datathon, that this new unicorn is going to come out, but that’s clearly not what happened. What ended up happening is, what we found was some very interesting insights and correlations, in terms of human behavior and let’s say, their propensity to save for retirement, or different population segments and their ability to be healthy. It’s not like you could build a company out of those insights, but there was a lot of very important insights that I think would have helped a lot of, let’s say, policymakers, gain ideas into what’s going on and also from the private sector and how they can better help people.
Alexandra: Yes, absolutely. Since you describe this how people save for retirement, that reminds me of one study that was once conducted, where they found out that people are much more likely to save for their retirement, if you actually show them a picture of themselves, that is photoshopped for an older version. I think this exact finding was taken by some startup, that’s now doing this with an AI service that helps you to build some kind of financial savings tool, where you can regularly see and also talk to your older self, and therefore increase the likelihood to save for retirement.
Sang: One thing I recall from that was there was a certain amount of traveling that people did by a certain age and if they hit some threshold, like traveled a certain amount by a certain age, their retirement funds were not great.
Alexandra: Okay. What was the exact correlation, so was it just too much traveling that’s consumed all your retirement savings?
Sang: Kind of, yes. Essentially, that your propensity to spend was high as a result of traveling. There will be some leading indicators into that.
Alexandra: Understood. Of course, it’s a difficult choice to make while you’re young.
Sang: Oh yes. There are no right or wrong answers. Everybody decides how they want to live their life.
Alexandra: Absolutely. I think that’s always the best choice. What are your predictions for the tech sector in 2021 and beyond?
Sang: Oh, wow. This part, maybe you don’t want to keep in the podcast because this is how I get caned later on, “Sang predicted that, never happened.” Predictions, we’re in a very interesting point right now. Continuing on what I said earlier, it’s interesting in gaming the human psychology and making money through apps that make you tap and click here and suck you in through addiction. I think that that is last decade. I think, as I was saying, more and more technology is coming in, into impact, whether it be through sustainability or life sciences. I do think that biotechnology and genomic areas is going to grow a lot. In fact, it may be where sustainability was two years ago, which was really explosive, but people are just getting to know. Then all of a sudden since the last two years, it just exploded, in terms of funding, at least.
If anything, COVID has shown that if enough money and smart people are put together onto a task, amazing things happen. We have messenger RNA vaccines in a year. That’s just crazy. It’s just unheard of. It’s literally like in textbooks in 100 years from now, when they look back, they’re going to remember this as that moment in humanity were like, “Wow, if we work together on something, we can do amazing things, literally.” I think it’s showed, and that’s why I think biotechnology and genomic, we’re going to find more cures and more money pouring into it in the coming years. It’s going to be very multidisciplinary.
The other thing that COVID is shown is not just biologists and medicine and medical people and doctors, no, it’s machine learning experts that are out doing the traditional. It’s the data, the huge open databases now that house all this information that people are using to do research. It’s a very collaborative, multidisciplinary platform that has opened up. People’s minds also now, it’s impossible, now they think it’s possible.
I think for those reasons, that area is definitely going to be an area of interest. That in the usual, like FinTech, is democratizing people as they get digitized, particularly in developing nations. AI is going to continue to do what it does. That’s something we have to closely monitor, as it evolves. Then you have fringe things like space and quantum. A little bit further down the line, that we’ve got to keep our eyes on as well.
Alexandria: Definitely interesting. You mentioned not sure if these predictions come true, but I definitely hope they do because I think there’s so much potential. If we really manage to work together and create technology that serves us and not, I don’t know, puts us in front of our screens, and let not get us away from them and spend hours in front of social media, I think then we can really benefit from this on a global and humanitarian level. Let’s hope that things continue.
What are actually your three best pieces of advice that you share with people or companies that look into leveraging data, and also privacy for their success?
Sang: It’s super important to be transparent. This whole like, “I’m going to hide it away in small text, and do all this stuff,” is gone. Especially now we have Netflix documentaries talking about it. That’s yesteryear. The more upfront you can be, and the more transparent and clear that whatever product or service you’re providing, needs certain things for certain reasons, that is something that enables that trust with the people. Which right now is broken. That is a competitive advantage these days, is that trust.
The other thing is rather than being so protective of the data, if you think about software development, it used to be the case where you have proprietary software, you put it into a vault, like Windows code, and nobody can touch it. It’s yours. You sell it and you make money off, and you license it and whatnot. That got disrupted with open source. Now, it’s all about open source, the more people using it, actually it’s is more secure, like signal or bigger market opportunities, because of large user base. These types of things, I think, will apply to data as well. It’ll calm you over and spill into the data world, where it’s now not anymore, like Facebook walled gardens, “This is my data,” but more on like, “Okay, we’re going to have open databases. We’re going to have open anonymized databases.”
Similar to open-source software, these open anonymized databases, will provide new business opportunities and new capabilities and insights that you wouldn’t be able. That’s the direction, I think, things should go and probably will.
Alexandria: Absolutely. This is also one thing I’m particularly excited about, that the European Union is looking into ways to make open data possible in a privacy friendly way and therefore, give also these small and medium enterprises and startups the possibility to innovate in a much larger scale. Not only the large corporations have all the data and all the innovation.
Sang: Look we’re re-living in a decentralizing moment in human history. Decentralized finance, it’s not just finance. It’s everything and it’s enabled by technology.
Alexandria: Absolutely. Well, I think we can continue this conversation for hours to come, but I don’t want to keep you from your weekend too long. Let’s maybe move to the last section of our recording. We usually like to play this game with our guests at the end. Just answer the first thing that comes to your mind. Don’t think about it too long. Are you ready?
Sang: Sure.
Alexandria: Perfect. First question is rainbows or unicorns?
Sang: Rainbows.
Alexandria: Why rainbows?
Sang: You can never find the end of the rainbow. You’ll always be going and looking for it.
Alexandria: Keep moving and keep learning. Traditional banks or Fintechs.
Sang: Fintechs.
Alexandria: Why Fintechs?
Sang: I’m just not traditional, to begin with.
Alexandria: Valid argument.
Sang: Yes.
Alexandria: No worries. Global or local.
Sang: Both, globe, local. Glocal. My answer will be glocal.
Alexandria: What do you mean by glocal?
Sang: I’m a glocal guy. A local, but global. Again, enabled through technology. I can be local, but global at the same time.
Alexandria: Yes, definitely. That’s a benefit also now with all this technology, that we have the possibility to have conversations with people all around the world to collaborate. It’s really impressive to see how the pandemic actually, also boosts these capabilities and enables so many more things. Apple or Google?
Sang: You know I got banned by Apple, right?
Alexandria: Okay, so not a fan?
Sang: I think that neither.
[laughter]
Alexandria: Okay. Let’s leave it like that. Privacy or data driven innovation?
Sang: Can those two coexist?
Alexandria: I think so. This was actually the answer I wanted to hear. Thank you, a lot, Sang. It was really a pleasure talking to you. It was so great to hear all your stories.
Sang: Happy to be on board and happy to have shared some of my thoughts. Thank you for anybody who’s listening for listening.
Alexandria: Absolutely. Thanks for listening. Have a great day.
Jeff: I really enjoyed today’s episode, and hope you did, too. Sang provided much food for thought. Alexandra, can we pull together some takeaways for our listeners?
Alexandria: Sure Jeff. I’ve got three. First, be transparent on how we use data. Transparency is actually a competitive advantage in today’s market. Consumers expect that the privacy stays protected.
Second, join the open data movement. That is going to be the next big thing, following the open-source software trend. Open Data will bring new business opportunities and valuable insights.
Third, be on the lookout for more startups and tech with a mission to positively impact society and the planet. Whether it be climate change or social good, we are only at the beginning of innovation. We’ll see a lot of advancements over the next 10 to 20 years.
Jeff: Boom. Thanks, Alexandra. Thank you, Sang. To our listeners, I hope you enjoyed the show.