originally published: 2024-08-20 15:43:21
Hessie Jones
Hi everyone. My name is Hessie Jones and welcome to Tech Uncensored. This is the third and final day at collision and I always say the third day we reserve for the best and the last so I’m here. Talking to Kabir Barday, who is the CEO of OneTrust and we’re going to be talking accountability. We’re going to be talking trust. We’re going to be talking responsibility, especially in this age of ML’s and AI. So welcome.
Kabir Barday
Yeah, let’s do it. Thanks for having me.
Hessie Jones
Awesome. So why don’t you tell us a little bit about one trust and especially the milestones that you’ve recently Hit.
Kabir Barday
Yeah, at one trust, we’ve pioneered a new category of software around the responsible use of data in AI and everything it takes to do that, and that includes transparently collecting data, simplifying all the compliance automation of all the different regulations of that data. And then enforcing the policies when you’re using the data. And so we’ve built a platform around that. We’ve grown incredibly quickly over the last many years, first fueled by privacy regulations and now AI we have, we’re approaching 500 million of AR is the recent announcement we made 14,000 customers all over the world. Those are some of the. Recent milestones.
Hessie Jones
So it looks like privacy is good for your business, right? Yeah. So from your perspective?
Kabir Barday
Privacy is good for the world.
Hessie Jones
Privacy is good for the world, but for a long time, privacy flew under the radar, as you know, and a lot of companies were doing amazing things. They’re innovating. The privacy was a start, I would say a hurdle or a stopping block for many companies so.
Kabir Barday,
Yeah.
Hessie Jones
We are reaching a phase. It’s almost a dichotomy between. Speed of innovation and also responsible technology. Can you speak a little bit about how you’ve seen that shift over the number of years?
Kabir Barday
Yeah, it’s been interesting. When we work with companies, so our product helps companies manage risk and use data. And so you want to bridge those. Apps, but the challenge in the market is that there there are almost sometimes two sides to that equation. When I meet with companies, there are business teams, marketing data, people trying to use more data. Their jobs depend on innovating and using more data, and then you have all these risk teams, privacy, security, ethics, legal. It’s like 8 different risk teams now. Their job depends on supporting the business, but also minimizing risk. And all of these teams are under pressure now because of these new demands, and that pressure sometimes even puts those teams at odds. The good news is there is a common goal and a common question that is bringing these teams together and it’s how do we future proof our data. Across all these risk angles we care about at the speed and volume demanded by our data and AI initiatives, and we’re seeing those teams coming together but needing something new to help bridge that gap. And that’s the gap we solve with our software.
Hessie Jones
So we spoke earlier offline about the need for marketers. I wouldn’t say the need for marketers, but maybe marketers realizing that there is a need to to look at privacy in a different way because the industry advertising industry marketers have been known to target. And maybe they haven’t used it in the best way. I have to say that when they’re doing it, they’re doing it with the best intentions. They’re trying to find increased revenue, relevant customers, but sometimes the targeting that they’re using are using variables that are not necessarily. Stuff that should they should have access to. So what? What’s your perspective on the the common day marketer and why there is a need to be, I would say more responsible.
Kabir Barday
Yeah, I love the question and it and it’s so in tune to the moment because almost every company I meet with now 3-4 years ago, I’d show up and it would be the privacy and the legal team in the room. And now it’s the marketing and the data team in every meeting leading the privacy charge. And what we’re seeing is that every marketing team is really going through three phases. 1st, it’s regulatory compliance. Second, it’s first party data capture and 3rd it’s AI enablement. When you start in that first phase of regulatory compliance, these marketing teams are really thinking about Cookie and cookie compliance because that’s the first thing the regulators going to enforce. So great, that’s usually where they start, but but we know that’s just the tip of the iceberg and there’s still lots of of. Techniques that are being worked out in that world. But where companies shift to marketing, teams see reliance on 3rd party data is not their future. It’s gotta be first party data. Your ability to capture first party data is directly proportionate to the trust you have with your customers, and that trust is activated through privacy and choice and control and transparency. So companies are deploying these technologies around consent and preference management. We’re very well known and widely used for that as well. That’s kind of the second phase now, the third phase is new with AI and what marketing teams are realizing with AI is it’s driving a whole new urgency to your point. And that’s because there is no machine on learning. When you put data into a machine learning algorithm, the data loss has happened. The issues have happened. You can’t selectively remove it. That’s different from a previous CDP or personalization project where you can just delete the data from that environment. And regulators have found this out. When I meet with regulators now they’re shifting their tool of choice. From the. Line to a data deletion and now an algorithmic deletion order because the only repercussion. The only way to solve an issue in machine learning deleting the data doesn’t help the machine learning algorithm already learned it. You got to delete the algorithm and so marketing teams are seeing this existential issue to their marketing campaigns. And trying to lay the foundation up front and what’s happening is because global laws are everywhere now there’s a common playing field that everyone needs to comply and so now you can differentiate with privacy. In the past when it was only certain states or certain countries with laws and it wasn’t a level playing field, it was a. Problem. There’s a second interesting concept, which is This is why most companies that were lobbying for privacy laws not against it, because it’s not fair for a Microsoft if they got to comply with all these global laws and then some local company in the US doesn’t. It’s like Microsoft is loving for privacy law. And so it’s really driving this interesting trend where if you can’t beat them, you join them.
Hessie Jones
Well, I I want to address a couple of things. So first, First off, you mentioned consent and everybody would argue that in the age of advanced artificial intelligence, that consent has gone out the. Door. We are now seeing a prevalence of data scraping. There is no consent for data scraping. There is no consent for data brokerages, and there’s no provenance for when that data was actually received. So there are companies that are saying maybe we go out there and we become the proxy for the consent mechanism. So the. That the consumer doesn’t have to worry about where their data is and where they can control it. Because from their perspective there is no more control. So I want your perspective on that.
Kabir Barday
Yeah, there there’s, there’s a lot of different concepts here and I think this is an area we’re going to see a lot of interesting new business models in innovation. There been companies that have tried to innovate in this model where you have a almost a personal data vault for a consumer. They put all their data in it and then people, those have been around for 10 years and those businesses have gone nowhere. Now will that change with? AI, I think they’re fundamental issues with that model and they’re issues with how that that companies compete with each other. And if one company is, these problems are very in your face with child consent as well. It’s like if you consent to use a service for company A. Then Company B, their direct competitor, gets that consent for free and it’s not, you know, so they’re competitive issues and things that are preventing that market from from surfacing now more broadly. What companies are starting to realize is any data they have, whether it’s scraped, whether it’s collected first party, or whether it’s from a third party, there’s a new level of governance they need on that data and that new level of governance is a purpose specific attribute on the data. It used to be that the industry used to talk about what’s called data access. Governance, which just has to do with, is the data sensitive and who needs to access it? Now there’s a third dimension and it’s do you have permission to use it for that? This and this is really overhauling how companies are thinking about data governance to embed purpose specification in their data. So now if a company goes and scrapes a bunch of data online, the question for that company is what that data you collected? How was the consumers of that data able to give you? The purpose specification and what documentation do you have that proves you have the right legal basis to have that? This is something that companies are taking risks on today and if they get that. And. Look, all companies take risks. Whether that’s a smart risk to take or not, a smart risk to take is not for me to stay right in hell. But what I would say is if a company gets that wrong, the repercussions are dramatic and are going to be seen in the next few years because the repercussions of data deletion order. Imagine if open AI has an enforcement action on one small thing they scraped that violates a regulation. A tiny mistake that they make a regulator can do a data deletion order on their entire algorithm. That pretty much shuts down the entire general purpose. Lam. My prediction is that’s going to happen in the next few years and we’re going to see significant disruption now. I don’t think that’s going to slow down companies ability to use enterprise LM’s with their own proprietary data sets. I think that’s the future. It’s companies that have proprietary purpose specific consent based data sets are going to win and companies that are doing things that don’t really have the right. Intellectual property copyright consent. Permissions. You’re one data deletion and one algorithmic deletion. Order away from being shut down. It’s going to happen.
Hessie Jones
It seems like you’re saying companies like I would say cohere Claude anthropic open AI. That are creating a lot of these general models. They’re probably the ones more at risk than the ones that are. Then the companies that are actually using the the model to create sub models is that.
Kabir Barday
Right. You know, I I think I think so, right, because the differentiation in AI number one comes from proprietary. Data not not the model itself, right? These models are going to be everywhere available open source, so who has the most proprietary data? This is also interesting because. Privacy law and AI law is going to start overlapping with competition. Law. Because companies that have the most data set the the the companies that have the best relationships with the first party consumer are going to be able to collect purpose specific consent based data the best. And they’re the only ones that going to be collected. And when these enforcement actions happen and the people that are collecting it illegally versus not collecting it, you have the haves and the have nots. Well, who are the big companies that have the most first party data relationship? You could probably think of a few big platform companies that start to now have competition issues, so data is not just an ethical issue, a security issue, a fairness issue, and a consent issue. But it’s now a competition issue as well, and I don’t even think we’re scratching the surface yet on how hairy of a problem this is going to emerge. That gets back to the point of your your original question, which is what are companies doing today and companies are doing today are realizing and marketers are realizing today that proprietary first party data directly from their customers is the goal. And trust is the currency that allows you to exchange. And so yeah, marketing teams, data teams, everybody’s now upping their game and looking at privacy as a differentiator, not as a regulatory compliance issue.
Hessie Jones
That is so good to hear. I mean, I’ve seen ever since the evolution of the privacy Commissioner telling Facebook, yeah, they gotta change their their privacy system in order for Canadians to actually go on to that platform. I think that was circa 2000 and.
8 or 2010 it’s it’s actually changed so much since then, so thank you for that. OK. So let’s shift gears a little bit. Let’s talk about Dora the digital Operational Resilience Act. Can you, I guess in layman’s terms, introduce what does that mean for financial institutions and their vendors
Kabir Barday
Yeah. So, so Dora is an act around digital operational resilience, I think originated out of out of Europe and applies to financial institutions, but. People interacting with financial institutions that end up getting in scope and directly. This is one of the biggest demands around. Ultimately third party management that we’re seeing with companies, a lot of the companies ability to be resilient operationally has to do with who are all your third parties who are all your suppliers and do you have. Some sort of do you have issues with that? If there are geopolitical issues in a certain region, if there are even climate change issues that put a data center at risk, but even just security and privacy issues in that supply chain, one of those entities goes down. What’s your ability to stay in business? I mean, we learned this in the pandemic when businesses were shut down. Dora takes it a step further, rightfully so. And saying it’s not just to direct vendors, but it’s the fourth parties as well. What starts to happen is you have a lot of vendors that, let’s say, a technology vendor that you’re using, maybe all of them are using the same back end cloud data. Or security provider you end up having a hidden concentration risk at a fourth party. Then now Dora is trying to bring forward. So those are some of the concepts in in Dora now more broadly. The How this relates back to our AI topic is also interesting because most of the companies AI exposure when I meet with companies today, they’ll say I have two or three. Yeah, really big, really important AI projects, but we have our arms around it. We haven’t gone live with them yet and so our exposure is limited. What they really start to realize is they have 1000 plus AI projects happening at their company. That they don’t realize is happening. It’s coming through your software supply chain. The difference with AI versus the Internet or mobile as a as a as a new technology trend is Internet and mobile took many years for technology companies to replate form on a new thing. AI is an API call. It took us a week to integrate AI into our application, so now you have A and it’s not a new vendor. I signed on. It’s just. It’s it’s, it’s the existing vendor for the company. So now you have thousands of vendors that just built an AI into their product and are pumped into the supply chain into all these companies. And the difference is those vendors are being used directly by line of business. It’s like shadow IT and those line of businesses are putting data into it. That is a completely new angle that companies have no control over. So third party management, not just Dora, but third party management in general, is a massive, massive trend right now. When it comes to responsible data in AI and this area we excel at.
Hessie Jones
I’m. Wondering because of that, to what extent to what extent is the are you liable the first party company to the breaches or the mishaps that happen at the end line?
Kabir Barday
Yeah. So so I’m a technologist, not not a lawyer, but I do. Study these issues pretty heavily now. It depends more broadly. This was one of the big shifts in GDPR, so GDPR holds the controller which is the company that’s collecting and and having that data that that responsible. So me as a company I am responsible for what my vendors. Do. Now if I want to shift that liability back to my vendors, I can try to do some of that in my contract with my vendor, but the regulation supersedes that and still holds me accountable. So I can try to say, hey, if I get in trouble, I’m going to try to put a $5,000,000 liability clause in my contract with my vendor and I’m gonna try to reclaim that from them. The regulator still coming after me. My name is still gonna be in the news. And if I want to drag my vendor through the mud, I can choose to do that or not. But the accountability is now on that, on that first party that’s driving a new level. You use the word accountability to start. That’s driving accountability and a lot of these regulations are really accountability based regulation and they’re trying to make the accountability clear. And they’re trying to say you don’t just need accountability, but you need to have the right people in your company accountable in the right org chart where they can be accountable and and GDPR and privacy even goes as far as being prescriptive in some of that.
Hessie Jones
I I would love to speak to you more. I’m. I’m getting the you got to get out of here. Absolutely. So I want to thank you so much for speaking to me. I think what we’re probably going to speak a little bit more offline because I want more information from you. So everyone thank you for joining us in this conversation.
Kabir Barday
We’re getting getting signals. Yeah. Yeah, she was fine. Yeah.
Hessie Jones
Everyone enjoyed the rest of Collision and this is Hesse Jones for Tech Uncensored.
This website uses cookies to save your preferences, and track popular pages. Cookies ensure we do not require visitors to register, login, or share any identity information.