GDPR + Data Management with Data Protection Expert David Clarke

GDPR and Data Management are Key topics in today’s hi-tech world, the main issue we have is data privacy and how to navigate the abyss.

David Clarke shares so much value in this episode about GDPR and Data Management.

This is a must listen for all executives and anyone wanting to understand the minefield of GDPR and Data Management.

It will alleviate many of the myths around it and some of the panic…

WARNING — AI Transcriptions May Cause Grammatically Correct People Serious Stress

In today’s episode, I’m interviewing David Clarke. And he is a GDPR data protection cyber security expert, and the founder of the GDPR Technology Group. David has operated across FTSE 100s, SMEs and startups, within lots of lots of different industries. And it’s a very interesting conversation where we talk about data management, privacy, all the key topics that anyone living in today’s society needs to think about, especially if you’ve got kids this is a really, really important episode.

Well, thanks for joining me, David.

David Clarke 1:14
Thank you. Pleasure to be here.

Nathaniel Schooler 1:16
So I know people probably sick of the words of GDPR. But you know, that’s something that you’ve actually specialized in. And I think people would like to hear a bit more about it really, because you’ve been focusing around that area for the past few years. And I know, you got a background in cyber security. And I think a lot of people are still kind of behind with the GDPR.

David Clarke 1:41
They are and there’s plenty of work to be done in many companies. And, you know, at the end of the day, I guess the key is in the title or the regulation. Data protection is looking after people’s data. And generally, to make that work, we do have to have a good basis in cyber security, because if we can’t put the locks on the door; there’s no point kind of talking about privacy. It’s almost like giving someone curtains, and they don’t even have a window, you need the foundation there to make it work.

Nathaniel Schooler 2:12
Right. So where did where do you start?

David Clarke 2:18
Ideally, start at the top. And we talk to the board or senior management to work out actually where the scope is? Where the risks are? We then get agreement on where the risks are.

And then we put together a plan, starting with is where is your data?

And for many companies now, this is highly complex, because everyone’s using cloud services, they may have a bit of legacy data. That could be anything that’s a few weeks old in reality, it’s just maybe on a different system. And of course, how do you know how that data is being used? Who’s got access to it? Can you deliver the data, subject rights, so we kind of do this type of analyses.

The GDPR talks about, technical and organizational measures, I think there’s probably another layer in between, which is the operational measures. And we try and give a company a measure of that, because although you can’t really say, this is at 50%, or 60%, where you can do is say, we’ve made an improvement in the last three months, and we’re now improved beyond whatever we were thinking before. So it’s a relative measurement.

Nathaniel Schooler 3:31
Right. So it’s so it’s bit like sort of a due diligence procedure?

David Clarke 3:37
Actually. I think that’s, that’s a good way to describe it.

Having been on the kind of wrong or right side of many big audits by the big kind of audit companies, the questions are going to be similar. And you know, can you can you answer the questions? And can you go down a number of levels? And if you can fulfill that you’ve done the best you possibly can. And I think that’s really all they’re asking for is that you understand it. You know, what to do; things do go wrong, I think it’s fully understood things will go wrong horribly, sometimes, but at least you understood what to do and how you control it.

Nathaniel Schooler 4:13
Yeah, it’s very difficult with things like data breach and stuff like that. I mean, we hear about it all the time. The big credit card companies, and, you know, and things like this. And it’s just like, well, whose responsibility is it? Say you are Sainsbury’s or one of the big, big companies in the UK, right, and you have a data breach? surely it is your responsibility to tell the banks and the banks need to lock it down from there. And before something happens, surely?

David Clarke 4:42
And I think, that’s quite right. And actually what the information to commissioner’s office says and the GDPR says that if there is a risk to the data, subjects, potential future, or there’s a detriment to him, you need to have done that risk assessment as part of the breach. And yesterday, or the day before, there was a great article, some company did a freedom of information on the ico and the on the data breaches that have been reported. And the figures were quite astounding. And I think 90% didn’t know what the impact of the breach was on the data subjects, you know, 70%, handn’t filled the form out properly, I’m sort of paraphrasing, but it was at that level.

So the understanding, you know, it’s not just reporting the data breach, it’s demonstrating how you handled the because she knows I said before, things will go wrong. But if you can demonstrate:- “Yes we understand the risks, we know where our responsibility is, and we’re doing our best!” And mitigate it, you’re going to be in a good place. You know, in Singapore, they’re they’re pretty strong on most of this type of data protection. And they often don’t fine companies for data breach, if they can actually demonstrate they did all the right things at the right time in the right way.

Nathaniel Schooler 5:56
Right. Yeah, it’s a very interesting one, like, who actually he owns the data, one thing that we’re, you know, avoiding blockchain, guys, come on, you know, it’s about time you took control of owning your own data, well, actually, the horse is already bolted, as far as I’m concerned.

David Clarke 6:19
Absolutely. And actually, it’s that concept of owning data. When it comes to personal data, nobody really owns it anymore that data subject and using data, you just have the control of it, or maybe the right to use it for that period. So it’s a totally different concept. It’s not a entities data to do with as they wish; those days, as you can see from the sort of Facebook and the Google inquiries are rapidly disappearing.

Nathaniel Schooler 6:45
Yeah, it’s something that I’m still sort of lost with really GDPR. I have a couple of websites, and I collect email addresses. And, but that’s all I really collect email, and name, sometimes not even the name, you know, and it’s like, well, if I’ve got your email, and you don’t like it, just unsubscribe, like, you know, I’m not really. I was using a lot of LinkedIn emails. And, you know, I had a LinkedIn database of like, probably, like, 9000, 10,000,

I was sending out a mail shot to them. And as soon as GDPR came along, I was like, You know what, I’m going to stop that all together. And then kind of social media became more of the focus that I had. And even if though the thing is, is that you’re giving away access to your data, but I’m in B2B, though, so I focus on business people. So I’m not really sending someone an email and telling them to buy something. I mean, it’s quite different, for example, I bought some glasses a few days ago, some reading glasses. Right. And, I don’t even think I subscribed to the updates from I’m not gonna mention the name, but a big retail company in the UK that deals with glasses. Yeah, there’s only three or four. Right? This is true. And thing is, they began sending me emails, right? “Oh, look at our contact lenses. Oh, you have Amazon voucher!” I opened one just to kind of see if I can unsubscribe. And then I get bombarded with more emails, then there is not an unsubscribe box on the email. In addition to that, I can’t even reply to the email, because it’s a no reply reply email. How are you supposed to unsubscribe? Like, that’s illegal, isn’t it?

David Clarke 7:10
Wow.

Pretty much. And I think the other thing is, it’s not just GDPR, there are other regulations around that, there’s a regulation called Packer, that’s been around a long time. And that controls the human indication of data. So actually, it’s that one that says, where, how you email and how you use electronic communication. And then GDPR kicks in on how you store and process that data behind the scenes. So actually, two levels of legislation, which makes it a little bit more complicated.

Yeah, and you’re quite right, everybody, most of these things, they should have an opt out that there are there are ways, if you bought something from a company, there’s the concept of the soft auction, or actually, maybe this was part of the contract that, you know, when you do business with somebody they say, actually:- “Part of the contract is that I will tell you about my products, yeah, it doesn’t all have to be based on, you know, the concept of consent. And I think that’s where a lot of companies when a bit crazy on especially on the 24th, 25th of May, when we all got inundated with a billion emails, I don’t know whether you got the emails where there’s services that I buy off companies, and they said:- “If you don’t consent to the email, we won’t be able to bill you.” And I replied to them:- “So you guys, kind of crazy, this is not what GDPR is about, GDPR is just to make sure data is safe. And it’s not exploited, you know, in a bad way. It’s not there to stop you doing business and receiving money and collecting invoices.

Nathaniel Schooler 10:17
Right. So what are the sort of big myths then around around it? Like for further for people?

David Clarke 10:25
I think there’s kind of a number of myths. Yeah. And everything is going to be put in context. You know, you often see a lot of debate on LinkedIn. And sometimes, you know, it’s probably in the right direction. But without knowing the full context, you could be making a big mistake. I was doing a conference on long ago, where somebody said they had 4 million either email addresses, and can they use them? And of course, the first question you should ask is:- “Are they existing customers?” Because that’s a different answer, then if we scrape them off the internet, or we bought them or we acquire them from another company, right? So the same same data, different context and a different way of handling it.

So it’s very contextual. I mean, I guess one of my favorite examples is I don’t know if you remember the Jeremy Corbyn on the Virgin Trains, where he got his picture, taken sitting on the floor, and trying to prove the train was overcrowded. And then Virgin actually then released pictures of the whole carrying show do is pretty much empty, other than a couple of people. So they got taken to task by the ICA, they won’t find anything. But the point made was, they had the right to self defense. But actually the people on the train is photos were then plus that all over the national newspapers, their expectation was actually this cameras are used for security, that they’re not going to be, you know, plus all over the national press. So you know, the same photo can have multiple different contexts and connotations and usage. And it’s really putting the line to content, which is why it’s difficult. So one scenario may look good. Another one may look not so good.

Nathaniel Schooler 12:03
Right.

Yeah. But then then it’s also it’s back to the security of that data, isn’t it, like you say, actually locking down that container, wherever it may be? And the transport of that data?

David Clarke 12:18
And it’s become, and it always has been, in reality, it’s become more than that it’s who has access to that data. And how long for the great example is, is, you know, hospital work, inadvertently looks at the wrong patient’s medical details, is that a breach, it probably is a breach. But when you actually think of actually how you gotta fix this, when you have, 10,000 Records in 200, hospitals, you know, 100 staff, the data has 100 million combinations. So a 1% error in who gets to look at what could be a million data points out. So it, although it sounds simple, technically, could be quite complex? And how do you manage that going forward? How do you know that person hasn’t changed roles, and they need access to it? And it’s making that as frictionless as possible? Which, I guess sounds easy in 10 seconds, but reality is a little bit tougher.

Nathaniel Schooler 13:13
Yeah, I mean, everything sounds easy, until you really dig into it. I mean, if you’re documenting the process it’s a massive task, isn’t it?

The record keeping of it, it’s just massive. But it goes back, it goes even further, though than this, it goes deeper than then the sort of service level, because it’s actually everyone’s responsibility, isn’t it within that business?

David Clarke 13:40
It’s everyone’s responsibility. And I guess it also comes down to how we architect our systems and how we implement those systems for our staff and employees to use, because quite often, it’s kind of easy to tell them, you know, to change your password, don’t leave the data lying around, they use USPS. But actually, if you’re database collection system, and you’ve got the structures that say, collect every piece of data, that person when he brings up is down to the database designer, and who authorizes that data. To be collecting, you do you need to, what moon sign they were born under, for example, maybe in some circumstances, but most brokers answer is no.

So question would be:- “Why are you collecting it?

I mean, that then you come into the regulatory areas where, you know, there are regulatory requirements, safer companies in the UK dealing with the US, where they even have to collect people’s height and weight, which, which sounds crazy in a minute, when I first heard it, I thought this was madness financially, in the US, that was a requirement. So they had to do it. And because now you’ve got that data, you’re going to manage it, you gotta look after it properly.

Nathaniel Schooler 14:46
Yeah, the amount of data that’s being created on everyone is just insane, isn’t it? Like, I forget the figure, like, there’s some ridiculous figure isn’t there, like, per day. But then sort of, it goes to, you know, if you think about, like, the the kind of video data that is that is taken on people. So for example, you know, you could be out in the streets and, the government are looking at that data, and they’ve got artificial intelligence that is analyzing that video footage to make sure that you are not carrying a weapon or walking so that it looks like you’re definitely carrying a weapon. Yeah. And so they are using this data for specific purposes. And the real issue is that where it’s stored, and then the transport of it and where it actually goes afterwards. And like you say, how long they keep it for?

David Clarke 15:47
And how it’s used behavioral analytics is, you know, kind of a cool thing. But you know, sometimes machines, computers, they make mistakes, and if you get profiled:- “Look, this guy looks like he’s carrying a huge weapon.” It’s never used, but you get a mark against your name. And you’ll never know why potentially.

You know, my kids school, you know, they don’t talk about kind of have some money for lunch is top up my thumb, because they use their thumbprint to you know, I’m not allowed in the school anymore. So my wife doesn’t let me know. Because I’m always saying, you know :- “What are you doing with the data? How are you managing it? Are you going to delete it when they leave?”

And, you know, those questions, are difficult to get a good answer to?

Nathaniel Schooler 16:24
Yeah, well, it’s sort of edging edging into territories of Skynet, isn’t it?

Isn’t it? I mean, you know, if you if you sort of look at the new AI powered kind of drones, and in would warfare and stuff like this, we were creating huge risk, a huge security risk that the in essence, I mean, I’ve just looked you up, or I’ve got this tool that I use. I just looked you up on LinkedIn just before. Yeah. And I can’t, whereas I’ve got it somewhere. And I’ll take it out. And basically, it goes through here we go so stinking now, and it basically tells me it says:- “David is primarily driven by logic and efficiency, likely brings most conversations to a quick rational conclusion.”

It says you’re a skeptic. Yeah. You like he likes critique avoids emotional decisions.” Okay. Yeah. And, and it’s telling me how So, I need so I can I can have a look here on how to approach you. Yeah. It says:- “David tends to be an objective thinker who prioritises accuracy and results. He will likely pay attention to small details.” which I knew anyway, I can see that straight. You don’t need to be a rocket scientist.

Yeah, and take a systematic approach to solving problems.

David Clarke 17:55
Yeah.

Nathaniel Schooler 17:55
Right. So so you know, it says:- “It comes naturally for David highly value accuracy question inefficient practices.” Yeah. sighs the importance of quality. Right.

David Clarke 18:09
And all this just from your LinkedIn profile? Wow.

Nathaniel Schooler 18:12
And it goes even further, it says, yeah. But it’s amazing. But then the thing is, is that you share that information with the internet. And, and it’s, it’s a very strange world whereby, you know, people, I think are only just starting to realize that the moment they go onto a website, they’re starting to be followed around the internet.

David Clarke 18:38
Absolutely, absolutely.

Nathaniel Schooler 18:39
And they’re still lost, they still don’t realize that the only the only time when they actually get upset about that is it’s when it’s their kids that are being followed around the internet by by blocks that are trying to work

David Clarke 18:51
Absolutely. And that is kind of you know, that the next generation of take this kind of coming in now to protect vulnerable adults and children, really, and that then then probably needs to be done. And then if you notice with us two weeks ago, YouTube and a couple of other companies, tick tock kind of just totally changed how they deal with under 30 teams and stopped a lot of, you know, connectivity, following comments, that type of stuff, because it has, I guess, getting in the wrong hands to put it mildly, and being exploited in you know, the horrible and terrible potential ways.

Nathaniel Schooler 19:28
Yeah. Well, I mean, it has, and it kind of opened up the realms of, you know, being quite easy to to do things to vulnerable people, you know, and it is, for people who want to do that it was very, very easy. It’s, it’s still easier than ever, ever has been, because of social media. I mean, it’s very easy to contact someone if you know, if you know their name, you might know their phone number, and then you can find them on on social media. And but the thing is, is that said, it’s up to us as parents to educate our children to the point of knowing.

David Clarke 20:11
It is, but it’s how much education can you do that so effective? Because, you know, I mean, I’ve got three children, and the biggest problem I had at home was :- “How do you control the internet?” Right? It was really tough, having no rhyme huge security networks, trading trillions of dollars a day.

And then you come home and you find actually, you got no control whatsoever. Right? Right. Very difficult, managed to achieve a level of inverted commas, compliance or home. But it wasn’t easy, you know, you giving everybody a username and password just doesn’t work at home, because having everyone shares it, work where you can enforce it. It does become very difficult. And then what do you do with the mobile phones that are much more difficult to control? What do you do when they around friends houses?

water problem to crack and that, and that is that is something that, you know, there are companies working on, and I’ve had some involvement with that. Because it is such a challenge. Because everyone, you know, I’ve not met a parent who says to Me, there is a problem, but I’m sure it is a problem. Because they just don’t know. Oh, yeah, sure that I know, either.

Nathaniel Schooler 21:22
Know, but the thing is, at least at least children are a lot wiser than they were in our day.

Right.

David Clarke 21:26
Though. Yes they are way more clued up, you know, and I talked to my son, he was a big gamer. I’m always saying, you know, how do you know these guys you’re playing with on some kind of strange person. And he said:- “I only talk about the game, if they took about anything else. I just tend the mic off. It was off. I’m not interested in talking about anything but the game.”

But you know, he made will be an exception. I don’t know, you know, how the kids may engage. But, you know, he kind of decided a long time ago that he was going to engage that chat about the game. And that’s it.

Nathaniel Schooler 22:02
Right. So with, as far as security and GDPR. And this sort of thing? Excuse me.

Oh, man, you know, like with the internet, right? surely, surely it should be down to the internet service providers to actually do something about about Internet Security. Like they take more responsibility as far as I’m concerned.

David Clarke 22:30
And I think you’re right there. And I think there is there is a definitely a sort of ground swell now that corporate responsibility has to be taken a little bit more seriously.

You know, if you, if you buy a mobile phone for your, for your kids, for example, has it has everything on it, it’s very difficult to have any control over a cap on the bill, and maybe some kind of very lightweight, safe internet filtering. You can’t monitor the communications, you can’t monitor for cyber bullying, any of those type of things. And of course, they weren’t necessarily knowledgeable on that mobile. That device is owned by a child because you pay the bill as an adult because the child doesn’t pay the bill. So that type of stuff is is definitely coming in there are, there is a lot of work going on in background on that. And luckily, I’ve had some involvement with that. So that’s good, that sounds really good. It’s really interesting. And it just go back to another point you brought up earlier about, you know, you stopped emailing yen sort of move to social media. And when you kind of think about it is actually, you know, which social media use is probably just a handful of social media, you know, controlled by non UK companies. So what we’ve done inadvertently, is we’ve made legislation to protect people that actually is now driven as to third party tools that are not even European, quite often controlled, just to maintain business. Which sounds a bit kind of weird now. Really?

Nathaniel Schooler 24:01
Yeah, it’s, it’s very difficult to keep track of what’s going on. I mean, you you connect your accounts to all sorts of things. And, and it’s kind of like, if you don’t set up the sorts of protocols, you know, like,

Third party authentication, and all these things, it’s kind of like your responsibility, isn’t it sort of locking down a lot of apps?

David Clarke 24:51
Yeah, and I guess, you know, social media is had a really good run. And it’s probably got a bit out of control for many companies and people. You know, so it’s a different world that we’ve we’ve had to sort of work out how does this work? And how do we stop bad things happening? And where does the difference between bad things happening in government interference star?

It’s very good point. very wonderful. We want the best of both worlds. Yeah, I

Nathaniel Schooler 25:16
mean, we don’t want to stifle innovation, because we want to make make continued growth, right. But then we want to protect people. So it’s a difficult one.

David Clarke 25:24
It is and I think if you kind of look at the history of regulation, generally, I think it’s been a good thing. If you look at airplanes in the US, they were first regulated in 1923, because the death rate was too high, you know, look at travel, now, it’s grown phenomenally, because it was regulated, and it safe.

Food Standards, same same thing, cars in the 70s, there was a risk calculation done in the 70s, that, you know, life is worth, and I’m making these figures it was it was $50,000, or whatever, and they could afford a lot of lives, because they make millions and hundreds and hundreds of millions of dollars. And that type of calculation is now you are not allowed to, you know, write the value of human life against profit, and the cost of doing business. Cars have to be safe. And, you know, cars are way, way safer now, and, you know, the concept of safety belts, you know, a while ago, you know, 20/30 years ago, people thought, you know what, I don’t need to worry about that, that’s not really your decision anymore, you get caught, you’re going to get fined if you don’t worry, you can have a dreadful accident. So it’s not really a risk decision that people make anymore. The same is probably going to come to information security that, you know, certain things have to be done. And that way, you know, the rate of breach, potentially will go down.

Nathaniel Schooler 26:41
The rate of breach.

David Clarke 26:42
Yeah, it’s every day multiple times every day now.

Big companies.

Nathaniel Schooler 26:47
Yeah. But I still go back to the point, the internet needs to actually take responsibility. And the people, you know, like, for instance, if you’re using, say, you say you’re on Virgin, or you’re on, you know, a specific internet service provider, like BT or Verizon or whatever, they need to take responsibility and actually closed down their net.

David Clarke 27:19
And that, that may be the kind of direction that something’s kind of needs to go. How do we get that responsibility? I think that’s going to be a government initiative. And I think, you know, it may be it may come to that.

Nathaniel Schooler 27:32
Right. But then then it may stifle certain industries. Everyone in the tech world wants freedom. Right. But it’s like, but what cost?

David Clarke 27:44
Absolutely, and probably, I think maybe that cost is, is getting too high. Now for many people.

Nathaniel Schooler 27:50
Really?

David Clarke 27:52
Yeah, the exploitation. I read somewhere that, you know, the, how Facebook, in the Trump elections, I can’t remember the volume of targeted messages that were targeted to individual profiles, was, you know, I think it was in the hundreds of thousands. It wasn’t like, you know, vote this party, or whatever. It was a message tailored for, you know, micro niches of, you know, two or three people.

Nathaniel Schooler 28:15
I know, isn’t it so strange, though? How nothing’s been done about about it? I find it, I find it particularly strange that, that also, you know, we can have a, you know, whether you want Brexit or you don’t want Brexit, that’s actually irrelivent now. Yeah. But because it’s happening, right? Because but the point I’m making is, is that will actually, how did we get to that point where we got to that point, because people worked out to use marketing more effectively. And I wrote an article about it, you know, three, four days after and no one.

David Clarke 28:51
Really? Yeah.

Nathaniel Schooler 28:51
No one, no one really batted an eyelid.

David Clarke 28:53
But interesting. Yeah.

Nathaniel Schooler 28:55
That is that the company behind it, Cambridge Analytica, they were set up for the beginning to fall. Yeah, yeah, I think someone goes to a meeting with someone who’s an undercover reporter, knowing that they’re not an undercover reporter, like you must be some kind of an idiot to not sit there and go through that. So and it takes you back to the conversation of well, you know, a breach happened, people got ahold of people’s data, but actually, it wasn’t a breach at all, because that data was freely given away!

You know, and that goes back to the whole ethics of the corporate social responsibility of social media networks themselves, you know, and, actually, like, there’s a new Blockchain powered one that’s coming quite soon called Howdo and that one, you actually get to be in control of your data.

David Clarke 29:48
Interesting.

Nathaniel Schooler 29:49
And how much data you give away or how little, but you will also be paid 50% of ad revenue, all ads that selling through your account

David Clarke 30:00
Interesting model.

Nathaniel Schooler 30:01
Cool, right.

David Clarke 30:02
It’s rare for a very interesting, yeah. And I think there’s the other thing, isn’t it?

Isn’t it he was kind of talking about the concept of permanence. So that their idea is that stuff you post on social media will have a limited life, it won’t be there forever, because that’s the other problem with the writing on social media. It could be there. You know, till kingdom come?

Nathaniel Schooler 30:25
Well, that’s why that’s why I sort of caution people about about, you know, social media. we all we all make mistakes, don’t we.

But you know, I think when emotions are high, it’s very easy to to do things, but the thing thing is, then we’ve got this another conversation, it’s like freedom of speech, like, what happened to freedom of speech? Like we sit back, and we it’s almost like, we’re afraid to speak about things these days, because:- “Oh, that’s not a not politically correct.” Or:- “I can’t say that. Because, you know, someone might think that I’m being racist!”

I’m not racist at all, in any way, shape, or form.

But, you know, it’s a, it’s a funny, funny world whereby, people can get banned from social media platforms by speaking about certain things, when actually, they’re just being having freedom of speech.

David Clarke 31:25
I guess that’s kind of getting the balance, right, because of the power of many of these entities. Yeah. You know, unfortunately, you know, with with, you know, as Spider Man thing with great power comes great responsibility. And how do you manage that? Yeah. And we, we’ve probably not need to say on such a scale.

Yeah I mean, it is difficult, you know, because now social media is really, you know, it’s a way of doing business now, as well.

Nathaniel Schooler 31:51
Yes.

David Clarke 31:52
You know, most companies have got massive social media presence and rely on it all the time. You know, it’s a, it’s necessary for them. Absolutely. But how do you how do you manage and make sure that it doesn’t kind of go into the dark places?

Nathaniel Schooler 32:10
Yes, this is the problem, isn’t it? I mean, I would rather have public discussions, quote, then actually then go behind closed doors, because that’s when the extremists cause issues. If those conversations were public, we could have a good old argument with them publicly?

Or the people who like to troll others, because there are many of them, then it would it would slightly different, wouldn’t it? But you know because you don’t want it driving it below closed doors do at the end of the day?

David Clarke 32:42
Yeah, I think you kind of, there’s also two sides of that there’s, there’s the kind of political side, and there’s also the commercial side, but actually, you know, your data can be exploited both ways.

Nathaniel Schooler 32:54
Right.

David Clarke 32:57
And I don’t know, if you’ve ever watched the TV ads were there, they’re sending some kind of cool Gizmo. And you know, by the end of it, you really want to even if she can they use free whatsoever. Because they tune those adverse they watch every minute, you know, there’s been, again, the right number of calls, and they modify, modify it. So it becomes irresistible. And the same happens on the internet. So you come across these, you know, the word phrases are particularly chosen for you. And as you said, you kind of did a very light profile that you found about me of how I would respond, and you can customize it for each individual. So you’re actually got a direct message, and it becomes very irresistible.

Nathaniel Schooler 33:35
Yeah, it does. But then, isn’t that isn’t that also partly our responsibility to educate people that we know, to understand that, so that I mean, we ignore thousands and think 10s of thousands of marketing messages a day when we go out into the streets or whatever? So but at the end of the day, you can’t have it both ways. You can’t have free TV. Without adverts. You can’t have you know, free content without adverts it you know, and the consumers need to understand nothing is free. Giving away You’re not clicking and reading something. Fine. That’s your mistake for not reading it, but you are giving away your data. And if you need to, you need to just close your accounts.

David Clarke 34:23
I mean, if you don’t know if you look at Netflix at all? Netflix, they recommend again, extra billion a year by profiling everybody to for their likes and dislikes. I don’t I don’t look at it that often. But what I do I love it. Because you know, soon as you finish one film, they’ve got another film that you just know, you’ve got to watch!

Yeah.

Nathaniel Schooler 34:42
Right. But did you know that that’s only based upon 10 clicks at the beginning, when you started your career, because it’s a very clever recommender system that I did a little bit study into machine learning and recommender systems, which sit behind all those things. And it’s super interesting, because they just they basically ask you certain questions. And if you soon as you get to 10 data points, they’ve got enough information. Wow. And then it gets better and better and better. As it as it as it goes along. It’s very interesting.

David Clarke 35:16
That’s amazing.

Nathaniel Schooler 35:17
Yes, it is. You are you are paying Netflix, so you don’t have adverts. You see? So absolutely. It’s kind of like, it’s quite an interesting topic really. So kind of back to back to GDPR. Yeah. What, what do you think organizations really need to do?

David Clarke 35:41
I think probably, you can boil it down to a couple of things. And I think probably one of the first ones is is breaches inevitable, unfortunately. And if you’ve got a great process on how you’re going to manage it and deliver it, you can still look good, bad things have happened. Okay. And for me, you know that that’s something that pretty much every company can do. It’s having a good breach processor, good escalation model, and just taking the bull by the horns. And it it will, the outcomes are generally way, way better. If you have that in place, and you can demonstrate you’ve got it in place.

And the other thing, and I know this is really tough it is to be obviously to bear to deliver the data subject rights, and that is obviously directly dependent on knowing where your data is and what the security levels are on that data. But that’s an ongoing process, you know, that they can be started and that can evolve. And it will probably take companies quite a few years to get to a very kind of safe level. But the journey can begin and you can demonstrate their journey. And you can kind of show anybody that this is the intention of the business that actually our intention is to do good things with the data and not, you know, treat it badly exploiting and give it away and leave ourselves open to to unnecessary breach. Should I say.

Nathaniel Schooler 37:09
Right? So data subject rights, can you explain a bit more what that means? I was looking a bit confused there probably!

David Clarke 37:16
Sorry.

Nathaniel Schooler 37:17
Its alright. That’s why I ask questions.

David Clarke 37:21
Data subject, as you were kind of talking earlier, you signed up for an email letter, you potentially you know, your rights on that you’ve given consent, and you should bear to withdraw consent as easily as you go. That’s what I thought you should also bear to know is what else are they doing with our data they profiling you? Is that data is stored in the in the EU? Or is it stored somewhere else in the world? You have the right to know where that is. And also, you know, what their retention period is? How long do they intend to keep that? You know, are you are you on that email list for life? Or do they review it every so often and delete people are asking for your permission again, you know, consent is not forever. So all these rights are there and you know, being a subject access requests, you can ask for this information and ask them to deliver on what information they have about you. Obviously, in certain areas, they can’t give everything you know, if it’s a regulated financial firm, certain information cannot be divulged. But a lot can be, you know, employees have rights, and a lot of their data, can be sent back to the data subject to show what you’ve got on them.

And if it’s wrong, you have the right for it to be rectified. I mean, the ones I always get asked those, you know, is that Can I have the the right to be forgotten stroke, the right of deletion, strike eraser, but but really, they’re two different things. If you look at the cases with Google, they were actually the right to be unlinked from the search engine, because obviously, the data was stored on somebody else’s website. So you’d have to go to them if you want it deleted. Not Google, all Google can do for example, is just say, you won’t find it in the search engines. So you know, that’s kind of more of a right to be forgotten. So it’s not easily find them. Although fortunate, I think one of the people that actually challenged it, his actual original data he won, but obviously the case hasn’t, he’s now even more famous for his challenge to it and the right to be forgotten.

And, of course, the right to be erasure, it’s very difficult in a digital world to delete stuff, I mean, companies that back up every 15 minutes online to a cloud service, you can have 30,000 copies within two years of their data, very difficult to work at it delete, which means actually, you’ve got to work out your backup procedure, can you do it in such a way that it is possible, you know, in the future to delete it? And if you can’t delete it, can you do virtual deletion?

Then there’s a discussion actually what actually is deletion, you know, if I delete a file off my laptop, actually, the data is not deleted, just just the pathway to where that data is, has been deleted. So you know, what is true deletion? Are we talking military grade? Where you, you know, you encrypted and overwrite in crypto game and delete it 20 times?

Or is it actually the concept of virtual deletion? Were actually nobody has access to it? Or if they do, it’s just so difficult. It’s as good as gone.

Nathaniel Schooler 40:17
Right? So it’s so it’s basically changed in its form. And, you know, no one can actually read it basically, is that what you are saying?

Yeah, excuse me. Yeah, it’s quite interesting, really, you know, because they’re all these different departments, aren’t there, within businesses and within enterprises, it’s like, well, why would like someone in one department need to know everything about someone like that data needs to be specifically boxed up. And access to it needs to be specifically given, doesn’t it?

David Clarke 40:24
That’s in an ideal world, I think the reality is, if you can do a virtual deletion, we’re actually you can say that this piece of data, no one can look at Yeah, and if you do have to look at it, you need four eyes or six eyes that means, you know, 2/3/4 people to authorize it. So he can’t be just because you’ve got techinal access you can look at it, you actually use a system that will actually enforce that. So you need multiple authorities to get to that data.

Sure. And Germany is way ahead of, you know, most practices around Europe, you know, even if you want to share data with a different department, you have to make sure you’ve got a proper provisioning system. Or you can get into trouble. And you’re in your quite, right? You know, it’s the way I guess we’re so used to now setting up IT systems would pretty much allowed access for most places, and it’s how do we get gain that control back and verify it? It’s easy when you got 10 people in the company, it’s a million times harder when you’ve got 10,000 people in the company, and they all have jobs to do and to deliver on stuff. So you know, it’s not an excuse. I think there’s there’s a thing on the Information Commissioner’s Office website about subjects request, it says there, what if I get too many, and I can’t handle it, I’m paraphrasing. They’ve said actually about 20 years to get ready, because that kind of component is not new.

Nathaniel Schooler 42:05
Right? So they basically everything needs to be an automated process. Yeah, seamless process. If someone wants their data, and they have a right to have that data deleted, deleted, they should be able to just click a button, firstly, request the data that they have on, you must delete this, I don’t want you holding it.

David Clarke 42:25
Absolutely. I mean, if you remember, there are exceptions. You know, there’s liabilities, if you sold products or services, there’s financial regulatory stuff, and you may have to keep that data for much longer periods. You know, some parts of pensions has to be kept here, you know, you’re 100, right, or equivalent to 100. So it applies to a lot of data, but doesn’t necessarily apply to it all. As you said most of it is probably unnecessarily kept. But then I guess if you work in IT, you know that the one thing you know, that always goes wrong, is that when you deliver they trick and orbiting around because you shouldn’t have deleted it. And because storage is now so cheap, with not really kind of worried about deleting data. Now, actually, deleting data is a is a real art.

Nathaniel Schooler 43:11
Yeah, it’s just managing that right to really where it is, you know, it’s like, you know that if you delete it, you’re going to need it probably anyway. And so you may as well keep it and as long as you’re keeping it safely and no one, no one is actually getting access to it who shouldn’t have access to it, then it’s you showing diligence right

David Clarke 43:33
I mean, speaking that isn’t kind of in line with the GDPR. If you don’t need it, you should get rid of it. Right. I mean, what what we when I, you know, when we tend to work with companies, what we do is we we recommend a very structured deletion process. So say something smart for deletion, it doesn’t actually get deleted from maybe nine months or a year, right? A lot of processes in order and checks. Just check that someone in the company didn’t have a good reason why we needed to keep it.

So nothing is really interesting. It’s Yes, this is the process and she needs to go through this in a large company, it can take months to you know, get sign off from risk, compliance, legal marketing, just to make sure that actually this data is ok, for deletion. But in the meantime, it could be one restriction, which is the next biggest problem after deletion actually can if you can put it on restriction. That’s probably a huge leap forward to saying you know, it’s ready for deletion or something.

Nathaniel Schooler 44:30
Right. But social media outlets, they just keep it just because they can.

David Clarke 44:38
Absolutely. And I think that that’s probably got to kind of change sooner or later. Yeah. And I think thats Zuckerberbs idea of permanence, really. So that can be back in the data subjects control. So stuff you posted 10 years ago of your kind of party night out can be got rid of if needed. It doesn’t have to stay there.

Nathaniel Schooler 44:58
Yeah, it’s a whole whole can of worms. Isn’t? You know?

Well, thank you.

I’ve really enjoyed talking to you. Really interesting, new education, actually. And I think we should follow up perhaps, and to do another one.

David Clarke 45:16
Good idea.

Nathaniel Schooler 45:17
Because there’s so much that I didn’t realize I just had a look at your profile. And it was, you know, and it’s just like, there’s so much stuff that you that you sort of know about.

David Clarke 45:28
Thank you.

Nathaniel Schooler 45:30
Yeah, it’s pretty serious, pretty serious stuff. Really. And quite exciting. Actually, I always was bored, because my brother works for Cisco, you know, and, and I was just my mind with security, but I don’t think until you’ve actually had someone hack you and damage your stuff, right. I don’t think you appreciate what it’s like to actually have a problem.

David Clarke 45:52
Absolutely. And, you know, and the reality is, unfortunately, you know, most of it isn’t like on the TV, you know, quite and you don’t even know you’ve been hacked.

Nathaniel Schooler 46:01
Exactly.

David Clarke 46:02
Do know, you’ve only got a symptom. And actually, you don’t know the scope, scale or size of the problem. And you’ve got to go, you know, what are we going to do? We actually have no idea what’s going on. And we’ve then got to whittle that down. So we have a vague idea what’s going on. And it could take months, maybe even longer.

Nathaniel Schooler 46:20
Oh, yeah. And you just don’t know like what’s going on. If you’re not a programmer is very difficult. Like that’s why not having a good security software is important, you know, whatever it might be. Oh, absolutely. Yeah. and update it as often as possible. And just don’t annoy anyone that that’s a really good hacker.

David Clarke 46:42
Just I don’t know if you saw that story about some of the car security they they put, I guess he was taken as a challenge as these things off and our website, they said this car security was unhackable. And and then within three days, you know, they got told Actually, it’s not.

Nathaniel Schooler 46:59
Oh, dear. Oh, we’ve got so many risks coming up with Internet of Things. Like it’s, you know, there’s the next level of security coming, you know.

David Clarke 47:10
Oh absolutely, yeah!

Nathaniel Schooler 47:11
It’s using the machine learning to actually work out what’s going on on the network. And I think that is that’s most exciting, actually.

But it’s but it’s exciting. Right?

David Clarke 47:23
And it means that level.

Nathaniel Schooler 47:24
Yeah, yeah. But

yeah, I think we could talk about that I think I’d like to is, that’s what’s so funny is that she would like to talk about that for another time, right!

David Clarke 47:44
Well, appreciate it. Thank you so much. It’s been fun.

Nathaniel Schooler 47:47
Yeah, thank you.

Unknown Speaker 47:52
Thanks so much for listening. Please subscribe and wherever you prefer, share with your friends. And if you enjoyed the show, drop review on iTunes or wherever you listen.