The Evolving Landscape of Digital Security – [FULL INTERVIEW]

Kevin Prone from FourNet, together with Kevin Still, discusses the escalating threats in cybersecurity, highlighting the rapid advancement and sophistication of cyber-attacks, especially in light of recent events like the COVID-19 pandemic which accelerated digital transformation.

They discuss the challenges of ensuring digital security in this fast-evolving landscape, including the implications for both corporations and consumers, the importance of multi-factor authentication, governance frameworks, and the potential need for more in-person interactions to establish trust in the digital age.

Find out more about FourNet -> Here.

Key Points

  • Cybersecurity threats have significantly escalated in recent years.
  • The COVID-19 pandemic has accelerated digital transformation, increasing vulnerabilities.
  • Ransomware attacks, including on major institutions like the British Library, have become more common.
  • Nation-state attacks are becoming monetized and targeting corporations.
  • The adoption of AI and machine learning is both a boon and a bane for cybersecurity.
  • The discussion highlights the importance of multi-factor authentication as a defense mechanism.
  • Digital transformation has led to a broadened attack surface, with more devices and networks at risk.
  • The regulatory landscape around cybersecurity is tightening, with increasing liability for breaches.
  • Consumer education on cybersecurity risks and best practices is lacking but crucial.
  • The need for robust governance frameworks in businesses to manage cybersecurity risks.
  • Cybersecurity insurance is becoming more expensive and selective.
  • There’s a potential shift towards more in-person interactions to verify identities and establish trust.

Key Statistics

  • Digital transformation equivalent to 5 to 7 years has been compressed into a matter of weeks or months due to COVID-19.
  • A significant proportion of cyber attacks (75%) originate from inside the network.
  • Cyber insurance premiums are rising as the frequency and severity of claims increase.

Key Takeaways

  • Cybersecurity threats are rapidly evolving and increasing in sophistication.
  • The pandemic has significantly accelerated the shift to digital, expanding the cybersecurity attack surface.
  • Ransomware and nation-state attacks are prominent threats that organizations must contend with.
  • Advances in technology like AI and machine learning are double-edged swords in cybersecurity.
  • There is a critical need for multi-factor authentication to secure digital identities.
  • The expansion of the digital footprint requires a reevaluation of security practices.
  • Regulatory pressures and personal liability for breaches are increasing for corporate leaders.
  • Consumer education on cybersecurity is essential for mitigating risks.
  • Strong governance frameworks are vital for managing cybersecurity effectively.
  • The rising cost and stringent requirements of cyber insurance pose challenges for businesses.
  • In-person interactions may become more prevalent as a means of establishing trust in a digital world.
  • Continuous investment in cybersecurity defenses is necessary to keep pace with evolving threats.
Interview Transcript

0:02
Hi everyone, I’m here with Kevin Prone. And Kevin Still today the two Kevin’s Kevin Prone is chief security architect for FourNet in the cyber and security space. And Kevin still is director of Kevin still consulting and demsa. We know Kevin pretty well. Both of you guys. Welcome, and thanks for thanks for joining me today.

0:22
Thank you.

0:24
So I suppose this, this discussion has started off, based on a couple of events. And Kevin, I’d heard you speak a little bit around cybersecurity. And some of the things that shocked me were was just as the average person the street, you know about it, but what I didn’t realise is just how much it’s escalating in the last couple of years. And I think it was exacerbated to a certain extent, I actually went to the British Library the other day, and obviously, they’d had a cyber attack, and they had a ransomware attack as well. And so it’s like top of mind, how, how was the dynamics around cybersecurity, in particular, slight probing of networks? And how has that changed over the last couple of years? Because it feels like it’s escalating? Oh,

1:04
yeah, it’s certainly escalating. And we’re, we’re seeing that trend, I think, one of the first sort of pinch points that really started with COVID, obviously, there was a lot more use of digital media and enabling digital journeys happened quite quickly, during COVID. So I think we quote, five to seven years of digital transformation, done in two or three weeks, sometimes two or three months. And what’s happened as a result of awareness is improved. So people are a little bit more aware now of the scare mongering that goes around the market. So you only had to worry about when we were waiting for a COVID test. And suddenly, we’re starting to see SMS is coming through saying that we’ve been somebody with with somebody, it was potentially infected. And therefore, we’d very quickly start moving towards those digital channels of engagement. Because the NHS app enabled us to get access to this typically, I think we’ve seen a massive increase in that. I think we’ve also seen a massive increase in cyber attacks due to that, let’s just say that the state that the globe’s in at the moment, so we’ve seen a lot of nation state attacks actually happening. And those nation state attacks have now been monetized and start moving down into corporate attacks. Because I think one of the biggest things we’ve seen in terms of acceleration is things like AI and machine learning, and toolkits that enable people with a cause or not just cause in their mind a cause to be able to recover quite quickly, with a relatively small amount of knowledge. So we’ve seen a massive increase of that. We’ve also seen a massive shift as a result of the digital transformation journeys that people having. So moving to contactless. So more seeing your doctors online, more engaging digitally, with a service organisation, and maybe face to face. And I guess, as a result of that, we’ve seen an increase in that sort of use of digital technology. The increase also is based around monetization. So attackers, ethical hackers, like myself, would sit there and demonstrate how we could infiltrate an organisation from the outside. And the idea was that we were demonstrating where those issues were, for the good of those organisations. Now, I think, if you imagine now, there’s downloadable kit to enable you to be particularly sophisticated online. And with a couple of minutes in YouTube, or a couple of hours in YouTube. And being a bit flippant, there are a couple of hours into YouTube, you can actually use these tools to start getting money and extracting money from individuals. So it is really, really interesting. I think the challenge we’ve seen is that still the secure thinking inside organisations hasn’t caught up. There’s this sort of a psychological view of not me, it’s going to be somebody else. And we I’ve worked on four or five high profile cyber attacks in the last two or three years, I can mention that I’ve really started to open eyes of individuals as to the breadth and depth of the problem.

4:24
How much do you think that the pandemic and I suppose digital readiness that really came from pandemic is do we do enough? Do we think about it enough when we’re implementing some of these systems? I think, Kevin, you mentioned some of that. And we’ve almost like felt like we rushed to put some of this stuff in place. And now it was I was like, we need to go back with some of these new regulatory frameworks you’re talking about Kevin is almost like back and re implement them just to make sure that they’re robust enough.

4:46
I think it’s an interesting one, isn’t it? Because we’ve got to remember with COVID we really didn’t have a lot of choice. We were talking about businesses doing what they needed to do to survive. And that sort of transformation. Yeah. And he was done pretty quickly. I don’t think the regulator’s would would look at that and have the same view, ultimately, you still got, you still got a duty of care, you still got to be able to do things to a relevant standard. But I think it’s giving you the, the typical example, moving from an on premise solution to a cloud based solution, yes, no longer means that you need to be pinned to an office. But what does that mean, in terms of the attack surface, suddenly, you have a locked office, with servers and computers that are inside it, and the only people that can access it, we’re physically sat in that office, now to an organisation where, essentially, they’re cloud based. And you can get access to them from anywhere. So this notion of trust, comes into question. When you walk in an office, you scan in, they validate who you are, you log on, they know who you are, you physically sat in a location, CCTV can see you access control can see you, we know it’s you, right? But now, when you’re consuming an application, and you’re just a user at the other end of an internet connection, how do we know they are who they say they are? And equally, the adversaries? How do we know they are who they say they are? So how do you actually know you are interacting with your bank? How do you know you are interacting with the organisation that you know and trust?

6:24
Yeah. And Kevin, I know you’ve done quite a bit of work around looking at things like data security, but also like process security. I mean, like, how do we think about how do we get ahead of some of that? What do you think? How do we get ahead of it in terms of the planning? I know it goes through into particularly financial services, we got greater degrees of regulation, but what do you think are the elements around trying to get ahead of that are thinking about it, because we’re not actually aware of all the things that Kevin was talking about, at least some of the data was showing that we’re really not aware enough as we need to be?

6:50
Yeah, it’s very rare that you’re in a situation that you’re a greenfield site. And you can do everything from welfare by design, do your record of processing activity, your data processing impact assessments and said, I’m going to design this from Ground Zero, upwards. And that’s going to have a combination of all the cyber resilience alongside all of the delivery mechanisms that you want in terms of effectively an omni channel type approach. So recognising that everybody’s a startup FinTech, at best, you’re likely to be Brownfield, but more likely, as curves indicates, you’ve got a mixture of on prem in cloud. And there are a few bits that need to be joined together, which probably haven’t been looked at in terms of where one entity or one set of data sets versus where the end user engagement may well be. So it’s very difficult if you are in that hybrid environment that says you’ve got an awful lot of money invested in legacy systems. And so you inevitably inevitably end up having to build different perimeters. It’s like layers of an onion. And Kevin describes this very well, when he looks at d want to get to 100% resilience, and then confirms that’s not really achievable. But there are a number of steps that you can take within that. But but I think you’ve got to go through a code design process that says security. And some of the aspects we’re looking at now, which is particularly around consumer protection, sit alongside the other elements in terms of in the FCA world world, a consumer enjoying their experience. And to sometimes that’s just not possible, you’ve got to put friction into journeys, and increasing idle or working the vulnerability as in vulnerable consumers. But trying to distinguish between a victim and somebody who’s a perpetrator is getting increasingly difficult. So you almost have to have a slightly cynical mind when you’re going through the design process to look at it and say, which comes first, ultimately, is it consumer protection, rather than that entire user experience? Because sometimes something’s got to give.

9:09
There’s the balance there isn’t that we’ve always been on banking apps, whereas it’s so locked down, that you end up not being able to do the transaction versus something where it’s very easy to be able to do it and you’re absolutely love it until something goes wrong. And then at that point, you’re like, I don’t I’m feeling like I’m exposed and it strictly was that this was the digital kind of like ecosystems we got now we we seem to like ticking over COVID being like, trending towards the ease of use rather than the rather than this like super tight process. How do we get that kind of balanced? Right. Do you think

9:42
it’s an interesting one, isn’t it because it’s about striking that balance, isn’t it? You talk about the secure by design principles. Again, a lot of those Secured by Design principles are built on Greenfield but when I talk about cybersecurity, you’re never really done. It’s an iterative process, it’s a cycle that you need to apply. Very much like when you’re going through an ISO certification, you’ve got that continual improvement cycle, you’ve got to have the same sort of cycle baked into everything you do. And you’ve got to try and get that right level of security to validate the app is your app. So getting it on things like the Play Store, and the apple marketplace, is almost a tick of validation that the iPod say you are. But we’ve also got to make sure that anybody can engage on the digital devices, that and the digital channels that they’re using. So you’ve got to be able to protect the consumer, and make sure that we are actually talking to the consumer, you’ve got to start being able to have some form of digital charter, or digital validation that says, hang on, for instance, inside your online banking, believe it or not, I can’t actually set up a new pay without blinking four times in a non shadowed room for to enable me to set a new pay up once that new pay is actually then set up. Again, I’m warned via the app that this potentially could be fraudulent. And I have to validate who I’m paying and where I’m paid. So I think a lot of people, particularly when it’s an application that has a lot of impacts to it, like online banking, the organization’s is building in those secure principles. Because essentially, within banking and finance, there were those always those guards in the first place, the vault was always behind, behind locked doors, there were only certain people could get access to the vault, etc. So it’s building this trust model, essentially, in your app journeys, that you need to make sure the person is who they say they are you validating them continually, so that they are continually validated in case somebody else starts picking up the phone. And that you’re not just authenticating them once. I think that is one of the one of the key elements and

See also  Establishing Continuous Trust

12:05
is there an inclusion angle to that as well. So I know we’ve been talking about digital inclusion as an example and making sure that everyone is included, as we put in maybe it was more controls. Is that an aspect? Kevin? That would you think we have to think about I know we’re looking about your legacy ways of accessing, let’s say, financial services, which might be walking in your branch we’ve got we’ve got digital access, but then if there’s new controls that sort of come in place, making sure things are safe and secure, then are there potentially exclusion type type considerations, you’ve got to think to interact thinking in terms gathering information and looking for good outcomes?

12:37
I think one of the things I’ll say and I’ll let Kev step in at the moment where we’ve got things like multi factor authentication. On the one hand, the moment one device, notably your mobile phone gets compromised, then pretty much everything else gets compromised. But where we are now, looking increasingly in the vulnerability World Consumer vulnerability world, is those outliers, maybe the elderly people that live in a rural area, that they don’t have access to the branch anymore, and more traditional authentication mechanisms like your bank manager recognising somebody when they walk in, and other old fashioned values, there are a limited number of people that fit into that outlier piece. And there are a lot of advocates in looking at what happens to people who for one reason or another, don’t have access to the internet. And that might be virtue or financial issues where they lose access by virtue of the fact that they miss payments, or there might be an outage of power. So when we’re looking at essential services, what actually happens when somebody moves from being potentially vulnerable to critically vulnerable, as Kev has indicated, they may rely on Internet of Things for somebody monitoring their pulse, their heart rate, whatever it might be on a dialysis machine, and you’re having to take readings and those devices inevitably become a weak point in somebody’s home security infrastructure. And then we get back into another discussion, which is, what is the right balance between supervising people actually making sure that the right people have the right level of access, without some form of unintended consequence, which as a result of that one day, something breaks down and somebody’s left and as found a week later, because there’s been a, you know, what might be a minor blip, but passes all of the sensor monitoring devices that we have in play?

14:33
How does as as this sort of the is the surface? I think, as you describe it, Kevin, as we get all these different kinds of IoT data connected devices for surface or the landscape increases to a certain extent, where’s the balance between educating people around this is what you’ve got to look like versus a firm’s then looking around the security of the devices because there’s a bit of a balance there and I was looking through some of your stats around I think 24% of staff last lack skills to deal with sophisticated threats as an example. I think that was one of the stats that I pulled out. Where does the education piece come from customers, versus making sure as businesses that we’ve got the threats almost like locked down, particularly for service increases?

15:12
Yeah. So from the corporates perspective, there are lots of toolkits out there to help get that secure thinking you’ve got a lot of things like the ICO keep people keep people on the straight and narrow because of, you know, the potential fines that come out as a result of any data breaches. So you’ve got, as a corporate, I think it’s almost easier to say we expect layers of security to be in place, we have to have multiple layers in security in place, because Kev says, the likelihood of an attack with one layer of security is a lot higher than if you have multiple layers of security. The key other element that I think corporates need to look at is visibility across the piece, you can only you can only manage what you can see you can only defend what you can see, actually getting that balance from the corporate point of view is all down to their risk and the risk they carry. And then you start to think actually, my technology as a corporate can then be leveraged by a consumer and I can put them potentially in a vulnerable situation without assuming that they know something like multi factor authentication exists. So if you start looking at it from a consumer point of view, it’s there’s lots going on, when we talk about MFA multi factor authentication. Almost everybody that’s got a digital device now expects some form of multi factor authentication be, I was going to send you a text to say that you’ve logged on to the phone, you’ve got to navigate towards the app to validate you are who you say you ask. And when you log on to your Google account. Now, you they’re recommending multi factor authentication is switched on, you authenticate yourself through YouTube or or through the Google app. So I think there’s a general increase in awareness, predominantly, because of all the high profile attacks that are happening. But there’s still I think, this view that cybersecurity as a technologist problem, it’s not a new problem, it’s a huge problem. And the IT department and the corporates need to make sure that we’re not that blocker, we’re not the Department of No. So you’ve got you’ve got to try and get that awareness up. As much as possible. We, we use our cyber awareness platform actually has a great feature in that. What it enables you to do is not only go through the training as a corporate user, but also pass and share that expertise on to your friends and family. Because everybody wants to be aware of how they can be scammed. And we’ve got to also remember that there’s lots of tricks. We’ve had the the fraudsters who have been operating since Victorian times, to extracting money from people in various ways. But we’ve got to realise now, I think, certainly when you start looking at things like digital inclusion, as soon as you introduce technology, and it becomes a point where people don’t want to engage on those platforms anymore, I still want to go and talk to the bank manager, when I want a loan, I’m a technologist and I still want to engage there, they want to engage with me on a an online call, I actually want to go and sit with people in the branch because I know, that’s where I need to do my work. When I work with an investment firm, I will always go and work with the investors in their offices, because I have an inherent distrust of all of the internet, and where I put my money. So that’s a bit of a balance. Yeah, sounds

18:43
like you gotta have the right channel for the for the right tasks to a certain extent. So if it’s a large amount of money, or a very complicated decision, that’s the one you want to have almost like in person, whereas some of the more transactional pieces making have that more and more digital. Having said that, I think one of the other pieces that Kevin You are making me think of was when you have lots of different channels, and they all interlink. Let’s say it’s a device in your kitchen that actually record videos and example, if that’s not properly secure, then you can actually gather information to then access a completely different device or completely different service from even from a different company. And I suppose it’s like the link, the ability to be able to link those together becomes very powerful. And there’s actually a risk for consumers. And as businesses, we’ve probably got to think ahead of some of that. I

19:28
think one of the things that perhaps highlighted in the events that we’ve done through the course of March is people aren’t tremendously good at coming up with innovative passwords. And indeed, if they’ve come up with one innovative password that’s likely to be rare. They’re likely to be tied around the pet’s name with a year added and perhaps an exclamation mark added to the end of it. So if there is a point of weakness, that you’ve only got to look at it and say, Are the transactions going through this have any meaningful value? Probably not. But is when you escalate through and said right now I’ve gained access to a number of different levels. And I think we made reference in the events to the film the beekeeper, that’s at the moment where a simple scam gets in where there’s an expectation of maybe taking a few $1,000 from somebody, and they suddenly find there’s a pension trust funds behind, but everything’s got the same password. And the ripple effect takes takes on. And I’m constantly reminded of this, that not only do people often have the same password, but they keep them in public view, either on a post it note, or in a book very adjacent to the device, you access to rob all the money. So there are many facets of this, that where consumer education is key. Just today, I’ve had from one of my credit card providers, a reminder of the things they won’t do, which I think is very important. It’s a message that kept talks about all the time. But I think in the world we’re in at the moment where there is a risk that people do the wrong things don’t make the right choices on it, that there has to be a balance to protect both sides, the commercial entity, if somebody does something really stupid, and consumers themselves, who may well have been duped into doing something in good faith thinking they’re dealing with a trusted provider.

21:18
I think they it gets interesting as well, when you you start looking at risk as a consumer. Going along, I bought my first Alexa device, I thought it was really cool many years ago, put it into the house and then started to play with it and started to use technologies like if this they’re not so decision based automation engines, and I started to very quickly disable the Alexa devices in my home. Because people don’t necessarily have a view of risk. Again, you look at things like AI, everybody’s now using AI to do their homework, I had to correct my son more than on two occasions where it’s going, I can discuss Chuck GPT, and it’s going to write it. Yeah, think about the implications of what you’re doing. So I think big people are becoming more aware, if you’re in that digital, if you’ve been growing up as born in the 2000s, then you almost embrace this technology, and use it without the same sense of fear that maybe I do in my early 50s. Knowing what I do for a living. I embrace these new technologies. But you’ve got to try and get people mindful that there are risks. But back to Kevin’s point, the risks, the controls need to be proportional to the risk. And this is where some of the technologist terminology, the process of using a framework like nest now enables you to pick tiers. So a control can be proportional to the rescue carry. And the bigger the impact of that technology going wrong, the more controls you need to put around it. So we don’t want to drive drop digital innovation. We want to make sure that we support that. But we want to make sure that we’re doing it to a driving test standard, if you like, of digital resilience.

23:13
I just remember from my days running a fraud department was just like it, it’s the criminals are organised crime or the fraudsters seem to be like, they invest in this as a business as well. And they seem to like one step ahead and almost like getting a being faster to get to get to some of these things than businesses can get to. And I suppose that there’s that kind of aspect in terms of like just the acceleration of it. But then it also strikes me just what you’re saying. It’s also a little bit around the human factor it as well, which is they all play on our psychology or some of these scams these days, and really, really persuasive. And there’s a play on that, or you’ve missed something, or we’ve just done it. We’ve just done a delivery of those kinds of things. And it plays on your fears. I mean, fears, doesn’t it?

23:52
Yeah, it does. It does. human psychology is a big hobby of mine and something I’m really passionate about. But there’s things like authority bias at the end and revenue. So your tax returns has to be done. Now you’re going to be fined. Now with things like chat GPT, they’ve actually got the language, right, it’s a lot more believable. They’re using bitly links, and people obviously use bit. Ly links, we use it in LinkedIn all the time, don’t worry, you don’t want a big link will give you a bitly link and click on the link. So we’re seeing that happening more and more. And it’s starting to get really concerning when you start to look at it from that perspective.

See also  Accelerating Threats - the Rising Cost of Cybersecurity

24:30
I think one thing I see Chris is that cap talks about a number of different types of attacks and the motives for them. You can probably understand where something’s got a commercial intent to it. But I think where it is deliberately designed almost to be a sort of guerrilla attack, and that there is machismo to it around. I’ve proven that I can do this or I do it for other reasons that are more destructive, where particularly when you’re going down to a non corporate level to win dividuals and just how devastating that can be on people’s lives. That to me is where it’s very difficult to apply human psychology to that sort of guerilla type tactic, where you are probably just unfortunate or unlucky, in some instances, and protecting against that, for most consumers, there is an expectation that the infrastructure providers are going to be there to help them. So some of the recent FCA activity again, going back to the sort of Duty has been around critical service providers, the likes of AWS, your resource, your big broadband providers, etc, there is an expectation that they are culpable in some manner to provide some layer of security, even if we’re talking about how do you access the internet at home, what your basic firewall configuration is, most people still don’t believe on a smartphone that you have to have virus protection in the same way as you do on your PC. There are so many myths and legends that we have to deal with that somewhere within that critical service provision, there should be a sort of minimum level of hygiene that you can detect that you just can’t transact safely at the moment if you’re not doing the following.

26:18
And I’m right in saying that it’s because it’s not just your business, it’s also your entire supply chain, you’ve got to be aware of as well. So because they can come in at the end of your weakest link be right at the end of the supply chain, which might be slight distance as far as you’re concerned. But that can then ripple all the way through. And

26:33
that’s happened very recently, with things like the progress move it letter shops that were producing critical personal data on behalf of big banks. But they were the supplier of the supplier where the information commission is very clear, everybody should have an intercompany agreements have the same level of quality as the primary agreement. But that often isn’t the case. So what we ended up finding was millions of letters with personal information being acquired through a an attack. And then the ripple goes all the way back to saying who’s the data controllers you Mr. Tier one bank.

27:11
It is interesting when you look at supply chain, because you know, it’s quite popular, people think that the the attacks actually come from the outside in, we’re trying to exfiltrate data from the outside in. But again, these principles of secure thinking monitor things I always very close to my heart is zero trust, you need to make sure that you’re giving people the minimum level of access to do the job that they need to do. And also in your supply chain, that you are making sure that the connectivity and the conversations or the transactions that are going through that supply chain are as secure as they need to be on a need to know basis, you’ve only got a look at the SolarWinds attack SolarWinds being a tool that was used to monitor networks and servers, what’s a managed service providers use them and use this software in their service offering and overnight SolarWinds themselves were compromised, which meant that everybody that had SolarWinds, and hadn’t got the appropriate controls in place, which we had at the time, were potentially compromised, it’s one of those interesting conversations where you need to start thinking about the entire supply chain, the entire digital journey and making sure that those controls are appropriate. So when we’re managing a customer, and those customers want us to look after them, from a security point of view, we’re not going to have a one to one relationship directly with the customer. We’re monitoring and defending that customer to keep them secure. We have to connect to them to enable us to do so we have to make sure that the secure channels and secure agreements to allow that intercommunication to happen and not necessarily allow malware or ransomware. Or somebody from the inside over 75% of attacks actually happen from the inside and network to move laterally across into your supply chain and attack them via you can be quite scary. How

29:14
do we start thinking about AI at large language, there’s developments taking place in terms of large language models, particularly look at things like multi factor authentication, some of the authentication methods that are out there, it feels like we’re on the cusp of almost like being able to replicate humans or human interaction as a result through through the computer, you can do it not just once you don’t have to employ people to do you can actually just get the computer to do it. So be cloning voices as an example, I think was was one of the ones I think I’d heard of us before, but it could also be video phones, it can be all sorts of things now where it’s really becoming quite convincing,

29:47
I think caps talked about is talked about earlier on with Alexa and Google. Potentially you can be used that to actually develop all the voice prints you want until you’ve got a big enough. So and so most of these phishing troops aren’t fly fishing by JR Hartley anymore. That’s not one strike and you’ve caught your fish. It’s very patient. And what I’ve been impressed with when Kevin talks about their approach, there’s a lot of reconnaissance work that goes on in advance of this, that that can be on a large scale and multiple victims, probably 1000s of victims concurrently being worked on.

30:23
It makes it really believable. At the end of the day, you only need about 30 minutes of voice to create a reliable clone of an individual, there’s places that you can go on the internet, obviously, I won’t quote them here. But the places that you can go to download a digital face, which is highly convincing, developed and delivered by AR models. And then you can automate that. So essentially, you sit there with a camera, it takes the takes the model, and it stitches, your facial expressions and everything together, my voice is my password, be very worried, because that can be very successful. And other things now is believability, I have a digital chart with my partner now, where as I say, I will not interact with you in a certain way. And I will never ask you to do certain things in certain ways. And if you do, we need to have a sort of secret handshake. So that we can authenticate, we are who we say you are. Because if you think about it, lots of people are putting a lot of what they do out on to Facebook, and Twitter, and Snapchat, and essentially, they’re putting videos out there. So those videos can be used to train those models. And then you can create a very convincing version of yourself saying, I’ve broken down, I need to I need to put myself into a hotel, and probably not going to be able to get back from weeks because Europe, can you send me 5000 pounds, before you even engage with that person, you’ve actually transferred to 5000 pounds. It wasn’t the individual that who said they are so you, when I talk about the digital charters and banks a really good example of this is by case says credit card companies saying what they won’t do. Again, you need to design those into your own sort of secret handshakes and, and secret authentications that you’ve needed. Did you think it’s going to

32:21
almost revert back to like in person meeting? So you talked a little bit earlier around meeting your bank account? Do you think it’s we’ve almost gone so far towards digital? And the fact that there’s other threats coming in from the outside? Do you think we’re gonna have to say what, there are some things where I mean, because the one of the ways you can actually trust that you’re talking to a human is by meeting them physically, I mean, not even talking on the video, I could be just, I could actually just be an AI here sitting interviewing you guys. But the one way you can trust that is the fact that we’ve met and like that meeting and almost like that cone of silence to certain way. If you remember that is one way of guaranteeing that trust is there do you think we’re going to start to see some of those things come back, I think

32:59
to a certain degree, we’re too far down that digital journey to go back to obviously things that that deal with national security, of course, those things have to happen, because it’s proportional for the risk that’s carried. But I think there’s a lot to be said in the advancements of digital validation. So being able to authenticate you as who you say you are, via digital means I think we’ll see significant improvement in multi factor authentication. multi factor authentication is seen as the Nirvana, but we all know and I actually do a demo on this where we can actually steal the cookie session of an individual’s authentication. So essentially, what we do is we get you to sign into office 365, via phishing mail. Once you authenticate, I’ve actually got remote control of your machine. And then I’ll pick up your authentication cookie, which when you log into things like Microsoft 365, you’re authenticated for a certain period of time before you’re asked to log on again. So if I can pick that up, and I can then inject that, digitally, again, we have an issue, essentially, multi factor authentication is torn down. So there’s lots of different digital methods that are now being employed. Not only do I want multi factor authentication, but I want to check that it’s on the phone that’s registered as trusted. It’s in a location that I would expect the phone to be in. It has certain digital footprint on that phone to validate it is who it says it is. So again, it’s just using more of those layers of technology for that digital validation. And it’s striking the balance between validating them and friction

34:52
and caring when we start to thinking about governance around some of this and I’m making sure we’re staying ahead of this because it sounds like the landscapes completely change Engaging all the time. And having a good governance framework can help you stay ahead of it and other flagging the risks won’t solve everything but at least take staying ahead of it. What? What, how do we think about that with, particularly in the current environment with all of the regulations that are coming out? What’s the best kind of governance framework Do you think got to have out there?

35:17
This is a strange one, because the FCA. So if you are an FCA, regulated firm, they like statements of responsibility. They don’t like shared responsibility. And there are some prescribed responsibilities, which include things like financial crime. So within an organisation of any size, there will be somebody who’s designated to look after that. Conversely, you can have virtually anybody you want as the data protection officer. And indeed, many firms don’t even require to have a DPO. Many that still are a DPO, in their firm are in an antiquated old world when you add a branch network. And this is what you did with data in filing cabinets and the like, rather than the world that we’re actually in now. So this whole concept of this particularly the cyber area, should this align more with the chief risk officer? Should it more aligned with the Chief Technology Officer, Chief Information Officer and indeed may not have all three of those roles? And then from an accountability point of view, how is that reporting on regular board meetings? If I look at it relative to somebody, like who’s a money laundering reporting officer or a compliance officer, they normally have a designated role. And they’re meant to have a reporting slot every board meeting, but it’d be interesting to see with calves roll and the audits he does. And I know he talks to a lot of boards, where you’re looking at things like the new NIST, this to requirements, who you’re facing off to when you speak into the board around this is your problem, because it isn’t intuitive. And I think increasingly, these roles are changing now within I think the NIST two framework Kathy will talk about in a second. Governance is a major factor in that and probably the starting point. Yeah,

37:14
absolutely. Governance is a major factor. And like you say, caveats, baking in that responsibility to the board so that security is seen as baked into the fabric of everything you do. I do spend a lot of time with boards. And based on some of the new regulations that are coming out around the European directive of NIST two really interesting fines are getting so big now that they’re holding people personally liable for data breaches. So I think it’s getting there. I think the the issue that we always have as technologists is that we need to speak the language of the board. And the language of the board is essentially risk with mitigations. And, and controls. And then ultimately, it’s the board’s decision as to their risk appetite. But ultimately, more and more governance is going to be coming in mandating the way. They they look at that. Security is something for a lot of organisations, that’s done when you join the firm, you, you have a security induction, you’re given the past, you’re told what not to do and what not to do. And then your cyber awareness training is done and induction, you’ve got to start thinking about the secure processes that you’re doing. And when you bring on a new product, when you’re discussing it within a board meeting its Hang on, how do we know that we’re not opening ourselves up to a new channel of attack? We’re launching a new digital app via this. And now how, how resilient is that digital journey going to be? I have one organisation that I have close links to who essentially were responsible for issuing one time passwords. So when you log on to your online banking, you will sometimes be given a one time password sent via text and you put that one time password into the app, an app was actually compromised, that utilised one time passwords, and there was a massive bill generated to the one time password vendor. Well, whose fault is that? Let alone where the risk is, in terms of it can end up these one time passwords can end up in the wrong hands. Because SMS is incredibly easy to mimic. There’s a number of things that boards need to be thinking about. And really they should have an ongoing view of the digital risks that they’re carrying at the board meeting as well.

See also  Technology First: Using digital collections tools in the right way - [FULL INTERVIEW]

39:50
And that’s particularly clear now within the FCA regime under consumer duty so every product is meant to have its own product assurance framework. So when you look at that, and even existing products should have gone back through the process of being effectively reapproved. Now, if you work on the basis in one of the spaces I work in, which is dead advice, that around 93 94% of the audience will have access to the internet 84% to a smartphone, that’s got to be intrinsically involved in that process. And where sludge creeps in is, it’s incredibly easy to onboard a client. But the back end technology doesn’t live up to the front end technology. And it doesn’t use any of those tools on the back end, it goes back to quill pen technology. So there needs to be a consistency throughout the journey, where you almost use that onboarding process to educate the consumer, who may be regarded as financially vulnerable, around the risks. And building that trust right up front of this is what we will do, this is what we won’t do. And we’re going to build that going forward. And I just at the moment, still don’t see that as an integral part of what people have done through the duty implementation gone down to that granular level, and said, I’m going to unpick my product that may have been around since the dawn of time. And think about what this means to Mr. Consumer, what does vanilla look like? What does good look like? That includes payment methods, your opening times, if you are closing the branch or restricting when the telephones are available, what happens out of hours? And does that thing create a risk in terms of if you’re engaging, because you aren’t in an emergency situation, and that you find the channel you deal with at the weekend is nowhere near as robust as the one that you’re dealing with during the week. So there’s a lot more you’ve got to think about when you go through a product assurance framework. And I think security and engagement trust building are integral parts of that proposition.

42:02
It very much feels like the fire or the heat is being turned up on this as we go through. And it sounds like cybersecurity something else we’ve also got to think through in terms of all of the resilience piece, and the temperature just keeps on increasing and probably multiplying or even exponentially increasing to a certain extent. Where do you think what do you think the big watch outs are in the next year? 18 months? What do you got to think about now? And where do you think we’re gonna go from it? What are the big things that keep you awake at night, and Kevin may just start first Kevin still.

42:30
So I’m gonna go back to basics really. And I think the fact that we are seeing such an inextricable journey towards digital, or at least a partial jump, digital journey, and almost everything you do, finding a telephone number, even online now for some of your core providers is nigh on impossible. So there’s a gigantic push in that particular way. And consumers aren’t necessarily being brought along with the journey. So I am fearful where I am being told my journey is better on an app, that I’ve got to go and sign on the portal, or doing this yourself is perfectly feasible. But they don’t engender that trust as you’re going along. So I, for one, get very anxious, if I’m doing something that is new, even if somebody changes their website, allegedly for the better. It’s a journey I don’t recognise. And therefore, I’m often nervous about how I go about, you know, going ahead, I want a bit and reassurances as I go through. But a lot of corners seem to be cut that discussion we had around COVID, where things had to be rushed through. That should have been periods where things were unwound, and then done correctly afterwards. Because I think there are still loopholes in many systems, whether they be central government, or the private sector. And this concept of omni channel versus multi channel, I still think there are an awful lot of journeys that have been botched together. And inevitably, you can almost see it and said somebody’s bolted this on to the side, whether it be a bot chat, web chat or something similar, this doesn’t look like it’s integrated. And therefore there’s a potential weakness here, where I don’t feel like I’m going through the same authentication process that I perhaps would go through going through another channel.

44:24
It does very much feel like that. This conversation is very much highlighting the risks are there as a result of not doing some of those things. Kevin, what’s your kind of view on the near and midterm futures

44:34
are near and midterm futures are quite scary if you’re your digital journeys aren’t keeping pace with the pace at which the cyber attacks are increasing. We’re now conscious of the fact that things like AI and machine learning coupled with the advances of things like quantum computing, enable us to do things at a tremendous rate. What I think is happening now is that Technology is advancing so quickly. With like voice cloud and video cloning and this type of technology, you’re beginning to get to a point where the pace at which the attacks are happening, you’re going to struggle to keep up with the defensive requirements that you need to keep yourself secure without implementing those appropriate controls. Now, we see a lot in that when we talk about, you’re only as good as your weakest link, we’ve got a massive acceleration and number of attacks that’s happening, we’ve got a massive simplification in the way to exploit an attack, you can actually buy a ransomware on demand now directly off the dark web. So you take a particular dislike to a corporate, you can pay a fee, normally in bitcoins, and then they will release their ransomware to you that ransomware will be able to be executed really quickly, because they’re using things like video to actually train the individuals and helping them do that. And then the pace and sophistication of attacks that are happening now. It’s no longer the very basic attacks that are easy to spot. So I think when we were talking about what the work we do in contact centre, a lot of the simple work has been taken away, a lot of what we’re doing now is complex. Now, a lot of complexity can be done at speed. And that’s the challenge, I think, nothing to be looking out for as you’re designing these digital journeys. And as you’re looking at your risk profile, is making sure that you’re engaging with people that can give you that right level of risk, there’s a lot of fear out there, there is a lot of fear, the only way really to be secure is to turn it all off, put it back in the box and send it back to the vendor. That’s one level of secure thinking. But essentially, we’ve got to have those proportional controls in place. And organisations have got to invest in that ongoing defence of their cybersecurity posture. Because the pace at which things are happening is accelerating, the complexity is being reduced for the adversaries, the number of vectors that we need to be vigilant on is increasing exponentially. And we need to make sure that we keep pace with that by defending everything we can with a proportional controls.

47:24
That comes at a cost Chris, and I think when you look in in the regulatory space, we do a lot of work with debt buyers, debt collectors, as well as debt advice firms, who are inherently small in the grand scheme of things relative to big insurance firms and banks. So the cost of things like cyber insurance is disproportionately high. The cost of your core infrastructure to do your core businesses, which is collecting money be readily compromised relative to people that put highly sophisticated tools in place. So one of my fears on the other side of the corporate side is that for small and medium sized firms, the cost of entry and maintenance of your position to cover this immense landscape, where you are doing things that are fundamental to your business, which is getting paid, and that those monies can get easily misdirected elsewhere, we don’t know where that’s going to end up. What I do know is whenever we look at bidding, where the government wants 1/3 of all spend to be with an SME. But you’ve got to get through all of these hoops of fire to get there, it means you almost can’t be an SME to jump through the hoops of fire or sometimes because you need every ISO accreditation under the sun. Plus insurances that are disproportionately large to typically what you would have had to have several years ago. And

48:47
the interesting quandary, then is that the insurance may not pay out. So we’ve actually seen that ourselves where firms have been compromised, and the insurance won’t actually pay out because they expect that level of digital resilience that you said in your application form to be present. If, say, endpoint security software was something that you said was my mitigation for this attack, and it’s fine not to be on there. Then you’re nullifying your playing potential. And so I think this is where when I’m working with boards, there’s a lot of tick box exercises that happen. I’ve done that I’m cyber essentials Plus, I’ve done that I’ve done that I’ve ticked loads all these boxes, but you need to start scraping a little bit below those tick box exercises and look how deep and how wide those controls go. Because the insurance certainly will. And the premiums are getting more and more expensive for a reason. They’re paying out more and more. So they’re wanting more and more proportional controls to be in place. One firm that we’ve actually been working with recently were described as uninsurable. That That was how bad particularly in in the finance sector, they were viewed. Now, that was disproportionate, you’ve got to look at some of the technology that’s been used by the insurance. Insurers, I should say, to assess risk. They themselves are now using tools that predominantly I’d had been using five or six years ago now to assess risk. They’re an insurance company. So there’s a massive push now to making security something that you do in practice every day, and it becomes part of the very everyday fabric of, of your organisation. I

50:41
know this, the whole topic has been a bit of an eye opener for me just in terms of talking about some of these kinds of issues and just let’s say how much is accelerating, but Kevin and Kevin, thanks very much for joining me and helping me helping illustrator over the over today but also the last few weeks as well. I really appreciate it. Hopefully it’s interesting for folks because it feels like it’s something that definitely has to be top of mind. Thanks very much.

#FourNet


RO-AR insider newsletter

Receive notifications of new RO-AR content notifications: Also subscribe here - unsubscribe anytime