Why culture, care and communication are critical for the future of tech

Why culture, care and communication are critical for the future of tech

At LEAP, we focus a lot of energy on cultivating ecosystems that are good for people. We want to see tech developments that support happier, healthier lives for people around the world, and innovation breakthroughs that have a positive impact on communities and the planet. 

When it comes to creating tech for good, we have to include cybersecurity in the conversation. Because importantly, many of the same mindset shifts that will help us develop positive tech will also help us develop secure tech. And both are essential for our future. 

In cybersecurity, most breaches don’t start with elite hacker collectives. They start with human error. A distracted click, a misjudged invoice, or a message forwarded without thinking twice can open the door to millions in losses. 

The latest threat report from AI-powered cybersecurity platform Abnormal highlights just how extensive the problem is: employees in their survey engaged with 44% of vendor email compromise (VEC) attacks. And even more worrying, 98.5% of text-based attacks that were read weren’t reported at all – which means organisations often have no idea an attack has even landed. 

As the report puts it: “Every time an employee has to decide whether an email is legitimate, the risk of human error enters the equation.”

Why culture matters more than controls 

Tech alone can’t solve the problem of human error. In an interview with Black Hat MEA, Bernard Assaf (CISO at Airbus) said leaders must look deeper to understand what’s driving those mistakes:

“We have to look beyond simple compliance statistics to gauge employee behaviour and engagement.”

That’s because behaviour doesn’t change through policy documents. It changes through culture: the conversations that make security relatable, the safety to report mistakes, and the stories that bring threats to life.

The Abnormal report shows why this is so urgent. In one example, attackers hijacked a real vendor thread, made a nearly imperceptible change to the sender domain, and persuaded an employee not only to reply, but to loop in colleagues – all before the fraud was caught. 

When a person feels pressured or uncertain, or scared of being wrong, they’re more likely to make rushed decisions. And attackers exploit that. 

The workforce shortage in cyber comes with a cultural cost 

The ISC2 2025 study on hiring trends found that demand for cybersecurity talent continues to outpace supply worldwide. Employers report persistent difficulty filling roles, particularly those requiring hands-on skills, and many organisations admit they rely on under-resourced teams to manage growing threats.

That shortage creates cultural pressure. Cybersecurity practitioners are more likely to experience burnout, and training is hurried – so mistakes multiply. Nearly three in five hiring managers (58%) told ISC2 they’re concerned about attrition among entry-level and junior-level staff – even as 75% report budget for professional development and 73% for staffing. And the result is a vicious cycle – stressed teams are more likely to miss threats, which erodes trust across the business.

As Ivana Bartoletti (Global Chief Privacy Officer at Wipro) put it at LEAP:

“We know that if we want to transform as organisations…we really need to build on people’s trust.” 

If your workforce doesn’t feel supported with the training and tools they need, that trust crumbles – first inside your organisation, and then in the markets you serve.

The human factor is critical to security

The Abnormal data shows which roles are most exposed. Entry-level sales staff had an 86% engagement rate with VEC attacks, followed closely by project managers and account managers. 

These are people who are expected to respond to customers and colleagues quickly, resolve issues, and keep everyone happy. That very responsiveness becomes a vulnerability.

So across the tech industry (and every industry that engages with digital tech which is…pretty much every industry today), we have to work hard to create the kind of culture where those employees can slow down, ask questions, and flag concerns without fear. 

Only safe tech can have a positive impact 

The Abnormal report warns against making employees the last line of defence. Instead, layered protection is key – combining AI-driven anomaly detection with cultural interventions that empower people, rather than punish them. 

But even the smartest tools fail without care and communication. The ISC2 study urges employers to close the gap not only with better recruitment pipelines, but also with continuous professional development and mentorship. Curiosity and training, supported by a culture of care, give people confidence to act wisely when confronted with suspicious requests.

To make tech safer we need to foster that curiosity. We have to make sure that cybersecurity training is adapted to specific job roles and cultural backgrounds. And the same goes for developing tech that can have a positive impact on the world – we need to bring everyone into the conversation so we can understand where assumptions or bias are creating gaps in reach and impact. 

At LEAP, we love exploring technology’s capacity to change lives. And over at Black Hat MEA, our colleagues confront the uncomfortable truth that people are always the weakest link – and simultaneously the strongest potential defence. In both cases, the human factor is the real key to resilience and growth.

Related
articles