How Publishing Failure Became a Foundation for Trust

Episode 139 | 22.12.2025

How Publishing Failure Became a Foundation for Trust

What an early Salesforce decision reveals about transparency, leadership, and trust in AI-driven organisations.

Listen to the full podcast episode on YouTube, Spotify, and Apple Podcasts.

Scene and context

As artificial intelligence moves into the centre of organisational life, trust is becoming a design problem. Decisions once made quietly by managers are now mediated by systems that prioritise, score, and recommend at scale.

Much of the current debate focuses on adoption speed and productivity gains.

Less attention is paid to how trust is built when machines act on behalf of organisations.

That question sits at the heart of this episode of The Responsible Edge, where the discussion turns repeatedly to a counter-intuitive idea. That trust is not built by hiding failure, but by making it visible.

 

A career shaped by things going wrong

For Steve Garnett, the mythology of seamless growth has never rung true. His career spans senior leadership roles at Oracle and Salesforce, both organisations that experienced moments of severe stress behind the scenes.

At Oracle in the early 1990s, weak discipline and misaligned incentives pushed the company close to collapse.

Survival depended on confronting uncomfortable truths rather than protecting appearances.

Those experiences shaped Steve’s instinct that systems fail, people make mistakes, and organisations reveal their values not when things work, but when they break.

 

The Salesforce decision

The most telling example came from Salesforce’s early cloud years. As customers moved critical data off-premise, system outages carried real consequences. When the platform went down, entire businesses felt it.

Leadership debated how much to disclose. The safer option was concealment. Instead, they chose exposure.

“We published all of it,” Steve said.

Every outage, every performance issue, every failure was made public. Not as a crisis response, but as a standing practice. Customers could see exactly when systems failed and for how long.

The decision was not framed as bravery. It was framed as consistency. Trust was a stated value. Publishing failure was how that value was operationalised.

 

Why this matters for AI

The article discussed during the episode, published by Cerkl, argues that AI is increasingly shaping company culture by filtering information and determining relevance.

Steve’s experience adds a sharper edge. When systems decide what people see, what they are measured on, or how they are prioritised, transparency becomes non-negotiable.

AI agents do not feel embarrassment. They do not intuit when silence erodes trust. If their decisions are hidden, confidence drains quietly.

What Salesforce learned through public failure now applies to AI systems operating inside organisations. If employees and customers cannot see how decisions are made, trust is replaced by suspicion.

 

Trust must be engineered

Steve argues that AI cannot be trusted by intention alone. It must be governed through what he describes as trust layers. Clear rules, visibility, and constraints that mirror human judgement.

A human sales leader knows not to upsell a customer whose system has just failed. An AI agent does not. That restraint must be designed.

Publishing system performance was one way Salesforce encoded values into operations. With AI, leaders must decide what transparency looks like when decisions are automated.

Dashboards, explanations, audit trails, and visibility into failure are not optional extras. They are how trust survives scale.

 

The tension leaders avoid

Many organisations fear transparency because it exposes imperfection. Steve’s experience suggests the opposite. Concealment magnifies risk.

AI will make more decisions faster, with greater distance from human judgement.

Without deliberate openness, leaders lose the ability to explain outcomes they are still accountable for.

The temptation will be to smooth results, protect confidence, and manage perception. The harder choice is to let people see where systems fall short.

That choice, Steve suggests, is where values become real.

 

Closing reflection

Publishing failure did not weaken Salesforce’s credibility. It strengthened it. Customers stayed because honesty replaced uncertainty.

As AI systems increasingly act on behalf of organisations, the same logic applies.

Trust will not be earned by perfection, but by visibility.

The leaders who understand this will not ask whether AI works. They will ask whether people can see it fail, and still choose to trust it.

Sponsored by...

 

truMRK: Communications You Can Trust


👉 Learn how truMRK helps organisations strengthen the credibility of their communications.

Want to be a guest on our show?

Contact Us.

The Responsible Edge Podcast
Queensgate House
48 Queen Street
Exeter
Devon
EX4 3SR

Recognition.

Join 2,500+ professionals.

Exploring how to build trust, lead responsibly, and grow with integrity. Get the latest episodes and exclusive insights direct to your inbox.

  • This field is for validation purposes and should be left unchanged.

© 2026. The Responsible Edge Podcast. All rights reserved.
The Responsible Edge Podcast® is a registered trademark.

Sponsored by truMRK

© 2026. The Responsible Edge Podcast

Europe’s Tech Sovereignty Test: Can Values Withstand U.S. Pressure?

Episode 131 | 26.10.2025

Europe’s Tech Sovereignty Test: Can Values Withstand U.S. Pressure?

The EU’s clash with Trump over digital regulation exposes the fault line between free-market power and value-based governance.

Listen to the full podcast episode on YouTube, Spotify, and Apple Podcasts.

Why Europe’s Tech Fight Matters

When Donald Trump threatened new tariffs on the EU for its digital laws, it looked like another trade dispute. But behind the politics lies a deeper question: who should set the rules for global technology, and whose values shape those rules?

For TECH by Handelsblatt, this debate is not about tariffs. It is about values. Managing Director Dale Rickert believes Europe must show that growth and integrity can exist together.

“Our rules are not for sale,” he says. “They protect the democracy we rely on.”

 

A Beginning in Music

Dale started his career far from politics. He was a professional cellist in Australia and Germany. In an orchestra, he says, you learn that harmony comes from teamwork, not ego.

“You play your part, and together you create something bigger.”

That experience shaped how he works today. Whether he is running events or building partnerships, he treats business as a collective performance. Everyone has a part to play, and success depends on collaboration, not competition.

 

Building a European Vision

When Handelsblatt created TECH, the goal was to build a European platform where business and technology leaders could meet, share ideas and defend common values.

The idea came from a simple question: what does it mean to be European in business? For Dale, it means having a clear moral compass and protecting what matters most — trust, transparency and fairness.

The TECH Congress in Heilbronn has become a space for that conversation. Leaders, innovators and ministers come not only to talk but to listen and connect.

 

Europe Says: Our Rules, Our Future

The article that inspired this episode — EU defends sovereign right to regulate tech against Trump’s latest tariff threat — described how the U.S. called Europe’s digital laws “unfair.”

The EU refused to back down. Its Digital Services and Markets Acts ask tech giants to take responsibility for harmful content, data privacy and monopoly power. Europe’s answer to Trump was clear: the rules stand, and they exist for a reason.

 

The Moral Divide

Dale calls it a “paradigm war.” The U.S. often removes rules to help business grow. Europe creates rules to make sure growth stays fair.

It raises a bigger question: should technology be free from limits, or guided by ethics? Dale believes that regulation does not stop innovation. It makes sure innovation helps society instead of harming it.

 

The Future of Digital Sovereignty

A less visible issue in this debate is data. Most European companies still use American cloud services. Under U.S. law, those companies could be forced to hand over user data if asked by their government.

Dale warns that Europe cannot build a strong digital future on foreign infrastructure. He supports the move toward “sovereign technology” — tools and systems built in Europe, under European law, with data protected by European values.

 

The Takeaway

For Europe to become a true tech power, it must balance growth with integrity. It needs to protect trust, even when progress feels slow.

Dale’s advice is simple: talk face to face. “It’s hard to hate someone when you sit with them,” he says.

“If we work together as people, not profiles, big problems start to shrink.”

Sponsored by...

 

truMRK: Communications You Can Trust


👉 Learn how truMRK helps organisations strengthen the credibility of their communications.

Want to be a guest on our show?

Contact Us.

The Responsible Edge Podcast
Queensgate House
48 Queen Street
Exeter
Devon
EX4 3SR

Recognition.

Join 2,500+ professionals.

Exploring how to build trust, lead responsibly, and grow with integrity. Get the latest episodes and exclusive insights direct to your inbox.

  • This field is for validation purposes and should be left unchanged.

© 2026. The Responsible Edge Podcast. All rights reserved.
The Responsible Edge Podcast® is a registered trademark.

Sponsored by truMRK

© 2026. The Responsible Edge Podcast

Expansion, Emissions and the Valley of Death: Can Aviation Grow Responsibly?

Episode 119 | 6.8.2025

Expansion, Emissions and the Valley of Death: Can Aviation Grow Responsibly?

In a political landscape where economic growth is clutched like a lifebuoy, the decision to approve a third runway at Heathrow might seem pragmatic. But for Charlie Garner — Policy & Advocacy Lead at Cleantech for UK — it’s a case study in how policy ambition and climate reality continue to fly at different altitudes.

Listen to the full podcast episode on YouTube, Spotify, and Apple Podcasts.

Speaking on The Responsible Edge, Charlie’s view is nuanced but clear:

“It is a bit of a shame that the government is considering airport expansion in the face of growth,” he said, cautiously.

He’s quick to acknowledge the economic rationale but even quicker to flag the climate cost: an additional 2.4 million tonnes of CO₂ annually by 2050 — and that’s with optimistic assumptions about cleaner fuels.

 

The SAF Mirage: Betting on Technology That Isn’t There

Central to the government’s climate mitigation argument is the promise of Sustainable Aviation Fuel (SAF). It’s a drop-in replacement that relies on waste oils and bio-feedstocks. But there’s a problem: “In the UK, we don’t have any operational projects in the ground,” Charlie warns.

“We’re relying on imported feedstock, creating a huge risk.”

The UK’s SAF mandate may be a step forward, but it’s more of a policy scaffold than a solution. For Charlie, it’s not enough.

“If the government wants to consider expanding Heathrow and Gatwick, we need to do more than just SAF.”

 

Crossing the Valley of Death

The real bottleneck, Charlie argues, lies in the “valley of death” — the treacherous chasm between innovation and scale. Early-stage clean aviation firms struggle to attract funding just when it matters most. It’s a familiar tale: venture capital dries up, institutional investors won’t take the risk, and the UK loses homegrown technologies to more supportive climates.

“We think the National Wealth Fund should co-invest at this stage,” he says, citing the UK Infrastructure Bank’s potential to bridge this critical gap. It’s not about blank cheques — it’s about risk-sharing to unlock scale.

 

Vision vs Delivery: A Nation Adrift

Charlie’s criticism isn’t just of a single policy decision but of the systemic inconsistency in Britain’s clean tech trajectory.

“You can’t keep changing the strategy every four years and expect stakeholders to trust you,” he notes.

His example? The indecision over energy grid pricing models, which stalled investor confidence and delayed renewable projects.

For a nation that once imagined itself a climate leader, Britain has yet to articulate a coherent green industrial strategy — let alone deliver on it.

 

The Case for a National Mission

“What are we trying to be?”

Charlie asks, echoing a deeper concern. Could the UK rebrand itself as a global clean tech hub? Could green growth be our export advantage?

“There’s not one sector the UK doesn’t have an opportunity to succeed in,” he says — provided the government stops gambling on future tech and starts investing in the present.

It’s not a romantic vision of degrowth or techno-utopia. It’s something more grounded: building a policy infrastructure that enables cooperation, coherence, and continuity across government cycles.

 

Lobbying Without Tribes

In his most passionate moment, Charlie turns his attention to the corporate world.

“We need to remove the tribal nature of lobbying at the expense of climate progress,” he argues.

In his view, competitive sectoral lobbying undermines unified climate policy — and with it, public trust in politics itself.

To fix it? “We need a new model of climate lobbying. One that values long-term thinking and shared outcomes.”

 

The Takeaway

Charlie’s magic wand wish isn’t flashy. It’s not a moonshot. It’s a call for better coordination, pragmatic realism, and long-haul investment. For those looking to lead responsibly in today’s messy world, that might just be the most radical thing of all.

Sponsored by...

 

truMRK: Communications You Can Trust


👉 Learn how truMRK helps organisations strengthen the credibility of their communications.

Want to be a guest on our show?

Contact Us.

The Responsible Edge Podcast
Queensgate House
48 Queen Street
Exeter
Devon
EX4 3SR

Recognition.

Join 2,500+ professionals.

Exploring how to build trust, lead responsibly, and grow with integrity. Get the latest episodes and exclusive insights direct to your inbox.

  • This field is for validation purposes and should be left unchanged.

© 2026. The Responsible Edge Podcast. All rights reserved.
The Responsible Edge Podcast® is a registered trademark.

Sponsored by truMRK

© 2026. The Responsible Edge Podcast

Why We Need a Beehive for the Truth

Episode 117 | 28.7.2025

Why We Need a Beehive for the Truth

There’s something hauntingly familiar in Rafael Cossi’s description of the modern information landscape: fragmented, hyper-emotional, and desperately short of systemic understanding. Cossi, co-founder of Beehive News, isn’t just building a business—he’s constructing a compass for a world that no longer knows which way is north.

Listen to the full podcast episode on YouTube, Spotify, and Apple Podcasts.

When he says, “We’ve lost touch with the sense of the whole,” it’s not a lament—it’s a warning. The disease of misinformation isn’t just poisoning our political discourse or undermining innovation; it’s chipping away at the glue that binds societies together.

 

News as Theory, Not Fact

Most things in the news, says Rafael, aren’t facts. “They’re theories.” And that matters. We don’t need everyone to agree, he explains:

“Consensus is not good for progress.”

But we do need everyone to follow the same logic. That’s what Beehive provides: a transparent, objective framework to assess news articles based on consistency, context, and credibility—not ideology.

His pandemic example is revealing. When UK media declared Brazil’s Covid response a catastrophe, the headlines were technically true. But the omission of critical context—Brazil’s much larger population, regional disparities, and urban/rural divides—meant the narrative was misleading. It’s not fake news. It’s just incomplete. And that, Cossi warns, is the most dangerous kind.

 

The Slow Collapse of Trust

“Information is soft power,” Rafael reminds us. It shapes not just opinions but entire economies, voting behaviours, and social contracts. Today, he observes:

“A lot of young people in the UK don’t believe in democracy anymore.”

It’s not hard to see why. When truth becomes a battleground, the casualties are cohesion and common purpose.

Misinformation doesn’t need to be believed to be effective. It just needs to be seen. Cossi explains how emotional anchoring—what psychologists call “knowledge neglect”—can distort perception even when we know something is untrue. The damage is already done.

 

A Better Incentive: Pay for Quality

What Beehive is trying to do is simple, yet radical: create a marketplace where quality journalism is not just a moral imperative but a commercial advantage.

“When people use our app to read news,” says Cossi, “they’re 35% more likely to click on well-rated articles.”

That data doesn’t just help readers—it gives publishers a reason to care.

And some do. Beehive collaborates with media regulators and has already started nudging some major outlets towards better standards. But others? “They say, ‘You’ve correctly identified the flaws—but we only care about engagement.’”

This isn’t cynicism—it’s systems failure.

 

Beyond the Printing Press

To make sense of today’s chaotic information ecosystem, Cossi turns to history. The invention of the printing press, he notes, was followed by centuries of chaos, propaganda, and ultimately, regulation. The same must now happen with digital content.

“We review hotels, we review restaurants—why not the news?” he asks.

But his vision is not authoritarian. Beehive doesn’t decide what’s true or false. It simply makes transparent what’s missing. In doing so, it reintroduces a sense of shared informational ground—without flattening the complexity of diverse perspectives.

 

A Magic Wand for Holism

When asked what he’d change about the commercial world, Cossi doesn’t mention regulation or AI. He wants to restore a “sense of the whole”—a worldview that connects individual decisions to collective impact. The metaphor he uses comes from the Apple TV series Severance, where workers forget their real lives the moment they enter the office.

“We’ve siloed ourselves,” he says. “We’ve lost our sense of purpose.”

This is the moral heart of the conversation. Not just how we rate news. But how we relate to one another.

Sponsored by...

 

truMRK: Communications You Can Trust


👉 Learn how truMRK helps organisations strengthen the credibility of their communications.

Want to be a guest on our show?

Contact Us.

The Responsible Edge Podcast
Queensgate House
48 Queen Street
Exeter
Devon
EX4 3SR

Recognition.

Join 2,500+ professionals.

Exploring how to build trust, lead responsibly, and grow with integrity. Get the latest episodes and exclusive insights direct to your inbox.

  • This field is for validation purposes and should be left unchanged.

© 2026. The Responsible Edge Podcast. All rights reserved.
The Responsible Edge Podcast® is a registered trademark.

Sponsored by truMRK

© 2026. The Responsible Edge Podcast

The AI Catalyst: Why Humans Still Matter in the Age of Artificial Intelligence

Episode 109 | 30.6.2025

The AI Catalyst: Why Humans Still Matter in the Age of Artificial Intelligence

In an era where AI writes poems, drives cars, and diagnoses diseases, it’s easy to feel like the machines are taking over. But as AI expert, United Nations advisor, and serial innovator Neil Sahota reminds us on The Responsible Edge, technology is only part of the story. The real question is: how do humans lead in a world where machines increasingly make decisions?

Listen to the full podcast episode on YouTube, Spotify, and Apple Podcasts.

In a thought-provoking conversation with host Charlie Martin, Neil pulls back the curtain on the AI revolution—not with doom and gloom, but with pragmatic optimism.

“People always ask, will AI take our jobs? The real concern is whether we evolve fast enough to create the new ones.”

 

AI: Not Just for Coders

Neil’s background spans IBM’s Watson project to UN policy advising, but what sets him apart is his ability to cut through the tech jargon. AI isn’t just for Silicon Valley or data scientists—it’s rapidly becoming embedded in how businesses, governments, and individuals operate.

“AI is already shaping everything from healthcare to legal contracts. But without responsible leadership, it can easily exacerbate inequality or misinformation,” Neil warns.

For businesses, that means AI literacy is no longer optional. It’s a boardroom imperative.

 

🚀 The Leadership Test of Our Time

AI isn’t just about efficiency—it raises profound governance and ethical questions. Neil highlights three areas where business leaders must step up:

AI Ethics by Design: From bias in algorithms to accountability for machine-led decisions.
Reskilling at Scale: Preparing people for jobs that don’t yet exist.
Policy & Collaboration: Ensuring AI development aligns with human values globally.

“AI will challenge the very definition of what it means to be human in the workforce. That’s why leadership, not just engineering, matters most.”

 

🌍 The Global Perspective

Drawing on his work with the UN, Neil points to AI’s potential to tackle global challenges—from climate modelling to food security. But without collaboration across sectors and borders, these opportunities could be lost to short-termism or technological monopolies.

And while AI hype dominates headlines, Neil is refreshingly candid:

“The most dangerous myth is that AI is inevitable. Its impact depends entirely on human choices—what we prioritise, how we govern, and whether we include everyone.”

 

🎧 Listen to the Full Conversation

This episode of The Responsible Edge goes beyond the AI headlines. It’s a call to action for leaders, policymakers, and anyone curious about how we build a future where AI empowers rather than replaces us.

Sponsored by...

 

truMRK: Communications You Can Trust


👉 Learn how truMRK helps organisations strengthen the credibility of their communications.

Want to be a guest on our show?

Contact Us.

The Responsible Edge Podcast
Queensgate House
48 Queen Street
Exeter
Devon
EX4 3SR

Recognition.

Join 2,500+ professionals.

Exploring how to build trust, lead responsibly, and grow with integrity. Get the latest episodes and exclusive insights direct to your inbox.

  • This field is for validation purposes and should be left unchanged.

© 2026. The Responsible Edge Podcast. All rights reserved.
The Responsible Edge Podcast® is a registered trademark.

Sponsored by truMRK

© 2026. The Responsible Edge Podcast

Reinventing Women’s Health: How 28X is Building a New Model for Ethical Tech

Episode 89 | 21.4.2025

Reinventing Women’s Health: How 28X is Building a New Model for Ethical Tech

In an era where digital health solutions are booming, Amber Vodegel is proving that innovation doesn’t have to come at the cost of ethics. Speaking to The Responsible Edge podcast, Amber shared the remarkable story behind her new venture, 28X—a revolutionary period tracking app aiming to transform women’s health by putting data ownership, accessibility, and integrity at its core.

Listen to the full podcast episode on YouTube, Spotify, and Apple Podcasts.

🎙️ “We need to give women the power back over their own data,” Amber explained, detailing a blueprint for business that challenges the profit-first model dominating today’s tech landscape.

 

The Problem with Current Women’s Health Apps

While millions of women globally rely on period tracking apps, Amber highlighted a worrying reality:

  • Most are VC-backed and male-owned, with data practices that often prioritise profit over privacy.

  • Some leading apps have faced lawsuits for mishandling sensitive user information, eroding trust.

  • Subscription models and employer-based access schemes often exclude the 80% of women who cannot afford expensive plans.

“These models are outdated,” Amber said. “They’re designed for the privileged few, not the many.”

 

28X: A Radically Different Approach 🚀

Rather than replicating the flawed systems already in place, Amber is building 28X around three radical principles:

  • No data collection by default: Users’ information stays on their device unless they choose otherwise.

  • Completely free access: No paywalls, no barriers.

  • Ethical funding model: Self-funded with selective, mission-aligned investors, avoiding the pressures that can lead to unethical compromises.

🦋 The name 28X reflects the 28-day cycle and the X chromosome, with a butterfly symbol representing transformation—a fitting metaphor for Amber’s ambitions.

 

A New Blueprint for Business 📈

Amber’s vision for 28X isn’t just about creating a better health app. It’s about showing the world that responsible, ethical businesses can still scale, still succeed—and still make a real difference.

Key features of her approach:

  • Circular impact: Future profits will be partially reinvested in supporting female founders and climate-focused businesses.

  • Longevity over exit: Rather than building for a quick sale, 28X is designed for long-term ownership, offering dividends rather than structuring for acquisition.

  • Open invitation for collaboration: Amber is actively calling for support from professionals who want to help build an ethical giant in women’s health tech.

“We have enough companies focused purely on extraction. Let’s build something different—something that gives back.”

 

Lessons from an Entrepreneurial Journey

Amber’s own story—rooted in resilience, creativity, and hard-earned lessons—shapes everything she is building today.

Key takeaways she shared:

  • Build around paradox: Entrepreneurship is the constant balancing of highs and lows.

  • Own your resilience: Setbacks are inevitable; persistence is essential.

  • Work when others watch TV: Amber attributes much of her early success to sacrificing downtime to build her ventures in the evenings.

  • Ethics must be baked in early: Retrofitting ethics doesn’t work; they must be foundational.

 

A Call to Action 📣

Amber’s goal is bold: reach 100 million women a month within five years, making 28X the world’s largest, most trusted period tracking platform.

“If we want to change the status quo, we need to think big—and do it ethically,” she said.

💬 Interested in helping? Amber is inviting skilled volunteers, collaborators, and mission-driven supporters to join the movement. “We can build this together,” she urged.

Sponsored by...

 

truMRK: Communications You Can Trust


👉 Learn how truMRK helps organisations strengthen the credibility of their communications.

Want to be a guest on our show?

Contact Us.

The Responsible Edge Podcast
Queensgate House
48 Queen Street
Exeter
Devon
EX4 3SR

Recognition.

Join 2,500+ professionals.

Exploring how to build trust, lead responsibly, and grow with integrity. Get the latest episodes and exclusive insights direct to your inbox.

  • This field is for validation purposes and should be left unchanged.

© 2026. The Responsible Edge Podcast. All rights reserved.
The Responsible Edge Podcast® is a registered trademark.

Sponsored by truMRK

© 2026. The Responsible Edge Podcast