Last Updated on 04/10/2025 by 75385885
AI Governance Labour and Profession: Introduction – Power Without Control Becomes Spasm
A body without muscles cannot move. A body without nerves cannot feel. Together they provide strength and sensitivity: the ability to act with force, but also to sense when to hold back. Organizations are no different.
AI brings speed, power, and reach — but unless labour and professions, the muscles and nerves of governance, are engaged, that power degenerates into spasm. Machines can calculate, but they cannot exercise judgment. Algorithms can recommend, but they cannot be accountable. Only people — the professionals at every level of an organization — can validate, document, and ultimately decide.
Muscles deliver the strength to execute. Nerves transmit the signals that prevent damage. Governance ensures they remain connected: that the organization moves powerfully, but never blindly.
Metaphor – Muscles for Strength, Nerves for Sensitivity
When muscles contract without signals from nerves, the body suffers seizures. When nerves transmit without muscles responding, the body lies paralyzed. Governance seeks the balance between the two.
- Muscles represent employees and professionals who execute strategy, operate processes, and carry the daily load of decision-making. They are accountants validating journal entries, nurses documenting treatments, compliance officers approving exceptions, engineers monitoring safety.
- Nerves represent sensitivity and culture — the ability to detect bias, to escalate when things feel wrong, to apply empathy in a decision that affects a customer, patient, or employee.
Without muscles, AI governance is lofty policy trapped in paralysis. Without nerves, AI governance becomes brute enforcement: rigid, mechanical, and often harmful.
Case Study 1 – Amazon’s Black Box Recruiter
Amazon’s experiment with AI recruiting is a classic muscle–nerve failure. The tool screened thousands of CVs with great efficiency — the muscles of HR were working at scale. But the nerves were severed: the model downgraded resumes that included the word “women’s,” replicating historical bias.
It was not lack of processing power that doomed the project. It was lack of sensitivity. No one tested the system for bias until whistleblowers noticed patterns. Governance requires both strength and nerves: HR professionals empowered to question results, compliance teams running fairness audits, and leadership insisting that automation never trumps equality.
Lesson: muscles without nerves cause injury. AI may appear efficient, but without cultural awareness and human oversight, it reproduces prejudice at scale.
Read more from BBC: Amazon scrapped ‘sexist AI’ tool.
Case Study 2 – NHS and Diagnostic AI
In the UK, NHS pilots of diagnostic AI for sepsis and cancer faced clinician resistance. Doctors asked a simple governance question: How can I sign off on a diagnosis if I cannot see the reasoning?
Here the nerves were intact: professional skepticism and accountability pushed back against blind reliance. And governance improved when AI suppliers provided heatmaps of X-rays, ranked indicators for triage, and confidence levels. Muscles (clinicians) executed, but nerves (professional ethics) ensured they only acted when they understood.
Lesson: nerves without muscles cause paralysis. If professionals refuse to use AI because it is opaque, the system stagnates. But when explainability bridges the gap, muscles and nerves work together.
But it also shows Hospitals learn, read about it from a messages on the BBC – AI ‘co-pilot’ used to speed up cancer diagnosis.
Case Study 3 – Barclays and Soft Controls
After the 2008 crisis, Barclays realized that technical controls alone were insufficient. Fraud detection and compliance systems could catch anomalies, but culture determined whether staff acted. Employees had to feel safe escalating issues, challenging questionable trades, and refusing pressure.

That is the nervous system of governance: soft controls, tone from the top, and psychological safety. Muscles enforce policy; nerves ensure it is done with integrity. AI systems today repeat the lesson: without a culture that encourages people to question the algorithm, organizations risk blind execution of flawed outputs.
Read more from Barclays Simpson (a subsidiary or related company) where they write about Auditing Culture: A New Look At Soft Controls, discussing auditing culture using soft controls
Case Study 4 – Fujitsu and Upskilling in Japan
Fujitsu responded to AI adoption not with layoffs but with upskilling. Employees were trained to interpret AI dashboards, understand model limitations, and design new workflows. The muscles grew stronger — capable of heavier loads — because the nerves were trained alongside, teaching staff when to trust AI and when to override.
Governance lesson: AI adoption must include investment in professional education. Boards cannot assume staff will intuitively know how to validate AI. Skills in critical thinking, ethics, and data literacy are governance tools just as much as ERP audit trails.
Read more from Fujitsu in Fujitsu Embarks towards ‘New Normal’, Redefining Working Styles for its Japan Offices.
Case Study 5 – Infosys and Global Service Delivery
Infosys delivers AI-enabled services across continents. For European clients, Indian professionals act as extended muscles and nerves. Distance introduces risk: signals may degrade, accountability may blur.
Governance answers with contracts, dashboards, and shared audits. Clients demand transparency: reason codes, cross-border monitoring, and the right to inspect processes. The muscles may work abroad, but the nerves must still connect locally.
Lesson: global muscles need global nerves. Outsourcing multiplies the need for clarity and oversight, not less.
Read more in the viewpoint document ‘Banking without borders‘ by Infosys.
Case Study 6 – Petrobras and the Cost of Severed Nerves
Brazilian oil giant Petrobras collapsed into scandal when internal controls failed to detect corruption. AI compliance tools were later introduced, but the skeleton was already brittle, and the nerves — professional ethics, whistleblower channels — had been cut.
This case shows that technology cannot substitute for culture. Muscles without nerves are dangerous: they follow orders blindly, amplifying misconduct. Governance begins with integrity; AI only adds speed.

Read more in the Brittannica on the Petrobras scandal.
Emotional Intelligence – The Hidden Nervous System
Boards often focus on technical skill (IQ) but neglect emotional intelligence (EQ). Yet EQ is the invisible nervous system of governance.
- A customer service agent explaining a loan rejection must do so with respect, or trust erodes.
- A doctor interpreting AI diagnostics must apply compassion, or patients lose confidence.
- A board discussing AI risks must listen to frontline staff, or signals of failure are missed.
EQ turns governance from compliance into legitimacy. Without it, even technically sound AI produces resentment and resistance.
Read more about this in the Brain of AI Governance.
Muscles Under Strain – The Risk of Burnout
Just as overworked muscles tear, overburdened professionals fracture under AI-driven workloads. Employees asked to “validate” endless AI outputs without time, training, or authority become fatigued. They rubber-stamp decisions instead of challenging them.
Governance must design workflows where validation is meaningful and manageable. That means clear thresholds: AI handles routine, low-risk cases autonomously, but humans review the material, ambiguous, or ethically sensitive ones. Muscles strengthen through exercise, not exhaustion.
Reflexes and Delays – The Importance of Speed in Nerves
Nerves work not only by transmitting signals but by doing so quickly. In governance, delays are dangerous. If whistleblower channels take months to investigate, signals die before reaching the brain. If compliance reports arrive only quarterly, risks escalate unnoticed.
AI can accelerate nerve signals: anomaly detection tools alert in real time; dashboards surface risks instantly. But governance ensures those alerts are acted upon, not ignored. Reflexes matter: muscles must respond swiftly when nerves fire.
Labour, Profession and Sensitivity – Keeping Muscles and Nerves Aligned
A body only functions when muscles and nerves move in harmony. If muscles contract without signals, there is seizure; if nerves send warnings without response, there is paralysis. In organizations, labour and professions provide the strength to act, while sensitivity — culture, trust, and ethics — ensures that strength is directed responsibly.
Keeping muscles and nerves aligned means giving professionals the authority to act, the tools to understand AI outputs, and the psychological safety to escalate concerns. Employees must not be reduced to rubber stamps for machine recommendations. Instead, they should function as interpreters: validating AI suggestions, documenting decisions, and applying judgment where technology cannot.
Labour provides the movement: accountants, doctors, compliance officers, engineers. Professions provide the standards: codes of conduct, ethical frameworks, and regulatory obligations. Sensitivity ensures both are connected: the culture that encourages staff to question, to pause, and to explain decisions in terms customers, patients, and regulators can understand. Without this alignment, AI either overwhelms staff with mechanical power or leaves them hesitant and inert.
What This Means in Practice
For governance, keeping muscles and nerves aligned requires three commitments:
- Empower employees as final decision-makers. Staff must be trained and authorized to validate AI, not sidelined by it. Authority to override, with clear documentation, is a non-negotiable safeguard.
- Integrate professional standards into workflows. Decisions should be consistent with legal and ethical codes, from medical protocols to IFRS and compliance rules. This anchors muscles in professional duty.
- Protect sensitivity as a governance asset. Culture and soft controls are not “intangibles” but the nervous system that makes AI trustworthy. Boards must monitor trust, tone, and safety as rigorously as financial controls.
In practice this means: HR ensuring staff are trained in bias awareness; compliance embedding escalation rules; audit verifying that documented explanations exist; and leadership signalling that questioning AI is not obstruction but responsibility.
Governance Functions – Who Controls the Muscles and Nerves?
Boards and Non-Executive Directors
They set the tone: AI must support people, not replace accountability. Boards ask whether employees understand outputs, whether whistleblower channels are trusted, and whether culture encourages escalation.
Compliance Officers
They ensure AI follows laws and ethics. Compliance translates regulation into workflows: mandatory bias checks, required documentation, and explainability templates.
Internal Audit
They verify muscles and nerves function. Audit samples AI decisions, ensuring explanations are recorded and acted upon. They confirm that staff have documented overrides and that alerts reached decision-makers.
AI Governance Labour and Profession
AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession
Supervisors and Middle Management
They are the connective tissue. They must shield staff from overload, ensure training is applied, and escalate concerns upwards.
Works Councils and Employee Representatives
They provide collective nerve signals. When staff fear AI monitoring or job loss, governance means dialogue, agreements, and safeguards.
Read more on monitoring activities which these roles govern in Step 5 – COSO Monitoring Activities.
Global Governance Lessons
- Humans remain decision-makers. AI provides advice at speed; accountability rests with employees and boards.
- Soft controls matter. Culture, trust, and EQ prevent paralysis or spasm.
- Upskilling is governance. Training professionals to interpret and question AI is as vital as any audit.
- Global services need global oversight. Distance multiplies the importance of nerves.
- Explainability is proprioception. Awareness of movement prevents self-injury.
- Muscles need rest and support. Workflows must protect staff from fatigue and ensure validation is real.
FAQs – The Muscles and Nerves of AI Governance
Why describe labour and professions as the “muscles and nerves” of AI governance?
Employees are the muscles that provide the strength to execute strategies, operate systems, and ensure processes are carried out in daily practice. But muscles alone are not enough; they need nerves to direct and control their force. Culture, ethical awareness, and professional standards function as the nervous system, ensuring that the strength is applied in the right way, at the right moment.
Without this balance, AI governance either becomes blind enforcement — decisions made with speed but without reflection — or complete paralysis, where no one dares to act. This metaphor helps boards and managers visualize that governance is not only about policies and technologies, but about people who must be both empowered and guided.
The muscles give the organization its force; the nerves provide the sensitivity and feedback loops that make sure the force builds trust instead of breaking it.
Can AI replace human decision-making?
AI can process vast data volumes at speeds far beyond human capacity, and it can propose recommendations that appear highly accurate. But governance demands more than efficiency: it demands accountability, legitimacy, and the ability to explain why a decision was made.
These qualities are uniquely human. A doctor, for instance, may accept an AI suggestion, but only after reviewing the evidence and ensuring it aligns with professional standards of care. A compliance officer may follow an AI alert, but only after documenting why it fits the regulatory framework.
Without human validation, AI decisions are just automated outputs — powerful but legally and ethically empty.
Governance insists that humans remain the final decision-makers, because only they can carry responsibility before regulators, courts, and stakeholders. AI may accelerate analysis, but judgment, context, and accountability stay in human hands.
What role do soft controls play in AI governance?
Soft controls — culture, trust, and emotional intelligence — form the invisible nervous system of organizations. They ensure that when AI delivers an output, employees feel safe to question it, escalate concerns, and apply ethical standards. Technical controls alone cannot guarantee integrity; they only catch violations after the fact.
Soft controls influence behaviour proactively: if the tone from the top emphasizes transparency and fairness, staff will be more willing to challenge odd results instead of rubber-stamping them. For example, if a recruitment algorithm consistently overlooks minority candidates, only a culture of openness will ensure HR professionals speak up and demand adjustments.
In practice, soft controls turn governance from a rulebook into lived behaviour. They bridge the gap between formal compliance and actual practice, ensuring that AI is not only efficient but also trustworthy. Without soft controls, even the best-designed AI systems risk being used in ways that erode stakeholder confidence.
How should boards prepare employees for AI adoption?
Boards must recognize that AI adoption is not simply a technological upgrade but a profound change in how professionals work. Preparing employees means providing them with the knowledge, authority, and confidence to work with AI responsibly. Training in data literacy, bias awareness, and critical judgment must be built into professional development, just as financial literacy is for managers.
Governance also requires boards to ensure workloads are realistic: if staff are expected to “validate” endless AI outputs without time or authority, they will default to rubber-stamping, and governance collapses. Instead, boards should set thresholds where AI acts autonomously only in low-risk scenarios, while humans review and decide in high-stakes cases.
Moreover, boards must ensure that employee voices are heard during implementation. Works councils, unions, or professional associations can serve as governance partners, ensuring adoption does not erode trust but strengthens it. In short, boards prepare staff not only with tools but with a culture that reinforces professional responsibility.
How do global service providers fit into this metaphor?
When organizations outsource critical processes to global service providers, they extend their muscles across borders. Indian engineers processing European financial data, or Brazilian analysts monitoring supply chains, are effectively part of the corporate body. But if the nerves — oversight, feedback, and transparency — do not connect, the body suffers misalignment.
Governance requires contracts that demand explainability, shared dashboards that give clients visibility, and cross-border audits that ensure accountability. Without these, distance becomes opacity, and local problems may escalate before headquarters even notices. The metaphor highlights the risk: strong muscles abroad are useless if the nerves that carry signals home are cut.
Service providers must be integrated into the governance framework, with clear responsibilities and reporting lines, so that actions taken far away can still be trusted locally. In this way, global muscles remain coordinated by global nerves, ensuring that outsourcing strengthens rather than weakens governance.
What is the risk if employees cannot explain AI outputs?
If employees cannot explain AI outputs, two risks emerge. First, paralysis: professionals may refuse to use AI tools because they cannot justify the results, slowing down decisions and undermining innovation. Second, blind execution: staff may simply accept whatever the system delivers, turning into rubber stamps. Both outcomes damage governance.
Regulators, courts, and stakeholders demand explanations in human language, not technical code. A bank that declines a loan must tell the customer why; a hospital that recommends treatment must explain the reasoning; an HR department that rejects a candidate must defend its process. If employees cannot provide these explanations, trust erodes, legal challenges multiply, and legitimacy collapses.
Governance depends on explainability by design: systems must generate reasons, staff must be trained to understand them, and organizations must document both the AI’s output and the human’s judgment. Only then do muscles and nerves work together, ensuring that AI remains a tool of governance rather than a threat to it.
AI Governance Labour and Profession
AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession AI Governance Labour and Profession