Introduction
Apprenticeship units are short flexible training courses designed to support employers to upskill their workforce in critical skill shortage areas. Units are for employed learners aged 19 and over, where their employer has identified that they need to upskill quickly to respond to emerging skills gaps and to support business growth and productivity.
Apprenticeship units are based on relevant knowledge and skills in existing employer-led occupational standards to ensure relevant high-quality, targeted training. Each unit is short, with the length of training ranging from 30 and 140 hours delivered over a period of 1 to 16 weeks. This enables employers to have maximum flexibility to select a unit that meets their specific skill need and to deliver the training in a way that fits around their business.
Who is it for?
This apprenticeship unit is for individuals in leadership roles responsible for setting direction and who have oversight of AI use, who, with the support of their employer, need upskilling in the safe and effective delivery of AI-enabled organisational transformation. It is suited to those overseeing implementation of AI and ensuring that AI solutions are integrated effectively into organisational processes and ways of working.
Learning outcomes
A learning outcome is a concise statement that describes what an individual should be able to do by the end of their course. It summarises a cluster of knowledge and skills in the course and provides a foundation for assessment.
Learning outcomes
- Lead delivery of AI-enabled organisational change, ensuring AI solutions are sustainable and aligned to long-term organisational objectives.
- Assess and manage workforce impacts of AI adoption, including reskilling, role redesign, anticipating and responding to potential job displacement and workforce reduction. Assess the impact on the organisation’s wider ecosystem, for example, suppliers and digitally excluded groups.
- Implement organisational level AI risk management, including monitoring, the use of tools, mitigation and escalation.
- Achieve audit requirements and regulatory compliance, including incident and security response planning.
- Monitor performance and risks of deployed AI systems, including bias, drift and security vulnerabilities.
- Communicate AI risks and opportunities internally and externally, including communication with non-technical stakeholders, and regulators.
| Occupational standard |
Related apprenticeships
|
Occupational map link
|
| Artificial intelligence (AI) and automation practitioner | ST1512 V2.0 | OCC1512 |
|
Related apprenticeships:
ST1512 V2.0
Occupational map link:
OCC1512
|
||
| Chartered manager (degree) | ST0272 V1.1 | OCC0272 |
|
Related apprenticeships:
ST0272 V1.1
Occupational map link:
OCC0272
|
||
| Senior leader | ST0480 V1.2 | OCC0480 |
|
Related apprenticeships:
ST0480 V1.2
Occupational map link:
OCC0480
|
||
Entry requirements
Learners must be employed and must be 19 years or over.
Must be working in a leadership position within an organisation with autonomy to deliver technological change and inform investment decisions.
Technical knowledge
K1: AI and automation concepts and models that support leadership decision-making, and their limitations. The impact adoption may have on workplace culture and wellbeing.
K2: The capabilities, benefits and risks of automation, AI and digital tools, including responsible use, ethical considerations and the potential impact on the workforce.
K3: The role of organisational leadership in responsible AI adoption, including setting values, policy, and strategy. The business case for ethical AI adoption, including reputational risk, staff engagement and morale, and long-term sustainability.
K4: Understand how to develop and implement organisational AI strategy and plans, including approaches to workforce development, taking and managing risk, monitoring and evaluation, and quality assurance.
K5: How to assess the viability of solutions when making acquisition decisions, for example, testing and evaluating solutions, using test data and results, feasibility (time, cost, data quality and process maturity), and user testing.
K6: The capabilities, risks and implications of adopting on-premise, cloud-based and third-party solutions.
K7: The capabilities, benefits and risks of automation, AI and digital tools, including responsible use, ethical considerations and the potential impact on the workforce.
K8: Governance principles to ensure accountability and compliance, including defining roles and responsibilities to identify, escalate and mitigate threats or risks to assets, data and cyber security.
K9: Crisis and risk management strategies including accountability and technological implications.
K10: Principles of human oversight and human AI collaboration to achieve shared outcomes.
K11: Principles for operationalising sustainable AI solutions to support organisational strategies and objectives.
K12: Governance principles to ensure accountability and compliance, including methods to identify system vulnerabilities and mitigate threats or risks to assets, data and cyber security.
K13: Engagement and training approaches used with non-technical staff to understand their roles, responsibilities, and concerns when AI automation solutions are proposed, in support of strategic AI governance decisions.
K14: Principles to support project and change management delivery.
K15: Principles and practices of algorithmic impact assessment and workforce equality monitoring, including methods to identify, assess, and mitigate potential disproportionate impacts of automation and AI systems on different workforce groups. Organisational responsibilities under equality and employment law, and methods to evidence fairness and transparency in adoption.
K16: Principles and practices for the long-term monitoring of AI and automation solutions, including detection and mitigation of risks such as model drift, emerging bias, degraded performance, and security vulnerabilities.
Technical skills
S1: Identify organisational improvements and opportunities for innovation and growth, using qualitative and quantitative analysis of information and data.
S2: Set strategic direction for AI and gain support for it from key stakeholders.
S3: Commission analysis to identify if AI adoption is viable. Evaluate assessments of risks and unintended consequences of AI automation projects, such as the impact on job roles.
S4: Use evidence to inform governance of AI adoption, outcomes and facilitate improvement.
S5: Ensure sustainable and efficient AI and automation solutions.
S6: Ensure business needs are aligned with technical capabilities, to ensure solutions are scalable, efficient, and aligned with the organisation’s strategic objectives.
S7: Apply principles relating to ethics and values-based leadership and governance, and regulatory compliance.
S8: Lead and respond in a crisis situation using risk management techniques.
S9: Use project management principles, techniques and tools to support the development of clear, balanced AI communications and briefings, articulating both opportunities and risks.
S10: Lead deployment of AI and automation strategies, including measures to deal with the impact of automation, for example, workforce engagement, retraining, redeployment, or upskilling of affected staff.
S11: Present and communicate information, including the translation of technical concepts into accessible materials to support clear dialogue with stakeholders.
Knowledge and skills outcomes
| Function | Learning Outcome | K & S mapping |
|---|---|---|
| AI-enabled change | Lead delivery of AI-enabled organisational change, ensuring AI solutions are sustainable and aligned to long-term organisational objectives | K3, K4, K11, K15, S1, S5, S6, S10 |
| Workforce and ecosystem impact | Assess and manage workforce impacts of AI adoption, including reskilling, role redesign, anticipating and responding to potential job displacement and workforce reduction. Assess the impact on the organisation’s wider ecosystem, for example, suppliers and digitally excluded groups. | K2, K7, K13, K16, S7, S10 |
| Risk management | Implement organisational level AI risk management, including monitoring, the use of tools, mitigation and escalation. | K8, K9, K10, K14, S8 |
| Audit, compliance & security | Achieve audit requirements and regulatory compliance, including incident and security response planning. | K9, K10, K12, K14, S8, S9 |
| Monitoring | Monitor performance and risks of deployed AI systems, including bias, drift and security vulnerabilities. | K10, K29 |
| Communication | Communicate AI risks and opportunities internally and externally, including communication with non-technical stakeholders, and regulators. | K3, S9, S11 |
Funding
This apprenticeship unit is currently eligible for public funding.
Skills England will provide the Department for Work and Pensions with ongoing advice on critical skills needs, and the affordability and prioritisation of funding for apprenticeship units will remain under review.
The Department will give notice if funding for this apprenticeship unit is to be withdrawn. Following which, funding for new starts will not be available after four weeks from that notice being given.
Validation and assessment
Mandatory: As a minimum, learners will need to pass a skills test delivered by the training provider, to demonstrate that they have acquired the skills and knowledge set out in the apprenticeship unit. Employers will need to validate the result to confirm the learner has been successful.
Extended: In addition, employers (or learners) have the option to choose independent external assessment where they feel it is appropriate, for example, through the use of a non-mandatory qualification.
If the apprenticeship unit is in a regulated occupation and the role requires adherence to industry recognised standards and procedures, we would expect employers to choose an extended assessment.
Version log
| Version | Change detail | Earliest start date | Latest start date |
|---|---|---|---|
| 1.0 | 28/04/2026 | Not set |
Crown copyright © 2026. You may re-use this information (not including logos) free of charge in any format or medium, under the terms of the Open Government Licence. Visit www.nationalarchives.gov.uk/doc/open-government-licence