Toggle high contrast

Artificial Intelligence (Regulation and Employment Rights) Bill

Report type
Research and reports
Issue date
Introduction

In September 2023, the TUC set up a taskforce with the AI Law Consultancy at Cloisters Chambers, and the Cambridge University Minderoo Centre for Technology and Democracy, in order to manage the drafting of the Artificial Intelligence (Regulation and Employment) Bill (“The Bill”). 

The Bill was drafted by Robin Allen KC and Dee Masters of the  AI Law Consultancy at  Cloisters chambers. The TUC was assisted in administration of the project by the Cambridge University Minderoo Centre for Technology and Democracy.   

The Bill benefitted from the input and expertise of a Special Advisory Committee, which met three times across 2023 and 2024. 

Members of the Committee included representatives from a diverse range of organisations and political parties, including the Ada Lovelace Institute, the Alan Turing Institute, Connected By Data, TechUK, UKBlackTech, the British Computer Society, CIPD, the RAI UK, Cambridge University, Oxford University, Prospect, Community, CWU/UTAW, USDAW, GMB and cross-party MPs. The policy expressed in the Bill is that of the TUC and should not be taken to express the policy of these organisations unless explicitly stated. 

Purpose of the Bill 

The Bill regulates the use of artificial intelligence systems by employers in relation to workers, employees and jobseekers to protect their rights and interests in the workplace.  

The Bill also provides for trade union rights in relation to the use of artificial intelligence systems by employers, addresses the risks associated with the value chain in the deployment of artificial intelligence systems in the field of employment, and enables the development of safe, secure and fair artificial intelligence systems in the employment field. 

The rights and obligations contained in the Bill will be enforceable in the Employment Tribunal which is ordinarily a ‘no cost’ jurisdiction (where the parties are responsible for their own costs regardless of outcome). 

Authored by Robin Allen KC and Dee Masters, AI Law Consultancy at Cloisters Chambers

Outline of AI Bill 

The Bill is divided into thirteen Parts, each covering a different aspect of artificial intelligence and employment, with four Schedules which complement these Parts.  There are also Explanatory Notes. 

Part 1: Preliminary  

This part introduces the structure of the Bill. 

Part 2: Core Concepts  

This part of the Bill defines the core concepts that define the remit of the Bill such as ‘artificial intelligence system’, ‘high-risk decision-making’, ‘data’, ‘processing’, ‘emotion recognition technology’, ‘employee’, ‘worker’, ‘jobseeker’ and ‘employer’.   

The decision by an employer or its agent to deploy artificial intelligence systems for ‘high-risk decision-making’ is the trigger for most of the rights and obligations in the Bill.  

‘Decision-making’ means any decision made by an employer or its agent in relation to its employees, workers or jobseekers taken or supported by an artificial intelligence system.  

Decision-making is ‘high-risk’ in relation to a worker, employee, or jobseeker, if it has the capacity or potential to produce legal effects concerning them, or other similarly significant effects. 

The term ‘jobseeker’ is new and covers ‘a person who is actively seeking new employment, whether or not that person is already employed’.  

The definition of ‘employer’ includes an existing employer of a worker or employee and a prospective employer in relation to a jobseeker. 

Part 3: Transparency, Observability, and Explainability  

This Part enacts positive duties on employers and their agents in the employment sphere.  

A new type of assessment called a ‘Workplace AI Risk Assessments’ (WAIRA) is created.   

An employer cannot undertake high-risk decision-making until a WAIRA has risk assessed an artificial intelligence system in relation to health and safety, equality, data protection and human rights. 

There will need to be direct consultation with employees and workers before high-risk decision-making occurs; the WAIRA will be central to that consultation.  

Employees will need to establish and maintain a register of information about the artificial intelligence systems used in high-risk decision-making. 

There will be a right to personalised explanations for a high-risk decisions which are or might reasonably be expected to be detrimental to employees, workers or jobseekers. 

Employees, workers or jobseekers will be entitled to a right to human reconsideration of a high-risk decision.   

Part 4: Prohibition on Detrimental Use of Emotion Recognition Technology 

This part prohibits the use of emotion recognition technology in high-risk decision-making that may be detrimental to a worker, employee or jobseeker.  

Part 5: Prohibition on Discrimination  

Existing rights in the Equality Act 2010 are amended to tailor them to the use of artificial intelligence systems by employers and their agents in relation to employees, workers and jobseekers. 

The amendments include that employees will not be liable for the discriminatory consequences of artificial intelligence systems used by their employers, and employers will need to prove that systems are not discriminatory in order to avoid liability subject to a new audit defence. 

This new audit defence will allow employers and their agents to successfully defend a discrimination claim where they did not create or modify the artificial intelligence system and conducted thorough auditing before deploying it, including introducing careful procedural safeguards.    

Part 6: Health and Wellbeing 

There is a statutory right to disconnect which will be added to the Employment Rights Act 1996. 

Part 7: Dismissal 

This part states that it will be automatically unfair dismissal to dismiss an employee through unfair reliance on high-risk decision-making or as a punishment because an employee has exercised their right to disconnect.  The right to interim relief pending determination of any complaint will also be extended to these scenarios. 

Part 8: Trade Unions 

There are provisions for the fair use of data so that trade unions can be provided with the data collected by employers in relation to their members.    

Existing collective consultation obligations in relation to trade unions are also extended to situations in which an employer is proposing to do high-risk decision-making. 

The consultation must begin at least one month before the high-risk decision-making takes place and must be repeated every 12 months for as long as decision-making continues. 

The consultation must include consultation about the risks to the rights of employees and the measures envisaged to address the risks. 

Part 9: Auditing and Procedural Safeguards  

This section of the Bill sets out the auditing of artificial intelligence systems for discrimination, and the standards which an employer must meet to rely on the auditing defence set out in Part 5.   

Part 10: Regulators and Bodies in the Employment Field and Artificial Intelligence 

This part sets out regulatory obligations concerning artificial intelligence and details the principles that key regulators (identified within Schedule 3) must apply in any context concerning employment and the deployment of artificial intelligence systems.  

The AI Bill Project

This paper sets out the background to the TUC Artificial Intelligence (Employment and Regulation) Bill (“the Bill”), the multi-stakeholder process behind the drafting, why the Bill is needed, and how it could improve the rights of working people.   

AI is rapidly transforming our society and the world of work, yet there are no AI related laws in place in the UK, nor any current plans to legislate soon. 

Urgent action is needed to ensure that people are protected from the risks and harms of AI-powered decision making in the workplace, and that everyone benefits from the opportunities associated with AI at work. Employers and businesses also need the certainty offered by regulation.  

And the more say workers have in how technology is used at work, the more rewarding and productive the world of work will become for us all.  

British workers are overwhelmingly supportive of more worker consultation, with 69% of working adults in the UK agreeing that employers should have to consult their staff first before introducing new technologies such as AI in the workplace. 1

The Bill translates many of the principles and values that seem to attract near universal support (such as the importance of consultation, transparency, explainability and equality) into concrete rights and obligations. It represents a significant step forward in the movement towards the responsible adoption of AI. 

Read the full summary here

AI Bill : Part 1 Preliminary

1. Overview

  1. This Act makes provision for the safe, secure, and fair use of decision- making based on artificial intelligence systems, by employers and prospective employers, in relation to workers, employees and jobseekers, and its provisions are to be construed accordingly.
  1. Part 2 defines the Core Concepts used in this Act.
  1. Part 3 enacts positive duties of transparency, observability and explainability on employers and prospective employers, and persons acting on their behalf.
  1. Part 4 enacts a prohibition on emotion recognition technology which is used to the detriment of workers, employees, and jobseekers.
  1. Part 5 tailors the prohibition in discrimination within the Equality Act

2010 to address the use of artificial intelligence systems by amending the burden of proof and introducing a new defence for employers and their agents, where they have audited the system for discrimination.

  1. Part 6 enacts a right for employees to disconnect.
  1. Part 7 extends the right of employees not to be unfairly dismissed to circumstances when artificial intelligence systems are used and provides protection in relation to the new rights contained in this Act.
  1. Part 8 extends the existing rights and obligations in the Trade Union and Labour Relations (Consolidation) Act 1992 to deployment of artificial intelligence systems and makes further provision for trade unions to secure the fair use of data collected by employers that relates to employees and workers.
  1. Part 9 enacts provisions for the auditing of artificial intelligence systems.
  1. Part 10 enacts enhanced responsibilities for regulators and bodies operating in the employment and artificial intelligence field.
  1. Part 11 enacts provisions to ensure that employers comply with recommendations made by an employment tribunal in proceedings under this Act.
  1. Part 12 enacts provisions to encourage innovation in relation to the use of artificial intelligence systems in the context of employment.
  1. Part 13 contains general and miscellaneous provisions including a power to exempt or modify the obligations within this Act for microbusinesses.
AI Bill: Part 2 - Core concepts
  1. The Core Concepts

    (1) The Core Concepts in this Act are those defined in this Part.

    (2) Cognate phrases to those defined in this Part are to be construed accordingly.

    (3) Regulations made under, and guidance and codes published in accordance with, the powers in this Act are to be construed accordingly.

  1. Artificial intelligence system

    1. In this Act an “artificial intelligence system” means a machine-based system that, for explicit or implicit objectives, infers from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. Different artificial intelligence systems vary in their levels of autonomy and adaptiveness after deployment.
    2. Such systems will have functions that include, but are not limited to -
      1. prediction,
      2. planning,
      3. classification,
      4. pattern recognition,
      5. organisation,
      6. perception,
      7. the recognition of speech, sound, or image,
      8. the generation of text, sound, or image,
      9. language translation,
      10. communication,
      11. learning,
      12. representation, and
      13. problem-solving.
    3. A system does not cease to be an artificial intelligence system solely because of human involvement in the system.
  1. Artificial intelligence value chain

    1. In this Act, “value chain” means the tools, services, code, components, and processes that are the steps by which an artificial intelligence system accrues utility before the ultimate deployment of an artificial intelligence system in decision-making.
    2. The steps in a value chain include –
      1. training data acquisition,
      2. creation of training data sets,
      3. data collection,
      4. data manipulation,
      5. data pre-processing,
      6. model selection,
      7. model training and re-training,
      8. model testing, validation, and evaluation,
      9. software integration,
      10. application configuration, and
      11. other similar steps in the development of the system.
         
  2. Decision-making

    1. In this Act “decision-making” means any decision, including profiling, whether to act or not to act, made by an employer or its agent in relation to its employees, workers or jobseekers taken or supported by an artificial intelligence system.
    2. In this Act, “profiling” means decision–making by any form of processing of data by an artificial intelligence system to -
      1. evaluate one or more personal aspects relating to a natural person, or
      2. analyse, compare, or make predictions.
    3. Profiling for the purposes of this Act includes decisions concerning any one or more of the following aspects of a person -
      1. their performance at work or potential performance at work,
      2. their suitability for work or employment more generally,
      3. their interest in work or employment opportunity,
      4. their membership or potential interest in membership of trade union or other collective initiatives,
      5. their trade union activities or involvement in other collective initiatives,
      6. their state of health,
      7. their protected characteristics,
      8. their personal preferences,
      9. their interests,
      10. their reliability,
      11. their behaviour,
      12. their attitude toward an employment function,
      13. their location or movements, or
      14. any other similar attribute.
    1. Subsection (1) does not apply to decision-making in relation to the provision of a benefit, facility or service by the employer or agent (A) to employees, workers or jobseekers where A is concerned with the provision (for payment or not) of a benefit, facility or service of the same description to the public.
       
  3. High-risk

    1. In this Act, decision-making is “high-risk” in relation to a worker, employee, or jobseeker, if it has the capacity or potential to produce –
      1. legal effects concerning them, or
      2. other similarly significant effects.
    2. In this Act “legal effect” is to be construed and applied by reference to the rights and responsibilities of a worker, employee, or jobseeker, arising from or by reason of –
      1. the common law,
      2. contract law,
      3. the law of tort or delict, or
      4. any of the statutory provisions set out in Schedule 1.
    3. Unless an employer or their agent can prove otherwise, any decision- making listed in Schedule 2 is high-risk.
    4. The Secretary of State may make regulations to give further guidance as to the factors on which the employer may rely to show that decision making is not high-risk; before making such regulations the Secretary of State shall consult with such organisations of employers and employees as he considers appropriate.
  4. Data

    In this Act -

    1. “Data” means “personal data”, “biometric data” and “synthetic data”.
    2. “Personal data” means any information relating to an identifiable living individual.
    3. “Biometric data” means personal data resulting from specific technical processing relating to the physical, physiological, or behavioural characteristics of a natural person, such as facial images or dactyloscopic data, which allow or confirm the unique identification of that natural person, and which is the product of decision-making or is used for decision-making.
    4. “Synthetic data” is data that has been generated using a purpose built mathematical model or algorithm, with the purpose of using it in place of personal data and with the aim of solving one or more of a set of data science tasks.
    5. “Identifiable living individual” means a living individual who can be identified, directly or indirectly, by reference to –
      1. an identifier such as a name, an identification number, location data or an online identifier, or
      2. one or more factors specific to the physical, physiological, genetic, mental, economic, cultural, or social identity of the individual.
  5. Processing

    1. In this Act “processing”, in relation to information, means an operation or a set of operations which is performed on information, or on sets of information.
    2. “Processing” includes, but is not limited to, the following operations in respect of information –
      1. its collection, recording, organisation, structuring, or storage,
      2. its adaption or alteration,
      3. its retrieval, consultation, or use,
      4. its disclosure by transmission, dissemination
      5. its otherwise being made available,
      6. its alignment or combination,
      7. its restriction, erasure, or destruction, and
      8. any other similar operation.
  6. Emotion Recognition Technology

    In this Act “emotion recognition technology” means an artificial intelligence system used in whole or in part for the purpose of identifying or inferring the attention, emotions, or intentions of natural persons on the basis of their biometric data.

  7. Employees, workers, jobseekers, and employers

    In this Act -

    1. “Contract of employment” means a contract of service or apprenticeship, whether express or implied, and (if it is express) whether oral or in writing.
    2. “Employee” means an individual who has entered into, or works under (or, where the employment has ceased, worked under), a contract of employment.
    3. “Worker” means an individual who has entered into, or works under (or, where employment has ceased, worked under), —
      1. a contract of employment, or
      2. any other contract, whether express or implied and (if it is express) whether oral or in writing, whereby the individual undertakes to do or personally perform any work or services for another party to the contract whose status is not by virtue of the contract that of a client or customer of any profession or business undertaking carried on by the individual; and any reference to a worker’s contract shall be construed accordingly.
    4. “Jobseeker” means a person who is actively seeking new employment, whether or not that person is already employed.
    5. “Employer” means -
      1. in relation to an employee or a worker, the person by whom the employee or worker is (or, where the employment has ceased, was) employed, and
      2. in relation to a jobseeker, a person engaging in the process of identifying jobseekers with a view to entering into an employment relationship with one or more of them.
    6. “Employment”—
      1. in relation to an employee, means employment under a contract of employment,
      2. in relation to a worker, means employment under his contract,
      3. in relation to a jobseeker, means employment whether as an employee or as a worker.
  8. Trade union

    In this Act a “trade union” has the same meaning as section 1 of the Trade Union and Labour Relations (Consolidation) Act 1992.

  9. Amendment of the Core Concepts

    1. The Secretary of State shall keep under review the developments of technologies based on, or associated with, the collection of data relating to employment, for the purpose of deciding whether to amend the Core Concepts.
    2. For this purpose, not less than every two years from the commencement of this Act, the Secretary of State shall consult with appropriate organisations of employers and of employees and workers, in order to keep under review the developments of technologies based on, or associated with, the collection of data relating to employment.
    3. The Secretary of State may by order amend the Core Concepts –
      1. In sections to 9 to take account of developments in the capacities of artificial intelligence systems, and
      2. In section and Schedule 2 to take account of developments in the assessment of risk in relation to the use and capacities of artificial intelligence systems.
  10. Guidance

    1. The Secretary of State shall publish guidance as required by the provisions of this Act by order.
    2. The Secretary of State may by order also publish guidance including technical standards to supplement this Act for the purpose of addressing how the Core Concepts may be assessed, evaluated, and understood.
    3. Guidance, whether published pursuant to subsection (1) or (2), may make different provision for different circumstances.
    4. An employment tribunal or regulator having functions under this Act, shall take into account any relevant guidance published by the Secretary of State, in exercising those functions.
AI Bill: Part 3 - Transparency, observability and explainability
  1. Workplace AI Risk Assessments

  1. An employer shall carry out Workplace AI Risk Assessments in accordance with the provisions of this section. In this Act such an assessment is referred to as a “WAIRA”.
  2. High-risk decision-making shall not take place unless the employer has carried out an initial WAIRA.
  3. The initial WAIRA shall, unless not reasonably practicable, contain at least –
    1. A description of the proposed artificial intelligence system,
    2. A description of the relevant value chain,
    3. The date from which it is proposed that the system will be used in high-risk decision-making,
    4. The categories of high-risk decision which it is proposed the system will take or contribute to,
    5. The proposed purpose or aim in using the system,
    6. The logic which will underpin the proposed decision-making,
    7. The proposed data that will be processed by the system in relation to high-risk decision-making,
    8. The way in which the personal data of employees, workers or jobseekers will influence the proposed decisions,
    9. A description of how it is proposed to monitor the artificial intelligence system for accuracy, including how that metric will be defined, when high-risk decision-making takes place,
    10. A description of how it is proposed to monitor the artificial intelligence system for the risks to the rights of workers, employees or jobseekers contained in the Health and Safety at Work etc. Act 1974, the Human Rights Act 1998, the Equality Act 2010, the Data Protection Act 2018, and the UK General Data Protection Regulation,
    11. An assessment of the risks to the rights of workers, employees or jobseekers contained in the Health and Safety at Work etc Act 1974, the Human Rights Act 1998, the Equality Act 2010, the Data Protection Act 2018, and the UK General Data Protection Regulation, and
    12. The measures to be taken with a view to eliminating the risks.
  4. Once high-risk decision-making starts, an employer shall carry out further WAIRAs in accordance with subsection (3) at intervals of not more than 12 months for as long as decision-making continues.
  5. After the initial assessment, subsequent WAIRAs shall also assess -
    1. the impact of the high-risk decision-making that has taken place on the protected characteristics set out in sections 4 to 12 of the Equality Act 2010 including the extent to which inaccurate decisions are made by the artificial intelligence system,
    2. how often decisions have been modified pursuant to section 18, and
    3. the extent to which there have been any incidents in which the high-risk decision-making has caused harm in the workplace.
  6. The Secretary of State shall by order give guidance as to the form that a WAIRA shall take, including as to how it shall address modifications to the functions of an artificial intelligence system, and such guidance may make different provision for different circumstances such as the size of the employer.
  7. Before preparing guidance under subsection 6, the Secretary of State shall consult with such of the following as he considers appropriate—
    1. trade associations,
    2. trade unions,
    3. the Equality and Human Rights Commission,
    4. the Information Commissioner's Office, and
    5. persons who appear to the Secretary of State to represent the interests of workers.
  1. Direct consultation with employees and workers

    1. High-risk decision-making shall not take place unless, at least one month before the high-risk decision-making takes place, the employer has taken into account the concerns and interests of workers or employees who are or may be affected by it.
    2. In this Act, the concerns and interests of workers and employees, include all legitimate concerns and interests, including –
      1. Understanding and minimizing the deployment of detrimental high-risk artificial intelligence systems,
      2. The impact or potential impact of artificial intelligence systems upon workers and employees in relation to their well-being, and
      3. The potential for any diminution or other adverse effect on the degree of human connection with their employer.
    3. In order to take into account, the concerns and interests of employees or workers pursuant to subsection (1) employers shall –
      1. Complete a WAIRA in accordance with section 14,
      2. Share that WAIRA with their employees and workers,
      3. Listen to the concerns and interests of their employees and workers in relation to the WAIRA, and
      4. Discuss how any adverse aspects identified in the WAIRA can be removed or modified.
    4. Once high-risk decision-making starts, the process in subsection (3) must be repeated every 12 months for as long as decision-making continues.
    5. The Secretary of State shall by order provide guidance as to the process identified in subsection (3), and how account is to be given to such guidance.
    6. Such guidance may provide for different ways for employers to proceed for different classes of employment, and for different levels of risk.
    7. Before preparing guidance under subsection (5), the Secretary of State shall consult with such of the following as he considers appropriate—
      1. trade associations,
      2. trade unions, and
      3. persons who appear to the Secretary of State to represent the interests of workers.
  1. Register of artificial intelligence systems used for high-risk decision- making

    1. To the extent that it is reasonably practicable to do so, employers shall establish and maintain a register of information about the artificial intelligence systems used in high-risk decision-making, in accordance with the provisions of this section.
    2. The information contained in the register must be available to workers, employees, and jobseekers, in a readily accessible format.
    3. The information in the register must identify, in so far as it contributes to high-risk decision-making -
      1. Each artificial intelligence system in use,
      2. The date that the use commenced and when the use ended,
      3. The categories of high-risk decision-making the system took or contributed to,
      4. The purpose or aim in using the system,
      5. The type or category of data processed by the system, and
      6. The existence and date of any WAIRA.
    4. The information in relation to the use of an artificial intelligence system must be set out in the register within three months of the day on which the system is first used.
    5. The register must be kept up to date as changes to the artificial intelligence system occur, and the date of such changes must be recorded in the register within three months.
    6. The Secretary of State may make regulations by order as to -
      1. what is a readily accessible format,
      2. the form which a register must take,
      3. the detail of the information to be set out in the register, and
      4. what is reasonably practical,

      and such regulations may make different provision for different circumstances.

  1. Right to personalised explanations for a high-risk decision

    1. On request, made by an employee, worker, or jobseeker (A), in compliance with this section, an employer must provide an explanation of any high-risk decision which is, or might reasonably be expected to be to the detriment of A.
    2. The explanation must -
      1. be readily understandable,
      2. address how the decision affects the worker, employee, or jobseeker personally,
      3. be in writing in a readily accessible format, and
      4. be free of charge.
    3. The obligation in subsection (1) arises only if A makes a request to the employer’s nominated contact in writing within 3 months of the date on which they become aware that a high-risk decision has been made, or such longer period as is agreed between the parties or is otherwise just and equitable.
    4. The explanation shall be provided within 28 days of a written request from A or such other period as is agreed between the parties or is otherwise just and equitable.
    5. Subsection (1) does not apply, if -
      1. It is not reasonably practicable for the employer to provide an explanation,
      2. It relates to a decision which has already been personally reconsidered by the employer,
      3. It duplicates a request which the employer has already properly personally reconsidered in relation to A within the last 3 months, or
      4. It is vexatious or excessive.
    6. Whether a request is vexatious or excessive shall be determined having regard to the circumstances of the request, including (so far as relevant)—
      1. The extent to which the request repeats a previous request of a similar nature, for which the employer has already provided an explanation.
      2. How long ago any previous request(s) were made, and
      3. Whether the request overlaps with other requests made by the employee or worker to the employer.
    7. In any proceedings where there is an issue as to whether a request is vexatious or excessive, it is for the employer to show that it is.
    8. An employer’s nominated contact is such person, as is nominated and competent to provide an explanation for the purposes of this section, or in default of such nomination the most senior person within the employer.
    9. An employer may nominate the contact by any means, provided that the name and address and contact details of that person are readily available to all its employees, workers, or jobseekers.
    10. The Secretary of State may make regulations by order as to -

      and such regulations may make different provision for different circumstances.

      1. What is reasonably practical,
      2. The contents of an explanation, and
      3. What is an acceptable accessible format for an explanation.
  1. Right to human reconsideration of a high-risk decision

    1. Subject to subsection (2), on request made by an employee, worker, or jobseeker (A), in compliance with this section, an employer shall undertake a personal reconsideration, by a competent agent, of any high-risk decision which is, or might reasonably be expected to be, to the detriment of A.
    2. The obligation in subsection (1) arises only if A makes a request to the employer’s competent agent in writing within 6 months of the date on which they become aware that a high-risk decision has been made, or such longer period as is agreed between the parties or is otherwise just and equitable.
    3. An employer’s competent agent is such person, as is competent to act for the purpose of this section in accordance with subsection (5), and who is nominated to act for the purposes of this section, or in default of such person the most senior person within the employer.
    4. An employer may nominate the competent agent by any means, provided that the name and address and contact details of that person are readily available to all its employees, workers, or jobseekers.
    5. A person is competent to act as an agent for the purpose of this section only if they are -
      1. suitably trained,
      2. designated by the employer to conduct such reconsiderations,
      3. able to discuss and clarify the facts, circumstances and reasons that led to or relating to high-risk decision, to which A has been subject,
      4. able to discuss and clarify the facts, circumstances and reasons that led to or relating to high-risk decision, to which A has been subject, and
      5. able to alter that decision.
    6. The reconsideration by a competent agent must take place, and be notified to A in writing, within 28 days or such other reasonable period as is agreed between the parties or is otherwise just and equitable.
    7. The employer may authorise the competent agent to delegate the task of reconsideration, provided that the person to whom the function is delegated also satisfies the requirements in subsection (5).
    8. Subsection (1) does not apply if, either,
      1. It is not reasonably practicable for the employer to provide an explanation,
      2. It duplicates a request which the employer has already properly personally reconsidered in relation to A within the last 3 months, or
      3. It is vexatious or excessive.
    9. Whether a request is vexatious or excessive must be determined having regard to the circumstances of the request, including (so far as relevant)—
      1. The extent to which the request repeats a previous request of a similar nature, for which the employer has already provided an explanation,
      2. How long ago any previous request was made, and
      3. Whether the request overlaps with other requests made by the employee or worker to the employer.
    10. In any proceedings where there is an issue as to whether a request is vexatious or excessive, it is for the employer to show that it is.
    11. The Secretary of State may make regulations by order as to -
      1. What is reasonably practicable,
      2. The training that is necessary,
      3. The nomination of a contact, and
      4. The form of the reconsideration,

      and such regulations may make different provision for different circumstances.

  1. Complaint to an employment tribunal

    1. A worker, employee, or jobseeker, who is personally affected may present a complaint to an employment tribunal that their employer has failed to comply with the rights and obligations set out in sections 14, 15, 16, 17 or 18.
    2. On a complaint under this section, it is for the employer to show that it has complied with its obligation under sections 14, 15, 16, 17 or 18.
    3. An employment tribunal shall not consider a complaint under this section after the end of –
      1. The period of 6 months beginning with the date of the failure to which the complaint relates, or
      2. Such other period as the employment tribunal considers just and equitable.
    4. Section 207B Employment Rights Act 1996 (extension of the time limits to facilitate conciliation before institution of proceedings) applies for the purposes of subsection (3)(a).
    5. For the purposes of subsection (3), a failure shall be taken to have occurred on the day after the last date by which an employer could have complied fully with an obligation in section 14, 15, 16, 17 or 18.
    6. For the purposes of subsection (3), in deciding what is just and equitable, the employment tribunal shall take into account-
      1. Any steps taken by a trade union or an employee, worker or jobseeker, to attempt to persuade an employer to comply with sections 14, 15, 16, 17 or 18 without recourse to litigation, and
      2. the extent to which the employer’s failure was observable and transparent.
  1. Remedy

    1. Where an employment tribunal finds a complaint under sections 14, 15, 16, 17 or 18 well-founded, the tribunal may –
      1. make a declaration to that effect,
      2. make an award of compensation to be paid to the worker, employee, or jobseeker, in respect of the act or failure to act to which the complaint relates, and
      3. make a recommendation in accordance with Part 11 to the employer as to the steps necessary to remedy the breach of this Act and to ensure that there is no repetition (a breach of which will attract additional compensation).
    2. The award of compensation may include compensation for the injury to the feelings of the worker, employee, or jobseeker.
    3. The amount of compensation pursuant to subsection (1)(b) shall be such as the employment tribunal considers just and equitable in all the circumstances having regard to the infringement to which the complaint relates but shall not exceed a maximum sum of £xx.
    4. An award of compensation pursuant to subsection (1)(b) will not prevent the worker, employee, or jobseeker from seeking a remedy for the infringement of their rights and entitlements under other legislation, but the principle of no double recovery for the same loss shall apply.
    5. An award of compensation pursuant to subsection (1)(c) shall be such as the employment tribunal considers just and equitable in all the circumstances having regard to the infringement to which the complaint relates but shall not exceed a maximum as set out in section 157(5) in the Data Protection Act 2018.
    6. The Secretary of State shall make regulations by order to set the initial maximum compensation and for a mechanism to update the maximum sum in subsection (3) on annual basis after consultation with such employer and employee organisations as he considers appropriate.
AI Bill: Part 4: Prohibition on detrimental use of emotion recognition technology
  1. Prohibition on detrimental treatment due to emotion recognition technology

    No high-risk decision-making using emotion recognition technology may be used which is, or might reasonably be expected to be, to the detriment of a worker, employee, or jobseeker.

  1. Complaint to an employment tribunal

    1. A worker, employee or jobseeker may present a complaint to an employment tribunal that an employer has acted contrary to section 21 in so far as they are personally affected by the alleged breach.
    2. An employment tribunal may not consider a complaint under this section after the end of —
      1. the period of 6 months starting with the date of the act to which the complaint relates, or
      2. such period as the employment tribunal thinks just and equitable.
    3. For the purposes of subsection (2)—
      1. conduct extending over a period is to be treated as done at the end of the period, and
      2. a failure to do something is to be treated as occurring when the person in question decided on it.
    4. In the absence of evidence to the contrary, a person (P) is to be taken to decide on a failure to do something –
      1. when P does an act inconsistent with doing it, or
      2. If P does no inconsistent act, on the expiry of the period in which P might reasonably have expected to do it.
    5. Section 207B Employment Rights Act 1996 (extension of the time limits to facilitate conciliation before institution of proceedings) applies for the purposes of subsection (3)(a).
    6. For the purposes of subsection (3), in deciding what other period is just and equitable, the employment tribunal shall take into account-
      1. any steps taken by a trade union or an employee, worker, or jobseeker to attempt to persuade an employer to comply with section 21 without recourse to litigation, and
      2. the extent to which the use of the high-risk decision-making has been observable and transparent.
  1. Remedy

    1. Where an employment tribunal finds a complaint under section 21 well- founded, the tribunal may –
      1. make a declaration to that effect,
      2. make an award of compensation to be paid to the worker, employee, or jobseeker, in respect of the act or failure to act to which the complaint relates, and
      3. make a recommendation in accordance with Part 11 to the employer as to the steps necessary to remedy the breach of this Act and to ensure that there is no repetition (a breach of which will attract additional compensation).
    2. The amount of compensation further to subsections (1)(b) and (c) shall be such as the employment tribunal considers just and equitable in all the circumstances having regard to the infringement to which the complaint relates but shall not exceed a maximum sum of £xx.
    3. An award of compensation pursuant to subsection (1)(b) will not prevent the worker, employee, or jobseeker from seeking a remedy for the infringement of their rights and entitlements under other legislation, but the principle of no double recovery for the same loss shall apply.
    4. The Secretary of State shall make regulations by order to set the initial maximum compensation and for a mechanism to update the maximum sum in subsection (2) on annual basis after consultation with such employer and employee organisations as he considers appropriate.
AI Bill: Part 5 - Prohibition on discrimination
  1. Discrimination in relation to high-risk decision-making

  1. The Equality Act 2010 is amended by the insertion after Part 2, Chapter 1 of section 12A as follows –

“CHAPTER 1A

Artificial Intelligence at Work

12A Definitions

  1. In this Act, “Artificial intelligence system” has the same meaning as section 3 in the Artificial Intelligence (Regulations and Employment Rights) Act 2024.
  2. The terms “value chain”, “high-risk”, “decision-making”, “data”, “processing”, “jobseeker” and “employer” have the same meaning as sections 4 to 10 in the Artificial Intelligence (Regulations and Employment Rights) Act 2024 when used in relation to artificial intelligence systems.”
  1. Jobseekers

    1. Section 39 of the Equality Act is amended an accordance with subsections (2) and (3).
    2. Substitute “39. Employees and Applicants” with “39. Employees, Applicants and Jobseekers”.
    3. After subsection (1)(a), add:

      “(aa) in identifying B as a jobseeker;

      (ab) advertising to B as a jobseeker”

  1. Liability of employers and principals

    1. Section 109 Equality Act 2010 is amended to add after (1) –

      “(1A) Any decision-making done by an artificial intelligence system which is deployed by an employer must be treated as done by the employer.”

    2. Section 109 Equality Act 2010 is amended to add after (2) –

      “(2A) Any decision-making within the meaning of the Artificial Intelligence (Regulations and Employment Rights) Act 2024 deployed by an agent for a principal, with the authority of the principal, shall be treated as also done by the agent and the principal.”

  1. Liability of employees

    Section 110 Equality Act 2010 is amended to add after section 110 (3) –

    “(3A) A does not contravene this section if A is an employee and he relied to any extent on decision-making within the meaning of the Artificial Intelligence (Regulations and Employment Rights) Act 2024 which is deployed by an employer.”

  1. Remedy

    Section 124 Equality Act 2010 is amended to add after (3) –

    “(3A) Where the contravention relates to the use of discriminatory artificial intelligence systems, in setting out an appropriate recommendation, the tribunal must have regard to Part 9 of the Artificial Intelligence (Regulation and Employment Rights) Act 2024.”

  1. Burden of proof

    1. Section 136 of the Equality Act 2010 is amended as set out in subsections (2) – (3).
    2. Subsection (2) is amended to include at the start “Subject to subsection (3A),”.
    3. After Subsection (3), insert:

      “(3A) Where person (A) is alleged to have contravened Part 5 (work), as a result of reliance on an artificial intelligence system within the meaning of the Artificial Intelligence (Regulations and Employment Rights) Act 2024, unless A shows that the provision was not contravened (whether by A or by the artificial intelligence system), the court must hold that the contravention occurred.

      (3B) If A cannot discharge the burden of proof set out in subsection 3A, it is nevertheless a defence to a claim under this Act where A is the employer or its agent, and

      1. A did not create or modify the artificial intelligence system,
      2. A audited the artificial intelligence system for discrimination at each stage in the artificial intelligence value chain before using it to make high-risk decisions, as set out in Part 9 of the Artificial Intelligence (Regulation and Employment Rights) Act 2024, and
      3. there were procedural safeguards in place designed to remove the risk of discrimination after the audit was completed which included monitored steps to prevent employees or workers from using the artificial intelligence system in a discriminatory way as set out in Part 9 of the Artificial Intelligence (Regulation and Employment Rights) Act 2024.

      (3C) If A successfully relies on the defence in section 3B, this does not preclude other persons in the artificial intelligence value chain from being liable under the Equality Act 2010 beyond Part 5 (work)”

  1. Amendment to Schedule 25 of the Equality Act 2010

    Schedule 25 to the Equality Act 2010 (Information Society Services) is amended by the deletion of paragraphs 1 and 2.

AI Bill: Part 6 - Health and wellbeing
  1. Statutory right to disconnect

    The Employment Rights Act 1996 is amended by the insertion after section 63K of a new section as follows:

“PART 6B

STATUTORY RIGHT TO DISCONNECT

63L. Statutory right to disconnect

  1. For the purposes of this Part, “agreed working hours” means the period of time in respect of which an employer has agreed to remunerate his employee, and which is not holiday time or any other form of paid leave.
  2. Unless the employer can prove otherwise, an employee’s “agreed working hours” will be as stated in any statement produced in accordance with sections 1, 2, 4, 7A and 7B Employment Rights Act 1996.
  3. An employer shall not require an employee employed by him to monitor or respond to any work-related communications, or to carry out any work, outside of the employee’s normal working hours unless, and to the extent that a different arrangement has been agreed by way of collective agreement within the meaning of section 178 TULRCA 1992 or by a relevant workforce agreement.
  4. Subsection (3) does not apply where the employer can show that there is a genuine economic or functional emergency threatening the fair running of the employer which justifies work-related communications, or the carrying out of any work, outside of A’s normal working hours.
  5. The employer must send a statement to each employee explaining that there is a right to disconnect save in an emergency; before sending such a statement the employer should consult with any recognised trade union on the terms of the statement and take into account any relevant guidance from ACAS.
  6. The Secretary of State shall make regulations as to the timing of the first statement in subsection (5).
  7. The statement in subsection (5) above must be re-issued every 12 months.
  8. For the purposes of this section an agreement is a “workforce agreement” if it meets the conditions for a workforce agreement set out in Schedule 1 to the Working Time Regulations 1998.
  9. ACAS must prepare and publish a code of practice for employers, employees and trade unions in relation to this section.
  10. Before preparing a code or amendments under this section, ACAS must consult the Secretary of State and such of the following as ACAS considers appropriate—
    1. trade unions,
    2. employers’ organisations, and
    3. persons who appear to ACAS to represent the interests of employees and employers.

63M. Enforcement

  1. An employee may present a complaint to an employment tribunal that there has been a breach of section 63L(3) by his employer.
  2. An employment tribunal shall not consider a complaint under this section unless it is presented:
    1. before the end of the period of 6 months beginning with the date of the act or failure to act to which the complaint relates or, where that act or failure is part of a series of similar acts or failures, the last of them, or
    2. within such further period as the tribunal considers reasonable in a case where it is satisfied that it was not reasonably practicable for the complaint to be presented before the end of that period of 6 months.
  3. For the purposes of subsection (2)—
    1. where an act extends over a period, the “date of the act” means the last day of that period, and
    2. a deliberate failure to act shall be treated as done when it was decided on,

    and, in the absence of evidence establishing the contrary, an employer, shall be taken to decide on a failure to act when he does an act inconsistent with doing the failed act or, if he has done no such inconsistent act, when the period expires within which he might reasonably have been expected to do the failed act if it was to be done.

  4. Section 207B (extension of time limits to facilitate conciliation before institution of proceedings) applies for the purposes of subsection (2).

63N. Remedy

  1. Where an employment tribunal finds a complaint under section 63N well-founded, the tribunal –
    1. Shall make a declaration to that effect, and
    2. May make an award of compensation to be paid to the employee in respect of the act or failure to act to which the complaint relates.
  2. Where an employment tribunal finds a complaint under this section well-founded, the tribunal shall order the employer to pay the worker damages in a sum not exceeding an amount equivalent to the worker’s pro rata daily wages for each day on which a breach has occurred.
  3. The Secretary of State shall make regulations by  order for a mechanism to update the maximum sum in subsection (2) on annual basis.“
  1. Right not to be subject to detriment in relation to the right to disconnect

    1. After section 47G of the Employment Rights Act 1996 insert:

      “47H. Right to disconnect

      1. An employee or worker has the right not to be subjected to a detriment by any act, or deliberate failure to act, by the employee’s employer done on the ground that the employee failed or refused to monitor or respond to any work-related communications or to carry out any work outside of his normal working hours.
      2. Subsection (1) does not apply if the detriment in question amounts to dismissal within the meaning of Part 10.”
    2. Section 48 of the Employment Rights Act 1996 is amended as follows: in subsection (1) for “47F or 47G” substitute “47F, 47G or 47H”.
AI Bil: Part 7 - Dismissal
  1. Automatic unfair dismissal: Unfair high-risk decision-making

    1. The Employment Rights Act 1996 is amended in accordance with paragraphs (2) to (4).
    2. After section 104G insert –

      “104H Unfair high-risk decision-making

      1. An employee who is dismissed is to be regarded as unfairly dismissed if the reason (or if there is more than one reason, the principal reason) for the dismissal is unfair reliance on high-risk decision-making within the meaning of the Artificial Intelligence (Regulation and Employment Rights) Act 2024.
      2. On a complaint that an employer has not complied with subsection (1) it is for the employer to show that the high-risk decision-making is fair.
      3. The tribunal must examine the extent to which the employer has complied with any obligations under sections 14 to 18 and 21 of the Artificial Intelligence (Regulation and Employment Rights) Act 2024.”
    3. After section 104(e) insert –

      “(f) the rights conferred by ss 14 to 18 and 21 the Artificial Intelligence (Regulation and Employment Rights) Act 2024”.

    4. After section 108(3)(gm) insert –

      “(gn) section 104H applies”.

  1. Automatic unfair dismissal: Right to disconnect

    1. (1) The Employment Rights Act 1996 is amended in accordance with paragraphs (2) to (3).
    2. (2) After section 104H insert –

      “104I Right to disconnect

      1. (1) An employee who is dismissed is to be regarded as unfairly dismissed if the reason (or if there is more than one reason, the principal reason) for the dismissal relates to the employee’s failure or refusal to monitor or respond to any work-related communications, or to carry out any work, outside of the worker’s normal working hours in so far as section 63L applies.”
    3. After section 108(3)(gn) insert –

      “(go) section 104I applies”.

  1. Remedy

    1. Section 112 Employment Rights Act 1996 is amended to add after (4) –

      “(3A) Where the dismissal is unfair pursuant to section 104H, the tribunal may make a recommendation to the employer to ensure that there is no repetition in accordance with Part 11 of the Artificial Intelligence (Regulation and Employment Rights) Act 2024”.

    2. Section 118 Employment Rights Act 1996 is amended to add after (4) –

      “(5) Where the tribunal makes an award of compensation because the dismissal is unfair pursuant to section 104H, it will not bar the employee from seeking a remedy for the infringement of their rights and entitlements under other legislation, but the principle of no double recovery for the same loss shall apply.”

  1. Interim relief pending determination of complaint

    Section 128(1)(a)(i) of the Employment Rights Act 1996 is amended by substituting “, 103A or 104H” for “or 103A”.

  1. Definitions

    1. The Employment Rights Act 1996 is amended in accordance with subsection (2).
    2. After section 134A insert –

      “Section 134B

      1. In this Part, “Artificial intelligence system” has the same meaning as section 3 in the Artificial Intelligence (Regulations and Employment Rights) Act 2024.
      2. The terms “artificial intelligence value chain”, “high-risk”, “decision-making”, “data”, “processing” have the same meaning as sections 4 to 8 in the Artificial Intelligence (Regulations and Employment Rights) Act 2024 when used in relation to artificial intelligence systems.”
AI Bill: Part 8 - Trade unions
  1. Fair data use

    1. In accordance with this section, a trade union has the right to be provided with all the data collected by an employer that relates to its members that is used or is proposed to be used by the employer for artificial intelligence decision-making, to the extent that it is reasonably practicable.
    2. Where a trade union makes a request in accordance with this section the employer shall provide the data in an accessible form within 2 months of the date of the request.
    3. Except where the trade union’s members expressly agree in writing the data shall be provided in an anonymised form.
    4. The right in subsection (1) shall not apply in respect of any member of the trade union who notifies the employer in writing of their objection to its collection.
    5. The trade union shall notify the employer of the type of data it seeks and the date range to which the request relates; the range of data shall commence no earlier than 52 weeks prior to the request.
    6. The right conferred by subsection (1) shall be enforceable by complaint to the employment tribunal in accordance with subsections 7 to 10 below.
    7. A complaint brought under subsection (6) shall be commenced within 6 months of the last date by which the employer should have complied with the request.
    8. It shall be for the employer to prove that it was not reasonably practicable to provide any particular data.
    9. If the employment tribunal finds that the complaint is to any extent well founded it shall make a recommendation in accordance with Part 11.
    10. The Secretary of State shall make regulations by order
      1. as to the manner in which a request under this section is to be made,
      2. the form by which an employee may lodge an objection to data being shared,
      3. what is reasonably practical, and
      4. the steps to be taken to preserve the anonymity of the data.
  1. Trade union consultation

    1. The Trade Union and Labour Relations (Consolidation) Act 1992 is amended in accordance with paragraph (2).
    2. After section 198B insert –

      “Chapter IIA Artificial Intelligence Systems

      198C Duty to consult representatives

      1. Where an employer is proposing to do high-risk decision-making, it shall consult all the persons who are appropriate representatives of any of the employees who may be affected.
      2. The consultation shall begin in good time and in any event at least 1 month before the high-risk decision-making takes place.
      3. Once high-risk decision-making starts, the consultation in subsection (2) must be repeated every 12 months for as long as decision-making continues.
      4. For the purposes of this section the appropriate representatives of any affected employees are –
        1. if the employees are of a description in respect of which an independent trade union is recognized by their employer, representatives of the trade union, or
        2. in any other case, whichever of the following employee representatives the employer chooses –
          1. employee representatives appointed or elected by the affected employees otherwise than for the purposes of this section, who (having regard to the purposes for and the method by which they were appointed or elected) have authority from those employees to receive information and to be consulted about the high-risk decision-making on their behalf;
          2. employee representatives elected by the affected employees, for the purposes of this section, in an election satisfying the requirements of section 198D(1).
      5. The consultation shall include consultation about —
        1. the risks to the rights of employees contained in the Equality Act 2010, the Human Rights Act 1998, Health and Safety at Work etc Act 1974, the Equality Act 2010, the Data Protection Act 2018, and the UK General Data Protection Regulation,
        2. the measures envisaged to address the risks,
        3. In this Act, the concerns and interests of workers and employees, include all legitimate concerns and interests, including –
          1. Understanding and minimizing the deployment of detrimental high-risk artificial intelligence systems,
          2. The impact or potential impact of artificial intelligence systems upon workers and employees in relation to their well- being,  and
          3. The potential for any diminution or other adverse effect on the degree of human connection with their employer.

        and shall be undertaken by the employer with a view to reaching agreement with the appropriate representatives,

      6. For the purposes of the consultation the employer shall disclose in writing to the appropriate representatives—
        1. A description of the proposed artificial intelligence system,
        2. A description of the relevant value chain,
        3. The date from which it is proposed that the system will be used in high-risk decision-making,
        4. The categories of high-risk decision which it is proposed the system will take or contribute to,
        5. The proposed purpose or aim in using the system,
        6. The logic which will underpin the proposed decision-making,
        7. The proposed data that will be processed by the system in relation to high-risk decision-making,
        8. The way in which the personal data of employees, workers or jobseekers will influence the proposed decisions,
        9. A description of how it is proposed to monitor the artificial intelligence system for accuracy, including how that metric will be defined, when high-risk decision-making takes place,
        10. A description of how it is proposed to monitor the artificial intelligence system for the risks to worker, employee or jobseeker rights contained in the Equality Act 2010, Health and Safety at Work etc Act 1974, the Human Rights Act 1998, the Equality Act 2010, the Data Protection Act 2018, and the UK General Data Protection Regulation,
        11. An assessment of the risks to worker, employee or jobseeker rights contained in the Equality Act 2010, Health and Safety at Work etc Act 1974, the Human Rights Act 1998, the Equality Act 2010, the Data Protection Act 2018, and the UK General Data Protection Regulation,
        12. The measures to be taken with a view to eliminating the risks, and
        13. A copy of the register created pursuant to s 16 in this Act.
      7. That information shall be given to each of the appropriate representatives by being delivered to them or sent by post to an address notified by them to the employer, or (in the case of representatives of a trade union) sent by post to the union at the address of its head or main office.
      8. The employer shall allow the appropriate representatives access to the affected employees and shall afford to those representatives such accommodation and other facilities as may be appropriate.
      9. If in any case there are special circumstances which render it not reasonably practicable for the employer to comply with a requirement of subsection (2), (3), (4) or (5), the employer shall take all such steps towards compliance with that requirement as are reasonably practicable in those circumstances.
      10. Where-
        1. the employer has invited any of the affected employees to elect employee representatives, and
        2. the invitation was issued long enough before the time when the consultation is required by subsection (2) to begin to allow them to elect representatives by that time,

        the employer shall be treated as complying with the requirements of this section in relation to those employees if he complies with those requirements as soon as is reasonably practicable after the election of the representatives.

      11. If, after the employer has invited affected employees to elect representatives, the affected employees fail to do so within a reasonable time, he shall give to each affected employee the information set out in subsection (5).
      12. This section does not confer any rights on a trade union, a representative or an employee except as provided by this Act.

198D Election of representatives

  1. The requirements for the election of employee representatives under section 198C(3)(b)(ii) are that–
    1. the employer shall make such arrangements as are reasonably practical to ensure that the election is fair;
    2. the employer shall determine the number of representatives to be elected so that there are sufficient representatives to represent the interests of all the affected employees having regard to the number and classes of those employees;
    3. the employer shall determine whether the affected employees should be represented either by representatives of all the affected employees or by representatives of particular classes of those employees;
    4. before the election the employer shall determine the term of office as employee representatives so that it is of sufficient length to enable information to be given and consultations under section 198C to be completed;
    5. the candidates for election as employee representatives are affected employees on the date of the election;
    6. no affected employee is unreasonably excluded from standing for election;
    7. all affected employees on the date of the election are entitled to vote for employee representatives;
    8. the employees entitled to vote may vote for as many candidates as there are representatives to be elected to represent them or, if there are to be representatives for particular classes of employees, may vote for as many candidates as there are representatives to be elected to represent their particular class of employee;
    9. the election is conducted so as to secure that –
      1. so far as is reasonably practicable, those voting do so in secret, and
      2. the votes given at the election are accurately counted.
  2. Where, after an election of employee representatives satisfying the requirements of subsection (1) has been held, one of those elected ceases to act as an employee representative and any of those employees are no longer represented, they shall elect another representative by an election satisfying the requirements of subsection (1)(a), (e), (f) and (i).

198D Complaint and compensation

  1. Where an employer has failed to comply with a requirement of section 198A or section 198B, a complaint may be presented to an employment tribunal on that ground–
    1. in the case of a failure relating to the election of employee representatives, by any of the affected employees;
    2. in the case of any other failure relating to employee representatives, by any of the employee representatives to whom the failure related,
    3. in the case of failure relating to representatives of a trade union, by the trade union, and
    4. in any other case, by any of the affected employees.
  2. If on a complaint under subsection (1) a question arises as to whether or not any employee representative was an appropriate representative for the purposes of section 198A, it shall be for the employer to show that the employee representative had the authority to represent the affected employees.
  3. On a complaint under subsection (1)(a) it shall be for the employer to show that the requirements in section 198B have been satisfied.
  4. If the tribunal finds the complaint well-founded it shall make a declaration to that effect and may also make an award of compensation to any affected employee to be paid by the employer.
  5. The amount of compensation under subsection (4) shall be such as the employment tribunal considers just and equitable in all the circumstances having regard to the infringement to which the complaint relates but shall not exceed a maximum sum of £xx.
  1. The Secretary of State shall make regulations by order to set the initial maximum compensation and for a mechanism to update the maximum sum in subsection (5) on annual basis after consultation with such employer and employee organisations as he considers appropriate.
  1. An employment tribunal may not consider a complaint under this section after the end of –
    1. the period of 6 months beginning with the date of the failure to which the complaint relates,
    2. such other period as the employment tribunal considers just and equitable.
  2. Where the complaint concerns a failure to comply with a requirement of section 198A or 198B, section 292A (extension of time limits to facilitate conciliation before institution of proceedings) applies for the purposes of subsection (6)(a).
  3. For the purposes of subsection (6), in deciding what is just and equitable, the employment tribunal shall take into account-
    1. any steps taken by a trade union or an employee to attempt to persuade an employer to comply with sections 198A and 198B without recourse to litigation, and
    2. the extent to which the employer’s failure was observable and transparent.
  4. If on a complaint under this section a question arises—
    1. whether there were special circumstances which rendered it not reasonably practicable for the employer to comply with any requirement of section 198A, or
    2. whether he took all such steps towards compliance with that requirement as were reasonably practicable in those circumstances, it is for the employer to show that there were and that he did.

198D Definitions

  1. In this Chapter, “Artificial intelligence system” has the same meaning as section in the Artificial Intelligence (Regulations and Employment Rights) Act 2024.
  2. The terms “artificial intelligence value chain”, “high-risk”, “decision- making”, “data” and “processing” have the same meaning as sections 4 to 8 in the Artificial Intelligence (Regulations and Employment Rights) Act 2024 when used in relation to artificial intelligence systems.”
AI Bill: Part 9 - Auditing and procedural safeguards
  1. Auditing artificial intelligence systems for discrimination

  1. An employer or its agent can rely on the defence under section 136(3B) Equality Act 2010 in a discrimination claim.
  2. The employment tribunal must have regard to the extent to which the following matters apply when determining whether section 136(3B)(b) is satisfied:
    1. Compliance with statutory guidance from the Equality and Humans Rights Commission concerning the audit of artificial systems for discrimination before they are used to make high-risk decisions;
    2. Compliance with relevant technical standards issued by an approved body; and
    3. Certification of the artificial intelligence system by an approved body.
  3. An employer or its agent cannot rely on the defence under section 136(3B)(b) Equality Act 2010 if it deploys artificial intelligence systems for use cases which were not originally envisaged when it acquired the system.
  4. The employment tribunal must have regard to the following matters when determining whether section 136 (3B)(c) is satisfied:
    1. The extent to which there is compliance with any statutory guidance from the Equality and Humans Rights Commission concerning the procedural safeguards required to remove the risk of discrimination after an audit for discrimination has taken place; and
    2. Any relevant WAIRA.
  5. The Secretary of State shall make regulations by order as to the identification of relevant-
    1. Statutory guidance from the Equality and Humans Rights Commission,
    2. Technical standards, and
    3. Certification by an approved body.

    and such regulations may make different provision for different circumstances.

AI Bill: Part 10 - Regulators and bodies in the employment field and artificial intelligence
  1. Regulatory obligations concerning artificial intelligence

  1. A regulator within Schedule 3 shall apply the principles set out in this section in any context concerning employment and the deployment of artificial intelligence systems.
  2. The principles are that regulation should promote -
    1. safety, security and robustness,
    2. appropriate transparency and explainability,
    3. fairness,
    4. equality, diversity, equality of opportunity, and compliance with the Equality Act 2010,
    5. accountability and governance, and
    6. contestability and redress.
  3. For the purposes of this section a regulator within Schedule 3 may request and see information produced pursuant to sections 14 and 16 in accordance with Schedule 4.
  4. “equality”, “diversity” and “equality of opportunity” having the same meaning as in the Equality Act 2006.
  1. Statutory guidance on artificial intelligence and the principle of non- discrimination

    1. In accordance with this section, Equality and Humans Rights Commission shall publish guidance for employers (their agents and employees), every two years from commencement of this section.
    2. The guidance shall set out the steps that should be taken to avoid a breach of the principle of non-discrimination in consequence of high-risk decision-making including in relation to the defence under section 136(3B) Equality Act 2010.
    3. Equality and Humans Rights Commission must consult with the regulators and bodies listed in subsection (4) and any other bodies or regulators that he considers to be relevant before publishing the statutory guidance referred to in subsection (1).
    4. The regulators and bodies within the meaning of subsection (3) are:
      1. The Advisory Conciliation and Arbitration Service (ACAS),
      2. The Information Commissioner’s Office,
      3. Trade Union Congress,
      4. The Department of Science, Innovation and Technology,
      5. The Chartered Institute of Personnel and Development,
      6. The Confederation of British Industry, and
      7. The British Computer Society.
    5. Not less than every two years from the commencement of this Act, the Secretary of State shall consult with appropriate organisations of employers and of employees and workers in order to assess whether amendment is required to the regulators and bodies listed in subsection (3).
  1. Data awareness and education

    1. ACAS must prepare a code of practice for employers, employees, workers and jobseekers which —
      1. explains how artificial intelligence is used within the employment field, and
      2. explains the rights and entitlements contained in this Act and an explanation as to how they can be used to ensure transparency and accountability.
    2. Where a code under this section is in force, ACAS may prepare amendments of the code or a replacement code.
    3. Before preparing a code or amendments under this section, the ACAS must consult the Secretary of State and such of the following as ACAS considers appropriate—
      1. trade associations,
      2. trade unions,
      3. workers,
      4. Equality and Humans Rights Commission, and
      5. persons who appear to ACAS to represent the interests of workers.
AI Bill: Part 11 - Enforcement of recommendations
  1. Power to make recommendations

  1. A recommendation to an employer made by an employment tribunal under this Act must specify -
    1. The steps necessary to remedy the breach of this Act in relation to an employee, worker, jobseeker or trade union and ensure that there is no repetition, and
    2. The period or periods of time (not exceeding 52 weeks) within which the steps are to be taken.
  2. Such a recommendation may provide for interim and or final reports on the implementation of the recommendations to be made to the employment tribunal.
  3. Where an employment tribunal has made a recommendation to an employer concerning the right of an employee, worker, jobseeker or trade union under this Act and the terms of the recommendation are not fully complied with and it is satisfied that the non-compliance is culpable, it shall make an award of compensation to be paid by the employer to the employee, worker, jobseeker or trade union concerned.
  4. The award of compensation to an employee or worker under subsection (3) above shall be subject to a maximum of £xx.
  5. The award of compensation to a jobseeker under subsection (3) above shall be a maximum of £xx.
  6. The award of compensation to a Trade Union under subsection (3) shall be a maximum of £xx.
  7. The Secretary of State shall make regulations by order to set the initial maximum compensation and for a mechanism to update the maximum sum in subsections (4) to (6) on annual basis after consultation with such employer and employee organisations as he considers appropriate.
AI Bill: Part 12 - Innovation
  1. Use of regulatory sandboxes

  1. In this Part, ‘regulatory sandbox’ means a concrete and controlled framework set up by an approved body which offers providers or prospective providers of artificial intelligence systems the possibility to develop, train, validate and test, where appropriate in real world conditions, an innovative AI system, pursuant to a sandbox plan for a limited time under the regulatory supervision by the Information Commissioner’s Office.
  2. In this Part, ‘a sandbox plan’ means a document agreed between the participating provider and the approved body describing the objectives, conditions, timeframes, methodology and requirements for the activities carried out within the sandbox.
  3. The provisions of this Act are suspended to the extent that they are identified within a regulatory sandbox but no further.
  4. The Secretary of State shall make regulations as to the meaning of “an approved body”, and for the purposes of putting this Part into effect, and such regulations may make different provision for different circumstances.
AI Bill: Part 13 - General and miscellaneous provisions

Microbusinesses

  1. Power to disapply or modify obligations for microbusinesses

    The Secretary of State may by order disapply or modify the obligations in sections 14, 15, 16, 17, 18 and 38 for any employer who employs fewer than 10 employees.

Subordinate legislation

  1. Exercise of power

    1. A power to make an order or regulations under this Act is exercisable by a Minister of the Crown on behalf of the Secretary of State.
    2. Orders and regulations, made under this Act must be made by statutory instrument.
    3. Orders or regulations under this Act—
      1. may make different provision for different purposes, and
      2. may      include     consequential,      incidental,      supplementary, transitional, transitory or saving provisions.
    4. Nothing in this Act affects the generality of the power under subsection (3)(a).
  1. Ministers of the Crown, etc.

    1. This section applies where the power to make an order or regulations under this Act is exercisable by a Minister of the Crown.
    2. A statutory instrument solely containing an order or regulations that supplements but does not amend this Act is subject to the negative procedure.
    3. A statutory instrument containing (whether alone or with other provision) an order or regulations that amend this Act is subject to the affirmative procedure.

Final provisions

  1. Money

    There is to be paid out of money provided by Parliament any increase attributable to this Act in the expenses of a Minister of the Crown.

  1. Commencement

    1. Subject to subsection (2) this Act will come into force two years after the day on which this Act is passed.
    2. The Secretary of State may by order make regulations bringing different provisions of this Act into force at any time.
  1. Extent

    This Act forms part of the law of Great Britain [note it can easily be adapted to extend to Northern Ireland, however there are further and different provisions in Northern Ireland relating to the protection of human rights and prohibition of discrimination].

  1. Short title

    This Act may be cited as the Artificial Intelligence (Regulations and Employment Rights) Act 2024.

Enable Two-Factor Authentication

To access the admin area, you will need to setup two-factor authentication (TFA).

Setup now