103RD GENERAL ASSEMBLY
State of Illinois
2023 and 2024
HB5116

 

Introduced 2/8/2024, by Rep. Daniel Didech

 

SYNOPSIS AS INTRODUCED:
 
New Act

    Creates the Automated Decision Tools Act. Provides that, on or before January 1, 2026, and annually thereafter, a deployer of an automated decision tool shall perform an impact assessment for any automated decision tool the deployer uses or designs, codes, or produces that includes specified information. Provides that a deployer shall, at or before the time an automated decision tool is used to make a consequential decision, notify any natural person who is the subject of the consequential decision that an automated decision tool is being used to make, or be a controlling factor in making, the consequential decision and provide specified information. Provides that a deployer shall establish, document, implement, and maintain a governance program that contains reasonable administrative and technical safeguards to map, measure, manage, and govern the reasonably foreseeable risks of algorithmic discrimination associated with the use or intended use of an automated decision tool. Provides that, within 60 days after completing an impact assessment required by the Act, a deployer shall provide the impact assessment to the Department of Human Rights. Provides that the Attorney General may bring a civil action against a deployer for a violation of the Act.


LRB103 36408 SPS 66510 b

 

 

A BILL FOR

 

HB5116LRB103 36408 SPS 66510 b

1    AN ACT concerning business.
 
2    Be it enacted by the People of the State of Illinois,
3represented in the General Assembly:
 
4    Section 1. Short title. This Act may be cited as the
5Automated Decision Tools Act.
 
6    Section 5. Definitions. As used in this Act:
7    "Algorithmic discrimination" means the condition in which
8an automated decision tool contributes to unjustified
9differential treatment or impacts disfavoring people based on
10their actual or perceived race, color, ethnicity, sex,
11religion, age, national origin, limited English proficiency,
12disability, veteran status, genetic information, reproductive
13health, or any other classification protected by State law.
14    "Artificial intelligence" means a machine-based system or
15technology operating on datasets that can, for a given set of
16human-defined objectives, make predictions, recommendations,
17or decisions influencing a real or virtual environment.
18    "Automated decision tool" means a system or service that
19uses artificial intelligence and has been specifically
20developed and marketed to, or specifically modified to, make,
21or be a controlling factor in making, consequential decisions.
22    "Consequential decision" means a decision or judgment that
23has a legal, material, or similarly significant effect on an

 

 

HB5116- 2 -LRB103 36408 SPS 66510 b

1individual's life relating to the impact of, access to, or the
2cost, terms, or availability of, any of the following:
3        (1) employment, worker management, or self-employment,
4    including, but not limited to, all of the following:
5            (A) pay or promotion;
6            (B) hiring or termination; and
7            (C) automated task allocation;
8        (2) education and vocational training, including, but
9    not limited to, all of the following:
10            (A) assessment, including, but not limited to,
11        detecting student cheating or plagiarism;
12            (B) accreditation;
13            (C) certification;
14            (D) admissions; and
15            (E) financial aid or scholarships;
16        (3) housing or lodging, including rental or short-term
17    housing or lodging;
18        (4) essential utilities, including electricity, heat,
19    water, Internet or telecommunications access, or
20    transportation;
21        (5) family planning, including adoption services or
22    reproductive services, as well as assessments related to
23    child protective services;
24        (6) healthcare or health insurance, including mental
25    health care, dental, or vision;
26        (7) financial services, including a financial service

 

 

HB5116- 3 -LRB103 36408 SPS 66510 b

1    provided by a mortgage company, mortgage broker, or
2    creditor;
3        (8) the criminal justice system, including, but not
4    limited to, all of the following:
5            (A) risk assessments for pretrial hearings;
6            (B) sentencing; and
7            (C) parole;
8        (9) legal services, including private arbitration or
9    mediation;
10        (10) voting; and
11        (11) access to benefits or services or assignment of
12    penalties.
13    "Deployer" means a person, partnership, State or local
14government agency, or corporation that uses an automated
15decision tool to make a consequential decision.
16    "Impact assessment" means a documented risk-based
17evaluation of an automated decision tool that meets the
18criteria of Section 10.
19    "Sex" includes pregnancy, childbirth, and related
20conditions, gender identity, intersex status, and sexual
21orientation.
22    "Significant update" means a new version, new release, or
23other update to an automated decision tool that includes
24changes to its use case, key functionality, or expected
25outcomes.
 

 

 

HB5116- 4 -LRB103 36408 SPS 66510 b

1    Section 10. Impact assessment.
2    (a) On or before January 1, 2026, and annually thereafter,
3a deployer of an automated decision tool shall perform an
4impact assessment for any automated decision tool the deployer
5uses that includes all of the following:
6        (1) a statement of the purpose of the automated
7    decision tool and its intended benefits, uses, and
8    deployment contexts;
9        (2) a description of the automated decision tool's
10    outputs and how they are used to make, or be a controlling
11    factor in making, a consequential decision;
12        (3) a summary of the type of data collected from
13    natural persons and processed by the automated decision
14    tool when it is used to make, or be a controlling factor in
15    making, a consequential decision;
16        (4) an analysis of potential adverse impacts on the
17    basis of sex, race, color, ethnicity, religion, age,
18    national origin, limited English proficiency, disability,
19    veteran status, or genetic information from the deployer's
20    use of the automated decision tool;
21        (5) a description of the safeguards implemented, or
22    that will be implemented, by the deployer to address any
23    reasonably foreseeable risks of algorithmic discrimination
24    arising from the use of the automated decision tool known
25    to the deployer at the time of the impact assessment;
26        (6) a description of how the automated decision tool

 

 

HB5116- 5 -LRB103 36408 SPS 66510 b

1    will be used by a natural person, or monitored when it is
2    used, to make, or be a controlling factor in making, a
3    consequential decision; and
4        (7) a description of how the automated decision tool
5    has been or will be evaluated for validity or relevance.
6    (b) A deployer shall, in addition to the impact assessment
7required by subsection (a), perform, as soon as feasible, an
8impact assessment with respect to any significant update.
9    (c) This Section does not apply to a deployer with fewer
10than 25 employees unless, as of the end of the prior calendar
11year, the deployer deployed an automated decision tool that
12impacted more than 999 people per year.
 
13    Section 15. Notification and accommodations.
14    (a) A deployer shall, at or before the time an automated
15decision tool is used to make a consequential decision, notify
16any natural person who is the subject of the consequential
17decision that an automated decision tool is being used to
18make, or be a controlling factor in making, the consequential
19decision. A deployer shall provide to a natural person
20notified under this subsection all of the following:
21        (1) a statement of the purpose of the automated
22    decision tool;
23        (2) the contact information for the deployer; and
24        (3) a plain language description of the automated
25    decision tool that includes a description of any human

 

 

HB5116- 6 -LRB103 36408 SPS 66510 b

1    components and how any automated component is used to
2    inform a consequential decision.
3    (b) If a consequential decision is made solely based on
4the output of an automated decision tool, a deployer shall, if
5technically feasible, accommodate a natural person's request
6to not be subject to the automated decision tool and to be
7subject to an alternative selection process or accommodation.
8After a request is made under this subsection, a deployer may
9reasonably request, collect, and process information from a
10natural person for the purposes of identifying the person and
11the associated consequential decision. If the person does not
12provide that information, the deployer shall not be obligated
13to provide an alternative selection process or accommodation.
 
14    Section 20. Governance program.
15    (a) A deployer shall establish, document, implement, and
16maintain a governance program that contains reasonable
17administrative and technical safeguards to map, measure,
18manage, and govern the reasonably foreseeable risks of
19algorithmic discrimination associated with the use or intended
20use of an automated decision tool. The safeguards required by
21this subsection shall be appropriate to all of the following:
22        (1) the use or intended use of the automated decision
23    tool;
24        (2) the deployer's role as a deployer;
25        (3) the size, complexity, and resources of the

 

 

HB5116- 7 -LRB103 36408 SPS 66510 b

1    deployer;
2        (4) the nature, context, and scope of the activities
3    of the deployer in connection with the automated decision
4    tool; and
5        (5) the technical feasibility and cost of available
6    tools, assessments, and other means used by a deployer to
7    map, measure, manage, and govern the risks associated with
8    an automated decision tool.
9    (b) The governance program required by this Section shall
10be designed to do all of the following:
11        (1) identify and implement safeguards to address
12    reasonably foreseeable risks of algorithmic discrimination
13    resulting from the use or intended use of an automated
14    decision tool;
15        (2) if established by a deployer, provide for the
16    performance of impact assessments as required by Section
17    10;
18        (3) conduct an annual and comprehensive review of
19    policies, practices, and procedures to ensure compliance
20    with this Act;
21        (4) maintain for 2 years after completion the results
22    of an impact assessment; and
23        (5) evaluate and make reasonable adjustments to
24    administrative and technical safeguards in light of
25    material changes in technology, the risks associated with
26    the automated decision tool, the state of technical

 

 

HB5116- 8 -LRB103 36408 SPS 66510 b

1    standards, and changes in business arrangements or
2    operations of the deployer.
3    (c) A deployer shall designate at least one employee to be
4responsible for overseeing and maintaining the governance
5program and compliance with this Act. An employee designated
6under this subsection shall have the authority to assert to
7the employee's employer a good faith belief that the design,
8production, or use of an automated decision tool fails to
9comply with the requirements of this Act. An employer of an
10employee designated under this subsection shall conduct a
11prompt and complete assessment of any compliance issue raised
12by that employee.
13    (d) This Section does not apply to a deployer with fewer
14than 25 employees unless, as of the end of the prior calendar
15year, the deployer deployed an automated decision tool that
16impacted more than 999 people per year.
 
17    Section 25. Public statement of policy. A deployer shall
18make publicly available, in a readily accessible manner, a
19clear policy that provides a summary of both of the following:
20        (1) the types of automated decision tools currently in
21    use or made available to others by the deployer; and
22        (2) how the deployer manages the reasonably
23    foreseeable risks of algorithmic discrimination that may
24    arise from the use of the automated decision tools it
25    currently uses or makes available to others.
 

 

 

HB5116- 9 -LRB103 36408 SPS 66510 b

1    Section 30. Algorithmic discrimination.
2    (a) A deployer shall not use an automated decision tool
3that results in algorithmic discrimination.
4    (b) On and after January 1, 2027, a person may bring a
5civil action against a deployer for violation of this Section.
6In an action brought under this subsection, the plaintiff
7shall have the burden of proof to demonstrate that the
8deployer's use of the automated decision tool resulted in
9algorithmic discrimination that caused actual harm to the
10person bringing the civil action.
11    (c) In addition to any other remedy at law, a deployer that
12violates this Section shall be liable to a prevailing
13plaintiff for any of the following:
14        (1) compensatory damages;
15        (2) declaratory relief; and
16        (3) reasonable attorney's fees and costs.
 
17    Section 35. Impact assessment.
18    (a) Within 60 days after completing an impact assessment
19required by this Act, a deployer shall provide the impact
20assessment to the Department of Human Rights.
21    (b) A deployer who knowingly violates this Section shall
22be liable for an administrative fine of not more than $10,000
23per violation in an administrative enforcement action brought
24by the Department of Human Rights. Each day on which an

 

 

HB5116- 10 -LRB103 36408 SPS 66510 b

1automated decision tool is used for which an impact assessment
2has not been submitted as required under this Section shall
3give rise to a distinct violation of this Section.
4    (c) The Department of Human Rights may share impact
5assessments with other State entities as appropriate.
 
6    Section 40. Civil actions.
7    (a) The Attorney General may bring a civil action in the
8name of the people of the State of Illinois against a deployer
9for a violation of this Act.
10    (b) A court may award in an action brought under this
11Section all of the following:
12        (1) injunctive relief;
13        (2) declaratory relief; and
14        (3) reasonable attorney's fees and litigation costs.
15    (b) The Attorney General, before commencing an action
16under this Section for injunctive relief, shall provide 45
17days' written notice to a deployer of the alleged violations
18of this Act. The deployer may cure, within 45 days after
19receiving the written notice described in this subsection, the
20noticed violation and provide the person who gave the notice
21an express written statement, made under penalty of perjury,
22that the violation has been cured and that no further
23violations shall occur. If the deployer cures the noticed
24violation and provides the express written statement, a claim
25for injunctive relief shall not be maintained for the noticed

 

 

HB5116- 11 -LRB103 36408 SPS 66510 b

1violation.