skip to main content

H.R. 6580 (117th): Algorithmic Accountability Act of 2022


The text of the bill below is as of Feb 3, 2022 (Introduced). The bill was not enacted into law.


I

117th CONGRESS

2d Session

H. R. 6580

IN THE HOUSE OF REPRESENTATIVES

February 3, 2022

(for herself, Mrs. Watson Coleman, Ms. Norton, Mr. Espaillat, Mr. Grijalva, Mr. McGovern, Ms. Wilson of Florida, Ms. Moore of Wisconsin, Ms. Plaskett, Ms. Pressley, Mr. Payne, Mr. Butterfield, Mr. Veasey, Ms. Bass, Ms. Adams, Ms. Kelly of Illinois, Mr. Cohen, Ms. Omar, Mr. Khanna, Mr. Trone, Mr. Larsen of Washington, Mr. McNerney, Mrs. Trahan, Mr. Bowman, Mr. Jones, Ms. Jackson Lee, Mrs. Lawrence, Mr. Casten, Mr. Carson, Mr. Evans, Mr. Cleaver, and Mr. Huffman) introduced the following bill; which was referred to the Committee on Energy and Commerce

A BILL

To direct the Federal Trade Commission to require impact assessments of automated decision systems and augmented critical decision processes, and for other purposes.

1.

Short title

This Act may be cited as the Algorithmic Accountability Act of 2022.

2.

Definitions

In this Act:

(1)

Augmented critical decision process

The term augmented critical decision process means a process, procedure, or other activity that employs an automated decision system to make a critical decision.

(2)

Automated decision system

The term automated decision system means any system, software, or process (including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques and excluding passive computing infrastructure) that uses computation, the result of which serves as a basis for a decision or judgment.

(3)

Biometrics

The term biometrics means any information that represents a biological, physiological, or behavioral attribute or feature of a consumer.

(4)

Chair

The term Chair means the Chair of the Commission.

(5)

Commission

The term Commission means the Federal Trade Commission.

(6)

Consumer

The term consumer means an individual.

(7)

Covered entity

(A)

In general

The term covered entity means any person, partnership, or corporation over which the Commission has jurisdiction under section 5(a)(2) of the Federal Trade Commission Act (15 U.S.C. 45(a)(2))—

(i)

that deploys any augmented critical decision process; and

(I)

had greater than $50,000,000 in average annual gross receipts or is deemed to have greater than $250,000,000 in equity value for the 3-taxable-year period (or for the period during which the person, partnership, or corporation has been in existence, if such period is less than 3 years) preceding the most recent fiscal year, as determined in accordance with paragraphs (2) and (3) of section 448(c) of the Internal Revenue Code of 1986;

(II)

possesses, manages, modifies, handles, analyzes, controls, or otherwise uses identifying information about more than 1,000,000 consumers, households, or consumer devices for the purpose of developing or deploying any automated decision system or augmented critical decision process; or

(III)

is substantially owned, operated, or controlled by a person, partnership, or corporation that meets the requirements under subclause (I) or (II);

(ii)

that—

(I)

had greater than $5,000,000 in average annual gross receipts or is deemed to have greater than $25,000,000 in equity value for the 3-taxable-year period (or for the period during which the person, partnership, or corporation has been in existence, if such period is less than 3 years) preceding the most recent fiscal year, as determined in accordance with paragraphs (2) and (3) of section 448(c) of the Internal Revenue Code of 1986; and

(II)

deploys any automated decision system that is developed for implementation or use, or that the person, partnership, or corporation reasonably expects to be implemented or used, in an augmented critical decision process by any person, partnership, or corporation if such person, partnership, or corporation meets the requirements described in clause (i); or

(iii)

that met the criteria described in clause (i) or (ii) within the previous 3 years.

(B)

Inflation adjustment

For purposes of applying this paragraph in any fiscal year after the first fiscal year that begins on or after the date of enactment of this Act, each of the dollar amounts specified in subparagraph (A) shall be increased by the percentage increase (if any) in the consumer price index for all urban consumers (U.S. city average) from such first fiscal year that begins after such date of enactment to the fiscal year involved.

(8)

Critical decision

The term critical decision means a decision or judgment that has any legal, material, or similarly significant effect on a consumer's life relating to access to or the cost, terms, or availability of—

(A)

education and vocational training, including assessment, accreditation, or certification;

(B)

employment, workers management, or self-employment;

(C)

essential utilities, such as electricity, heat, water, internet or telecommunications access, or transportation;

(D)

family planning, including adoption services or reproductive services;

(E)

financial services, including any financial service provided by a mortgage company, mortgage broker, or creditor;

(F)

healthcare, including mental healthcare, dental, or vision;

(G)

housing or lodging, including any rental or short-term housing or lodging;

(H)

legal services, including private arbitration or mediation; or

(I)

any other service, program, or opportunity decisions about which have a comparably legal, material, or similarly significant effect on a consumer's life as determined by the Commission through rulemaking.

(9)

Deploy

The term deploy means to implement, use, or make available for sale, license, or other commercial relationship.

(10)

Develop

The term develop means to design, code, produce, customize, or otherwise create or modify.

(11)

Identifying information

The term identifying information means any information, regardless of how the information is collected, inferred, predicted, or obtained that identifies or represents a consumer, household, or consumer device through data elements or attributes, such as name, postal address, telephone number, biometrics, email address, internet protocol address, social security number, or any other identifying number, identifier, or code.

(12)

Impact assessment

The term impact assessment means the ongoing study and evaluation of an automated decision system or augmented critical decision process and its impact on consumers.

(13)

Passive computing infrastructure

The term passive computing infrastructure means any intermediary technology that does not influence or determine the outcome of a decision, including—

(A)

web hosting;

(B)

domain registration;

(C)

networking;

(D)

caching;

(E)

data storage; or

(F)

cybersecurity.

(14)

State

The term State means each of the 50 States, the District of Columbia, and any territory or possession of the United States.

(15)

Summary report

The term summary report means documentation of a subset of information required to be addressed by the impact assessment as described in this Act or determined appropriate by the Commission.

(16)

Third-party decision recipient

The term third-party decision recipient means any person, partnership, or corporation (beyond the consumer and the covered entity) that receives a copy of or has access to the results of any decision or judgment that results from a covered entity's deployment of an automated decision system or augmented critical decision process.

3.

Assessing the impact of automated decision systems and augmented critical decision processes

(a)

Acts prohibited

(1)

In general

It is unlawful for—

(A)

any covered entity to violate a regulation promulgated under subsection (b); or

(B)

any person to knowingly provide substantial assistance to any covered entity in violating subsection (b).

(2)

Preemption of private contracts

It shall be unlawful for any covered entity to commit the acts prohibited in paragraph (1), regardless of specific agreements between entities or consumers.

(b)

Regulations

(1)

In general

Subject to paragraph (2), not later than 2 years after the date of enactment of this Act, the Commission shall, in consultation with the Director of the National Institute of Standards and Technology, the Director of the National Artificial Intelligence Initiative, the Director of the Office of Science and Technology Policy, and other relevant stakeholders, including standards bodies, private industry, academia, technology experts, and advocates for civil rights, consumers, and impacted communities, promulgate regulations, in accordance with section 553 of title 5, United States Code, that—

(A)

require each covered entity to perform impact assessment of any—

(i)

deployed automated decision system that was developed for implementation or use, or that the covered entity reasonably expects to be implemented or used, in an augmented critical decision process by any person, partnership, or corporation that meets the requirements described in section 2(7)(A)(i); and

(ii)

augmented critical decision process, both prior to and after deployment by the covered entity;

(B)

require each covered entity to maintain documentation of any impact assessment performed under subparagraph (A), including the applicable information described in section 4(a) for 3 years longer than the duration of time for which the automated decision system or augmented critical decision process is deployed;

(C)

require each person, partnership, or corporation that meets the requirements described in section 2(7)(A)(i) to disclose their status as a covered entity to any person, partnership, or corporation that sells, licenses, or otherwise provides through a commercial relationship any automated decision system deployed by the covered entity in an automated decision system or augmented critical decision process;

(D)

require each covered entity to submit to the Commission, on an annual basis, a summary report for ongoing impact assessment of any deployed automated decision system or augmented critical decision process;

(E)

require each covered entity to submit an initial summary report to the Commission for any new automated decision system or augmented critical decision process prior to its deployment by the covered entity;

(F)

allow any person, partnership, or corporation over which the Commission has jurisdiction under section 5(a)(2) of the Federal Trade Commission Act (15 U.S.C. 45(a)(2)) that deploys any automated decision system or augmented critical decision process, but is not a covered entity, to submit to the Commission a summary report for any impact assessment performed with respect to such system or process;

(G)

require each covered entity, in performing the impact assessment described in subparagraph (A), to the extent possible, to meaningfully consult (including through participatory design, independent auditing, or soliciting or incorporating feedback) with relevant internal stakeholders (such as employees, ethics teams, and responsible technology teams) and independent external stakeholders (such as representatives of and advocates for impacted groups, civil society and advocates, and technology experts) as frequently as necessary;

(H)

require each covered entity to attempt to eliminate or mitigate, in a timely manner, any impact made by an augmented critical decision process that demonstrates a likely material negative impact that has legal or similarly significant effects on a consumer's life;

(I)

establish definitions for—

(i)

what constitutes access to or the cost, terms, or availability of with respect to a critical decision;

(ii)

what constitutes possession, management, modification, and control with respect to identifying information;

(iii)

the different categories of third-party decision recipients that a covered entity must document under section 5(1)(H); and

(iv)

any of the services, programs, or opportunities described in subparagraphs (A) through (I) of section 2(8) for the purpose of informing consumers, covered entities, and regulators, as the Commission deems necessary;

(J)

establish guidelines for any person, partnership, or corporation to calculate the number of consumers, households, or consumer devices for which the person, partnership, or corporation possesses, manages, modifies, or controls identifying information for the purpose of determining covered entity status;

(K)

establish guidelines for a covered entity to prioritize different automated decision systems and augmented critical decision processes deployed by the covered entity for performing impact assessment; and

(L)

establish a required format for any summary report, as described in subparagraphs (D), (E), and (F), to ensure that such reports are submitted in an accessible and machine-readable format.

(2)

Considerations

In promulgating the regulations under paragraph (1), the Commission—

(A)

shall take into consideration—

(i)

that certain assessment or documentation of an automated decision system or augmented critical decision process may only be possible at particular stages of the development and deployment of such system or process or may be limited or not possible based on the availability of certain types of information or data or the nature of the relationship between the covered entity and consumers;

(ii)

the duration of time between summary report submissions and the timeliness of the reported information;

(iii)

the administrative burden placed on the Commission and the covered entity;

(iv)

the benefits of standardizing and structuring summary reports for comparative analysis compared with the benefits of less-structured narrative reports to provide detail and flexibility in reporting;

(v)

that summary reports submitted by different covered entities may contain different fields according to the requirements established by the Commission, and the Commission may allow or require submission of incomplete reports;

(vi)

that existing data privacy and other regulations may inhibit a covered entity from storing or sharing certain information; and

(vii)

that a covered entity may require information from other persons, partnerships, or corporations that develop any automated decision system deployed in an automated decision system or augmented critical decision process by the covered entity for the purpose of performing impact assessment; and

(B)

may develop specific requirements for impact assessments and summary reports for particular—

(i)

categories of critical decisions, as described in subparagraphs (A) through (I) of section 2(8) or any subcategory developed by the Commission; and

(ii)

stages of development and deployment of an automated decision system or augmented critical decision process.

(3)

Effective date

The regulations described in paragraph (1) shall take effect on the date that is 2 years after such regulations are promulgated.

4.

Requirements for covered entity impact assessment

(a)

Requirements for impact assessment

In performing any impact assessment required under section 3(b)(1) for an automated decision system or augmented critical decision process, a covered entity shall do the following, to the extent possible, as applicable to such covered entity as determined by the Commission:

(1)

In the case of a new augmented critical decision process, evaluate any previously existing critical decision-making process used for the same critical decision prior to the deployment of the new augmented critical decision process, along with any related documentation or information, such as—

(A)

a description of the baseline process being enhanced or replaced by the augmented critical decision process;

(B)

any known harm, shortcoming, failure case, or material negative impact on consumers of the previously existing process used to make the critical decision;

(C)

the intended benefits of and need for the augmented critical decision process; and

(D)

the intended purpose of the automated decision system or augmented critical decision process.

(2)

Identify and describe any consultation with relevant stakeholders as required by section 3(b)(1)(G), including by documenting—

(A)

the points of contact for the stakeholders who were consulted;

(B)

the date of any such consultation; and

(C)

information about the terms and process of the consultation, such as—

(i)

the existence and nature of any legal or financial agreement between the stakeholders and the covered entity;

(ii)

any data, system, design, scenario, or other document or material the stakeholder interacted with; and

(iii)

any recommendations made by the stakeholders that were used to modify the development or deployment of the automated decision system or augmented critical decision process, as well as any recommendations not used and the rationale for such nonuse.

(3)

In accordance with any relevant National Institute of Standards and Technology or other Federal Government best practices and standards, perform ongoing testing and evaluation of the privacy risks and privacy-enhancing measures of the automated decision system or augmented critical decision process, such as—

(A)

assessing and documenting the data minimization practices of such system or process and the duration for which the relevant identifying information and any resulting critical decision is stored;

(B)

assessing the information security measures in place with respect to such system or process, including any use of privacy-enhancing technology such as federated learning, differential privacy, secure multi-party computation, de-identification, or secure data enclaves based on the level of risk; and

(C)

assessing and documenting the current and potential future or downstream positive and negative impacts of such system or process on the privacy, safety, or security of consumers and their identifying information.

(4)

Perform ongoing testing and evaluation of the current and historical performance of the automated decision system or augmented critical decision process using measures such as benchmarking datasets, representative examples from the covered entity’s historical data, and other standards, including by documenting—

(A)

a description of what is deemed successful performance and the methods and technical and business metrics used by the covered entity to assess performance;

(B)

a review of the performance of such system or process under test conditions or an explanation of why such performance testing was not conducted;

(C)

a review of the performance of such system or process under deployed conditions or an explanation of why performance was not reviewed under deployed conditions;

(D)

a comparison of the performance of such system or process under deployed conditions to test conditions or an explanation of why such a comparison was not possible;

(E)

an evaluation of any differential performance associated with consumers' race, color, sex, gender, age, disability, religion, family status, socioeconomic status, or veteran status, and any other characteristics the Commission deems appropriate (including any combination of such characteristics) for which the covered entity has information, including a description of the methodology for such evaluation and information about and documentation of the methods used to identify such characteristics in the data (such as through the use of proxy data, including zip codes); and

(F)

if any subpopulations were used for testing and evaluation, a description of which subpopulations were used and how and why such subpopulations were determined to be of relevance for the testing and evaluation.

(5)

Support and perform ongoing training and education for all relevant employees, contractors, or other agents regarding any documented material negative impacts on consumers from similar automated decision systems or augmented critical decision processes and any improved methods of developing or performing an impact assessment for such system or process based on industry best practices and relevant proposals and publications from experts, such as advocates, journalists, and academics.

(6)

Assess the need for and possible development of any guard rail for or limitation on certain uses or applications of the automated decision system or augmented critical decision process, including whether such uses or applications ought to be prohibited or otherwise limited through any terms of use, licensing agreement, or other legal agreement between entities.

(7)

Maintain and keep updated documentation of any data or other input information used to develop, test, maintain, or update the automated decision system or augmented critical decision process, including—

(A)

how and when such data or other input information was sourced and, if applicable, licensed, including information such as—

(i)

metadata and information about the structure and type of data or other input information, such as the file type, the date of the file creation or modification, and a description of data fields;

(ii)

an explanation of the methodology by which the covered entity collected, inferred, or obtained the data or other input information and, if applicable, labeled, categorized, sorted, or clustered such data or other input information, including whether such data or other input information was labeled, categorized, sorted, or clustered prior to being collected, inferred, or obtained by the covered entity; and

(iii)

whether and how consumers provided informed consent for the inclusion and further use of data or other input information about themselves and any limitations stipulated on such inclusion or further use;

(B)

why such data or other input information was used and what alternatives were explored; and

(C)

other information about the data or other input information, such as—

(i)

the representativeness of the dataset and how this factor was measured, including any assumption about the distribution of the population on which the augmented critical decision process is deployed; and

(ii)

the quality of the data, how the quality was evaluated, and any measure taken to normalize, correct, or clean the data.

(8)

Evaluate the rights of consumers, such as—

(A)

by assessing the extent to which the covered entity provides consumers with—

(i)

clear notice that such system or process will be used; and

(ii)

a mechanism for opting out of such use;

(B)

by assessing the transparency and explainability of such system or process and the degree to which a consumer may contest, correct, or appeal a decision or opt out of such system or process, including—

(i)

the information available to consumers or representatives or agents of consumers about the system or process, such as any relevant factors that contribute to a particular decision, including an explanation of which contributing factors, if changed, would cause the system or process to reach a different decision, and how such consumer, representative, or agent can access such information;

(ii)

documentation of any complaint, dispute, correction, appeal, or opt-out request submitted to the covered entity by a consumer with respect to such system or process; and

(iii)

the process and outcome of any remediation measure taken by the covered entity to address the concerns of or harms to consumers; and

(C)

by describing the extent to which any third-party decision recipient receives a copy of or has access to the results of such system or process and the category of such third-party decision recipient, as defined by the Commission in section 3(b)(1)(I)(iii).

(9)

Identify any likely material negative impact of the automated decision system or augmented critical decision process on consumers and assess any applicable mitigation strategy, such as by—

(A)

identifying and measuring any likely material negative impact of the system or process on consumers, including documentation of the steps taken to identify and measure such impact;

(B)

documenting any steps taken to eliminate or reasonably mitigate any likely material negative impact identified, including steps such as removing the system or process from the market or terminating its development;

(C)

with respect to the likely material negative impacts identified, documenting which such impacts were left unmitigated and the rationale for the inaction, including details about the justifying non-discriminatory, compelling interest and why such interest cannot be satisfied by other means (such as where there is an equal, zero-sum trade-off between impacts on 2 or more consumers or where the required mitigating action would violate civil rights or other laws); and

(D)

documenting standard protocols or practices used to identify, measure, mitigate, or eliminate any likely material negative impact on consumers and how relevant teams or staff are informed of and trained about such protocols or practices.

(10)

Describe any ongoing documentation of the development and deployment process with respect to the automated decision system or augmented critical decision process, including information such as—

(A)

the date of any testing, deployment, licensure, or other significant milestones; and

(B)

points of contact for any team, business unit, or similar internal stakeholder that was involved.

(11)

Identify any capabilities, tools, standards, datasets, security protocols, improvements to stakeholder engagement, or other resources that may be necessary or beneficial to improving the automated decision system, augmented critical decision process, or the impact assessment of such system or process, in areas such as—

(A)

performance, including accuracy, robustness, and reliability;

(B)

fairness, including bias and nondiscrimination;

(C)

transparency, explainability, contestability, and opportunity for recourse;

(D)

privacy and security;

(E)

personal and public safety;

(F)

efficiency and timeliness;

(G)

cost; or

(H)

any other area determined appropriate by the Commission.

(12)

Document any of the impact assessment requirements described in paragraphs (1) through (11) that were attempted but were not possible to comply with because they were infeasible, as well as the corresponding rationale for not being able to comply with such requirements, which may include—

(A)

the absence of certain information about an automated decision system developed by other persons, partnerships, and corporations;

(B)

the absence of certain information about how clients, customers, licensees, partners, and other persons, partnerships, or corporations are deploying an automated decision system in their augmented critical decision processes;

(C)

a lack of demographic or other data required to assess differential performance because such data is too sensitive to collect, infer, or store; or

(D)

a lack of certain capabilities, including technological innovations, that would be necessary to conduct such requirements.

(13)

Perform and document any other ongoing study or evaluation determined appropriate by the Commission.

(b)

Rule of construction

Nothing in this Act should be construed to limit any covered entity from adding other criteria, procedures, or technologies to improve the performance of an impact assessment of their automated decision system or augmented critical decision process.

(c)

Nondisclosure of impact assessment

Nothing in this Act should be construed to require a covered entity to share with or otherwise disclose to the Commission or the public any information contained in an impact assessment performed in accordance with this Act, except for any information contained in the summary report required under subparagraph (D) or (E) of section 3(b)(1).

5.

Requirements for summary reports to the Commission

The summary report that a covered entity is required to submit under subparagraph (D) or (E) of section 3(b)(1) for any automated decision system or augmented critical decision process shall, to the extent possible—

(1)

contain information from the impact assessment of such system or process, as applicable, including—

(A)

the name, website, and point of contact for the covered entity;

(B)

a detailed description of the specific critical decision that the augmented critical decision process is intended to make, including the category of critical decision as described in subparagraphs (A) through (I) of section 2(8);

(C)

the covered entity's intended purpose for the automated decision system or augmented critical decision process;

(D)

an identification of any stakeholders consulted by the covered entity as required by section 3(b)(1)(G) and documentation of the existence and nature of any legal agreements between the stakeholders and the covered entity;

(E)

documentation of the testing and evaluation of the automated decision system or augmented critical decision process, including—

(i)

the methods and technical and business metrics used to assess the performance of such system or process and a description of what metrics are deemed successful performance;

(ii)

the results of any assessment of the performance of such system or process and a comparison of the results of any assessment under test and deployed conditions; and

(iii)

an evaluation of any differential performance of such system or process assessed during the impact assessment;

(F)

any publicly stated guard rail for or limitation on certain uses or applications of the automated decision system or augmented critical decision process, including whether such uses or applications ought to be prohibited or otherwise limited through any terms of use, licensing agreement, or other legal agreement between entities;

(G)

documentation about the data or other input information used to develop, test, maintain, or update the automated decision system or augmented critical decision process including—

(i)

how and when the covered entity sourced such data or other input information; and

(ii)

why such data or other input information was used and what alternatives were explored;

(H)

documentation of whether and how the covered entity implements any transparency or explainability measures, including—

(i)

which categories of third-party decision recipients receive a copy of or have access to the results of any decision or judgment that results from such system or process; and

(ii)

any mechanism by which a consumer may contest, correct, or appeal a decision or opt out of such system or process, including the corresponding website for such mechanism, where applicable;

(I)

any likely material negative impact on consumers identified by the covered entity and a description of the steps taken to remediate or mitigate such impact;

(J)

a list of any impact assessment requirements that were attempted but were not possible to comply with because they were infeasible, as well as the corresponding rationale for not being able to comply with such requirements; and

(K)

any additional capabilities, tools, standards, datasets, security protocols, improvements to stakeholder engagement, or other resources identified by an impact assessment as necessary or beneficial to improve the performance of impact assessment or the development and deployment of any automated decision system or augmented critical decision process that the covered entity determines appropriate to share with the Commission;

(2)

include, in addition to the information required under paragraph (1), any relevant additional information from section 4(a) the covered entity wishes to share with the Commission;

(3)

follow any format or structure requirements specified by the Commission; and

(4)

include additional criteria that are essential for the purpose of consumer protection, as determined by the Commission.

6.

Reporting; publicly accessible repository

(a)

Annual report

Not later than 1 year after the effective date described in section 3(b)(3), and annually thereafter, the Commission shall publish publicly on the website of the Commission a report describing and summarizing the information from the summary reports submitted under subparagraph (D), (E), or (F) of section 3(b)(1) that—

(1)

is accessible and machine readable in accordance with the 21st Century Integrated Digital Experience Act (44 U.S.C. 3501 note); and

(2)

describes broad trends, aggregated statistics, and anonymized lessons learned about performing impact assessments of automated decision systems or augmented critical decision processes, for the purposes of updating guidance related to impact assessments and summary reporting, oversight, and making recommendations to other regulatory agencies.

(b)

Publicly accessible repository

(1)

In general

(A)

Establishment

(i)

Development

Not later than 180 days after the Commission promulgates the regulations required under section 3(b)(1), the Commission shall develop a publicly accessible repository designed to publish a limited subset of the information about each automated decision system and augmented critical decision process for which the Commission received a summary report under subparagraph (D), (E), or (F) of section 3(b)(1) in order to facilitate consumer protection.

(ii)

Publication

Not later than 180 days after the effective date described in section 3(b)(3), the Commission shall make the repository publicly accessible.

(iii)

Updates

The Commission shall update the repository on a quarterly basis.

(B)

Purpose

The purposes of the repository established under subparagraph (A) are—

(i)

to inform consumers about the use of automated decision systems and augmented critical decision processes;

(ii)

to allow researchers and advocates to study the use of automated decision systems and augmented critical decision processes; and

(iii)

to ensure compliance with the requirements of this Act.

(C)

Considerations

In establishing the repository under subparagraph (A), the Commission shall consider—

(i)

how to provide consumers with pertinent information regarding augmented critical decision processes while minimizing any potential commercial risk to any covered entity of providing such information;

(ii)

what information, if any, to include regarding the specific automated decision systems deployed in the augmented critical decision processes;

(iii)

how to document information, when applicable, about how to contest or seek recourse for a critical decision in a manner that is readily accessible by the consumer; and

(iv)

how to streamline the submission of summary reports under subparagraph (D), (E), or (F) of section 3(b)(1) to allow the Commission to efficiently populate information into the repository to minimize or eliminate any burden on the Commission.

(D)

Requirements

The Commission shall design the repository established under subparagraph (A) to—

(i)

be publicly available and easily discoverable on the website of the Commission;

(ii)

allow users to sort and search the repository by multiple characteristics (such as by covered entity, date reported, or category of critical decision) simultaneously;

(iii)

allow users to make a copy of or download the information obtained from the repository, including any subsets of information obtained by sorting or searching as described in clause (ii), in accordance with current guidance from the Office of Management and Budget, such as the Open, Public, Electronic, and Necessary Government Data Act (44 U.S.C. 101 note);

(iv)

be in accordance with user experience and accessibility best practices such as those described in the 21st Century Integrated Digital Experience Act (44 U.S.C. 3501 note);

(v)

include a limited subset of information from the summary reports, as applicable, under subparagraph (D), (E), or (F) of section 3(b)(1) that includes—

(I)

the identity of the covered entity that submitted such summary report, including any link to the website of the covered entity;

(II)

the specific critical decision that the augmented critical decision process makes, along with the category of the critical decision;

(III)

any publicly stated prohibited applications of the automated decision system or augmented critical decision process, including whether such prohibition is enforced through any terms of use, licensing agreement, or other legal agreement between entities;

(IV)

to the extent possible, the sources of any data used to develop, test, maintain, or update the automated decision system or augmented critical decision process;

(V)

to the extent possible, the type of technical and business metrics used to assess the performance of the augmented critical decision process when deployed; and

(VI)

the link to any web page with instructions or other information related to a mechanism by which a consumer may contest, correct, or appeal a decision or opt out of the automated decision system or augmented critical decision process; and

(vi)

include information about design, use, and maintenance of the repository, including—

(I)

how frequently the repository is updated;

(II)

the date of the most recent such update;

(III)

the types of information from the summary reports submitted under subparagraph (D), (E), or (F) of section 3(b)(1) that are and are not included in the repository; and

(IV)

any other information about the design, use, and maintenance the Commission determines is—

(aa)

relevant to consumers and researchers; or

(bb)

essential for consumer education and recourse.

(2)

Authorization of appropriations

There are authorized to be appropriated to the Commission such sums as are necessary to carry out this subsection.

7.

Guidance and technical assistance; other requirements

(a)

Guidance and technical assistance from the Commission

(1)

In general

The Commission shall publish guidance on how to meet the requirements of sections 4 and 5, including resources such as documentation templates and guides for meaningful consultation, that is developed by the Commission after consultation with the Director of the National Institute of Standards and Technology, the Director of the National Artificial Intelligence Initiative, the Director of the Office of Science and Technology Policy, and other relevant stakeholders, including standards bodies, private industry, academia, technology experts, and advocates for civil rights, consumers, and impacted communities.

(2)

Assistance in determining covered entity status

In addition to the guidance required under paragraph (1), the Commission shall—

(A)

issue guidance and training materials to assist persons, partnerships, and corporations in evaluating whether they are a covered entity; and

(B)

regularly update such guidance and training materials in accordance with any feedback or questions from covered entities, experts, or other relevant stakeholders.

(b)

Other requirements

(1)

Publication

Nothing in this Act shall be construed to limit a covered entity from publicizing any documentation of the impact assessment maintained under section 3(b)(1)(B), including information beyond what is required to be submitted in a summary report under subparagraph (D) or (E) of section 3(b)(1), unless such publication would violate the privacy of any consumer.

(2)

Periodic review of regulations

The Commission shall review the regulations promulgated under section 3(b) not less than once every 5 years and update such regulations as appropriate.

(3)

Review by NIST and OSTP

The Commission shall make available, in a private and secure manner, to the Director of the National Institute of Standards and Technology, the Director of the Office of Science and Technology Policy, and the head of any Federal agency with relevant regulatory jurisdiction over an augmented critical decision process any summary report submitted under subparagraph (D), (E), or (F) of section 3(b)(1) for review in order to develop future standards or regulations.

8.

Resources and authorities

(a)

Bureau of Technology

(1)

Establishment

(A)

In general

There is established within the Commission the Bureau of Technology (in this subsection referred to as the Bureau).

(B)

Duties

The Bureau shall engage in activities that include:

(i)

Aiding or advising the Commission with respect to the technological aspects of the functions of the Commission, including—

(I)

preparing, conducting, facilitating, managing, or otherwise enabling studies, workshops, audits, community participation opportunities, or other similar activities; and

(II)

any other assistance deemed appropriate by the Commission or Chair.

(ii)

Aiding or advising the Commission with respect to the enforcement of this Act.

(iii)

Providing technical assistance to any enforcement bureau within the Commission with respect to the investigation and trial of cases.

(2)

Chief Technologist

The Bureau shall be headed by a Chief Technologist.

(3)

Staff

(A)

Appointments

(i)

In general

Subject to subparagraph (B), the Chair may, without regard to the civil service laws (including regulations), appoint personnel with experience in fields such as management, technology, digital and product design, user experience, information security, civil rights, technology policy, privacy policy, humanities and social sciences, product management, software engineering, machine learning, statistics, or other related fields to enable the Bureau to perform its duties.

(ii)

Minimum appointments

Not later than 2 years after the date of enactment of this Act, the Chair shall appoint not less than 50 personnel.

(B)

Excepted service

The personnel appointed in accordance with subparagraph (A) may be appointed to positions described in section 213.3102(r) of title 5, Code of Federal Regulations.

(4)

Authorization of appropriations

There are authorized to be appropriated to the Commission such sums as are necessary to carry out this subsection.

(b)

Additional personnel in the Bureau of Consumer Protection

(1)

Additional personnel

Notwithstanding any other provision of law, the Chair may, without regard to the civil service laws (including regulations), appoint 25 additional personnel to the Division of Enforcement of the Bureau of Consumer Protection.

(2)

Authorization of appropriations

There are authorized to be appropriated to the Commission such sums as are necessary to carry out this subsection.

(c)

Establishment of agreements of cooperation

The Commission shall negotiate agreements of cooperation, as needed, with any relevant Federal agency with respect to information sharing and enforcement actions taken regarding the development or deployment of an automated decision system to make a critical decision or of an augmented critical decision process. Such agreements shall include procedures for determining which agency shall file an action and providing notice to the non-filing agency, where feasible, prior to initiating a civil action to enforce any Federal law within such agencies' jurisdictions regarding the development or deployment of an automated decision system to make a critical decision or of an augmented critical decision process by a covered entity.

9.

Enforcement

(a)

Enforcement by the Commission

(1)

Unfair or deceptive acts or practices

A violation of this Act or a regulation promulgated thereunder shall be treated as a violation of a rule defining an unfair or deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).

(2)

Powers of the Commission

(A)

In general

The Commission shall enforce this Act and the regulations promulgated under this Act in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this Act.

(B)

Privileges and immunities

Any person who violates this Act or a regulation promulgated thereunder shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).

(C)

Authority preserved

Nothing in this Act shall be construed to limit the authority of the Commission under any other provision of law.

(D)

Rulemaking

The Commission shall promulgate in accordance with section 553 of title 5, United States Code, such additional rules as may be necessary to carry out this Act.

(b)

Enforcement by States

(1)

In general

If the attorney general of a State has reason to believe that an interest of the residents of the State has been or is being threatened or adversely affected by a practice that violates this Act or a regulation promulgated thereunder, the attorney general of the State may, as parens patriae, bring a civil action on behalf of the residents of the State in an appropriate district court of the United States to obtain appropriate relief.

(2)

Rights of Commission

(A)

Notice to Commission

(i)

In general

Except as provided in clause (iii), the attorney general of a State, before initiating a civil action under paragraph (1), shall provide written notification to the Commission that the attorney general intends to bring such civil action.

(ii)

Contents

The notification required under clause (i) shall include a copy of the complaint to be filed to initiate the civil action.

(iii)

Exception

If it is not feasible for the attorney general of a State to provide the notification required under clause (i) before initiating a civil action under paragraph (1), the attorney general shall notify the Commission immediately upon instituting the civil action.

(B)

Intervention by Commission

The Commission may—

(i)

intervene in any civil action brought by the attorney general of a State under paragraph (1); and

(ii)

upon intervening—

(I)

be heard on all matters arising in the civil action; and

(II)

file petitions for appeal of a decision in the civil action.

(3)

Investigatory powers

Nothing in this subsection may be construed to prevent the attorney general of a State from exercising the powers conferred on the attorney general by the laws of the State to conduct investigations, to administer oaths or affirmations, or to compel the attendance of witnesses or the production of documentary or other evidence.

(4)

Venue; service of process

(A)

Venue

Any action brought under paragraph (1) may be brought in—

(i)

the district court of the United States that meets applicable requirements relating to venue under section 1391 of title 28, United States Code; or

(ii)

another court of competent jurisdiction.

(B)

Service of process

In an action brought under paragraph (1), process may be served in any district in which—

(i)

the defendant is an inhabitant, may be found, or transacts business; or

(ii)

venue is proper under section 1391 of title 28, United States Code.

(5)

Actions by other State officials

(A)

In general

In addition to a civil action brought by an attorney general under paragraph (1), any other officer of a State who is authorized by the State to do so may bring a civil action under paragraph (1), subject to the same requirements and limitations that apply under this subsection to civil actions brought by attorneys general.

(B)

Savings provision

Nothing in this subsection may be construed to prohibit an authorized official of a State from initiating or continuing any proceeding in a court of the State for a violation of any civil or criminal law of the State.

10.

Coordination

In carrying out this Act, the Commission shall coordinate with any appropriate Federal agency or State regulator to promote consistent regulatory treatment of automated decision systems and augmented critical decision processes.

11.

No preemption

Nothing in this Act may be construed to preempt any State, tribal, city, or local law, regulation, or ordinance.