I
117th CONGRESS
1st Session
H. R. 3611
IN THE HOUSE OF REPRESENTATIVES
May 28, 2021
Ms. Matsui introduced the following bill; which was referred to the Committee on Energy and Commerce
A BILL
To prohibit the discriminatory use of personal information by online platforms in any algorithmic process, to require transparency in the use of algorithmic processes and content moderation, and for other purposes.
Short title
This Act may be cited as the Algorithmic Justice and Online Platform Transparency Act
.
Findings
Congress finds the following:
Online platforms have become integral to individuals' full participation in economic, democratic, and societal processes.
Online platforms employ manipulative dark patterns, collect large amounts of personal information from their users, and leverage that personal information for opaque algorithmic processes in ways that create vastly different experiences for different types of users.
Algorithmic processes are often used by online platforms without adequate testing and in the absence of critical transparency requirements and other legally enforceable safety and efficacy standards, which has resulted in discrimination in housing, lending, job advertising, and other areas of opportunity.
The use of discriminatory algorithmic processes causes disproportionate harm to populations that already experience marginalization.
Online platforms constantly engage in content moderation decision making, resulting in highly influential outcomes regarding what content is visible and accessible to users.
Online platforms’ content moderation practices have disproportionately significant repercussions for members of marginalized communities, who have historically been the target of nefarious online activity, including disinformation campaigns.
Users of online platforms should have access to understandable information about how online platforms moderate content and use algorithmic processes to amplify or recommend content.
Users of online platforms should be able to easily move their data to alternative online platforms, and the importance of this right is particularly significant given certain online platforms’ use of harmful algorithmic processes and engagement in ineffective content moderation.
In a variety of sectors, algorithmic processes also facilitate discriminatory outcomes on online platforms that individuals may not personally interact with, but which nonetheless process the personal information of such individuals and have significant, negative consequences.
The people of the United States would benefit from the convening of experts from a diverse set of governmental positions to collectively study and report on discriminatory algorithmic processes across the United States economy and society, with particular attention to intersections of harm.
Definitions
In this Act, the following definitions apply:
Algorithmic process
The term algorithmic process means a computational process, including one derived from machine learning or other artificial intelligence techniques, that processes personal information or other data for the purpose of determining the order or manner that a set of information is provided, recommended to, or withheld from a user of an online platform, including the provision of commercial content, the display of social media posts, or any other method of automated decision making, content selection, or content amplification.
Biometric information
The term biometric information—
means information regarding the physiological or biological characteristics of an individual that may be used, singly or in combination with each other or with other identifying data, to establish the identity of an individual; and
includes—
genetic information;
imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted;
keystroke patterns or rhythms, gait patterns or rhythms, and sleep, health, or exercise data that contain identifying information; and
any mathematical code, profile, or algorithmic model derived from information regarding the physiological or biological characteristics of an individual.
Commission
The term Commission means the Federal Trade Commission.
Content moderation
The term content moderation means—
the intentional deletion, labeling, or editing of user generated content or a process of purposefully decreasing access to such content through the human labor of any individual that is financially compensated by an online platform, an automated process, or some combination thereof, pursuant to the online platform's terms of service or stated community standards; and
such other practices as the Commission may identify under regulations promulgated under section 553 of title 5, United States Code.
De-identified
The term de-identified, with respect to personal information, means information that has been altered, anonymized, or aggregated so that it cannot reasonably identify, relate to, describe, or be capable of being associated with or linked to, directly or indirectly, a particular individual or device.
Demographic information
The term demographic information means information regarding an individual's or class of individuals’ race, color, ethnicity, sex, religion, national origin, age, gender, gender identity, sexual orientation, disability status, familial status, immigration status, educational attainment, income, source of income, occupation, employment status, biometric information, criminal record, credit rating, or any categorization used by the online platform derived from such information.
Group
The term group means a page or other subdivision of an online platform that functions as a forum for users to post or otherwise distribute content to, or communicate with, other users of such page or other subdivision.
Non-precise geolocation information
The term non-precise geolocation information means information regarding a country, State, county, city, or ZIP code.
Online platform
The term online platform means any public-facing website, online service, online application, or mobile application which is operated for commercial purposes and provides a community forum for user generated content, including a social network site, content aggregation service, or service for sharing videos, images, games, audio files, or other content.
Personal information
In general
The term personal information means information that directly or indirectly identifies, or could be reasonably linked to, a particular individual or device.
Reasonably linked
For purposes of subparagraph (A), information could be reasonably linked to an individual or device if such information can be used on its own or in combination with other information held by, or readily accessible to, a person to identify an individual or device.
Place of public accommodation
The term place of public accommodation means—
any entity considered a place of public accommodation under section 201(b) of the Civil Rights Act of 1964 (42 U.S.C. 2000a(b)) or section 301 of the Americans with Disabilities Act of 1990 (42 U.S.C. 12181); or
any commercial entity that offers goods or services through the internet to the general public.
Small business
In general
The term small business means a commercial entity that establishes, with respect to the 3 preceding calendar years (or since the inception of such entity if such period is less than 3 calendar years), that the entity—
maintains an average annual gross revenue of less than $25,000,000;
on average, annually processes the personal information of less than 100,000 individuals, households, or devices used by individuals or households;
on average, derives 50 percent or less of its annual revenue from transferring the personal information of individuals; and
has less than 50 workers at any time during such period.
Common control or branding
For purposes of subparagraph (A), the amounts at issue shall include the activity of any person that controls, is controlled by, is under common control with, or shares common branding with such commercial entity.
User generated content
The term user generated content means any content, including text, images, videos, reviews, profiles, games, or audio content, that is made or created (including through a form, template, or other process provided by the online platform) and posted on an online platform by a user of the online platform.
Transparency
Notice and review of algorithmic process
Beginning 1 year after the date of enactment of this Act, any online platform that employs, operates, or otherwise utilizes an algorithmic process to withhold, amplify, recommend, or promote content (including a group) to a user of the online platform shall comply with the following requirements:
Required notice
In general
With respect to each type of algorithmic process utilized by an online platform, such online platform shall disclose the following information to users of the online platform in conspicuous, accessible, and plain language that is not misleading:
The categories of personal information the online platform collects or creates for purposes of the type of algorithmic process.
The manner in which the online platform collects or creates such personal information.
How the online platform uses such personal information in the type of algorithmic process.
The method by which the type of algorithmic process prioritizes, assigns weight to, or ranks different categories of personal information to withhold, amplify, recommend, or promote content (including a group) to a user.
Language of required notice
Such online platform shall make available the notice described in subparagraph (A) in each language in which the online platform provides services.
Rulemaking
The Commission shall conduct a rulemaking to identify each type of algorithmic process for which an online platform is required to disclose the information described in subparagraph (A).
Review of algorithmic process
Record of algorithmic process
Subject to subparagraph (B), such online platform shall, for 5 years, retain a record that describes—
the categories of personal information used by the type of algorithmic process;
the method by which the type of algorithmic process weighs or ranks certain categories of personal information;
the method by which the online platform develops its type of algorithmic process, including—
a description of any personal information or other data used in such development;
an explanation of any personal information or other data used to train the type of algorithmic process on an ongoing basis; and
a description of how the type of algorithmic process was tested for accuracy, fairness, bias, and discrimination; and
if the online platform (except for a small business) utilizes an algorithmic process that relates to opportunities for housing, education, employment, insurance, credit, or the access to or terms of use of any place of public accommodations, an assessment of whether the type of algorithmic process produces disparate outcomes on the basis of an individual’s or class of individuals’ actual or perceived race, color, ethnicity, sex, religion, national origin, gender, gender identity, sexual orientation, familial status, biometric information, or disability status.
Additional requirements
Requirement to de-identify personal information
The record described in subparagraph (A) shall not include any personal information other than de-identified personal information.
Extension of record retention
An online platform shall retain the record described in subparagraph (A) for up to an additional 3 years if the Commission determines that the online platform poses a reasonable risk of engaging in repeated violations of this Act or of unlawful discrimination as a result of its use of an algorithmic process.
Review of record
Upon the request of the Commission, an online platform shall make available to the Commission the complete record described in subparagraph (A).
Notice of content moderation practices
Notice
In general
Beginning 1 year after the date of enactment of this Act, any online platform shall disclose to users of the online platform in conspicuous, accessible, and plain language that is not misleading a complete description of the online platform's content moderation practices, including a description of any type of automated content moderation practices and content moderation practices that employ human labor.
Language of required notice
Such online platform shall make available the notice described in subparagraph (A) in each language in which the online platform provides services.
Content moderation transparency reports
In general
Beginning 180 days after the date of enactment of this Act, any online platform (except for a small business) that engages in content moderation shall publish, not less than annually, a transparency report of their content moderation practices.
Requirements
In general
The transparency report required under subparagraph (A) shall include, if applicable:
The total number of content moderation decisions for the applicable period.
The number of content moderation decisions for the applicable period broken down by:
Relevant policy, type, or category of content moderation undertaken by the online platform.
Whether the content moderation decision occurred in response to information regarding organized campaigns or other coordinated behavior.
Aggregate demographic information of users who created the user generated content subjected to content moderation.
Aggregate demographic information of users targeted by an algorithmic process involving content subjected to content moderation.
Whether the content moderation occurred through automated practices, human labor by the online platform, labor by any individual that does not work as a paid employee of the online platform, or any combination thereof.
In the case of content moderation that occurred through human labor by any individual that does not work for the online platform, the nature of such individual’s relationship to the online platform (such as a user, moderator, State actor, or representative of an external partner organization).
The number and percentage of content moderation decisions subject to appeal or other form of secondary review.
The number and percentage of content moderation decisions reversed on appeal or other form of secondary review.
The number of content moderation decisions occurring in response to a government demand or request.
The number of government demands or requests for content moderation broken down by Federal agency, State, municipality, or foreign nation.
The types of content moderation decisions made.
Other information that the Commission, by regulation, deems appropriate.
The ability to cross-reference each of the different types of information disclosed pursuant to subclause (II).
Accessibility of report
The transparency report required under subparagraph (A) shall be—
publicly available to any individual without such individual being required to create a user account;
conspicuous;
accessible;
not misleading; and
available in each language in which the online platform provides services.
Accessibility of report data
The online platform shall—
provide any data in the transparency report required under subparagraph (A) in a machine-readable format; and
allow anyone to freely copy and use such data.
Rule of construction
Nothing in this subsection shall require an online platform to collect personal information that the online platform would not otherwise collect.
Advertisement library
Beginning 180 days after the date of enactment of this Act, any online platform (except for a small business) that uses personal information in combination with an algorithmic process to sell or publish an advertisement shall take all reasonable steps to maintain a library of such advertisements. The library shall—
be—
publicly available to any individual without such individual being required to create a user account;
conspicuous;
accessible;
not misleading; and
available in each language in which the online platform provides services;
present information in both human- and machine-readable formats;
allow any individual to freely copy and use the information contained in the library;
at a minimum, be searchable by date, location, topic, cost, advertiser, keyword, information disclosed pursuant to paragraph (6), or any other criteria that the Commission, by regulation, deems appropriate;
contain copies of all advertisements sold or published by the online platform for 2 years following the sale or publishing of each advertisement; and
for each advertisement entry, include—
the content of the advertisement;
all targeting criteria selected by the advertiser, including demographic information and non-precise geolocation information (except in the event that including a specific criterion would disclose personal information);
any data the online platform provided to the advertiser regarding to whom it sold or published the advertisement, including demographic information and non-precise geolocation information (except in the event that including specific data would disclose personal information); and
the name of the advertiser, the cost of the advertisement, the dates the advertisement was displayed on the online platform, and any other information that the Commission, by regulation, deems appropriate.
Certification
Not later than 30 days after making any disclosure required by subsection (a)(1), (b), or (c), and annually thereafter, an online platform shall certify the accuracy and completeness of such disclosure. Such certification shall—
be signed, under oath, by the online platform’s chief executive officer, chief privacy officer, chief operating officer, chief information security officer, or another senior officer of equivalent stature;
attest that the officer described in paragraph (1) has personal knowledge sufficient to make such certification; and
in addition to any annual certification, be issued with any material change (which shall not include routine additions to or maintenance of entries in the advertising library pursuant to subsection (c)).
Right to data portability
In promulgating regulations under this Act, the Commission shall require an online platform, if the online platform retains the personal information of a user, to provide to the user access to the personal information retained in the form of a portable electronic table that—
is in a usable and searchable format; and
allows the user to transfer such personal information from one online platform to another without hindrance.
Prohibited conduct
Public accommodations
It shall be unlawful for an online platform to employ any proprietary online platform design features, including an algorithmic process, or otherwise process the personal information of an individual in a manner that segregates, discriminates in, or otherwise makes unavailable the goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation on the basis of an individual’s or class of individuals’ actual or perceived race, color, ethnicity, religion, national origin, sex, gender, gender identity, sexual orientation, familial status, biometric information, or disability status.
Equal opportunity
It shall be unlawful for an online platform to employ any proprietary online platform design features, including an algorithmic process, or otherwise process the personal information of an individual for the purpose of advertising, marketing, soliciting, offering, selling, leasing, licensing, renting, or otherwise commercially contracting for housing, employment, credit, insurance, healthcare, or education opportunities in a manner that discriminates against or otherwise makes the opportunity unavailable on the basis of an individual’s or class of individuals’ actual or perceived race, color, ethnicity, religion, national origin, sex, gender, gender identity, sexual orientation, familial status, biometric information, or disability status.
Voting rights
It shall be unlawful for an online platform to process personal information in a manner that intentionally deprives, defrauds, or attempts to deprive or defraud any individual of their free and fair exercise of the right to vote in a Federal, State, or local election. Such manner includes:
Intentional deception regarding—
the time, place, or method of voting or registering to vote;
the eligibility requirements to vote or register to vote;
the counting of ballots;
the adjudication of elections;
explicit endorsements by any person or candidate; or
any other material information pertaining to the procedures or requirements for voting or registering to vote in a Federal, State, or local election.
Intentionally using deception, threats, intimidation, fraud, or coercion to prevent, interfere with, retaliate against, deter, or attempt to prevent, interfere with, retaliate against, or deter an individual from—
voting or registering to vote in a Federal, State, or local election; or
supporting or advocating for a candidate in a Federal, State, or local election.
Discriminatory advertising
In general
Not later than 2 years after the date of enactment of this Act, the Commission shall promulgate regulations to define and prohibit unfair or deceptive acts or practices with respect to advertising practices.
Periodic review of regulations
The Commission shall review such regulations not less than once every 5 years and update the regulations as appropriate.
Considerations
In promulgating regulations under this subsection, the Commission shall consider:
Established public policy, such as civil rights laws, to prevent discrimination and promote equal opportunity.
The state of the art of advertising.
Research of and methodologies for measuring discrimination in advertising.
The role of each actor in the advertising ecosystem.
Any harm caused by predatory or manipulative advertising practices, including practices targeting vulnerable populations.
Whether, and at what age, a minor is able to distinguish between editorial content and paid advertisements.
Methods for fairly promoting equal opportunity in housing, employment, credit, insurance, education, and healthcare through targeted outreach to underrepresented populations in a fair and non-deceptive manner.
The needs of small businesses.
Any other criteria the Commission deems appropriate.
Safety and effectiveness of algorithmic processes
In general
It shall be unlawful for an online platform to employ an algorithmic process in a manner that is not safe and effective.
Safe
For purposes of paragraph (1), an algorithmic process is safe—
if the algorithmic process does not produce any disparate outcome as described in the assessment conducted under section 4(a)(2)(A)(iv); or
if the algorithmic process does produce a disparate outcome as described in the assessment conducted under section 4(a)(2)(A)(iv), any such disparate outcome is justified by a non-discriminatory, compelling interest, and such interest cannot be satisfied by less discriminatory means.
Effective
For purposes of paragraph (1), an algorithmic process is effective if the online platform employing or otherwise utilizing the algorithmic process has taken reasonable steps to ensure that the algorithmic process has the ability to produce its desired or intended result.
Discrimination by users of online platforms
It shall be unlawful for a user of an online platform to utilize an algorithmic process on an online platform in a manner that—
withholds, denies, deprives, or attempts to withhold, deny, or deprive any individual of a right or privilege under title II of the Civil Rights Act of 1964 (42 U.S.C. 2000a et. seq.);
intimidates, threatens, coerces, or attempts to intimidate, threaten, or coerce any individual with the purpose of interfering with a right or privilege under title II of such Act; or
punishes or attempts to punish any individual for exercising or attempting to exercise a right or privilege under title II of such Act.
Exceptions
Nothing in this section shall limit an online platform from processing personal information for the purpose of—
good faith internal testing to prevent unlawful discrimination, identify disparate outcomes or treatment, or otherwise determine the extent or effectiveness of the online platform’s compliance with this Act; or
advertising, marketing, or soliciting economic opportunities (which shall not be of lower quality or contain less desirable terms than similar opportunities the online platform advertises, markets, or solicits to the general population) to underrepresented populations in a fair and non-deceptive manner.
FTC advisory opinions
An online platform may request guidance from the Commission with respect to the online platform’s potential compliance with this Act, in accordance with the Commission’s rules of practice on advisory opinions.
Preservation of rights and whistleblower protections; rules of construction
No conditional service
An online platform may not condition or degrade the provision of a service or product to an individual based on the individual's waiver of any right guaranteed in this section.
No arbitration agreement or waiver
No pre-dispute arbitration agreement or pre-dispute joint action waiver of any right guaranteed in this section shall be valid or enforceable with respect to a dispute arising under this Act. Any determination as to the scope or manner of applicability of this section shall be made by a court, rather than an arbitrator, without regard to whether such agreement purports to delegate such determination to an arbitrator.
Whistleblower protection
An online platform may not, directly or indirectly, discharge, demote, suspend, threaten, harass, or in any other manner discriminate against an individual for reporting or attempting to report a violation of this section.
Rule of construction
Nothing in this section shall be construed to affect the application of section 230 of the Communications Act of 1934 (commonly known as section 230 of the Communications Decency Act of 1996
) (47 U.S.C. 230) to an online platform or otherwise impose on an online platform legal liability for user generated content.
Interagency Task Force on Algorithmic Processes on Online Platforms
Establishment
The Commission shall establish an interagency task force on algorithmic processes on online platforms (referred to in this section as the Task Force
) for the purpose of examining the discriminatory use of personal information by online platforms in algorithmic processes.
Membership
In general
The Task Force established under this section shall include representatives from—
the Commission;
the Department of Education;
the Department of Justice;
the Department of Labor;
the Department of Housing and Urban Development;
the Department of Commerce;
the Department of Health and Human Services;
the Department of Veterans Affairs;
the Equal Employment Opportunity Commission;
the Consumer Financial Protection Bureau;
the Federal Communications Commission;
the Federal Elections Commission; and
the White House Office of Science and Technology Policy.
Chair
The Task Force shall be co-chaired by 1 representative of the Commission and 1 representative of the Department of Justice.
Staff
The Task Force shall hire such other personnel, including individuals with expertise in the intersection of civil rights and technology, as may be appropriate to enable the Task Force to perform its duties.
Study and Report
Study
The Task Force shall conduct a study on the discriminatory use of personal information by online platforms in algorithmic processes. Such study shall include the following:
Discriminatory use of personal information in the advertisement of (including the withholding of an advertisement) housing opportunities.
Discriminatory use of personal information in the advertisement of (including the withholding of an advertisement) credit, lending, or other financial services opportunities.
Discriminatory use of personal information in the advertisement of (including the withholding of an advertisement) employment opportunities.
Discriminatory use of personal information in the advertisement of (including the withholding of an advertisement) education opportunities.
Discriminatory use of personal information in the advertisement of (including the withholding of an advertisement) insurance opportunities.
Discriminatory use of personal information or biometric information by employers in the surveillance or monitoring of workers.
Discriminatory use of personal information on online platforms involved in hiring screening practices.
Discriminatory use of personal information or biometric information in education, including the use of—
student personal information for predictive forecasting on student ability or potential for purposes of admissions decisions; and
automated proctoring software that monitors, analyzes, or otherwise processes student biometric information to identify suspicious behavior, including any discriminatory outcomes associated with the use of such software.
Discriminatory use of user biometric information.
Use of personal information by disinformation campaigns for the purpose of political disenfranchisement.
Any other discriminatory use of personal information.
Report
Not later than 180 days after the date of enactment of this Act, and biennially thereafter, the Task Force shall submit to Congress a report containing the results of the study conducted under paragraph (1), together with recommendations for such legislation and administrative action as the Task Force determines appropriate.
Funding
Out of any money in the Treasury not otherwise appropriated, there are appropriated to the Commission such sums as are necessary to carry out this section. Amounts appropriated under the preceding sentence shall remain available until expended.
Enforcement
Enforcement by the Commission
Unfair or deceptive acts or practice
A violation of this Act or a regulation promulgated under this Act shall be treated as a violation of a rule defining an unfair or deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).
Powers of the Commission
In general
The Commission shall enforce this Act in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this section.
Privileges and immunities
Any person who violates this Act or a regulation promulgated under this Act shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).
Authority preserved
Nothing in this Act shall be construed to limit the authority of the Commission under any other provision of law.
Rulemaking
The Commission shall promulgate in accordance with section 553 of title 5, United States Code, such rules as may be necessary to carry out this Act.
Enforcement by States
Authorization
Subject to paragraph (2), in any case in which the attorney general of a State has reason to believe that an interest of the residents of the State has been or is adversely affected by the engagement of any person in an act or practice that violates this Act or a regulation promulgated under this Act, the attorney general of the State may, as parens patriae, bring a civil action on behalf of the residents of the State in an appropriate district court of the United States to—
enjoin that act or practice;
enforce compliance with this Act or the regulation;
obtain damages, civil penalties, restitution, or other compensation on behalf of the residents of the State; or
obtain such other relief as the court may consider to be appropriate.
Rights of the Commission
Notice to the Commission
In general
Except as provided in clause (iii), the attorney general of a State shall notify the Commission in writing that the attorney general intends to bring a civil action under paragraph (1) before initiating the civil action against a person subject to this Act.
Contents
The notification required by clause (i) with respect to a civil action shall include a copy of the complaint to be filed to initiate the civil action.
Exception
If it is not feasible for the attorney general of a State to provide the notification required by clause (i) before initiating a civil action under paragraph (1), the attorney general shall notify the Commission immediately upon instituting the civil action.
Intervention by the Commission
The Commission may—
intervene in any civil action brought by the attorney general of a State under paragraph (1); and
upon intervening—
be heard on all matters arising in the civil action; and
file petitions for appeal of a decision in the civil action.
Investigatory powers
Nothing in this subsection may be construed to prevent the attorney general of a State from exercising the powers conferred on the attorney general by the laws of the State to conduct investigations, to administer oaths or affirmations, or to compel the attendance of witnesses or the production of documentary or other evidence.
Action by the Commission
If the Commission institutes a civil action with respect to a violation of this Act, the attorney general of a State may not, during the pendency of the action, bring a civil action under paragraph (1) against any defendant named in the complaint of the Commission for the violation with respect to which the Commission instituted such action.
Venue; service of process
Venue
Any action brought under paragraph (1) may be brought in the district court of the United States that meets applicable requirements relating to venue under section 1391 of title 28, United States Code.
Service of process
In an action brought under paragraph (1), process may be served in any district in which the defendant—
is an inhabitant; or
may be found.
Enforcement by the Department of Justice
In general
The Attorney General may bring a civil action to enforce section 6(a), (b), (c), (e), (f), or (i) in an appropriate district court of the United States.
Coordination with the Commission
The Attorney General shall, when reasonable and appropriate, consult and coordinate with the Commission on a civil action brought under paragraph (1).
Relief
In any civil action brought under paragraph (1), the court may impose injunctive relief, declaratory relief, damages, civil penalties, restitution, and any other relief the court deems appropriate.
Enforcement by individuals
In general
Any individual alleging a violation of section 6(a), (b), or (c), or a regulation promulgated thereunder, may bring a civil action in any court of competent jurisdiction, State or Federal.
Relief
In a civil action brought under paragraph (1) in which the plaintiff prevails, the court may award—
an amount equal to $2,500 or actual damages, whichever is greater;
punitive damages;
reasonable attorney's fees and litigation costs; and
any other relief, including injunctive or declaratory relief, that the court determines appropriate.