Calendar No. 382
[Report No. 117–114]
IN THE SENATE OF THE UNITED STATES
July 29, 2021
Mr. Portman (for himself, Mr. Peters, Ms. Rosen, Ms. Hassan, and Mr. Warnock) introduced the following bill; which was read twice and referred to the Committee on Homeland Security and Governmental Affairs
May 24, 2022
Reported by Mr. Peters, with an amendment
Omit the part struck through and insert the part printed in italic
To establish the National Deepfake and Digital Provenance Task Force, and for other purposes.
This Act may be cited as the
Deepfake Task Force Act.
National deepfake and digital provenance task force
In this section:
Digital content forgery
The term digital content forgery means the use of emerging technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead.
Digital content provenance
The term digital content provenance means the verifiable chronology of the origin and history of a piece of digital content, such as an image, video, audio recording, or electronic document.
The term eligible entity means—
a private sector or nonprofit organization; or
an institution of higher education.
Institution of higher education
The term institution of higher education has the meaning given the term in section 101 of the Higher Education Act of 1965 (20 U.S.C. 1001).
Relevant congressional committees
The term relevant congressional committees means—
the Committee on Homeland Security and Governmental Affairs of the Senate; and
the Committee on Homeland Security and the Committee on Oversight and Reform of the House of Representatives.
The term Secretary means the Secretary of Homeland Security.
The term Task Force means the National Deepfake and Provenance Task Force established under subsection (b)(1).
Establishment of task force
The Secretary, in coordination with the Director of the Office of Science and Technology Policy, shall establish a task force, to be known as
the National Deepfake Provenance Task Force, to—
investigate the feasibility of, and obstacles to, developing and deploying standards and technologies for determining digital content provenance;
propose policy changes to reduce the proliferation and impact of digital content forgeries, such as the adoption of digital content provenance and technology standards; and
serve as a formal mechanism for interagency coordination and information sharing to facilitate the creation and implementation of a national strategy to address the growing threats posed by digital content forgeries.
The following shall serve as co-chairpersons of the Task Force:
The Secretary or a designee of the Secretary.
The Director of the Office of Science and Technology Policy or a designee of the Director.
The Task Force shall be composed of 12 members, of whom—
4 shall be representatives from the Federal Government, including the co-chairpersons of the Task Force;
4 shall be representatives from institutions of higher education; and
4 shall be representatives from private or nonprofit organizations.
Not later than 120 days after the date of enactment of this Act, the co-chairpersons of the Task Force shall appoint members to the Task Force in accordance with subparagraph
(A) from among technical experts in—
secure digital content and delivery;
Term of appointment
The term of a member of the Task Force shall end on the date described in subsection (g)(1).
Any vacancy occurring in the membership of the Task Force shall be filled in the same manner in which the original appointment was made.
Expenses for non-federal members
Members of the Task Force described in clauses (ii) and (iii) of subparagraph (B) shall be allowed travel expenses, including per diem in lieu of subsistence, at rates authorized for employees under subchapter I of chapter 57 of title 5, United States Code, while away from their homes or regular places of business in the performance of services for the Task Force.
The Task Force shall develop a coordinated plan to—
reduce the proliferation and impact of digital content forgeries, including by exploring how the adoption of a digital content provenance standard could assist with reducing the proliferation of digital content forgeries;
develop mechanisms for content creators to—
cryptographically certify the authenticity of original media and non-deceptive manipulations; and
enable the public to validate the authenticity of original media and non-deceptive manipulations to establish digital content provenance; and
increase the ability of internet companies, journalists, watchdog organizations, other relevant entities, and members of the public to—
meaningfully scrutinize and identify potential digital content forgeries; and
relay trust and information about digital content provenance to content consumers.
The plan required under paragraph (1) shall include the following:
A Government-wide research and development agenda to—
improve technologies and systems to detect digital content forgeries; and
relay information about digital content provenance to content consumers.
An assessment of the feasibility of, and obstacles to, the deployment of technologies and systems to capture, preserve, and display digital content provenance.
An assessment of the feasibility of, and challenges in, distinguishing between—
benign or helpful alterations to digital content; and
intentionally deceptive or obfuscating alterations to digital content.
A discussion of best practices, including any necessary standards, for the adoption and effective use of technologies and systems to determine digital content provenance and detect digital content forgeries.
Conceptual proposals for necessary research projects and experiments to further develop successful technology to ascertain digital content provenance.
Proposed policy changes, including changes in law, to—
incentivize the adoption of technologies, systems, open standards, or other means to detect digital content forgeries and determine digital content provenance; and
reduce the incidence, proliferation, and impact of digital content forgeries.
Recommendations for models for public-private partnerships to fight disinformation and reduce digital content forgeries, including partnerships that support and collaborate on—
industry practices and standards for determining digital content provenance;
digital literacy education campaigns and user-friendly detection tools for the public to reduce the proliferation and impact of disinformation and digital content forgeries;
industry practices and standards for documenting relevant research and progress in machine learning; and
the means and methods for identifying and addressing the technical and financial infrastructure that supports the proliferation of digital content forgeries, such as inauthentic social media accounts and bank accounts.
An assessment of privacy and civil liberties requirements associated with efforts to deploy technologies and systems to determine digital content provenance or reduce the proliferation of digital content forgeries, including statutory or other proposed policy changes.
A determination of metrics to define the success of—
technologies or systems to detect digital content forgeries;
technologies or systems to determine digital content provenance; and
other efforts to reduce the incidence, proliferation, and impact of digital content forgeries.
In carrying out subsection (c), the Task Force shall consult with the following:
The Director of the National Science Foundation.
The National Academies of Sciences, Engineering, and Medicine.
The Director of the National Institute of Standards and Technology.
The Director of the Defense Advanced Research Projects Agency.
The Director of the Intelligence Advanced Research Projects Activity of the Office of the Director of National Intelligence.
The Secretary of Energy.
The Secretary of Defense.
The Attorney General.
The Secretary of State.
The Federal Trade Commission.
The United States Trade Representative.
Representatives from private industry and nonprofit organizations.
Representatives from institutions of higher education.
Such other individuals as the Task Force considers appropriate.
Staff of the Task Force shall be comprised of detailees with expertise in artificial intelligence or related fields from—
the Department of Homeland Security;
the Office of Science and Technology Policy;
the National Institute of Standards and Technology; or
any other Federal agency the co-chairpersons of the Task Force consider appropriate with the consent of the head of the Federal agency.
The co-chairpersons of the Task Force may enter into an agreement with an eligible entity for the temporary assignment of employees of the eligible entity to the Task Force in accordance with this paragraph.
Application of ethics rules
An employee of an eligible entity assigned to the Task Force under subparagraph (A)—
shall be considered a special Government employee for the purpose of Federal law, including—
chapter 11 of title 18, United States Code; and
the Ethics in Government Act of 1978 (5 U.S.C. App.); and
notwithstanding section 202(a) of title 18, United States Code, may be assigned to the Task Force for a period of not more than 2 years.
An agreement entered into with an eligible entity under subparagraph (A) shall require the eligible entity to be responsible for any costs associated with the assignment of an employee to the Task Force.
The co-chairpersons of the Task Force may terminate the assignment of an employee to the Task Force under subparagraph (A) at any time and for any reason.
Task force reports
Not later than 1 year after the date on which all of the appointments have been made under subsection (b)(2)(C), the Task Force shall submit to the President and the relevant congressional committees an interim report containing the findings, conclusions, and recommendations of the Task Force.
The report required under subparagraph (A) shall include specific recommendations for ways to reduce the proliferation and impact of digital content forgeries, including the deployment of technologies and systems to determine digital content provenance.
Not later than 180 days after the date of the submission of the interim report under paragraph (1)(A), the Task Force shall submit to the President and the relevant congressional committees a final report containing the findings, conclusions, and recommendations of the Task Force, including the plan developed under subsection (c).
With respect to each report submitted under this subsection—
the Task Force shall make the report publicly available; and
shall be produced in an unclassified form; and
may include a classified annex.
The Task Force shall terminate on the date that is 90 days after the date on which the Task Force submits the final report under subsection (f)(2).
Upon the termination of the Task Force under paragraph (1), each record of the Task Force shall become a record of the National Archives and Records Administration.
May 24, 2022
Reported with an amendment