13287
Blog

What Is Section 1557 and How Can You Prepare for It?

As healthcare organizations increasingly adopt software tools, including those incorporating AI, it’s critical to ensure that these tools offer equity, fairness, and safe and effective outcomes for all. As part of such efforts, healthcare providers must comply with nondiscrimination laws, such as Section 1557 of the Patient Protection and Affordable Care Act (PPACA). 

Please note that while covered entities must comply by May 1, potential changing policy priorities, the removal of several online compliance resources, and a significant department reorganization have made HHS’ future enforcement intentions less clear.

*Important Disclaimer: The information provided here is intended as general guidance and should not be considered legal advice. Readers should consult with legal counsel for specific questions regarding their obligations under the rule.

Understanding Section 1557 of the ACA

What is Section 1557?
Section 1557 is a civil rights law enforced by the U.S. Department of Health and Human Services (HHS). It prohibits discrimination in health programs and activities that receive federal financial assistance, are administered by an executive agency or are established under Title I of the ACA. 

The rule explicitly addresses discrimination based on race, color, national origin (including language access), sex (including pregnancy, sexual orientation and gender identity), age and disability.

It also expands on prior regulations by emphasizing protections against discrimination in the use of algorithms, clinical decision support tools and AI in healthcare.

How does it impact your healthcare facility?
Section 1557 mandates that all covered entities — including radiologists and radiology practices — be accountable for preventing discrimination, including any bias arising from the use of AI. The final rule applies to any healthcare entity receiving federal funding, such as hospitals, clinics and health insurance providers. Some important aspects:

  • AI and clinical algorithms: The rule directly addresses the use of “patient care decision support tools”, including clinical algorithms and AI-based tools in healthcare. The use of these tools must not result in discriminatory outcomes.
  • Accountability: Healthcare providers, including radiologists, are responsible for ensuring that tools like AI software don’t create or perpetuate bias.
  • Language access and accessibility: Facilities must ensure meaningful access for individuals with limited English proficiency and make services accessible for individuals with disabilities.
  • Nondiscrimination policies: Healthcare practitioners’ policies, procedures and training must be updated to align with Section 1557 requirements.

Key compliance deadlines

It’s important to note this timing reflects what was outlined by the Biden Administration and could change under the Trump Administration. We’ll update this information as more details become available. 

  • Nov. 2, 2024
    • Assign a Section 1557 coordinator and provide a notice of nondiscrimination to patients and the public.
  • May 1, 2025
    • Reasonable efforts to identify uses of patient care decision support tools in its health programs or activities that employ input variables or factors that measure race, color, national origin, sex, age or disability.
    • Reasonable effort to mitigate risks arising from such use. 
  • July 5, 2025
    • Ensure patient care decision support tools are nondiscriminatory. 
    • Develop and implement nondiscrimination policies and procedures. 
    • Train staff on new policies and procedures. 
    • Provide notice of available language assistance services and auxiliary aids.

What Actions Should You Take to Prepare?

To comply with Section 1557, healthcare facilities should:

  • Conduct governance and compliance reviews: Evaluate governance processes for AI deployment and monitoring. Document policies and procedures for addressing concerns about discrimination in clinically deployed AI algorithms.
  • Identify and mitigate bias: Understand the potential for bias in clinical algorithms and AI tools. For example:
    • Investigate the data sources used for AI training to ensure representation.
    • Identify tools that may utilize race, sex or disability factors in a way that could produce inequitable outcomes.
  • Educate and train staff: Train clinicians and staff on recognizing potential bias in AI outputs and emphasize the need for professional judgment in decision-making.
  • Enhance accessibility: Provide translation and interpretation services for patients with limited English proficiency and accommodations for individuals with disabilities.
  • Engage partners: Ensure third-party tools comply with Section 1557 by asking targeted questions about their compliance practices.
  • Monitor developments: Despite the recent executive order narrowing gender definitions, organizations should still prioritize equitable care, review their policies and provide training to uphold nondiscrimination standards while anticipating potential legal outcomes.

    Stay Current on Section 1557 Developments

    The American College of Radiology is actively monitoring developments at the HHS Office for Civil Rights (OCR) and keeping members informed.

    Explore the Latest AI Insights, Trends and Research

    Amalia Schreier
    Amalia Schreier serves as the Senior Vice President of Regulatory Affairs and Legal at Aidoc, guiding our company and product’s regulatory strategies and ensuring alignment with AI-focused medical device compliance requirements. Since her tenure began, she has streamlined our FDA clearance processes, emphasizing a meticulous approach that underscores our commitment to product and clinical quality. With a solid foundation from her legal background and leadership role in AI startup regulatory departments, Schreier brings invaluable insights and expertise to our regulatory framework. Prior to her tech world experience, she worked as a human-rights lawyer and legal policy scholar, with a BA and LLM in law from the Hebrew University of Jerusalem.
    Amalia Schreier
    Senior Vice President, Regulatory Affairs and Legal
    Elinor Kaminer Goldfainer
    Elinor Kaminer Goldfainer is the Regulatory Affairs Team Lead at Aidoc, where she oversees regulatory strategy and compliance for AI-driven medical imaging solutions. With a robust legal background—including an LL.B. and an LL.M. from Tel Aviv University—and experience in human rights law, she brings a unique perspective to the intersection of technology and healthcare.
    Elinor Kaminer Goldfainer
    Team Lead, Regulatory Affairs