Skip to Content
Edelman Combs Latturner & Goodwin, LLC Edelman Combs Latturner & Goodwin, LLC
Call Us Today! 312-626-3585
Top

Protecting the Rights of Consumers For Over 25 Years

|

CFPB Acts to Protect the Public from Black-Box Credit Models Using Complex Algorithms

Companies relying on complex algorithms must provide specific and accurate explanations for denying applications

Washington, D.C. — Today, the Consumer Financial Protection Bureau (CFPB) confirmed that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms. The CFPB published a Consumer Financial Protection Circular to remind the public, including those responsible for enforcing federal consumer financial protection law, of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

Data harvesting on Americans has become voluminous and ubiquitous, giving firms the ability to know highly detailed information about their customers before they ever interact with them. Many firms across the economy rely on these detailed datasets to power their algorithmic decision-making, which is sometimes marketed as “artificial intelligence.” The information gleaned from data analytics has a broad range of commercial uses by financial firms, including for targeted advertising and in credit decision-making.

Law-abiding financial companies have long used advanced computational methods as part of their credit decision-making processes, and they have been able to provide the rationales for their credit decisions. However, some creditors may make credit decisions based on the outputs from complex algorithms, sometimes called “black-box” models. The reasoning behind some of these models’ outputs may be unknown to the model’s users, including the model’s developers. With such models, adverse action notices that meet ECOA’s requirements may not be possible.

ECOA protects individuals and businesses against discrimination when seeking, applying for, and using credit. To help ensure a creditor does not discriminate, ECOA requires that a creditor provide a notice when it takes an adverse action against an applicant, which must contain the specific and accurate reasons for that adverse action. Creditors cannot lawfully use technologies in their decision-making processes if using them means that they are unable to provide these required explanations.

Today’s Circular makes clear that:

  • Federal consumer financial protection laws and adverse action requirements should be enforced regardless of the technology used by creditors. For example, ECOA does not permit creditors to use technology that prevents them from providing specific and accurate reasons for adverse actions. Creditors’ use of complex algorithms should not limit enforcement of ECOA or other federal consumer financial protection laws.

  • Creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new. Creditors who use complex algorithms—including artificial intelligence or machine learning technologies—to engage in credit decisions must still provide a notice that discloses the specific, principal reasons for taking adverse actions. There is no exception for violating the law because a creditor is using technology that has not been adequately designed, tested, or understood.

Categories: 
Share To: