Open Access. Powered by Scholars. Published by Universities.®
- Discipline
-
- Science and Technology Law (14)
- Artificial Intelligence and Robotics (9)
- Computer Sciences (9)
- Physical Sciences and Mathematics (9)
- Social and Behavioral Sciences (7)
-
- Computer Law (6)
- Public Affairs, Public Policy and Public Administration (5)
- Health Law and Policy (4)
- Intellectual Property Law (4)
- Public Administration (4)
- Science and Technology Studies (4)
- Administrative Law (3)
- Constitutional Law (3)
- Law and Economics (3)
- Public Law and Legal Theory (3)
- Biomedical Devices and Instrumentation (2)
- Biomedical Engineering and Bioengineering (2)
- Business Organizations Law (2)
- Economics (2)
- Engineering (2)
- Growth and Development (2)
- Industrial Organization (2)
- Internet Law (2)
- Labor Economics (2)
- Law and Society (2)
- Privacy Law (2)
- Theory and Algorithms (2)
- Torts (2)
- Antitrust and Trade Regulation (1)
- Institution
-
- University of Michigan Law School (7)
- University of Pennsylvania Carey Law School (4)
- Boston University School of Law (2)
- George Washington University Law School (2)
- Texas A&M University School of Law (2)
-
- University of Georgia School of Law (2)
- University of Washington School of Law (2)
- Vanderbilt University Law School (2)
- California Western School of Law (1)
- Case Western Reserve University School of Law (1)
- Columbia Law School (1)
- Nova Southeastern University (1)
- Penn State Dickinson Law (1)
- Singapore Management University (1)
- University of Florida Levin College of Law (1)
- William & Mary Law School (1)
Articles 31 - 31 of 31
Full-Text Articles in Law
The Input Fallacy, Talia B. Gillis
The Input Fallacy, Talia B. Gillis
Faculty Scholarship
Algorithmic credit pricing threatens to discriminate against protected groups. Traditionally, fair lending law has addressed such threats by scrutinizing inputs. But input scrutiny has become a fallacy in the world of algorithms.
Using a rich dataset of mortgages, I simulate algorithmic credit pricing and demonstrate that input scrutiny fails to address discrimination concerns and threatens to create an algorithmic myth of colorblindness. The ubiquity of correlations in big data combined with the flexibility and complexity of machine learning means that one cannot rule out the consideration of protected characteristics, such as race, even when one formally excludes them. Moreover, using …