Open Access. Powered by Scholars. Published by Universities.®

Civil Rights and Discrimination Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 3 of 3

Full-Text Articles in Civil Rights and Discrimination

Inequality And The Mortgage Interest Deduction, Kyle Rozema, Daniel J. Hemel Jan 2017

Inequality And The Mortgage Interest Deduction, Kyle Rozema, Daniel J. Hemel

Scholarship@WashULaw

The mortgage interest deduction is often criticized for contributing to after-tax income inequality. Yet the effects of the mortgage interest deduction on income inequality are more nuanced than the conventional wisdom would suggest. We show that the mortgage interest deduction causes high-income households (i.e., those in the top 10% and top 1%) to bear a larger share of the total tax burden than they would if the deduction were repealed. We further show that the effect of the mortgage interest deduction on income inequality is highly sensitive to the alternative scenario against which the deduction is evaluated. These findings demonstrate …


Data-Driven Discrimination At Work, Pauline Kim Jan 2017

Data-Driven Discrimination At Work, Pauline Kim

Scholarship@WashULaw

A data revolution is transforming the workplace. Employers are increasingly relying on algorithms to decide who gets interviewed, hired, or promoted. Although data algorithms can help to avoid biased human decision-making, they also risk introducing new sources of bias. Algorithms built on inaccurate, biased, or unrepresentative data can produce outcomes biased along lines of race, sex, or other protected characteristics. Data mining techniques may cause employment decisions to be based on correlations rather than causal relationships; they may obscure the basis on which employment decisions are made; and they may further exacerbate inequality because error detection is limited and feedback …


Auditing Algorithms For Discrimination, Pauline Kim Jan 2017

Auditing Algorithms For Discrimination, Pauline Kim

Scholarship@WashULaw

This Essay responds to the argument by Joshua Kroll, et al., in Accountable Algorithms, 165 U.PA.L.REV. 633 (2017), that technical tools can be more effective in ensuring the fairness of algorithms than insisting on transparency. When it comes to combating discrimination, technical tools alone will not be able to prevent discriminatory outcomes. Because the causes of bias often lie, not in the code, but in broader social processes, techniques like randomization or predefining constraints on the decision-process cannot guarantee the absence of bias. Even the most carefully designed systems may inadvertently encode preexisting prejudices or reflect structural bias. For this …