Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 4 of 4

Full-Text Articles in Entire DC Network

The Inconsistent Work Of Web Filters: Mapping Information Access In Alabama Public Schools And Libraries, Chris Peterson, Shannon M. Oltmann, Emily J. M. Knox Jan 2017

The Inconsistent Work Of Web Filters: Mapping Information Access In Alabama Public Schools And Libraries, Chris Peterson, Shannon M. Oltmann, Emily J. M. Knox

Information Science Faculty Publications

Recent popular and academic discussions regarding the Internet have raised the question of whether and how networked intermediaries have a (dis)integrating social effects. In this study, we use public records of configurations of Internet filters in Alabama public schools and libraries to show how different institutions implement nominally consistent content standards inconsistently. We argue that these varying implementations are both significant and troubling for two reasons: first, they overreach the stated goals of the legislation with which they in principle comply; second, they may contribute to a broader epistemic breakdown by fragmenting the kind of information made available through and …


Towards Merging Binary Integer Programming Techniques With Genetic Algorithms, Reza R. Zamani Jan 2017

Towards Merging Binary Integer Programming Techniques With Genetic Algorithms, Reza R. Zamani

Faculty of Engineering and Information Sciences - Papers: Part B

No abstract provided.


Auditing Algorithms For Discrimination, Pauline Kim Jan 2017

Auditing Algorithms For Discrimination, Pauline Kim

Scholarship@WashULaw

This Essay responds to the argument by Joshua Kroll, et al., in Accountable Algorithms, 165 U.PA.L.REV. 633 (2017), that technical tools can be more effective in ensuring the fairness of algorithms than insisting on transparency. When it comes to combating discrimination, technical tools alone will not be able to prevent discriminatory outcomes. Because the causes of bias often lie, not in the code, but in broader social processes, techniques like randomization or predefining constraints on the decision-process cannot guarantee the absence of bias. Even the most carefully designed systems may inadvertently encode preexisting prejudices or reflect structural bias. For this …


Data-Driven Discrimination At Work, Pauline Kim Jan 2017

Data-Driven Discrimination At Work, Pauline Kim

Scholarship@WashULaw

A data revolution is transforming the workplace. Employers are increasingly relying on algorithms to decide who gets interviewed, hired, or promoted. Although data algorithms can help to avoid biased human decision-making, they also risk introducing new sources of bias. Algorithms built on inaccurate, biased, or unrepresentative data can produce outcomes biased along lines of race, sex, or other protected characteristics. Data mining techniques may cause employment decisions to be based on correlations rather than causal relationships; they may obscure the basis on which employment decisions are made; and they may further exacerbate inequality because error detection is limited and feedback …