Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 3 of 3

Full-Text Articles in Computer Engineering

Http 1.2: Distributed Http For Load Balancing Server Systems, Graham M. O'Daniel Jun 2010

Http 1.2: Distributed Http For Load Balancing Server Systems, Graham M. O'Daniel

Master's Theses

Content hosted on the Internet must appear robust and reliable to clients relying on such content. As more clients come to rely on content from a source, that source can be subjected to high levels of load. There are a number of solutions, collectively called load balancers, which try to solve the load problem through various means. All of these solutions are workarounds for dealing with problems inherent in the medium by which content is served thereby limiting their effectiveness. HTTP, or Hypertext Transport Protocol, is the dominant mechanism behind hosting content on the Internet through websites. The entirety of …


Web-Dinar: Web Based Diagnosis Of Network And Application Resources In Disaster Response Systems, Kartik Deshpande Jan 2010

Web-Dinar: Web Based Diagnosis Of Network And Application Resources In Disaster Response Systems, Kartik Deshpande

Masters Theses 1911 - February 2014

Disaster management and emergency response mechanisms are coming of age post 9/11. Paper based triaging and evacuation is slowly being replaced with much advanced mechanisms using remote clients (Laptops, Thin clients, PDAs), RFiDs etc. This reflects a modern trend to deploy Information Technology (IT) in disaster management. IT elements provide a great a deal of flexibility and seamlessness in the communication of information. The information flowing is so critical that, loss of data is not at all acceptable. Loss of data would mean loss of critical medical information portraying the disaster scenario. This would amount to a wrong picture being …


Some Skepticism About Search Neutrality, James Grimmelmann Jan 2010

Some Skepticism About Search Neutrality, James Grimmelmann

Faculty Scholarship

In the last few years, some search-engine critics have suggested that dominant search engines (i.e. Google) should be subject to “search neutrality” regulations. By analogy to network neutrality, search neutrality would require even-handed treatment in search results: It would prevent search engines from playing favorites among websites. Academics, Google competitors, and public-interest groups have all embraced search neutrality.

Despite this sudden interest, the case for search neutrality is too muddled to be convincing. While “neutrality” is an appealing-sounding principle, it lacks a clear definition. This essay explores no fewer than eight different meanings that search-neutrality advocates have given the term. …