216876

Degenerate epistemology

Luciano Floridi

pp. 1-3

Science, especially physics, has taught us to be very cautious about our naïve certainties (“that's the way it is!”), everyday intuitions (“it must be that way!”), and commonsensical rejections (“that's impossible!”). While reading this issue of Philosophy & Technology, just recall that we are all travelling at about 100,000 km/h around the sun. Indeed, we are getting so used to contemporary science supporting extraordinary claims that abrasively clash with what we would consider plausible, that we might overreact, and be inclined to believe almost anything. If tomorrow some credible source tells us that unicorns have been biologically engineered in some lab, how many of us would be utterly incredulous? So when scientists come up with some incredible results, what should we believe? The problem is exacerbated by the fact that these days, experiments churn gazillions of data. The Large Hadron Collider, currently the largest highest-energy particle accelerator, pumps out approximately 15 petabytes of data per year, which require a dedicated computational grid to be refined, analysed, and put to proper use. The more data and analysis we need, the more likely it is that something might go wrong in the process. Quality standards and safety measures are serious issues in the knowledge industry too.

Publication details

DOI: 10.1007/s13347-012-0067-6

Full citation:

Floridi, L. (2012). Degenerate epistemology. Philosophy & Technology 25 (1), pp. 1-3.

This document is unfortunately not available for download at the moment.