I've long entered a debate with close family members over objective/concrete evidence and its often assumed absence in literary criticism. But I was struck by Ramsay's argument, in particular by his definition of data mining. According to Ramsay, who borrows a definition from Witten, "data mining is the extraction of implicit, previously unknown, and potentially useful information from data." This data, processed through algorithms, destills to "regularities" and "patterns" (185). It continues, "Many patterns will be banal and uninteresting. Others will be spurious, contingent on accidental coinicidences in the particular dataset used. And real data is imperfect: some parts are garbled, some missing. Anything that is discovered will be inexact." And this inexactness made me think of Derrida's Archive Fever, notably how the data in an archive is also far from perfect, subject to irregularity, and will never be approached from a neutral, purely rational, or objective standpoint. Interpretation will always be encoded in the human, in the moment, from our place of cultural understanding. It will never occur in a vacuum, always informed by data and previous patterns or discoveries.
Likewise, Ramsay comments on this lack of neutrality as several of my classmates have already discussed. Ramsay states that software cannot even be neutral as "there is no level at which assumption disappears" and instead, argues that this "lack of neutrality" should be "assert[ed]" with "candor, so that the demonstrably non-neutral act of intrepretation can occur" (182). If I see Derrida as looking to encode the cultural archive, I see Ramsay on a smaller scale, attempting to show how Derrida's truisms (or at least what I see as truth out of Archive Fever) also function on a smaller scale in data extraction. Both comment on the subjective human-process of patterning and privileging information.