Whether in streams, flows or feeds, the message latent in the contemporary ubiquity of dashboards, decks, and push notifications appears unequivocal: the only time today is real-time. Or to put it another way, time has, finally, become real. Rather than empty and calendrical, the sands of the hourglass that govern our activities, interests and encounters appear to have been materialized into the material–albeit only marginally–bits and bytes of information flows. This, of course, has been prophesied for some time. The difference today is that the deluge has become explicit, undeniable, and ominous. It has arrived.
The contemporary informational sublime is not only sequestered to early adapting tweeting, tumbling, liking and circling tweens. These new media that provide channels full of noise are nonetheless similarly the communication apparatuses that have begun to be the de facto platforms for all political, economic and social exchange. It is growing increasingly more difficult to insulate oneself from the buzz of the billion—and steadily increasing—daily tweets or the four billion—and steadily increasing—facebook interactions per day. Whether in the collective collateral form of an information ticker at the bottom of the 24-hour news channel or in the individual curated form of the carefully manicured streams we stay glued to through mobile devices like prosthetics replacing phantom limbs that never existed, the passage of time can be said to be measurable through the constant information stream of updates we parse, edit and, increasingly less so, process.
This rough sketch of what has become a banal truism leads us, by corollary, to a more serious matter. In addition to being global, social, ubiquitous and cheap—the four-part mantra of information theorist Clay Shirky—information today is likewise fast, very fast. To risk the serious question, what are the stakes for knowledge in this environment?
Perhaps not historically true, in modernity, though, knowledge is not identical to information. However specious through a contemporary lens, its notion of progress, the monotonic accumulation of knowledge over time, lends a distinction. Knowledge being the remainder when the needles of truth were mined from the haystacks of empirical detritus, held up to the light, passed around, and agreed to be without defect, progress was the growing stack of needles. Putting aside the ideological implications and compulsions of progress, this mechanical metaphor leaves us with a distinction born of the input/output ratio—much information goes in, spurts of knowledge come out—while it somewhat obfuscates another fundamental difference, and thereby its consequence: information was certain. The more time we spent collecting it, the more we had. Knowledge was not. It was contingent, subject to chance alignments, breakthroughs and moments of clarity. A more banal, yet operative, distinction is the result: information was fast, knowledge was slow. Not simply a by-product, the speed differential was structural. Progress progressed necessarily on the backs of bouts of endurance with testing, verification, and falsification. It was not only characterized by but also propelled methodologically and ideologically by its laboriousness, the very longness and slowness of its march.
Fine for modernity, but does this incommensurate pacing still hold? Or more instructively, what if it does not? What if the apparatuses of our information age that provide for real-time information transfer can likewise accelerate the production of knowledge? That is, what if they are able to close the gap between the time it takes to disseminate the documentation of observations and the time it takes to parse these into principles and relationships? Recent history offers little purchase. Even as progress has become somewhat of a cuss word in academia since the mid-twentieth century, there is still a lack of an Einsteinian theory of knowledge, a theory of knowledge at great speed. Such a theory is well beyond the potential ambit and actual ambition of the current text. Where might such a theory—a theory of real-time knowledge—be found, and why has it yet to surface?
Being at once the house of theory and the house of knowledge, the university would be the obvious place to begin regarding the former. Yet it leads us first to the latter, for it is precisely the university that is the institution most threatened by the possibility of real-time knowledge. Indeed, the university was founded precisely to divorce knowledge from both time and reality. Originally, it provided the structure to divorce the scholar from the world, from the instability of everyday life, wherein “the sleep of reasons produces monsters”. More recently, it has served the role of buffering the temporality of the market that requires a timely return on its investment.
The former dislocation dates back to the founding of the first university in Paris in the twelfth century. “The freedom of wandering is divided into two,” wrote Stephen of Tournoi, an early influence and theorist of the university, “the movement of the body through different places and the movement of the mind through different images. The curious wander with their eyes going from place to place, kingdom to kingdom, city to city, province to province… Those fickle and unstable in mind also wander… This freedom of wandering in the mind through different images tires and impedes scholars in their studies and the cloistered in their prayers.” Which is to say, as Mark Wigley has argued, the university was thus an attempt to resist “the ‘wandering mind’ before actual buildings are constructed to resist the “wandering body”. Before there was even a wall erected between the city and the scholar, the imaginary architecture of the university separated the scholar from the rhythms, needs and perturbations of real-time.
Separation from the market is registered in the late eighteenth century in Jean-Francois Lyotard’s history, wherein industry entered into a positive feedback loop with scientific knowledge. Not only did industry begin to rely upon the techno-scientific advancements of research, the university itself became dependent on financing from industry. In Lyotard’s words, “no technology without wealth, but no wealth without technology.” In order to reintroduce distance in this relationship of uneasy proximity, industry began to fund private research institutions, which could then themselves fund the universities. This separation freed universities to research without necessitating immediate returns, a mechanics underwritten by “the theory that research must be financed at a loss for certain length of time in order to increase the probability of its yielding a decisive, and therefore highly profitable, innovation.” That is, the separation of the university from the market, from real-time, was itself a market strategy.
So then, where to look? Lyotard’s history is in support of a theory of modernity governed by legitimation in the form of performativity. The union between knowledge production and the production of capital hinged on the idea that there was a stable relationship between the input of resources into the former—the university—and outputs generated for the latter—industry. The market-driven modern condition was thus a break in the classical mode of knowledge production: “scientists, technicians, and instruments are purchased not to find truth, but to augment power.” The Postmodern Condition, then, the title of the text and the stakes for Lyotard writing at the end of the 1970s, was framed as yet another break in the ends or “legitimation” of knowledge. By this time, Lyotard claims, the modern relationship between industry and research had expired. Whereas industry once looked to experts to provide a “scientific secret,” to produce proprietary knowledge unavailable to competitors, the post-modern condition is rather a “game of perfect information.” In such a game, extra performativity, extra production, does not come from producing new knowledge, but rather from “arranging the data in a new way,” from separating and recombining data into patterns and trends connecting that which was previously thought to be independent. “This capacity to articulate what used to be separate can be called imagination.” More importantly, “Speed is one of its properties.”
Applying this caveat—speed—to Lyotard’s subsequent claim reveals a previously overlooked prescience in his futurology. Understanding that “computerization” would result in the abstraction of knowledge production away from a traditional mode of accumulation to the re-arranging of pieces of information, it was Lyotard’s plea that information be made free. Whereas the anxiety of this recommendation speaks to the historical moment of his writing, its content provides an infrastructure for evaluating the present. “Give the public free access to the memory and data banks,” he recommended, actualizing the game of perfect information. For Lyotard, this sketched “the outline of a politics that would respect both the desire for justice and the desire for the unknown.” Yet, thirty odd years hence, it is precisely through the realization of this type of game that a political economics has developed that undermines both social justice and tolerance for uncertainty.
In the American financial industry, over 70% of market transactions today take the form of high frequency trading (HFT), and the figure is rapidly growing. Rather than trading securities based on the economic fundamentals of the underlying asset or commodity, HFT is a second order operation. It affects a game of perfect information by leveling the field between participants. Often occurring in “dark pools,” proprietary exchanges in which both buyers and sellers are deprived of price information, HFT is driven by computers on hyper-fast networks typically located in co-location centers, like the NYSE Euronext exchange in Mahwah, New Jersey where massive server banks are installed in the same location as the exchange server to be leased to trading shops to ensure the highest speed of transaction. Complex algorithms drive thousands of automated trades per second, putting and canceling orders, re-calibrating, and repeating in order to suss out market value in sophisticated processes of trial and error. In this game of poker, the underlying value of securities dematerialize, giving way to a system that increasingly references itself. The result is a milieu of immediacy in which any kernel of information inputted immediately results in a new state that invalidates the characteristics of the previous state. To put it another way, it is a real-time environment in which the very speed of its apparatus makes it impossible to distinguish between knowledge and information.
Whereas HFT has been around for some time, a more recent financial phenomenon makes use of another type of game of perfect information, precisely the type of game for which Lyotard lobbied. In April of this year, the London-based £25 million Derwent Capital hedge fund launched, using a proprietary program to process the content of millions of daily twitter messages in order to predict the movement of the market. Based on work done by Derwent consultant John Bollen at Indiana University, the seemingly inane stream of consciousness converted into 140 character-long blasts that is the twitter-verse was shown to collectively be able to predict the Dow Jones Industrial Average with 87 percent accuracy. In the fund’s press release of May 16, 2011, founder Paul Hawtin is quoted as saying, “For years investors have widely accepted that financial markets are driven by fear and greed, but we’ve never before had the technology or data to be able to quantify human emotion. This is the 4th dimension.” It is precisely because this information is open to the public, because twitter provides for a game of perfect information by way of a contemporary form of a law of large numbers, that this 4th dimension is open to a small minority of wealthy investors. Or, to repurpose Lyotard’s words, “no technology without wealth, but no wealth without technology.”
Given the recent onslaught of crises toting the adjective “financial,” the claim that finance, in part, drives history could perhaps be considered among the least controversial claims of this text. In the context of such a claim, HFT would not simply be a hermetic game played in an isolated house of mirrors, but a real game with real stakes that reach out into the most remote parts of the world. More global, and thereby more troubling, is sentiment-based trading. Using systems programmed by knowledge established in universities to convert a real-time global archive of public information into for profit transactions, sentiment trading would not only be potentially participating in the writing of history as a result of its fallout, but is precisely a historiographical technology itself. Gathering and parsing millions of statements a day, sentiment tracking could perhaps be described, in the terms of another strain of post-war French thought, as a means by which to draw out the discursive formations of public discussion. Activated in real-time, perhaps we can say that sentiment tracking is thus the actualization of a cocktail of mid-nineteenth and mid-twentieth century fantasies, the conversion of social facts into historical facts in real-time.
Where as the “avalanche of numbers” of the early nineteenth century in France, or the impossibly long scrolls of imperial accounting that accrued in untouched piles in late nineteenth century Britain, overran the capacity of their receivers to process them, today’s technology allows for massive streams of information to become tractable, and thereby useful. The university has not only largely failed to embrace such processing technologies, they are precisely the institutions developed to resist such speed. As such, the epicenter of the production and exploitation of real-time knowledge is not the house of knowledge, but finance. The stakes, and the opportunity, are rendered in high relief when recalling Wigley’s alert, now two decades old, that, “the university does not examine the foundation of its own foundation. Its architecture is unstable, necessarily erected over an abyss.” Supported by the observation that “the networks of communication have become the new house of theory,” Wigley’s remarks can be read as an invitation, and an invitation to re-theorize the space in which knowledge is produced. This begs the question: Where to start looking?