Logo  
  | home | authors | calendar colophon | links | newsgroups | newsfeed | new | printer version |  
volume 6
march 2004

Going digital?

 





   
by Hans Durrer
Previous
  René Magritte's painting "The Treason of Images" (1928-'29; Los Angeles County Museum of Art)

Only four years ago Internet provider AOL merged with Time Warner, thereby constituting a huge global media and entertainment group. The new company, however, did not live up to the big expectations. Over the next years its debts went spiralling down to a depth of 30 billion dollar. In September 2003 the board of directors confirmed that the company would change its name back to Time Warner. At the time of the merger, Hans Durrer wrote this essay which we here reprint for its early doubts about the whole operation.

 
1 Predicting the future of the Internet. When, in January 2000, it was announced that AOL, the world's biggest online service, and Time Warner, the biggest media conglomerate in the world, planned to merge, commentaries spoke of a new era, an online revolution. What had happened? A mega-fusion had occurred and would in a short time create a new type of business-conglomerate in which AOL would contribute 22 million customers, and Time Warner would provide its cable-network as well as "content": news, magazines, music, film and television, that is. Distribution opportunities would be endless: AOL customers would get Time Warner magazines at special rates, or they could download Warner music or Warner movies; conversely, magazine subscribers would get special offers if they were AOL members — welcome to the net world of AOL Time Warner Inc. where customers hop — online and offline — from offer to offer. Such, according to the German weekly Der Spiegel (Bredow and Jung, 2000: 93) is the vision of AOL's Steve Case and Time-Warner's Gerald Levin. What may sound like paradise for professional sales people, may resemble more a nightmare for the ones who fear a so far unparalleled uniformity looming on the horizon.
  Clearly, much is changing in the corporate world and it does so at breathtaking speed. It is speed, more than anything else, that characterises our time. "We want the world and we want it now," is not limited anymore to pubescent thinking and adolescent longings. It has become the norm. Immediate satisfaction, instantaneous gratification, that is, if we are to believe the prophets of unlimited consumption, what we are all here for. Are we really? An almost pointless question when one is surrounded by opportunities for online-shopping, online-travel, online-banking, online-auctions, and the like. However, are we at a turning point? Are we, as Newsweek (1999: 47) claims:
  "... at the beginning of a new way of working, shopping, playing and communicating. At Newsweek we're calling this phenomenon e-life, and it's just in time. Because the day is approaching when no one will describe the digital, Net-based, computer-connected gestalt with such a transitory term. We'll just call it life."
  The problem with predicting the future has always been that the future refuses to give itself to predictions. In addition, human beings are known to overestimate the importance of the times they are living in — this is only natural since we all believe — at least until we turn thirty nine and a half — that the world came into being when we were born. This essay will look at how the "digital age" has come about, and it will attempt to put the prevailing euphoria into perspective. It will argue that the impact of new technology will be far less significant than its prophets predict — centuries old power structures, as well as the human condition, are very unlikely to change dramatically because of some new, fascinating, and convenient tools. Yet things do change — change is, after all, the only constant in life — and the more subtle changes risk going unnoticed, as this essay will also aim at demonstrating.
2 The network of networks. The Internet, "the network of networks," is said to be "amazing, uncontrolled, vast, an essential information tool, interactive and multi-media, inexpensive and easy to use" (Paul, 1996: 1-2). Despite the fact that one can easily agree on such a characterisation, it is difficult to see how "uncontrolled" and "vast" can be compatible with the notion of an "essential information tool" which would, more likely, emphasise reliability over size. It goes without saying, that the growth of the Internet will, most probably, lead to more of the "uncontrolled" and "vast" variety, and might, eventually, turn it into a widely appreciated entertainment tool like, say, television or video. When speaking of the size of the Internet, one needs to bear in mind, that one is referring to "a phenomenon of small isolated countries" (Batty and Barr, 1994; cited in: Nicholas and Williams, 1999: 3). A calculation based on the number of hosts per 1,000 people showed Northern Europe and Switzerland at the forefront of connectivity. The overall inequality is striking. Victor Keegan (1997; cited in: Nicholas and Williams, 1999)) of The Guardian notes that 96% of the hosts — geographically speaking — are located within the rich 27-nation OECD area.
  So what exactly is the Internet, and how did it come about? As is often the case, military concerns gave birth to this new technology. When, in 1957, the USSR launched Sputnik, there was concern in the United States that Americans might fall behind in the technological race between the two superpowers. The Department of Defence set up ARPA, the Advanced Research Projects Agency, "to ensure that the country stayed at the forefront of technology" (Jonscher, 1999: 157). Computers were bought and used for defence-related issues, and one fine day they were connected together: ARPANET, the direct predecessor of the Internet, was born. Although e-mail was already established at that stage, it took until the early 1990's for the general public to start to become aware of something extraordinary going on.
  In 1989, the "World Wide Web" was conceived by Tim Berners-Lee at CERN, the European Particle Physics Lab near Geneva, Switzerland. Essentially, the Web was an extension of the already existing e-mail. As Charles Jonscher explains (1999: 161):
  "If users have mailboxes, at which they collect the private messages sent to them, why not give them "web sites," at which they can leave public messages for anyone who "visits" to pick up. Furthermore, there would be many interconnections called hyperlinks between pages of data, programmed by the creator of the site to enable visitors to jump to other websites. The hyperlinks could be to anywhere else on the Net: the Web was to be worldwide. The visitors would be "surfers," people going from site to site on the Internet to see what others had posted for them to see. The Internet would cease to be just a private domain where people sent each other messages — although that would still happen — it would become a very public domain in which, in effect, anyone could become a publisher. Put up your news on your website and then just sit back and wait for visits."
  The rapid, and — in the technological field — so far unprecedented (Jonscher, 1999: 161), growth in popularity that the Internet enjoys has been attributed "largely to the development of the World Wide Web and graphical browsers" (Stein, 1999: 10). This view however is not undisputed. Brian Winston (1997: 335), for example, states:
  "From the very beginning it has been clear that the most unambiguously valuable facility provided by the net is e-mail."
  If we were to believe recent media-outputs (Newsweek, 1999; Bredow and Jung, 2000, Der Spiegel, 2000), the dominating factor will soon be e-commerce. We are told that there are no limits to what the Internet has on offer, and we are, furthermore, told that the new economy is based on total transparency. Moreover, a future life is envisioned in which we do not have to leave our house anymore — provided we own one. Everything is just a mouse click away — albeit not for the ones who will still have to deliver the goods to our doorsteps. As much as technological progress might be desirable to make our life easier, one cannot help but regard this display of unlimited optimism as somewhat naïve. It sounds very much like the gospel of "digital capitalism" (Peter Glotz; cited in Enzensberger, 2000: 93). As Richard Bolton (1989: 261) reminds us:
  "... much of the growth in the information economy is the result of increased Defence Department spending. And most importantly, it should be recognized that a pernicious class structure accompanies this economy, a structure that privileges the managerial class and the elite at the expense of the hourly wage-earner, the poor, and the less educated. In fact, the new economy has been accompanied by a general resurgence of authority across the board in business and government. Upon close examination, we find that the power base of America's post-industrial society is precisely the conservative business class currently in power."
  Moreover, Bolton (1989: 262) argues:
  "... we are offered the fantasy of high tech, a fantasy that obscures the real social consequences of post-industrial production. Capitalism seems more heroic than ever — information commodities seem to generate spontaneously from deep within the system, bodied forth as evidence of the utopia to come. We are surrounded by the corporate sublime, by the promise of a new age of pure information and pure capital. Communication seems boundless, and we count on its speed and abundance to bring about a better world. But in spite of this plentitude, we have only a narrow understanding of the world. Or behind this information economy, and behind our information itself, can be found the same viewpoint, the same class, the same owners."
3 The present into the future. Is this true? Have "competing opinions and conflicting realities ... [really] been excluded from the post-industrial revolution?" (Bolton, 1989: 262). The claimed transparency of the Internet, clearly, should indicate otherwise as well as the emergence of hundreds and thousands of over night young millionaires. On the other hand: we all know — a lesson that life teaches us — that exchanging favours — and not publicly expressed opinions — keeps the world going. It has always been that way. Transparency, for instance, has never been, and still is not desirable. What would we be without our secrets? What if we were to mean what we say? The first post-war German chancellor Adenauer said — as oral history has it — about his successor Erhard that he was totally unfit to be chancellor because he believed what he said. True, Adenauer was an old cynic, but he certainly knew what he was talking about. To further Bolton's argument: we, occasionally, hear of insider trading and believe it to be isolated cases. It more likely is the rule. And where, one might ask, is the evidence for this assertion? The above mentioned phenomenon of exchanging favours — this, of course, includes sensitive information — is practised in all cultures, and on all levels — it is a fact of life, and it stands in total opposition to transparency. Yet, this is not to say that the Internet will not make a few things more transparent, it surely will, and it has done so already, as Matt Drudge with the Monica Lewinsky-story has demonstrated. Furthermore: wherever transparency is not seen as a threat to the established order, it very likely will be welcomed.
  There is no doubt that what the Internet offers is amazing and so far it has not ceased to surprise us at all. Recently, Der Spiegel (2000) devoted 80 pages to CEBIT, the annual computer fair in Hanover, Germany. History, from now on, will be written differently — it is said. What counts economically is the number of hits on a web page, one reads, and one starts to wonder if one has been completely left behind or if the world has gone mad. Online banking, we are told, is cool, and online travel booking the thing to do. Max Anderson (2000: 9), a journalist for the Sunday Times, has tried it and found out that what sounds easy is still extremely time consuming. The exercise of booking a last minute trip to Amsterdam resulted in "eight hours of online aggravation." So far, from job-search to buying a house, online exploring is an extremely time-consuming affair and not at all, as its propagandists would have it, just a mouse-click away. This, however, might change since we are still in the very early stages — we are told — of this new technology.
  On the other hand: the Web certainly looks like the future for classifieds; it generates new — hitherto unheard of — jobs like screen designers, web masters, or content managers; it has created online journalism that allows newspapers and magazines to compete with the breaking news programmes of radio and TV stations. An oddity: the German Spiegel Online reports as if it were an independent news agency thus quoting news from Der Spiegel, its mother magazine, as if it was quoting news from dpa or afp). The Internet proves incredibly useful as an education tool — the Oxford English Dictionary is about to go online (The Guardian, 2000: 9); and it is, to give another example, most valuable when searching for say, information about the ear, or the nose, or the throat useful, non-privilege-threatening information, that is.
  The impact this technology will eventually have, simply cannot be predicted. This, of course, has much to do with the fact that human beings seldom act rationally. So far it seems that, judging from the use of e-mail and mobile phones, the need for communicating never has been greater. Often, however, one is tempted to agree with Henry Thoreau, who in Walden (1854) famously wrote:
  "We are in great haste to construct a magnetic telegraph from Maine to Texas, but Maine and Texas, it may be, have nothing important to communicate."
  Moreover, some developments might already cause a few eyebrows to be raised. Roger Fidler (1997: 1-2), for instance, tells of an encounter with a graphic designer who said that he could not even imagine how to create news graphics without a computer. One might also question if life spent in front of a computer — for this we have to, if we were to make all our transactions from home — is not robbing us of what makes life exciting: physical experience. Last but not least: "The debate is not about what technology can do," Jonscher (1999: 7) says, "... but about who we are in the digital age." The dominant ideology at present seems to be that greed is good, bigger is better, and more is best, as stock market fever and mega mergers demonstrate. It could, however, be that these are clear indicators that our time is suffering from delusions of grandeur for spiritually inclined people through the centuries have always known that less is more.
4 The real world. Why would we want a world in which we do not have to leave the house anymore? To go to the supermarket, or to the bank, or to the post office has always been, and it is still, an opportunity to get in touch with other people. We are, after all, social beings, and are meant to communicate — in real life, not via interactive electronic tools. As Winston (1997: 335-336) elaborates:
  "One of the sillier facets of Information Revolution rhetoric is the belief that technology is urgently required to help people avoid going shopping or travelling on business. People like shopping and travelling — just as they like being told, or reading, stories. So we do not need stories to be more "interactive" than they have been since the dawn of time; a liking for travel is why business people have avoided the lure of the video-conference phone for nearly two-thirds of the twentieth century; and we so love shopping we have made the shopping mall — as the latest incarnation of the nineteenth century arcade — into our emblematic public space."
  The stock market however defies logical reasoning and, for instance, highly values Amazon, the online book retailer, despite the fact that this company so far has not made a single buck. Furthermore: it is one of the ironies of the digital age that Amazon, so far, makes its virtual money with the kind of old fashioned books that are envisioned to disappear. It is all, as the digital economy logic goes, about the future, which is why there is no evidence available to judge the claims that are made. The fact is that online buying "represents only a fraction of consumer sales these days, [but] it's a fraction that didn't exist a few years ago" (Newsweek, 1999: 51). Since nobody is able to predict the future, one can only wonder why one should trust the digital enthusiasts — who, of course, argue according to their own agenda. We seem to have entered the era where speculation — we might also call it gambling — is the accepted way of earning money.
  "Ceci n'est pas une pipe," "This is not a pipe," the painter René Magritte explained his picture of a pipe, thus making the point that the picture of a pipe is nothing but a picture of a pipe. Yet the preachers of the digital gospel did not, and do not, listen — they claim that what we call reality is simply a social or cultural convention, and therefore, in matters of truth for example, irrelevant. As the photographer Pedro Meyer argues (Meadows, 1999: 234):
  "First of all, photography as captured by a camera, is all we have had to date to go by when thinking about the photographic image. But the brain doesn't work just like that. My brain creates images also ... not only the camera. And today the computer allows me to extend the horizon of what traditionally has been espoused as a picture. You are probably familiar with Quantum physics, and if so, you are probably also aware that the issue of time being linear is only a figment our convention. That actually our good old Albert Einstein long ago proved that time was both flexible and non linear, and there was nothing to prove this at the time other than equations, however there are all of a sudden a slow of experiments in the labs that are able to prove his theories, and now in the world of art we are starting to have the tools with which to enter into a territory that is completely new. The creation of non-linear thinking photographs, which are also true ... and documentary ... albeit a new conception of what can be understood as photography."
  As Hans Magnus Enzensberger (2000: 101) says, such a degree of unworldliness can only be found in laboratories, science fiction, or university seminars. So let us have a university lecturer, Colin Jacobson, argue with Pedro Meyer's assertion of truth and reality (Meadows, 1999: 251-252):
  "The world of imagination has never been expected to conform to our expectations of everyday reality ... I am certain that whatever differences you, Stephen Hawking and I might have about the nature of experience of physical reality, if we were tipped off the top of a skyscraper, all three of us would expect to fall downwards and not upwards."
  Apart from these philosophical and ideological notions, there are some very practical reasons why people who work in the media fiercely argue the importance of the media, and thus by extension the virtual world. First of all, their jobs depend on it. Secondly, the tendency to overestimate one's own importance is rampant in every profession, yet the degree might be somewhat higher in domains where vanity is an indispensable job requirement. Enzensberger (2000: 101) argues that the unworldliness of journalists is almost surreal:
  "The next days' headlines are discussed with a fierceness as if the fate of the nation were to depend on it. Yet what might be of interest to the reader is seldom a topic, what matters more is the judgement of one's competitors."
  Equally out of touch with reality, Enzensberger (2000: 101) claims, are art directors whose only preoccupation is to be recognised as artists, and who, despite the fact, for example, that the buying power is not with the young, favour a youth cult that borders, economically speaking, on the absurd. The fact that media consumption is high, does not imply that the recipient takes what he sees, or hears, or reads, for real. Modern media consumers are far too worldly to depend on what the media want them to believe (Enzensberger, 2000: 101). Nevertheless, to be aware of being manipulated does not necessarily lead to consequences. Despite us suspecting that what we are presented with is often falling short of the truth, we still build on such "truth" — we seem to have to (Luhmann, 1996: 9-10). To say it in other words: we might know — and most of us do — that the media shows us a highly unreal world, yet we build — voluntarily, since there is not really an alternative — our knowledge of the world on it. Nevertheless, as Enzensberger (2000: 101) points out:
  "The ability to distinguish a pipe from the picture of a pipe is widespread, and toothache is not virtual. Moreover, old habits die hard. And yes, yes, there is life beyond the digital world: it is the only one we have."
5 An odyssey in cyberspace. John Seabrook is a staff writer for the New Yorker. In his novel Deeper (1998) he describes what he calls a two-year odyssey in cyberspace. It becomes apparent, as the story unfolds, how new technology already, often subtly and barely noticed, affects our lives. In the case of Seabrook (1998: 30), for instance:
  "The desktop metaphor did not, in fact, always correspond to the way I used my real desktop — a jumble of books an overlapping mounds of paper, with scissors, pencils, tape, and erasers scattered around the edges of the pile — and in those moments I could feel the intuitive element in my thought process confronting the Boolean logic of computers, and squirming like a slug under a pinch of salt. But I found that after I had been using my Mac for a while, my real desktop got neater and more logically organized — more like the metaphorical desktop on my computer screen. I actually bought a filing cabinet and began trying to organize my papers into folders, which I had never done before. Having begun by using my brain as a metaphor to understand my computer, I was now using the computer as a metaphor to understand my brain."
  When, in 1993, Seabrook set out for Redmond, Washington, Microsoft's headquarters, on an assignment about the emerging digital media industry, he was confronted with a place that reminded him of a college campus. But apart from the fact that there were a lot of young millionaires at work every day, nothing seemed to be out of the ordinary except that "the phones hardly ever rang" (Seabrook, 1998: 40) — everybody seemed to communicate via e-mail. It goes without saying that the tools we communicate with will influence how we communicate. The styles of e-mail writing and letter writing, for instance, differ greatly. One of the reasons is that letters, usually, will be kept — and can serve as reminders as well as evidence — and that e-mail will often be deleted after a while. As Seabrook (1998: 47) observed:
  "I noticed right away that there seemed to be a peculiar kind of intimacy to talking on e-mail, a sense of being wired into each other's minds, that was not present in telephone conversations — mine, anyway. This was true in spite of the fact — or because of the fact? — that our dialogue was as elaborately stylized as a minuet, with no chance for sudden interruption or spontaneous give-and-take. The feeling of intimacy seemed also to come from the shorthand style with which the e-mail was written. It was a form of communication that was neither writing nor speech, but writing used as speech, a peculiar in-between kind of expression that I would come to think of as "speak-acting." (The French make a linguistic distinction between a word used as speech, "un mot," and a word used in writing, "une parole," but in the English language we don't recognize the difference linguistically, and e-mail seemed to thrive in the grey air between the two.)"
  New media will also affect the style of writing and editing, claims John Herbert (1999: 2). According to him, linear storytelling works well in print but not on the screen. Indeed, the language used in, say, online editions of magazines, appears to be more functional, the sentences and paragraphs shorter that in the print editions. This is hardly surprising for the dominating factor in online journalism is speed — what counts, is to have the information before one's competitor has it. What is of interest is the short, clear, naked message. There is simply no time to construct long and, presumably, literary phrases. Besides: reading online is not really a treat, so short and — hopefully — clear sentence construction makes sense. However, this is not what Herbert (1999: 2) advocates:
  "Hypertext now enables journalists to write on-line stories that are multi-dimensional. The journalist can structure the story differently, and it allows readers to pick their own path through the story. Perhaps one reader will click onto a sidebar or too a set of definitions of technical terms; another to a related feature about a particular fact in the main body of the story. Every story published on-line can be read in many ways, and entirely as the reader wishes. The links are built on association of ideas, which is a very different approach to the way traditional journalism is compiled, logically and analytically."
  The thinking that Herbert displays here is basically arguing that whatever a reader does with a text is fine, he even encourages him to hop from item to item, to zap digitally. And the journalists should write, and present their texts, in a way that allows the reader to do so. Well, why not? In the age of anything goes, nothing really matters anyway. Nevertheless, one is left wondering where we are heading when encouraged to give in to every whim that enters our brains. However, we are not there yet. So far writing has still to do — it should, at least — with logic, analysis and the ability to tell a linear story. Should we continue to prefer printed words to the ones on screen, the traditional writing style will have to stay. Again, we may resort to Seabrook (1998: 31):
  "Sometimes, when I was in the middle of writing a piece, my editor would call and say: "Have you printed it out yet? How do you know how good it is until you print it out?" And although this didn't seem logical to me — the words themselves were the same, after all, whether on paper or on the screen — it did seem to be true. Sentences that had seemed to flow smoothly and logically when read in glowing pixels appeared weak and disjointed in cold type. I wondered if there were something about the light coming from the screen that dazzled my critical judgement and lit up some atavistic primate centre of my brain, the same place that had first been dazzled by fire."
6 Computers and the human condition. This essay has argued that, despite the possibilities that the Internet offers, the proclaimed changes will affect our lives to a much lesser extent than envisioned. It has argued that much of the cyber-world talk is nothing but hype, and that it is not very likely that the computer will lead us to more rational forms of life; greed, the lust for power et cetera — the human condition, that is — will see to that. Nevertheless, the channels, and the tools, of communication are changing. It is, so far, difficult to say to what extent the Internet, on the whole, has contributed to a better world — a more convenient way of life. Yet the prevalent ideology has made us — the economically privileged minority, that is — all buy computers, and we all hope now that they won't fail us and crash because we would have no clue as to how to fix them. Furthermore, it has been stated that the present megalomania — unforgettably the director of the movie Titanic, upon receiving his Oscar, exclaiming: "I'm the king of the world" — is not exactly an expression of a sound mental state. Less, this essay has argued, still is more.
   
Previous
  References
 
  • Anderson, Max (2000), "Trip-wired.com. How not to book a weekend break." In: The Sunday Times, Travel, February 20, 2000.
  • Batty, Michael M., and Bob Barr (1994), "The electronic frontier. Exploring and mapping cyberspace." In: Futures, 26, 7, 699-712.
  • Bolton, Richard (1989), "In the American East. Richard Avedon Incorporated." In: Richard Bolton (ed.), The contest of meaning. Critical histories of photography. Cambridge: MIT Press, 1989.
  • Bredow, Rafaela von, and Alexander Jung (2000) "Die Online-Revolution." In: Der Spiegel, January 17, 2000, 92-101.
  • Enzensberger, Hans Magnus (2000), "Das digitale Evangelium." ["The digital gospel."] In: Der Spiegel, January 10, 2000.
  • Fidler, Roger(1997), Mediamorphosis. Understanding new media. Thousand Oaks: Pine Forge.
  • Herbert, John (1999), Journalism in the digital age. Theory and practice for broadcast, print and online media. Oxford: Focal Press, 1999.
  • Jonscher, Charles (1999), Wired life. Who are we in the digital age? London: Anchor.
  • Keegan, Victor (1997), "What a web we weave." In: The Guardian, 1997.
  • Luhmann, Niklas (1996), Die Realität der Massenmedien. [The reality of the mass media.] Opladen: Westdeutscher Verlag.
  • Meadows, Daniel (1999), I think I've just been Pedroed. Unpublished manuscript.
  • Newsweek (1999), "The dawn of e-life." Newsweek Special, October 11, 1999, 44-78.
  • Nicholas, Dave, and Peter Williams (1999), Journalism and the Internet. The changing information environment. London: City University, Department of Information Science, October 8, 1999.
  • Paul, Nora (1996), "What is the Internet?" In: Computer Assisted Research, January 1996.
  • Seabrook, John (1998), Deeper. A two-year odyssey in cyberspace. London: Faber and Faber, 1998.
  • Spiegel, Der (2000), "Voll auf Draht. Sonderteil Cebit." ["Fully wired. Cebit Special."] In: Der Spiegel, 8, February 21, 200, 119-203.
  • Stein, Stuart D. (1999), Learning, teaching and researching on the internet. A practical guide for social scientists. Harlow: Longman, 1999 (lecture handout).
  • Thoreau, Henry David (1854), Walden. Boston: Houghton Mifflin, 1995.
  • Guardian, The (2000), "Oxford English Dictionary to go online." In: The Guardian. National News, March 11, 2000, 9.
  • Winston, Brian (1997), Media technology and society: a history. From the telegraph to the internet. London: Routledge, 1997.
Previous
  2000 © Hans Durrer / 2004 © Soundscapes