Friday, October 11, 2019
Electronic Literature as an Information System Essay
ABSTRACT Electronic literature is a term that encompasses artistic texts produced for printed media which are consumed in electronic format, as well as text produced for electronic media that could not be printed without losing essential qualities. Some have argued that the essence of electronic literature is the use of multimedia, fragmentation, and/or non-linearity. Others focus on the role of computation and complex processing. ââ¬Å"Cybertextâ⬠does not sufficiently describe these systems. In this paper we propose that works of electronic literature, understood as text (with possible inclusion of multimedia elements) designed to be consumed in bi- or multi-directional electronic media, are best understood as 3-tier (or n-tier) information systems. These tiers include data (the textual content), process (computational interactions) and presentation (on-screen rendering of the narrative). The interaction between these layers produces what is known as the work of electronic literature. This paradigm for electronic literature moves beyond the initial approaches which either treated electronic literature as computerized versions of print literature or focused solely on one aspect of the system. In this paper, we build two basic arguments. On the one hand, we propose that the conception of electronic literature as anà information system gets at the essence of electronic media, and we predict that this paradigm will become dominant in this field within the next few years. On the other hand, we propose that building information systems may also lead in a shift of emphasis from one-time artistic novelties to reusable systems. Demonstrating this approach, we read works from the _Electronic Literature Collection Volume 1_ (Jason Nelson and Emily Short) as well as newer works by Mez and the team gathered by Kate Pullinger and Chris Joseph. Glancing toward the future, we discuss the n-tier analysis of the Global Poetic System and the La Flood Project. INTRODUCTION The fundamental attributes of digital narrative have been, so far, mostly faithful to the origin of electronic text: a set of linked episodes that contain hypermedia elements. Whether or not some features could be reproduced in printed media has been subject of debate by opponents and proponents of digital narratives. However, as the electronic media evolves, some features truly unique to digital narrative have appeared. For instance, significant effort has been invested in creating hypertexts responsive to the readerââ¬â¢s actions by making links dynamic; additionally, there have been efforts to create systems capable of producing fiction, with varying degrees of success. Both approaches have in common that they grant greater autonomy to the computer, thus making of it an active part of the literary exchange. The increasing complexity of these systems has directed critical attention to the novelty of the processes that produce the texts. As critics produce a flood of neologisms to classify these works, the field is suffering from a lack of a shared language for these works, as opposed to drawing from the available computer science and well-articulated terminology of information systems. The set {Reader, Computer, Author} forms a system in which there is flow and manipulation of information, i.e. an _information system_. The interaction between the elements of an information system can be isolated in functional tiers. For instance: one or many data tiers, processing tiers, and presentation tiers. In general we will talk about n-tier informationà systems. We will expand this definition in the next section. In this system, a portion of information produced (output) is taken, totally or partially, as input, i.e. there is a feedback loop and therefore the process can be characterized as a cybernetic process. Of course, the field has already embraced the notion of the cybertext. The term cybertext was brought to the literary worldââ¬â¢s attention by Espen Aarseth (1997). His concept focuses on the organization of the text in order to analyze the influence of media as an integral part of literary dynamics. According to Aarseth, cybertext is not a genre in itself. In order to classify traditions, literary genres and aesthetic value, Aarseth argues, we should inspect texts at a much more local level. The concept of cybertext offers a way to expand the reach of literary studies to include phenomena that are perceived today as foreign or marginal. In Aarsethââ¬â¢s work, cybertext denotes the general set of text machines which, operated by readers, yield different texts for reading. Aarseth (1997, p. 19), refuses to narrow this definition of cybertext to ââ¬Å"such vague and unfocused terms such as digital text or electronic literature.â⬠For the course of this paper, we will use the phrase ââ¬Å"electronic literature,â⬠as we are interested in those works that are markedly literary in that they resonate (at least on one level) through evocative linguistic content and engage with an existing literary corpus. While we find ââ¬Å"cybertextâ⬠to be a useful concept, the taxonomies and schematics that attend this approach interfere with interdisciplinary discussions of electronic literature. Instead of using Aarsethââ¬â¢s neologisms such as textons, scriptons and traversal functions, we will use widely-accepted terminology in the field of computer science. This shift is important because the concepts introduced by Aarseth, which are relevant to the current discussion, can be perfectly mapped to concepts developed years earlier in computer science. While the neologisms introduced by Aarseth remain arcane, the terms used in computer science are pervasive. Although the term cybertext adds a sense of increasingly complex interactivity, its focus is primarily on the interaction between a user andà a single art object. Such a framework, however, insufficiently describes the constitution of such an object. Within his treatise, Aarseth is compelled to create tables of attributes and taxonomies to map and classify each of these objects. What is needed is a framework for discussing how these systems operate and how that operation contributes to an overall literary experience. We want to make a clear distinction between this notion of cybertext as a reading process and more thorough description of a workââ¬â¢s infrastructure. Clearly, there are many ways in which the interaction between a reader and a piece of electronic literature can happen; for instance, a piece of electronic literature could be written in HTML or in Flash, yet presenting the same interaction with the reader. In this paper, we adapt the notion of n-tier information systems to provide a scaffolding for reading and interpreting works of electronic literature. The fact that the field of electronic literature is largely comprised of cybertexts (in the sense described above) that require some sort of processing by the computer, has made of this processing a defining characteristic. Critics and public approach new works of electronic literature with the expectation of finding creativity and innovation not only at the narrative level but also at the processing level; in many cases the newness of the latter has dominated other considerations. NEW, NEWER, NEWEST MEDIA Until now, electronic literature, or elit, has been focused on the new, leading to a constant drive to reinvent the wheel, the word, the image, the delivery system, and consequently reading itself. However, such an emphasis raises a number of questions. To what extent does the ââ¬Å"novelâ⬠requirement of electronic literature (as the field is currently defined) de-emphasize a textual investment in exploring the (post)human condition (ââ¬Å"the literaryâ⬠)? How does this emphasis on the ââ¬Å"newâ⬠constrain the development of New Media both for authors and for prospective authors? Or how does such an emphasis put elit authors into an artistic arms race taking on the aethetics of the militiary-industrial complex that produces their tools? Literary essays that treat electronic literature focus on Flash movies, blogs, HTML pages, dynamically generated pages, conversation agents, computer games, and other software applications. A recent edition of Leonardo Almanac (AA.VV. 2006) offers several examples. Its critics/poets analyze the ââ¬Å"information landscapesâ⬠of David Small, the text art experiments of Suguru Ishizaki (2003), Brian Kim Stefansââ¬â¢ 11-minute Flash performance, and Philippe Bootzââ¬â¢s matrix poetry program. Though not all the objects are new, what they share most of all is the novelty of their surface or process or text. These works bear little resemblance to one another, a definitive characteristic of electronic literature (dissimilarity); however, their inclusion under one rubric reflects the fieldââ¬â¢s fetishization of the new. This addiction, mimicking that of the hard sciences it so admires, must constantly replace old forms and old systems with the latest system. Arguably, therefore, any piece of electronic literature may only be as interesting as its form or its novel use of the form. Moreover, such an emphasis shifts the critical attention from the content (what we will call data) to its rendering (or presentation plus processes) primarily. Marie-Laure Ryan (2005) raised charges against such an aesthetic in her _dichtung-digital_ article. In this piece, she rails against a certain style of new media, net.art, elit art object that follows WYSINWYG (What you see is _NOT_ what you get), where the surface presents a text that is considered interesting only because of a more interesting process beneath the surface. This approach, according to Ryan, focuses on ââ¬Å"the meta-property of algorithmic operation.â⬠For this aesthetic, ââ¬Å"the art resides in the productive formula, and in the sophistication of the programming, rather than in the output itselfâ⬠(Ryan). This means that literary, or artistic value, does not reside in what appears on the screen, but in the virtuoso programming performance that underlies the text. While Ryan goes too far in her dismissal of experimentation, her critique holds, in as much as electronic literary criticism that puts process uber alis risks not only minimizing the textual to insignificance but also losing what should be one of elitââ¬â¢s biggest goals: developing new forms for other authors to use andà explore. Such an emphasis reveals a bias that has thus far dominated new media scholarship. This same bias leads new media scholars away from literary venues for their discourse communities and instead to Boing Boing and Siggraph, sites where curiosity or commercial technological development dominate the discussions. It is also what spells instant obsolescence to many authorware forms. The person who uses authorware as it was intended is not the new media artist. It is the person who uses it in a new way or who reconfigures the software to do something unintended. This trend means that electronic literary artists will constantly be compelled to drive their works towards the new, even while it means a perpetual pruning of all prior authorware, cutting them off from theâ⬠literaryâ⬠tree. (We see this same logic in commerical software production where the 4.0 release reconfigures the interface and removes some of the functionality we had grown to love.) A disproportionate emphasis on the new overlooks the tremendous areas of growth in authorship on the stabilizing, if rudimentary, authoring systems. The tide of productivity (in terms of textual output of all levels of quality) is not from an endless stream of innovations but from people who are writing text in established author information formats, from traditional print to blogs. It is through the use of stabilized and reusable information systems that the greater public is being attracted to consume and produce content through digital media. Blogging is the clearest example. This is not equivalent to saying that all blogging is literary, just as not all writing is; however, blogging has created a social practice of reading and writing in digital media, thus increasing the frequency at which literary pieces appear through that venue. This increased community activity would have been impossible if each blogger had to develop their own authoring systems. To help redistribute the scholarly priorities, we propose a reconsideration of electronic literature as an n-tier information system. The consequence of this shift will be twofold: First of all, it will allow us to treat content and processing independently, thus creating a clear distinction between works of literary merit and works of technological craftsmanship. While thisà distinction is at best problematic, considering the information system as a whole will move the analysis away from over-priveleging processes. Secondly, we claim that this approach provides a unified framework with which all pieces of electronic literature can be studied. This paper is organized as follows: in Section 1 (Introduction) we describe what is the problem we intend to explore, and what are the type of systems that will be described in this paper. Section 2 (Information Systems) explores the components of an information system and compares the approaches of different researchers in the field. Section 3 (Examples) demonstrates that the n-tier information system approach can be used to describe a multifarious array of pieces of electronic literature. Section 4 (Discussion) explores the conclusions drawn from this study and set future directions. INFORMATION SYSTEMS Since electronic literature is mediated by a computer, it is clear that there must exist methods to enter information into the system, to process it, and to render an output for readers; that is to say, a piece of electronic literature can be considered as an _information system_. The term ââ¬Å"information systemâ⬠has different meanings. For instance, in mathematics an ââ¬Å"information systemâ⬠is a basic knowledge-representation matrix comprised of attributes (columns) and objects (rows). In sociology, ââ¬Å"information systemsâ⬠are systems whose behavior is determined by goals of individual as well as technology. In our context, ââ¬Å"information systemâ⬠will refer to a set of persons and machines organized to collect, store, transform, and represent data, a definition which coincides with the one widely accepted in computer science. The domain-specific twist comes when we specify that the data contains, but is not limited to, literary information. Information systems, due to their complexity, are usually built in layers. The earliest antecedent to a multi-layer approach to software architectures goes back to Trygve Reenskaug who proposed in 1979, while visiting the Smalltalk group at Xerox PARC, a pattern known as Model-View-Controllerà (MVC) that intended to isolate the process layer from the presentation layer. This paradigm evolved during the next decade to give rise to multi-tier architectures, in which presentation, data and processes were isolated. In principle, it is possible to have multiple data tiers, multiple process tiers, and multiple presentation tiers. One of the most prominent paradigms to approach information systems in the field of computer science, and the one we deem more appropriate for electronic literature, is the 3-tier architecture (Eckerson, 1995). This paradigm indicates that processes of different categories should be encapsulated in three different layers: 1. Presentation Layer: The physical rendering of the narrative piece, for example, a sequence of physical pages or the on-screen presentation of the text. 2. Process Layer: The rules necessary to read a text. A reader of Latin alphabet in printed narrative, for example, must cross the text from left to right, from top to bottom and pass the page after the last word of the last line. In digital narrative, this layer could contain the rules programmed in a computer to build a text output. 3. Data Layer: Here lays the text itself. It is the set of words, images, video, etc., which form the narrative space. In the proposed 3-tier model, feedback is not only possible, but also a _sine qua non_ condition for the literary exchange. It is the continuation of McLluhanââ¬â¢s mantra: ââ¬Å"the media is the messageâ⬠. In digital narrative, the media acts on the message. The cycle of feedback in digital narrative is: (i) Readers receive a piece of information, and based on it they execute a new interaction with the system. (ii) The computer then takes that input and applies logic rules that have been programmed into it by the author. (iii) The computer takes content from the data layer and renders it to the reader in the presentation layer. (iv) step -i ââ¬â is repeated again. Steps i through v describe a complete cycle of feedback, thus the maximum realization of a cybertext. N-tier information systems have had, surprisingly, a relatively short penetration in the field of electronic literature. Aarseth (1997, p.62) introduced a typology for his textonomy that maps perfectly a 3-tier system: Scriptons (ââ¬Å"strings as they appear to readersâ⬠) correspond to the presentation layer, textons (ââ¬Å"strings as they exist in the textâ⬠) correspond to the data layer, and traversal function (ââ¬Å"the mechanism by which scriptons are revealed or generated from textons and presented to the userâ⬠) corresponds to the process layer. These neologisms, while necessary if we study all forms of textuality, are unnecessary if we focus on electronic literature. The methods developed in computer science permeate constantly, and at an accelerating rate, the field of electronic literature, specially as artists create pieces of increasing complexity. Practitioners in the field of electronic literature will be better equipped to benefit from the advances in information technology if the knowledge acquired in both fields can be bridged; without a common terminology attempts to generate dialog are thwarted. The first reference that used computer science terminology applied to electronic literature appeared in an article by Gutierrez (2002), in which the three layers (data, logic and presentation) were clearly defined and proposed as a paradigm for electronic literature. Gutierrez (2004, 2006) explored in detail the logic (middle) layer, proposing algorithms to manage the processes needed to deliver literary content through electronic media. His proposal follows the paradigm proposed by Eckerson (1995) and Jacobson et al (1999): the system is divided into (a) topological stationary components, (b) users, (c) and transient components (processes). The processes in the system are analyzed and represented using sequence diagrams to depict how the actions of the users cause movement and transformation of information across different topological components. The next reference belongs to Wardrip-Fruin (2006); he proposes not three, but seven components: (i) author, (ii) data, (iii) process, (iv) surface, (v) interaction, (vi) outside processes, and (vii) audiences. This vision corresponds to an extensive research in diverse fields, and the interpretation is given from a literary perspective. Even thoughà Wardrip-Fruin does not use the terminology already established in computer science, nor he makes a clear distinction between topology, actors and processes, his proposal is essentially equivalent, and independent, from Gutierrezââ¬â¢s model. In Wardrip-Fruinââ¬â¢s model, author -i- and audience -vii- correspond to actors in the Unified Process (UP); process -iii- and interaction -v- correspond to the process layer in the 3-tier architecture (how the actors move information across layers and how it is modified); data -ii- maps directly the data layer in the 3-tier model; finally, surface -iv- corresponds to the presentation layer. The emergence of these information systems approaches marks the awareness that these new literary forms arise from the world of software and hence benefit from traditional computer science approaches to software. In the Language of New Media, Lev Manovich called for such analysis under the rubric of Software Studies. Applying the schematics of computer science to electronic literature allows critics to consider the complexities of that literature without falling prey to the tendency to colonize electronic literature with literary theory, as Espen Aarseth warned in Cybertext. Such a framework provides a terminology rather than the imposition of yet another taxonomy or set of metaphors that will always prove to be both helpful and glaringly insufficient. That is not to say that n-tier approaches fit works without conflict. In fact, some of the most fruitful readings come from the pieces that complicate the n-tier distinctions. EXAMPLES DREAMAPHAGE 1 & 2: REVISING OUR SYSTEMS Jason Nelsonââ¬â¢s Dreamaphage (2003, 2004) demonstrates the ways in which the n-tier model can open up the complexities and ironies of works of electronic literature. Nelson is an auteur of interfaces, and in the first version of this piece he transforms the two-dimensional screen into a three-dimensional navigable space full of various planes. The interactor travels through these planes, encountering texts on them, documentation of the disease. It is as if we are traveling through the data structure of the story itself, as ifà the data has been brought to the surface. Though in strict terms, the data is where it always was supposed to be. Each plane is an object, rendered in Flash on the fly by the processing of the navigation input and the production of vector graphics to fill the screen. However, Nelsonsââ¬â¢ work distances us, alienates us from the visual metaphors that we have taken for the physical structures of data in the computer. Designers of operating systems work hard to naturalize our relationship to our information. Opening windows, shuffling folders, becomes not a visual manifestation but the transparent glimpse of the structures themselves. Neal Stephenson has written very persuasively on the effect of replacing the command line interface with these illusions. The story (or data) behind the piece is the tale of a virus epidemic, whose primary symptom is the constant repetition of a dream. Nelson writes of the virusââ¬â¢ ââ¬Å"drifting eyes.â⬠Ultimately the disease proves fatal, as patients go insane then comatose. Here the piece is evocative of the repetitive lexias of classical electronic literature, information systems that lead the reader into the same texts as a natural component of traversing the narrative. Of course, the disease also describes the interface of the planes that the user travels through, one after the other, semi-transparent planes, dreamlike visions. This version of Dreamaphage was not the only one Nelson published. In 2004, Nelson published a second interface. Nelson writes of the piece, ââ¬Å"Unfortunately the first version of Dreamaphage suffered from usability problems. The main interface was unwieldy (but pretty) and the books hard to find (plus the occasional computer crash)â⬠(ââ¬Å"Dreamaphage, _ELC I_) He reconceived of the piece in two dimensions to create a more stable interface. The second version is two-dimensional and Nelson has also ââ¬Å"added a few more extra bits and readjusted the medical reports.â⬠In the terms of n-tier, his changes primarily affected the interface and the data layers. Here is the artist of the interface facing the uncanny return of their own artistic creation in a world where information systems do not lie in the stable binding in a book but in a contingent state that is always dependentà on the environments (operating systems) and frames (browser) in which they circulate. As the user tries to find a grounding in the spaces and lost moments of the disease, Nelson himself attempts to build stability into that which is always shifting. However, do to a particular difference in the way that Firefox 2.0 renders Flash at the processing layer, interactors will discover that theâ⬠openingâ⬠page of the second version is squeezed into a fraction of their window, rather than expanding to fill the entire window. At this point, we are reminded of the workââ¬â¢s epigram, ââ¬Å"All other methods are errors. The words of these books, their dreams, contain the cure. But where is the pattern? In sleeping the same dream came again. How long before I become another lost?â⬠(ââ¬Å"openingâ⬠). As we compare these two versions of the same information system, we see the same dream coming again. The first version haunts the second as we ask when will it, too, become one of the lost. Though Nelson himself seems to have an insatiable appetite for novel interfaces, his own artistic practices resonate well with the ethos of this article. At speaking engagements, he has made it a practice to bring his interfaces, his .fla (Flash source) files, for the attendees to take and use as they please. Nelson presents his information systems with a humble declaration that the audience may no doubt be able to find even more powerful uses for these interfaces. GALATEA: NOVELTY RETURNS Emily Shortââ¬â¢s ground-breaking work of interactive fiction offers another work that, like its namesake in the piece, opens up to this discussion when approached carefully. Galateaââ¬â¢s presentation layer appears to be straight forward IF fare. The interactor is a critic, encountering Galatea, which appears to be a statue of a woman but then begins to move and talk. In this novel work of interactive fiction, the interactor will not find the traditional spacial navigation verbs (go, open, throw) to be productive, as the action focuses on one room. Likewise will other verbs prove themselves unhelpful as the user is encouraged in the help instructions to ââ¬Å"talkâ⬠orà ââ¬Å"askâ⬠about topics. In Shortââ¬â¢s piece, the navigational system of IF, as it was originally instantiated in Adventure, begins to mimic a conversational system driven by keywords, ala Joseph Weizenbaumââ¬â¢s ELIZA. Spelunking through a cave is replaced with conversing through an array of conversational replies. Galatea does not always answer the same way. She has moods, or rather, your relationship with Galatea has levels of emotion. The logic layer proves to be more complex than the few verbs portend. The hunt is to figure out the combination that leads to more data. Galatea uses a novel process to put the user in the position of a safe cracker, trying to unlock the treasure of answers. Notice how novelty has re-emerged as a key attribute here. Could there be a second Galatea? Could someone write another story using Galateaââ¬â¢s procesess. Technically no, since the work was released in a No-Derivs Creative Commons license. However, in many ways, Galatea is a second, coming in the experimental wave of artistic revisions of interactive fiction that followed the demise of the commercially produced text adventures from Infocom and others. Written in Z-Machine format, Galatea is already reimagining an information system. It is a new work written in the context of Infocomââ¬â¢s interactive fiction system. Shortââ¬â¢s work is admittedly novel in its processes, but the literary value of this work is not defined by its novely. The data, the replies, the context they describe, the relationship they create are rich and full of literary allusions. Short has gone on to help others make their own Galatea, not only in her work to help develop the natural language IF authoring system Inform 7 but also in the conversation libraries she has authored. In doing so, she moved into the work of other developers of authoring systems, such as the makers of chatbot systems. Richard S. Wallace developed one of the most popular of these (A.I.M.L..bot), and his work demonstrates the power of creating and sharing authorware, even in the context of the tyranny of the novel. A.L.I.C.E. is the base-line conversational system, which can be downloaded and customized. Downloading the basic, functioning A.L.I.C.E. chatbot as a foundation allows users to concentrate on editing recognizeable inputs and systematic responses. Rather than worrying about how the system will respond to input, authors, or botmasters, can focus on creating what they system will say. To gain respect as a botmaster/author, one cannot merely modify an out-of-the-box ALICE. The user should further customize or build from the ground up using AIML, artificial intelligence markup language, the site-specific language created for Wallaceââ¬â¢s system. They must change the way the system operatesââ¬âlargely, because the critical attention around chatbots follows more the model of scientific innovation more than literary depth. However, according to Wallace, despite the criticsââ¬â¢ emphasis on innovations, the users have been flocking to ALICE, as tens of thousands of users have created chatbots using the system (Be Your Own Botmaster). AIML becomes an important test case because while users may access some elements of the system, because they are not changing fundamentals, they can only make limited forays into the scientific/innovation chatbot discussions. Thus while our n-tier model stresses the importance of creating authorware and understanding information systems, novelty still holds an important role in the development of electronic literature. Nonetheless, interactors can at least use their pre-existing literacies when they encounter an AIML bot or a work of interactive fiction written on a familiar platform. LITERATRONICA Literatronic is yet another example of an n-tier system. Its design was based entirely in the concept of division between presentation, process and data layers. Every interaction of the readers is stored in a centralized database, and influences the subsequent response of the system to each readerââ¬â¢s interactions. The presentation layer employs web pages on which the reader can access multiple books by multiple authors in multiple languages.à The process layer is rather complex, since it uses a specialized artificial intelligence engine to adapt the book to each reader, based upon his/her interaction, i.e. and adaptive system. The data layer is a relational database that stores not only the narrative, but also readerââ¬â¢s interaction. Since there is a clear distinction between presentation, data and process, Literatronica is a 3-tier system that allows authors of multiple language to focus on the business of literary creation. MEZââ¬â¢S CODE: THE SYSTEMS THAT DO NOT USE A COMPUTER[1] As with many systematic critical approaches, the place where n-tier is most fruitful is the where it produces or reveals contradictions. While some works of electronic literature lend themselves to clear divisions between parts of the information system, many works in electronic literature complicate that very distinction as articulated in such essays as Rita Raleyââ¬â¢s code.surface||code.depth, in which she traces out codeworks that challenge distinctions between presentation and processing layers. In the works of Mez (Maryanne Breeze), she creates works written in what N. Katherine Hayles has called a creole of computer and human languages. Mez, and other codework authors, display the data layer on the presentation layer. One critical response is to point out that as an information system, the presentation layer are the lines of code and the rest of the system is whatever medium is displaying her poem. However, such an approach missed the very complexity of Mezââ¬â¢s work. Indeed, Mezââ¬â¢s work is often traditional static text that puts users in the role of the processor. The n-tier model illuminates her sleight of hand. trEm[d]o[lls]r_ [by Mez] doll_tre[ru]mor[s] = var=ââ¬â¢msgââ¬â¢ val=ââ¬â¢YourPleadingââ¬â¢/> â⬠TREMOR Consider her short codework ââ¬Å"trEm[d]o[lls]r_â⬠published on her site and on the Critical Code Studies blog. It is a program that seems to describe (or self-define) the birth pangs of a new world. The work, written in what appears to be XML, cannot function by itself. It appears to assign a value to a variable named ââ¬Å"doll_tre[ru]mor[s]â⬠, a Mez-ian (Mezozoic?) portmenteau of doll_tremors and rumors. This particular rumor beign defined is called, the fifth world, which calls up images of the Native American belief in a the perfected world coming to replace our current fourth world. This belief appears most readily in the Hopi tribe of North America. A child of this fifth world are ââ¬Å"fractures,â⬠or put another way, the tremor of the coming world brings with it fractures. The first, post 2 inscription, contains polymers: a user set to ââ¬Å"YourDollUserName,â⬠a ââ¬Å"3rdpersonâ⬠set to ââ¬Å"Your3rdPerson,â⬠a location set to ââ¬Å"YourSoddenSelfâ⬠, and a ââ¬Å"spikeyâ⬠set to ââ¬Å"YourSpiKeySelf.â⬠The user then becomes a molecule name within the fracture, a component of the fracture. These references to dolls and 3rd person seem to evoke the world of avatars. In virtual worlds, users have dolls. If the first fracture is located in the avatar of the person, in their avatar, the second centers on communication from this person or user. Here the user is defined with ââ¬Å"YourPolyannaUserName,â⬠and we are in the world of overreaching optimism, in the face of a ââ¬Å"msgâ⬠or message of ââ¬Å"YourPleadingâ⬠and a ââ¬Å"lastword.â⬠Combining these two fractures we have a sodden and spikey self pleading and uttering a last word presumably before the coming rupture into the fifth world. As with many codeworks, the presentation layer appears to be the data and logic layer. However, there is clearly another logic layer that makes these words appear on whatever inerface the reader is using. Thus, the presentation layer is a deception, a challenge to the very division of layers, a revelation that hides. At the same time, we are compelled to execute the presneted code by tracing out its logic. We must take the place of the compiler with the understanding that the coding structures are alsoà meant to launch or allusive subroutines, that part of our brain that is constantly listening for echoes and whispers To produce that reading, we have had to execute that poem, at least step through it, acting as the processor. In the process of writing poetic works as data, she has swapped our traditional position vis-a-vis n-tier systems. Where traditional poetry establishes idenitity through Iââ¬â¢s, Mez has us identify with a system ready to process the user who is not ready for the fifth world, whatever that may bring. At the same time, universal or even mythical realities have been systematized or simulated. There is another layer of data that is missing, supplied by the user presumably. The poem leaves its tremors in a state of potential, waiting to operate in the context of a larger system and waiting for a user to supply the names, pleading, and lastwords. The codework means nothing to the computer. This is not to make some sort of Searlean intervention about the inability of computers to comprehend but to point out that Mezââ¬â¢s code is not valid XML. Of course, Mez is not writing for computer validation but instead relies on the less systematic processing of humans who rely on a far less rigorously specified language structure. Tremors fracture even the process of assigning some signified to these doll_tre[ru]mor[s]. Mezââ¬â¢s poem plays upon the layers of n-tier, exposing them and inverting them. Through the close-reading tools of Critical Code Studies, we can get to her inference and innuendo. However, we should not miss the central irony of the work, the data that is hidden, the notable lack of processing performed by this piece. Mez has hailed us into the system, and our compliance, begins the tremors that brings about this fifth world even as it lies in potential. N-tier is not the fifth world of interpretation. However, it is a tremor of recognition that literacy in information systems offers a critical awareness crucial in these emerging forms of literature. FUTURE PROJECTS Two new projects give the sense of the electronic literature to come. The authors of this paper have been collaborating to create systems that answer Haylesââ¬â¢ call at ââ¬Å"The Future of Electronic Literatureâ⬠in Maryland to create works that move beyond the desktop. The ââ¬Å"Global Poetic Systemâ⬠and ââ¬Å"The LA Flood Projectâ⬠combine GPS, literary texts, and civic spaces to create art objects that rely on a complex relationship between various pieces of software and hardware, from mobile phones to PBX telephony to satellite technology. To fully discuss such works with the same approaches we apply to video games or Flash-based literary works is to miss this intricate interaction. However, n-tier provides a scalable framework for discussing the complex networking of systems to produce an artistic experience through software and hardware. These projects explore four types of interfaces (mobile phones, PDAs, desktop clients, and web applications) and three ways of reading (literary adaptative texts, literary classic texts, texts constructed from the interaction of the community). The central piece that glues together literary information is geolocation. When the interactor in the world is one of the input systems, critics need a framework that can handle complexity. Because of the heterogeneity of platforms in which these systems run, there are multiple presentation layers (e.g. phone, laptop, etc.), multiple parallel processing layers, and multiple sources of information (e.g. weather, traffic, literary content, user routes, etc.), thus requiring a n-tier approach for analysis and implementation. It is clear that as electronic literature becomes more complex, knowledge of the n-tier dilineations will be crucial not only to the reception but also the production of such works. Since the interaction of heterogenous systems is the state of our world, an n-tier approach will up critics to open up these works in ways that help identify patterns and systems in our lives. DISCUSSION Let us bring down the great walls of neologisms. Let us pause for reflectionà in the race for newer new media. Let us collaborate on the n-tiers of information systems to create robust writing forms and the possibility of a extending the audiences that are literate in these systems. In this paper, we have described an analytical framework that is useful to divide works of electronic literature into their forming elements, in such a way that is coherent with advances in computer science and information technology, and at the same time using a language that could be easily adopted by the electronic literature community. This framework places creators, technicians, and critics on common ground. This field does not have a unified method to analyze creative works; this void is a result, perhaps, in the conviction that works of electronic literature require an element of newness and a reinvention of paradigms with every new piece. Critics are always looking for innovation. However, the unrestrained celebration of the new or novel has lead New Media to the aesthetic equivalent of an arms race. In this article we found common elements to all these pieces, bridging the gap between computer science and electronic literature with the hopes of encouraging the production of sustainable new forms, be they ââ¬Å"stand aloneâ⬠or composed of a conglomeration of media forms, software, and users. As works of electronic literature continue to become more complex, bringing together more heterogeneous digital forms, the n-tier model will prove scalable and nuanced to help describe each layer of the work without forcing it into a pre-set positions for the sake of theory. We have to ask at this point: how does this framework handle exceptions and increasing complexity? It is interesting to consider how the proposed n-tier model might be adapted to cope with dynamic data, which seems to be the most complex case. Current literary works tend to process a fixed set of data, generated by the author; it is the mode of traversing what changes. Several software solutions may be used to solve the issue of how this traversal is left in the hands of the user or mediated yet in some way by the author through the presentation system. The n-tier model provides a way of identifying three basic ingredients: the data to be traversed, the logic for deciding how toà traverse them, and the presentation which conveys to the user the selected portions at the selected moments. In this way, such systems give the impression that the reader is shaping the literary work by his/her actions. Yet this, in the simple configuration, is just an illusion. In following the labyrinth set out by the author, readers may feel that their journey through it is always being built anew. But the labyrinth itself is already fixed. Consider what would happen when these systems leave computer screens and move into the world of mobile devices and ubiquitous art as Hayles predicted they would at the 2007 ELO conference. How could the system cope with changing data, with a labyrinth that rebuilds itself differently each time based on the path of the user? In this endeavor, we would be shifting an increasing responsibility into the machine which is running the work. The data need not be modified by the system itself. A simple initial approach might be to allow a subset of the data to be drawn from the real environment outside the literary work. This would introduce a measure of uncertainty into the set of possible situations that the user and the system will be faced with. And it would force the author to consider a much wider range of alternative situations and/or means of solving them. Interesting initiatives along these lines might be found in the various systems that combine literary material with real-world information by using, for example, mobile hand-held devices, provided with means of geolocation and networking. With respect to the n-tier model, the changes introduced in the data layer would force additional changes in the other layers. The process layer would grow in complexity to acquire the ability to react to the different possible changes in the data layer. It could be possible for the process layer to absorb all the required changes, while retaining a version of the presentation layer similar to the one used when dealing with static data. However, this may put a heavy load on the process layer, which may result in a slightly clumsy presentation. The clumsiness would be perceived by the reader as a slight imbalance between the dynamic content being presented and the static means used for presenting it. The breaking point would be reached when readers become aware that the material they are receiving is being presented inadequately, and it is apparent that there might have been betterà ways of presenting it. In these cases, a more complex presentation layer is also required. In all cases, to enable the computer to deal with the new type of situations would require the programmer to encode some means of appreciating the material that is being handled, and some means of automatically converting it into a adequate format for communicating it to the user. In these task, current research into knowledge representation, natural language understanding, and natural language generation may provide very interesting tools. But, again, these tools would exist in processing layers, and would be dependent on data layers, so the n-tier model would still apply. The n-tier information system approach remains valid even in the most marginal cases. It promises to provide a unified framework of analysis for the field of electronic literature. Looking at electronic literature as an information system may signal another shift in disciplinary emphasis, one from a kind of high-theory humanities criticism towards something more like Human Computer Interface scholarship, which is, by its nature, highly pragmatic. Perhaps a better way would be to try bring these two approaches closer together and to encourage dialogue between usability scientists and the agents of interpretation and meaning. Until this shift happens, the future of ââ¬Å"newâ⬠media may be a developmental 404 error page. REFERENCES AA.VV. ââ¬Å"New Media Poetry and Poetics Specialâ⬠_Leonardo Almanac_, 14:5, September 2006. URL: à «http://www.leoalmanac.org/journal/vol_14/lea_v14_n05-06/index.aspà » First accessed on 12/2006. AARSETH , Espen J. _Cybertext: Perspectives on Ergodic Literature_. Johns Hopkins University Press, Baltimore, MD, 1997. CALVI, Licia.â⬠ââ¬ËLector in rebusââ¬â¢: The role of the reader and the characteristics of hyperreadingâ⬠. In _Proceedings of the Tenth ACM Conference on Hypertext and Hypermedia_, pp 101-109. ACM Press, 1999. COOVER, Robert.â⬠Literary Hypertext: The Passing of the Golden Age of Hypertext.â⬠_Feed Magazine_. à «http://www.feedmag.com/document/do291lofi.htmlà » First accessed 4 August 2006. ECKERSON, Wayne W.â⬠Three Tier Client/Server Architecture: Achieving Scalability, Performance, and Efficiency in Client Server Applications.â⬠_Open Information Systems_ 10, 1. January 1995: 3(20). GENETTE, Gerard. _Paratexts: Thresholds of Interpretations_. Cambridge University Press, New York, NY, 1997. GUTIERREZ, Juan B. ââ¬Å"Literatrà ³nica ââ¬â sobre cà ³mo y porquà © crear ficcià ³n para medios digitales.â⬠In _Proceedings of the 1er Congreso ONLINE del Observatorio para la CiberSociedad_, Barcelona, à «http://cibersociedad.rediris.es/congreso/comms/g04gutierrez.htmà » First accessed on 01/2003. GUTIERREZ, Juan B. ââ¬Å"Literatrà ³nica: Hipertexto Literario Adaptativo.â⬠in _Proceedings of the 2o Congreso del Observatorio para la Cibersociedad_. Barcelona, Spain. URL: à «http://www.cibersociedad.net/congres2004/index_f.htmlà » First accessed on 11/2004. GUTIERREZ, Juan B. ââ¬Å"Literatronic: Use of Hamiltonian cycles to produce adaptivity in literary hypertextâ⬠. In _Proceedings of The Bridges Conference: Mathematical Connections in Art, Music, and Science_, pages 215-222. Institute of Education, University of London, August 2006. HAYLES, N. Katherine. ââ¬Å"Deeper into the Machine: The Future of Electronic Literature.â⬠_Culture Machine_. Vol 5. 2003. à «http://svr91.edns1.com/~culturem/index.php/cm/article/viewArticle/245/241à » First accessed 09/2004. ââ¬â ââ¬Å"Storytelling in the Digital Age: Narrative and Data.â⬠Digital Narratives conference. UCLA. 7 April 2005. HILLNER, Matthias.â⬠ââ¬ËVirtual Typographyââ¬â¢: Time Perception in Relation to Digital Communication.â⬠New Media Poetry and Poetics Special Issue, _Leonardo Electronic Almanac_ Vol 14, No. 5 ââ¬â 6 (2006). à «http://leoalmanac.org/journal/vol_14/lea_v14_n05-06/mengberg.aspà » First accessed 25 Sep. 2006 JACOBSON I, BOOCH G, RUMBAUGH J. _The unified software development process_. Addison-Wesley Longman Publishing Co., Inc. Boston, MA, USA, 1999. LANDOW George P. _Hypertext 2.0_. Johns Hopkins University Press, Baltimore, MD, 1997. MANOVICH, Lev. _The Language of New Media_. MIT, Cambridge, MA, 2002. MARINO, Mark. ââ¬Å"Critical Code Studies.â⬠_Electronic Book Review_, December 2006. à «http://www.electronicbookreview.com/thread/electropoetics/codologyà » First Accessed 12/2006. MEZ.â⬠trEm[d]o[lls]r_â⬠_Critical Code Studies_. April 2008. à «http://criticalcodestudies.com/wordpress/2008/04/28/_tremdollsr_/à » First accessed 04/2008. MONTFORT, Nick.â⬠Cybertext ââ¬Å". _Electronic Book Review_, January 2001. URL: à «http://www.altx.com/EBR/ebr11/11monà » First accessed on 06/2006. NEA. _Reading At Risk: A Survey of Literary Reading in America_. National Endowment for the Arts, 1100 Pennsylvania Avenue, NW. Washington, DC 20506-0001, 2004. PAJARES TOSCA, Susana and Jill Walker.â⬠Selected Bibliography of Hypertext Critcism.â⬠_JoDI_. à «http://jodi.tamu.edu/Articles/v03/i03/bibliography.htmlà » First accessed October 24, 2006. Raley, Rita. ââ¬Å"Code.surface||Code.depth.â⬠_Dichtung Digital_. 2006. à «http://www.dichtung-digital.org/2006/1-Raley.htmà » First accessed 08/2006. RODRà GUEZ, Jaime Alejandro. ââ¬Å"Teorà a, Prà ¡ctica y Enseà ±anza del Hipertexto de Ficcià ³n: El Relato Digital.â⬠Pontificia Universidad Javeriana, Bogotà ¡, Colombia, 2003. à «http://www.javeriana.edu.co/relatodigitalà » First accessed on 09/2003. RYAN, Marie-Laure. ââ¬Å"Narrative and the Split Condition of Digital Textuality.â⬠1. 2005. URL: à «http://www.brown.edu/Research/dichtung-digital/2005/1/Ryan/à » First accessed 4 October 2006 VERSHBOW, Ben.â⬠Flight Paths a Networked Novel.â⬠_IF: Future of the Book_. December 2007 à «http://www.futureofthebook.org/blog/archives/2007/12/flight_paths_a_networked_novel.htmlà » First Accessed 01/2008. WALLACE, Richard S. ââ¬Å"Be Your Own Botmaster.â⬠Alice AI Foundation Inc. 2nd ed. 2004. WARDRIP-FRUIN, Noah. _Expressive Processing: On Process-Intensive Literature and Digital Media_. Brown University. Providence, Rhode Island. May 2006. WARDRIP-FRUIN,Noah. Christopher Strachey: the first digital artist? _Grand Text Auto_. 1 August 2005. à «http://grandtextauto.gatech.edu/2005/08/01/christopher-strachey-first-digital-artist/à » First accessed 3 September 2006. ZWASS, Vladimir. _Foundations of Information Systems_. Mcgraw-Hill College, NY 1997.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.