Budding Cyber Romance

Our physical rapport with computers & virtuality
We’ve been struck by Cupid in our relationship with computers, especially over the past decade or so. In fact, the association between the inanimate objects and ourselves — a connection that previously was strictly utilitarian and impersonal — morphed into a courtship following the 1970 invention of the first virtual reality gear by one of the field’s leading pioneers, Ivan Sutherland. Still, Sutherland’s contribution, called an “ultimate display,” was more of a sign of the times than it was a pivotal development as far as post-modern man’s increasingly corporeal contact with virtuality and intelligent machines.

Even after Sutherland’s interactive head-mounted display system came about, it was not until the early 1980s that virtual reality began to take off and move beyond the prototypical, and the function of computers slowly began to shift from their initial purpose of calculating data to their subsequent role as a means of simulation and interaction. The mutation made for a more temporal experience, versus wrestling with the bits and bytes that demanded technical savvy in the past and played a big part in relegating computer use to the relative few. The change to a more graphics- and effects-driven technology also helped unleash it into the hands of the masses, fueling a new way of using and thinking about computers — and life.

“With the rise of a personal computer culture in the 1980s, more people owned their own machines and could do what they pleased with them. This meant that more people began to experience the computer as an expressive medium that they could use in their own ways,” author Sherry Turkle notes in her 1997 nonfiction book, Life on the Screen.

Swept off our feet
By the 1990s, aided by factors such as the diffusion of computer gaming, color display systems, CD-ROMs and the further development of virtual reality systems, the sparks really started to fly in terms of our increasingly hot-and-heavy kinship with PCs and information/communication technology. Simply put, the primary role of the computer went from crunching numbers to stroking the senses. And the switch was a long time coming.

The computer’s predecessor initially was invented by Charles Babbage in the early 19th century. His bulky, mechanical contraption, meant solely for processing numbers, eventually evolved into what was to become IBM’s hallmark 1970s-era product. Although manufactured over a century after the Babbage “arithmetical mill,” the fundamental concept behind the old mold died hard — and the cultural mentality and production design aesthetic along with it — but ultimately was recast thanks to a number of factors.

Among the key determinants in the changing face of the human-computer connection and the machinery’s evolution into a tool for visualization: market forces and increased accessibility to the equipment, along with technological advancements and a fundamental alteration in computer design.

For example, the Mac played a monumental part as far as the refashioning of the computer design aesthetic, most notably by concealing the technical inner workings of its product in favor of convenient features such as user-friendly software and interactive desktop icons. Although locked in a market and cultural rivalry with the IBM-style design and user constituency during the 1980s, the Mac’s post-modern aesthetic became the “standard” design by the 1990s, and its Luddite-calming blueprint had widespread mass appeal. Additionally, it had a far-reaching cultural impact as far as paving inroads for simulation and virtuality.

“The simulated desktop that the Macintosh presented came to be far more than a user-friendly gimmick for marketing computers to the inexperienced. It also introduced a way of thinking that put a premium on surface manipulation and working in ignorance of the underlying mechanism,” Turkle writes. “Even the fact that a Macintosh came in a case that users could not open without a special tool (a tool which I was told was only available to authorized dealers) communicated the message.”

In a sense, virtual reality as a technology has come full circle, if you consider Morton Heilig’s 1960s-era invention, Sensorama.

Turkle adds that “The desktop’s interactive objects, its anthropomorphized dialogue boxes in which the computer ‘spoke’ to its user — these developments all pointed to a new kind of experience in which people do not so much command machines as enter into conversations with them. People were encouraged to interact with technology in something resembling the way they interact with other people.”

Convincing perception
In the realm of virtual reality systems in specific, applications have grown increasingly physical, offering engaging experiences to users on multiple levels, chiefly through graphical displays and audio effects that are growing more and more convincing. The same goes for the consumer electronics market as far as gaming is concerned.

“As we have seen, VR systems have extended the capabilities of existing machines to the extent that the worlds of computer gaming have become an experience that engrosses the user’s senses — and in professional visual computing for which VR has created more manipulability and realistic models,” explains Ralph Schroeder in his 1996 nonfiction book, Possible Worlds.

In a sense, virtual reality as a technology has come full circle, if you consider Morton Heilig’s 1960s-era invention, Sensorama. Users would duck into the machine, which resembled a game booth, to view cinematic images. True to the invention’s moniker, the immersive features of the machine appealed to the user’s senses and included aromas that were emitted during certain points in the viewing experience. Interestingly, Sutherland’s invention, although it incorporated the senses to a degree, represented a departure from Sensorama since his “ultimate display” was meant to be a computer interface for the purpose of data manipulation. But despite the difference, Sutherland’s contribution — and virtual reality systems in general — is considered to have been part of the genesis of a lasting and wider trend to make computer technology and interaction more user-friendly and experiential.

– Noche Kandora

Photo ID top: The Virtual I/O “i-glasses,” among a number of head-mounted displays geared toward the consumer market.
Photo ID bottom: Charles Babbage’s “arithmetical mill.”
Photo sources: The Fourth Discontinuity by Bruce Mazlish for the “arithmetical mill”; Virtual Realism by Michael Heim for the Virtual I/O “i-glasses.”

No Comments

No comments yet.

Comments RSS

Leave a comment

You must be logged in to post a comment.