July 05, 2007
The Body in Space
"The body is our first habitation, the building our second. I wanted to use the form of this second body, architecture, to make concentrated volumes out of a personal space that carries the memory of an absent self, articulated through measurement … Bodies and buildings, cities and cells, monuments and intimacies, each of the “rooms ” in (allotment) is someone’s, is connected to the moving body of an individual, alive and breathing."
'Architecture is supposed to be the location of security and certainty about where you are. It is supposed to protect you from the weather, from darkness, from uncertainty. Blind Light undermines all of that. You enter his interior space that is the equivalent of being on top of a mountain or at the bottom of the sea. It is very important for me that inside it you find the outside. Also you become the immersed figure in an endless ground, literally the subject of the work.’ - Antony Gormley
From the exhibtion website:
Taking the body as its point of departure, the exhibition is an invitation to embark on a journey through different kinds of space. New works include Blind Light: lose yourself in light and vapour in this cloud-filled glass room that is cold, wet and disorientating; a spectacular series of suspended figures in light-infused webs of steel; and a colossal 27-ton structure, Space Station, that tilts precariously, dominating the gallery space.
The exhibition extends beyond the confines of the gallery with Event Horizon, an ambitious urban artwork featuring around 30 sculptural casts of the artist’s body on rooftops and public always across central London, subtly punctuating the city skyline. Spanning outwards from The Hayward, all the figures face towards the gallery’s three sculpture terraces, which form the main viewing platforms for the project as a whole. [via pruned]
June 13, 2007
Hold: Vessel 2, 2007
Lynette Wallworth: Hold: Vessel 2, 2007 :: BFI Southbank, London SE1 :: 23 June - 2 September :: Artist's talk: Lynette Wallworth in Conversation with renowned scientist, writer and presenter Mark Lythgoe - July 3, 18:20 NFT2 :: FREE :: 020 7928 3232
Lynette Wallworth's London debut sees the BFI commissioning Hold: Vessel 2, 2007, enabling the artist to further develop her critically acclaimed piece Hold: Vessel 1, 2001. An interactive, large-scale installation that explores the intimacy and immensity of the natural world and our relationship to it, this work uses moving image and technology to reveal the hidden intricacies of human immersion in the wide, complex world.
The piece is activated by the viewer, the interaction being a metaphor for our connectedness within biological, social and ecological systems. Upon entering the exhibition space, the visitor is encouraged to 'catch' falling projected images of astronomical and underwater life in lens-shaped glass bowls. With intimate moments of synchronised light and sound, the installation celebrates minutiae - microscopic views of marine life forms and photographic imagery of deep space - leaving the visitor with a sense of communal participation within a complex system of which we are a part.
The images in Hold: Vessel 2 come from current visioning technologies such as X-ray Microtomography and remotely operated light sensitive cameras allowing us to see intricate detail inaccessible to the human eye. A reflection upon our own place in a complex and starkly beautiful world, the work helps us explore the complex interconnectivity between things that we do not always see or know.
Lynette Wallworth is the second of an ongoing series of art exhibitions at BFI Southbank exploring contemporary artists' use of the moving image.
Lynette Wallworth - Hold: Vessel 1, 2001
Courtesy of and commissioned by the Australian Centre for the Moving Image, Melbourne, Australia http://www.acmi.net.au
Lynette Wallworth - Hold: Vessel 2, 2007
Commissioned by the BFI, London, Produced by Forma Touring and Supported by Arts Council England
May 19, 2007
404 FESTIVAL - ON TOUR / EUROPE 2007
Einstein's Brain Project
[left: Alan Dunning, Paul Woodrow, The Einstein's Brain Project: The Errant Eye, 1997-2001. Virtual reality installation. The participant navigates around a recognizable visual environment, a forest whose outline faded into an abstract visual universe reflecting the variations in biological signals processed in real time by a computer module.]
Astas Romas & 404 Festival decided to launch a European Tour that begins on May 31, 2007. Directors and team of the "404 Festival" will be visiting cultural centers, public and alternative places performing live concerts, projections and conferences, also presenting "404 selected" artworks from international authors. Artists from different countries will join this tour, such as SadMb (Japan), Synchdub (Belgium), Sample Mousse (Finland), Guillermo Giampietro & Lara Baracetti (Italy), Einstein's Brain Project (Canada), Vladimir Manovski and Aleksandar Secerov (Serbia), Miha Ciglar (Slovenia), among others.
"...The cycle of installations in The Einstein's Brain Project (1995-2001) is a major technological detour for Dunning that, nonetheless, re-examines his past conceptual concerns. In this long-term project begun in 1995 with Paul Woodrow and a team of scientists from different fields, Dunning probes the new epistemological models that have developed thanks to technological advances in virtual reality.
The artificial worlds summoned up by the immersive universes often rekindle the presuppositions of a naturalistic project whose aim is to simulate familiar experiences. The interfaces created by Dunning and Woodrow propose a critical counterbalance to the withdrawal to the Cartesian universe. In the wake of recent research in cognitive science, the two men are interested in how biological and brain processes shape our perception of the world.
[Left: Diagram showing the way biological data gathered in real time on the participant's body is altering the display parameters of a three-dimensional virtual universe.] An initial series of installations completed between 1997 and 2001 explored popular culture's fascination with the human brain. Evidence of this fascination is found in Roland Barthes's essay on the fetishism of Einstein's brain, a reflection that serves as a critical point of reference for the installations in this body of work. By reactivating obsolete systems of representation (phrenology, eugenics, etc.), this series also underlines the impact of pseudo-scientific projections on our knowledge of the body and psyche.
In The Fall, The Furnace, The Flesh (1997), (7) participants underwent a sort of ritual as they crossed through a curtain made up of thin vinyl strips. These strips served as a screen for projecting an image of a blazing fire. Participants found themselves in a cubic space defined by four screens. In the middle of the space was an anatomically correct model of the human head covered with touch-sensitive pads (audio-digital). The location of the 55 pads replicated the brain map developed by the phrenologists Franz Joseph Gall and Johann Spurzheim. In the Victorian age, studying the skull's contours over these zones supposedly revealed a person's character traits and psychological predispositions. Dunning recycled the paradigm of phrenology as a means of accessing the installation's touch-sensitive interface. As participants pressed the pads, a series of video segments were projected on the wall. These segments, which came from various sources, showed irreconcilable objects and events that evoked the series of random associations produced by the brain as it assembles fragments of stored memory. Here, the unconscious content could not easily be distinguished from fragments of images from the media sphere. Images appeared erratically: a lunar eclipse, close-ups of the body, a political demonstration, a hall in a museum, a text flashing at a dizzying speed, barely perceptible abstract images. With the combination of images almost infinite, the screen constantly offered new sequences of juxtaposed images.
[Left: Alan Dunning, Paul Woodrow, The Einstein's Brain Project: The Furnace, 1997-1999. Video and sound installation with interactive components. Segments, from various sources, showing irreconcilable objects and events evoking the series of random associations produced by the brain as it assembles fragments of stored memory. Excerpt of the video documentary The Einstein's Brain Project, The Errant Eye, The Furnace, 1998-1999. Go here and click on the first segment under "Multimedia."]
The virtual environment installations The Errant Eye (1997-2001) and The Madhouse (2001) delved into Dunning and Woodrow's premise that the image captured on the retina doesn't always converge with brain activity. In The Errant Eye, the biological data gathered in real time on the participant's body altered the display parameters of a three-dimensional virtual universe. The participant donned a head-mounted display equipped with encephalogram electrodes that recorded the changing amplitude of brainwaves from the brain's right and left sides. The participant then navigated around a recognizable visual environment, a forest whose outline faded into an abstract visual universe reflecting the variations in biological signals processed in real time by a computer module. Once the feedback process reached the balance sought, the participant could recognize recurring motifs that corresponded to certain types of reactions and perceptions.
The Madhouse (2001), which was presented at the gallery Oboro (Montreal, Canada) in 2001, allowed participants to pool their individual perceptions as they experimented simultaneously with feedback. A luminous life-size cast of the human body lay in the centre of a room and was surrounded by participants in an immersive state. The participants touched the surface of the body, which stored and displayed their handprints and fingerprints as if which stored and displayed their handprints and fingerprints as if the body's material presence were providing them with a kind of anchorage in the physical world. Behind their displays, the participants were catapulted into a virtual world, while viewers on the periphery could observe their erratic gestures, which resembled the spasms of mental patients (hence the work's title). Through this sharing of the immersive experience, which is often deemed autarchic, Dunning and Woodrow's project created a more complex model of a virtual community that didn't exclude the body of the participants.
A series of installations in development will further explore technological and conceptual aspects begun within The Einstein's Brain Project. Under the working title (WIW), Worlds in Worlds, Dunning plans to put together an immersive environment whose boundaries will be defined by the real dimensions of the room the participant is in. Dunning is also interested in the Anatomically Lifelike Biological Interface, which operates via a model reproducing certain bio-anatomical functions. In this vein, he is pursing research on the properties of ferrofluids, liquid matter that can be altered by an electromagnetic field and modified by biological signals from the human body.
V.B. © 2002 Fondation Daniel Langlois
May 11, 2007
Visual Voice Pro
An Ultra-Responsive Environment
Visual Voice Pro creates an immersive, reactive digital playspace. The installation is comprised of a sensitive microphone, a computer with a data projector, and custom software written especially for the space. The microphone listens to all of the sounds in the room, from tiny footsteps to laughter to singing or even banging a drum. The computer instantly processes the sounds to create abstract, beautiful graphics. Action and re-action are clearly, vibrantly displayed. When the room is quiet, the scene falls still and dark. When there is noise there is activity. The louder the noise the bigger the effect.
"The entire experience has been designed based on requests from parents and care-givers of people with autism, cerebral palsy, and all sorts of behaviour disorders," says Adam Montandon, "So I am very keen to hear what anybody thinks."
Adam Montandon is the Director of HMC Interactive. You can contact him at Grosvenor House, Belgrave Lane, Plymouth, PL4 7DA, UK :: + 44 (0)845 20 11 462 :: Mob: + 44 (0)772 17 36 021.
April 23, 2007
PAM] the Perpetual Art Machine
[PAM] the Perpetual Art Machine Trifecta.
[PAM] TO BE FEATURED AT COACHELLA MUSIC AND ARTS FESTIVAL :: April 27 - 30, 2007 :: In its first appearance on the U.S. West Coast, Perpetual Art Machine [PAM] will be a featured multimedia installation at the Coachella Valley Music and Arts Festival, Indio, California, April 27-29.
This year, Coachella is presenting over 100 of the biggest names in music and art, including Bjork, The Red Hot Chili Peppers, Rage Against the Machine and Sonic Youth. [PAM] is putting into place its largest installation to date, at the Empire Polo Fields, a desert oasis just outside Palm Springs, for this three day outdoor 70,000 person sold out event.
Combining touch screen and projection technology, [PAM] gives its users an immersive 5 channel interactive video experience. Entering into the [PAM] installation, the user becomes the curator of some of the most cutting edge video work made this century. [PAM] democratizes the curatorial process by inviting both the artist and the viewer/user to participate on site and online at perpetualartmachine.com.
[PAM] was created in December 2005 as a collaboration between the artists Lee Wells, Raphaele Shirley, Chris Borkowski and Aaron Miller and has had installations and screenings worldwide including the United States, England, Austria, Russia, Brazil, Canada, France and Croatia.
At this year¹s Coachella festival, [PAM] will premiere its first ever compilation DVD titled "Perpetual Art Machine 2006-2007: Year one · Volume One" featuring 16 videos highlighting some (but not all) of the best work in the [PAM] archive.
For more information on Coachella go to http://www.coachella.com
For immediate assistance please call Lee Wells at 917 723 2524.
[PAM] TO BE FEATURED IN NEW MEDIA ART LOUNGE at artDC. :: April 27 - 30, 2007 :: Perpetual Art Machine [PAM] will be featured as a new media project at the inaugural Modern and Contemporary Art Fair, Washington DC - artDC. Modern, contemporary and high quality cutting edge work defines this newest art fair in America.
[PAM] will present a single channel interactive multimedia installation and will feature the artwork of Bill Dolson, Miroslaw Rogala and Raphaele Shirley. The New Media Lounge is organized by Rody Douzoglou and artDC takes place 27 30 April at Washington Convention Center, Hall E.
[PAM] co-founder, Raphaele Shirley also will be part of the New Media: Exploration and Innovations panel, which will explore the exciting area of new media, which is gaining more attention than any other medium in the current art market. The very distinguished panel of experts will discuss the topic.
Peggy Parsons, Moderator,
Film Department Head of the National Gallery of Art, DC
Chief Curator of the Hirshhorn Museum and Sculpture Gardens, DC
Senior Curator of the Phillips Collection, DC
Consulting Curator of Film and Media of the Smithsonian Museum, DC
Artist and co-founder of Perpetual Art Machine, NYC
For more information on artDC, go to: http://www.dc-artfair.com.
For immediate assistance please call Raphaele Shirley at 917 805 6320.
[PAM] PROFILED IN VIDEO AS URBAN CONDITION PROJECT IN AUSTRIA AND BRAZIL :: Exhibition @ Lentos Kunstmsueum/Museum of Modern Art Linz, Austria :: April 19 May 27, 2007.
Screening: Fundação Clóvis Salgado/Palácio das Artes, Belo Horizonte, Brazil
April 28, 2007.
Compiled by UK artist, Anthony Auerbach, the Video as Urban Condition project explores how video shapes urban experience.
Video is a medium of mass production, mass participation and mass consumption. Video as Urban Condition recognizes the diversity of activity in the field and challenges the participant to reflect on how the relations of representation in society are mediated by video.
The project reflects on the mutability of video as it shifts between fact and fiction, entertainment and persuasion, urban fantasy and reality-TV, art and activism, surveillance and control - tracing the web of interactions between of media and architecture, subject and commodity, identity and desire, the city and its phantasmagoria.
Video as Urban Condition examines a medium whose most distinctive characteristics are multiplicity and diversity, a form, which is not contained by the norms and institutions of art nor by the exclusive domains of professionals.
[PAM] Video Pool contributing artists: Anonymous (Iraq), George Barber (UK), Luis Berrios-Negron (Puerto Rico), Chris Borkowski (US), Josephin Böttger (Germany), Wilson Brown (Brazil), DnasaB (US), G.H. Hovagimyan (US), Stephanie Lempert (US), Aaron Miller (US), Motomichi Nakimura (Japan), Pierre St. Jacques (US), Sanotes (Spain), Raphaele Shirley (US/France), Jeremy Slater (US), Endre Tveitan (Norway), Lee Wells (US)
For more information on Video as Urban Condition go to:
For immediate assistance please call Raphaele Shirley at 917 805 6320.
April 19, 2007
Performance + Discussion
A tele-immersive cross-disciplinary performance piece called The Reception will be presented April 20, 21, 27, 28 at 8pm and April 22, 29 at 2pm as a part of the Berkeley Dance Project 2007. The piece was created by the co-directors of SmithWymore Disappearing Acts, Lisa Wymore and Sheldon B. Smith in collaboration with Ruzena Bajcsy of CITRIS (Center for Information Technology Research in the Interest of Society). Live performance and streamed realtime 3d tele-immersive technology are used to poetically examine the subject of presence. BDP is an annual collection of danceworks presented by UCBerkeley's Department of Theater Dance and Performance Studies. Performances will take place at UCB's Zellerbach Playhouse theater.
The April 22 performance will be followed by a post-performance discussion: Being Here: Presence/Remote Presence within Live and Media Based Performance by N. Katherine Hayles. The discussion will feature a demonstration of a live bi-located dance utilizing the tele-immersion labs at UC Berkeley and the University of Illinois, Urbana-Champaign. Co-sponsored by the UC Berkeley Department of Theater, Dance, and Performance Studies, the Towsend Center Dance Studies Working Group, and the Dance Department and Intermedia Program at Mills College. The discussion is free and open to the public.
February 02, 2007
Truth and Artifice in Nature
Summerbranch is a new commission by igloo that explores movement and stillness in nature. Using camouflage and other disguises, a person or a computer character can blend into a ‘natural’ environment captured and treated through the moving image. This installation uses the tools of the military-entertainment complex, computer gaming, motion capture, 3D environments and special effects to question what is truth and what is artifice in our attempts to reproduce nature. Through the creation of a computer generated virtual world, Summerbranch seeks to address this by the use of disguise in dance and movement. IGLOO not only investigates the role of the ‘real’ in virtual environments but also that of the reproduction of nature in the history of art and particularly landscape work.
Summerbranch is the result of a 2 month residency at ArtSway in the New Forest. The exhibition is created for 3 gallery spaces. A series of intermedia works which can stand alone as individual installations or together as a site specific install comprising 3 parts: Virtual environment, Video installation, a set of Lenticular Prints & Wallpaper. [via Rhizome]
January 24, 2007
Brigitta Zics: Out of Body Control
Interviewed by Julia Peck
Brigitta Zics discusses the development of a series of virtual reality projects with artist and writer Julia Peck. Zics reflects upon the different approaches she took and outcomes generated within her 'Mirror-Space' and 'Out of Body Control' projects, comparing the experience of working with programmers to develop custom-made software with the opportunity to transpose pre-existing high specification software.
Brigitta Zics, new media artist, discusses two recent projects and the collaborative process that facilitated their production with her colleague from University of Wales Newport, Julia Peck. Zics was born in Hungary and has studied in Germany at the Academy of Media Arts Cologne and Britain. Zics practice concentrates on virtual reality systems and responsive interfaces to provide an immersive and interactive environment for gallery and other audiences. Zics is currently a PhD candidate, researching interface development and real-time visualisation in creative artworks for which 'Out of Body Control' will be the final major work. Zics and Peck met in their course of undertaking their research projects and have used the interview to extend their discussions on Zics' creative output and her theoretical models.
Julia Peck: Brigitta, we're going to talk about two recent projects, 'Mirror_SPACE' (2004-2005) and 'Out of Body Control' (2004-2006) and the collaborative and conceptual processes that you go through to create these two pieces of art. I'd like to start with 'Mirror_SPACE', as you initiated the project and then located programmers who could help you realise the end artwork. Did you find that the way you thought about the project was quite different to the programmers?
Brigitta Zics: Actually to be honest, this was the first time I had this kind of artistic idea and I wasn't really sure which form of creative process I would choose or who I was going to work with. I wasn't sure how many programmers I would need or do I just do the concept for myself. For me, this project started a new form of creating. Before I was working by myself rarely using external help. During the project I realised the whole idea was really complex, therefore I decided to use people who have specialist knowledge. So as part of the process I was asking people to help me, and when I got these people, then we worked together and they provided ideas for the project but I had a very strong vision of how it should look, so I tried to implement the knowledge and I tried to lead them to this. But it is also important to say that these people were students not professionals, so I had to deal with problems such as they didn't have enough time etc. It also depended on the money because I just had a small budget for the project but I wanted to realise quite a complex thing. This was a new form of project management, a new way of expression. At the same time I have to admit that some of the naivety or freshness disappeared from the art work. More >>
January 18, 2007
@ Tesla/Transmediale 2007
Schwelle is a new media and performance project using cutting edge acoustic and interactive technologies to explore the extreme threshold states of consciousness that constitute human experience. The multi-part project uses High Definition image and multi-channel sound, interactive installation and live performance to create states of consciousness in the spectator akin to the thresholds states that one experiences at the edge of trance, sleep and death. For transmediale '07, Tesla presents parts 1 and 2.
Part 1 is a turbulent exploration told by way of image and sound of the experience undergone at the time of the dissolution of the body and of consciousness. Part 2 is a live performance in which the audience confronts a lone single performer Michael Schumacher, master improviser and long time dancer by William Forsythe’s Frankfurt Ballet, experiencing the traumatic transition period between death and rebirth. Utilizing wireless sensor networks in the room and on the dancer’s body, Part 2 creates a stage environment where light, sound and objects take on their own choreography, performing with Schumacher, breathing, and behaving alongside him. Where does the body end and the room begin? What happens in the threshold where body and room merge, mutually influencing and transforming each other?
Schwelle is a co-production between artists and researchers from cultural and scientific institutions in Canada, China, Germany, Holland, China, and the USA. Schwelle, Part 2 was partially developed during a project residency at Tesla in Summer 2006 and will receive its world premiere at "tesla zur transmediale".
Concept/Direction/HD Video/Sound: Chris Salter
Collaboration Sound: Daniel Moody-Grigsby and Philip Viel
Premiere: Tesla/Transmediale 2007, Berlin, February 2007
Concept/Direction: Chris Salter, in collaboration with Michael Schumacher
Dramaturgy: Heidi Gilpin
Lighting Design: Lea Xiao
Sound Design and Programming: Marije Baalman, Daniel Moody- Grigsby, Chris
Salter, Philip Viel
Interaction Design/Sensing Systems: Marije Baalman
Objects: Thomas Spier, Flora Luna
Production Stage Manager/Technical Director: Daniel Plewe
Production Assistents: Daniel Wessolek, Alexander Wilson, Brett Bergmann
Tesla: Medien-Kunst-Labor, Klosterstrasse 68, Berlin
Thursday-Saturday, February 1-3, 2007. 20:30
For tickets/more information, please visit http://firstname.lastname@example.org
Christopher L. Salter, Ph.D.
Asst. Professor of Computation Arts
Faculty of Fine Arts
January 15, 2007
37 Isolated Events
Contemporary Butoh Dance and Immersive Video Performance
this ocean is also the desert and I am walking into a minefield, into this installed landscape this land no longer part of the soul, I swallow, I listen, I can see your body cut into foreign lands.
37 Isolated Events begins with the normal running temperature of the human body and gradually fabricates a facsimile body. Within the noise of networked society, our intimate distance and distant intimacy induce a virtual, mediated sensibility. We are anesthetized - our breath mechanized - as the human biological system becomes hybridized with the global system. At thirty-seven degrees Celsius, in isolation, we have unprecedented potential to risk exposure and make contact inside the noise of a growing global network.
concept/direction | paige starling sorvillo
collaborating media artist | lucy hg (LA)
sound artists | imaginationandmymother (UK)
performance/choreography | sorvillo, monique goldwater, isabelle sjahsam, jez lee
lighting design | elaine buckholtz
photography | ian winters
San Francisco Asian Art Museum :: 200 Larkin, Samsung Hall (Civic Center BART)
37 Isolated Events is a supported in part by the Zellerbach Family Foundation and Asian American Dance Performances. Paige Starling Sorvillo is honored to be a 2007 CHIME awardee with Marc Bamuthi Joseph.
January 10, 2007
Cinematic and Immersive VR Experiences
The purpose of the Neurornet will be to facilitate cinematic and immersive virtual reality experiences across distances. These will include almost every type of experience imaginable with some of the most obvious being real-time video chat, video streaming, virtual reality travel, history, adventure, gaming, entertainment, sports, hobbies, business, education, medicine and training to name just a few.
The Neuronet will function similarly to the Internet in its ability connect users in different locations, but instead of the user interface mechanisms associated with the Internet, it will use Virtual reality (VR) technologies to facilitate cinematic and immersive virtual reality experiences for end-users.
Though similar to the Internet in its ability to connect individuals in different locations, the architecture of the Neuronet will be distinctly different. The entire Neuronet will be hosted on one central server system and then be mirrored to Neuronet metro servers in different geographic locations around the globe. This architecture is required to enable the localized transmission speeds required to facilitate immersive and cinematic virtual reality. The Neuronet will also support real-time peer-to-peer VR and gaming applications.
A special language or series of languages will be developed for transmitting VR data over the Neuronet, which may be referred to as Virtual Reality Over Neuronet Protocol (VRONP). In order to avoid data corruption and security risks, VRONP data will not be transferred over the Internet.
November 13, 2006
Immersive Virtual Art and the Essence of Spatiality
Char Davies' Immersive Virtual Art and the Essence of Spatiality by Laurie McRobert :: In this first book-length study of the internationally renowned Canadian artist Char Davies, Laurie McRobert examines the digital installations Osmose and Ephémère in the context of Davies’ artistic and conceptual inspirations. Davies, originally a painter, turned to technology in an effort to create the effect of osmosis between self and world. By donning a head-mounted display unit and a body vest to monitor breathing and balance, participants are immersed in 3D-virtual space where they interact with abstract images of nature while manoeuvring in an artificial spatial environment.
Char Davies’ Immersive Virtual Art and the Essence of Spatiality explores spatiality through a broad scope of disciplines, including philosophy, mythology, biology, and visual studies, in order to familiarize the reader with virtual reality art – how it differs from traditional artistic media and why immersive virtual art promises to expand our imaginative horizons. This original study provides us with an important exposition of two of Char Davies’ acclaimed projects and an exploration of the future impact of digital virtual art on our worldviews.
Laurie McRobert is an independent scholar and a former lecturer in the Department of Philosophy at McGill University and the Thomas More Institute for Adult Education.
"What I am trying to translate to you is more mysterious; it is entwined in the very roots of being, in the impalpable source of sensations." - J. Gasquet, Cezanne, quoted by Merleau-Ponty, 'Eye and Mind'.
I have been working in "virtual space" for nearly 10 years, and during that time have produced two major works, the virtual environments Osmose (1995) and Ephémère (1998) . Integrating full body immersion, interactive 3-D digital imagery and sound, and navigation via a breathing interface, these works embody a radically alternative approach to immersive virtual space, or what is commonly known as "virtual reality" or "VR". Rather than approaching the medium as a means of escape into some disembodied techno-utopian fantasy, I see it as a means of return, i.e. of facilitating a temporary release from our habitual perceptions and culturally-biased assumptions about being in the world, to enable us, however momentarily, to perceive ourselves and the world around us freshly.
 Osmose (1995) and Ephémère (1998) were constructed with the dedicated participation of the following individuals: John Harrison, custom programming; Georges Mauro, graphics; Dorota Blaszczak, 3-D sonic architecture; and Rick Bidlack, sound composition.
It should be noted that when I say virtual space, I am referring to immersive virtual space, i.e. a computer-generated artificial environment that one can seemingly, with the aid of various devices, go inside. I think of virtual space as a spatiotemporal "arena" wherein mental models or abstract constructs of the world can be given virtual embodiment (visual and aural) in three dimensions and be animated through time. Most significantly, these can then be kinesthetically explored by others through full body immersion and real-time interaction, even while such constructs retain their immateriality. Immersive virtual space is thus a philosophical and a participatory medium, a unique convergence in which the immaterial is confused with the bodily-felt, and the imaginary with the strangely real. This paradox is its most singular power. The firsthand experience of being bodily immersed in its all-encompassing spatiality is key: when combined with its capacity for abstraction, temporality, and interaction, and when approached through an embodying interface, immersive virtual space becomes a very potent medium indeed.
Between 1995 and 2001, more than 20,000 people have been individually immersed in the virtual environments Osmose and Ephémère. A common response to the experience is one of astonishment: many "immersants" have described their experience in euphoric terms while others have inexplicably wept. As one participant wrote six months afterwards: '[This experience] heightened an awareness of my body as a site of consciousness and of the experience and sensation of consciousness occupying space. It's the most evocative exploration of the perception of consciousness that I have experienced since I can’t remember when.'
Such responses suggest that immersive virtual space, when approached in an unconventional way, can indeed provide a means of perceiving freshly. The medium’s paradoxical qualities may effectively be usedto redirect attention from our usual distractions and assumptions to the sensations of our own condition as briefly embodied sentient beings immersed in the flow of life through space and time." Continue reading Virtual Space by Char Davies. Published in SPACE in Science, Art and Society. François Penz, Gregory Radick and Robert Howell, eds.; Cambridge, England: Cambridge Universty Press (2004), pp. 69-104, illus.
October 31, 2006
A Tele-Immersive Collaboration
Synthecology combines the possibilities of tele-immersive collaboration with a new architecture for virtual reality sound immersion to create a environment where musicians from all locations can interactively perform and create sonic environments.
Compose, sculpt, and improvise with other musicians and artists in an ephemeral garden of sonic lifeforms. Synthecology invites visitors in this digitally fertile space to create a musical sculpture of sythesized tones and sound samples provided by web inhabitants. Upon entering the garden, each participant can pluck contributed sounds from the air and plant them, wander the garden playing their own improvisation or collaborate with other participants to create/author a new composition.
As each new 'seed' is planted and grown, sculpted and played, this garden becomes both a musical instrument and a composition to be shared with the rest of the network. Every inhabitant creates, not just as an individual composer shaping their own themes, but as a collaborator in real time who is able to improvise new soundscapes in the garden by cooperating with other avatars from diverse geographical locations.
Virtual participants are fully immersed in the garden landscape through the use of passive stereoscopic technology and spatialized audio to create a networked tele-immersive environment where all inhabitants can collaborate, socialize and play. Guests from across the globe are similarly embodied as avatars through out this environment, each experiencing the audio and visual presence of the others.
Participants from the WWW use a browser interface to contribute sound elements to the garden environment for use as compositional items. All the while, this real-time composition is streamed through web broadcast of the virtual environment to illustrate the audio-visual transformation of the garden. Broadcast throughout the entirety of the festival, Synthecology will celebrate the possibilities of collaboration, improvisation, and distributed authorship that exist on the horizon of an increasingly interconnected world.
As current advances in networking become commonplace, the creation of collaborative environments connecting remote individuals will become less involved. By augmenting the possibilities for users to share sensory presence through tele-immersive interfaces, Applied Interactives intends to combine the possibilities of real-time collaboration and socialization with the dynamics of digital creation and manipulation. Synthecology is a speculative glance at how the technology of today may be utilized to create new autonomous zones for sampling & re-mixing culture.
Synthecology is being created as a collaboration of students and faculty from the Electronic Visualization Laboratory at the University of Illinois at Chicago, The School of the Art Institute of Chicago, and Columbia College Chicago, and art(n) through the Applied Interactives organization.
ABOUT APPLIED INTERACTIVES
The purpose of Applied Interactives, NFP is to educate the art and science community about the medium of Virtual Reality as an interactive, computer-generated, immersive computer graphics environment. Applied Interactives, NFP plans to advance the medium through research and experimentation as well as provide a bridge to bring the technology out of institutional labs and into more publicly accessible arenas. Applied Interactives, NFP intends to propagate the medium by providing support and direct access to the resources necessary for artists and scientists to exhibit and develop works in the medium.
September 06, 2006
Exploration of the Mind's Receptiveness to Suggestion
Silent Sound in Liverpool, U.K. :: PERFORMANCE: THURSDAY 14TH SEPTEMBER :: EXHIBITION: 15TH SEPTEMBER - 26TH NOVEMBER :: Artists Iain Forsyth & Jane Pollard invite you to a uniquely immersive experience which explores the mind's susceptibility to subliminal suggestion.
In 1865 Victorian performers The Davenport Brothers presented a public séance from inside their spirit cabinet on the stage of the Small Concert Hall in St. George's Hall, Liverpool. Unused for over twenty years the Hall will reopen for Silent Sound where the artists will perform on the same stage used by The Davenports. The evening will be introduced by Doctor of Parapsychology Ciarn O'Keeffe and will debut a specially commissioned original score written and performed by Jason Pierce of the band Spiritualized.
The performance takes place during the Liverpool Biennial on Thursday 14th September 2006. For more information regarding obtaining tickets for the performance please email: tickets[at]silentsound.info.
From Friday 15th September Silent Sound will feature in the inaugural programme at Greenland Street, A Foundation's major new contemporary art space - three former industrial buildings in the heart of Liverpool's Baltic Triangle which have been transformed into one of the largest and most challenging exhibition spaces for contemporary art in the country.
An ambisonic audio recording of the live performance created in collaboration with Arup Acoustics will be incorporated into a large-scale immersive installation specifically created by the artists to continue the event's exploration of the mind's potential receptiveness to suggestion. The recording will be reproduced in the installation to three-dimensionally reproduce the sound as it was actually heard in St. George's Hall.
A limited edition signed and numbered Compact Disc recording will be available from Monday 18th September, with a DVD documentary to follow in 2007. All graphic design for the project has been executed by Chris Bigg and Vaughan Oliver at v23.
More information: http://www.silentsound.info
August 07, 2006
The Travels of Mariko Horo
Being Mariko Horo
The Travels of Mariko Horo by Tamiko Thiel: Sometime between the 12th and the 22nd centuries Mariko HMrM, Mariko the Wanderer, journeys westward from Japan in search of the Buddhist Paradise floating in the Western Seas. She does find Paradise, but finds also a chilling, darker side to the West, an island where lost souls are held in an eternal Limbo. She encapsulates her impressions of the places she visits in a series of 3D virtual worlds and invites you to see the West through her eyes.
The Travels of Mariko is an interactive 3D virtual reality installation. The image is generated in real time on a fast gaming PC and projected on a large 9 x 12 screen to produce an immersive experience. Users move their viewpoint through the virtual environment with a joystick or similar navigational input device. Mariko is a fictitious character Thiel invented to incorporate the viewpoint for the project - users will never actually see Mariko, except perhaps in a mirror. In essence they will be Mariko, seeing the exotic and mysterious Occident through her eyes and experiences.
The virtual environment is sensitive to their presence, changing around them as a result of their movements and actions: An empty church fills with saints who vanish into the heavens. A basilica transports the user directly into the Western Cosmos, where angels sing the praises of the Goddess of Compassion. A pavilion takes users deep into the underwater realm of the Heavenly King. A plain wooden chapel leads into a Limbo of constant torment.
Music for Mariko Horo is embedded in the piece itself, localized to specific places within the 3D world. The composer Ping Jin, Professor of Music at SUNY/New Paltz, studied music both in his native China and in the USA. Ping describes the music as "creating a sonic dimension for Mariko's meditation on the mythic West. Created from both sampled and computer generated sounds, there are fusions and juxtapositions of Eastern and Western sounds to enhance the scene and mood of each section."
As the 2006 winner of the "Young Art/New Media" prize of the City of Munich, curated by Bettina Wagner-Bergelt and Dr. Stefan Urbaschek, "The Travels of Mariko Horo" will form an interactive 3D stage set for a specially commissioned Butoh dance performance by Shinichi Iova-Koga (www.inkboat.com) and Ishide Takuya during the Dance2006 Festival in Munich.
See also Mariko Horo's Logbook: This is a "preview" or "trailer" that Thamiko did as a brief narrative commentary on the much larger, non-linear VR installation. It was published in the June edition of the online art journal mark(s).
July 18, 2006
An Absurd Quest
Human Trials -- by Josephine Anstey, Dave Pape, and Sarah Bay-Chengis -- is simultaneously a public / private and embodied / disembodied performance. One user enters an immersive VR and is led on an absurd quest. The challenges appears to be about control and the choices one makes with power; but the games are rigged, the characters are duplicitous, the quest is a decoy, and the underlying test is how to cope with disempowerment. Meanwhile the experience is screened for a voyeuristic audience primed by reality TV. The audience members simultaneously watch multiple viewpoints of the virtual world, while live performers, networked into the VE, attempt to entangle the protagonist in their improvisational machinations.
The performers play two characters, Patofil and Filopat, who engage the participant in a set of overt challenges involving computer-controlled characters and dynamic virtual sets. Beyond these obvious tasks, the participant must also interpret and negotiate a subtext about world views, relationships and alliances. The participant's reactions are logged, interpreted psychologically, and affect the characters' behavior, the presentation of further challenges, and the ending. Although we expect the story to follow a basic arc based on a storyboard, our script/improvisation notes for the actors are evolving during performances.
Human Trials is designed for CAVE or CAVE-like, tracked, immersive VR systems. 3-D stereo displays with one large screen or multiple screens and/or HMDs. Ideally the participant and two human actors each enter the virtual environment from their own VR system. In effect the actors are manipulating life-size puppets since their tracking systems animate the avatars of Filopat and Patofil that the participant sees. A fall-back position is to have the actors operate their puppets from monitors without tracking systems, in this case they can still navigate their puppet wherever they need in the virtual environment.
Human Trials builds on Josephine Anstey and Dave Pape's previous experience building dramatic VR, The Thing Growing, and Networked VR projects, and Sarah Bay-Cheng's experience with drama and puppetry. Human Trials is a sister project to The Trail The Trial, experimental research focused on building intelligent agents to take the place of the human actors.
July 07, 2006
Sensing Place Through a Complex Web of Relations
Presented by the Brisbane Festival 2006, and Centre Of Contemporary Art, Cairns and conceived and developed by Transmute Collective, Intimate Transactions is a new type of interactive installation that allows two people, located in geographically separate spaces, to interact simultaneously using only their bodies. As this highly immersive experience evolves through digital image, sound and tactile feedback, each person begins to sense their place in a complex web of relations that connects them and everything else within the work.
VENUES: The Block, QUT Creative Industries Precinct, Musk Ave, kelvin Grove & Centre of Contemporary Arts Cairns. DATE & TIME: Sat 15 - Sat 22 Jul; 11am - 2pm & 3pm - 6pm daily; Individual sessions every half hour. DURATION: 30 minutes per session. TICKETS: FREE event. BOOKINGS: Brisbane: 07 3864 5495; Cairns: 07 4050 9493.
Intimate Transactions has been shown extensively in Europe and Australia. It was awarded an Honorary Mention in the Prix Ars Electronica Interactive Arts category, 2005.
July 06, 2006
Hyper-Instruments and Immersive Spaces
VIRTUAL MUSIC COMPOSITION: Unsatisfied with just creating virtual plant life, a provactive multiplayer game, and ambitious machinima, Robbie Dingo has also been hard at work creating virtual world musical instruments that actually play in-world in real time. He extensively documents the creation of his "Hyper-Instruments" on his blog here and here, and recently sent me word of a song-writing contest for music composed to be performed on his new SL cello.
ADVENTURES IN IMMERSION: Seasoned RL/SL architect Jauani Wu takes us on a personal tour of successfully immersive spaces in Second Life, accompanied by a 3D designer's manifesto written e.e. cummings style: immersion requires depth. it's not sufficient for one enclosed space to be finely constructed. it requires that the next space be so as well. and the space after that. immersion works better when the surface of one space frames the next. it reinforces the notion of world. By my lights, mandatory reading for builders looking to truly transform the world according to their vision. Read it all here. [posted on New World Notes]
June 28, 2006
TRG [Transreality Generators]
The 'Irreal' in New Media Art
TRG [Transreality Generators]--by FoAM--is a project that builds upon the two previous immersive installations / responsive environments: TGarden and txOom. All three projects are concerned with the 'irreal' in new media art. By 'irreal' we mean artworks that provoke a tension or imbalance between tangible reality and the imaginary worlds. With responsive environments, the line between the real and the imaginary can be very thin, allowing for the experience to become 'irreal' - where the participants fade in and out of physical reality, uncertain of their position in this 'reality continuum'. TGarden was designed to allow human gestures to use video and audio as calligraphic media, to 'write and draw' the immersive environment on the fly. txOom extended the concept and became an 'irreal ecology' where media would grow based on their interaction with the participants. TRG changes scale once more, to infinitely large and infinitely small 'irreal universes', whose existence is highly unstable and unpredictable, where minuscule local interactions can conjure up the lives massive worlds.
The conceptual framework within which the artists develop their concepts and designs is 'simulation of physics and physicality of simulation'. In order to be able to 'mix' the physical and the virtual reality in interesting ways, the artists explore the basic principles that constitute our universe (gravity, electromagnetism, nuclear forces) and create media worlds as new universes, where physical laws become amplified, distorted, enlarged etc. To the audience immersed in the MR environment, these worlds should 'feel' as familiar and intuitive, but upon closer inspection their strange and magical properties can become apparent. The TRG team creates rich media worlds consisting of materials, objects, interfaces and architectures that allow a seamless transition between the physical reality and the computational worlds. By correlating some of the more interesting properties in the physical systems (such as fundamental physical forces, string theory and membrane physics, tensegrity principles, etc.), and abstracting them, there is the potential to create new (coherent) laws within the media-worlds.
The project focuses on Mixed Reality (environments containing significant virtual and physical interaction possibilities, strongly intertwined) and exploring its implications in the cultural sphere. Mixed Reality (MR) enables the public to be immersed in multisensory, responsive media environments capable of reacting and evolving, influenced by human activity and interaction. The participants leave the role of the observers and become creators of a temporary autonomous reality. In TRG, an international team of artists and technologists explore the frontiers of this unique form of creative expression through the integration of self-contained systems of media, materials and structures, perceived as a distinct field of reality, embedded in the physical objects and architecture. This project examines the potential to extend their artistic practice into the field of situational experience design, in which the art-works become all-encompassing art-worlds. [Related]
June 08, 2006
"Monolith[s]" by Michael Takeo Magruder
Monolith[s] juxtaposes two icons of British culture: stone circles (Stonehenge in particular) and the British Broadcasting Corporation (BBC). "We are in a gravitational pull of past and future." (1)
Lacking declarative evidence of its original purpose, Stonehenge is a site of contested meaning. "It suffers from polysemia, in that it signifies a range of meanings, discursively contested through image and text." (2) Visitors are kept at a distance, no longer permitted to walk among the stones and physically experience their immense scale.
In "Monolith[s]," Magruder has appropriated the perspective that many images of the monument give: that of the majestic site at a distance, the glow of the sunset or moonrise radiating from the horizon, its backdrop. But Magruder's virtual world IS approachable—indeed, the user may immerse herself in it or fly above it. As she does, the temporal and spatial dimensions of her own immediate environment are absorbed and rearranged into a constantly evolving virtual realm in which the history of the Information Age materializes.
Formulated according to motifs and proportions of ancient architecture, infused with fundamental mathematics of modern digital communication systems, each genesis of the artwork's geometry is unique. Variables such as the time of day, the viewer's location on the Earth, and the position of the Earth around the sun are incorporated into the artwork, thus instilling into the realm functions of a rudimentary clock, global positioning system, and solar calendar.
Requirements: The technical specifications are detailed on the Setup/Help page. Please read them before proceeding.
Michael Takeo Magruder is an American artist based in the UK who received his formal education at the University of Virginia, USA, graduating with a degree in biological science. His artistic production has been exhibited worldwide and encompasses an eclectic mix of forms ranging from futuristic stained-glass windows, digital light-screens and modular sculptures, to architectural manipulations, ephemeral video projections and interactive net-installations. His work seeks to reflect upon the dualistic nature of media as both information source and cultural stimulant.
(1) Jeanette Winterson, "Weight"
(2) Andy Letcher, Jenny Blain, & Robert J. Wallis, "Re-viewing the Past: Discourse and Power in Images of Prehistory."
May 03, 2006
Schwelle I: Bardo
HDV Tryptic for Three Large Screens and Eight Channel Audio
At Elektra7, Thursday, May 11, 2006, 9:00 PM; Usine C, Montreal, Canada.
Schwelle I: Bardo--by Chris Salter--is a turbulent exploration of the experience undergone at the time of the dissolution of the body and consciousness. It is part I of a larger three part performance project currently under development that uses new sensing, electronic, material and computational technologies to tangibly explore the extreme physical and emotional states that might take place in the thresholds before and after death.
Part 1, Bardo, is an extended audio/visual composition/performance lasting approximately forty-five minutes. Partially shot in High Definition video, the work consists of three overlapped and synchronized 720 x 480 projections that form a 2160 x 480 widescreen resolution image with 8 channels of digital surround sound. Over the duration of forty minutes, the spectators undergo a powerful journey through a transforming landscape of abstract, Rothko-like image and dense, tsunami-like walls of sound that builds toward peak intensity, transporting the viewer through the threshold stages of dying and dissolution.
February 27, 2006
Truth and Artifice in Reproductions of Nature
Summerbranch is a new commission by Igloo that explores movement and stillness in nature. Using camouflage and other disguises, a person or a computer character can blend into a 'natural' environment captured and treated through the moving image. This installation uses the tools of the military-entertainment complex: computer gaming, motion capture, 3D environments and special effects to question what is truth and artifice in our attempts to reproduce nature. Through the creation of a computer generated virtual world Summerbranch seeks to address this through the use of disguise in dance and movement. Igloo not only investigate the role of the 'real' in virtual environments but also that of the reproduction of nature in the history of art and particularly landscape work.
4 March - 30 April 2006: Reception for the artists: Saturday 4 March 2006 2pm - 5pmSummerbranch: A Capture 4 co-commission between ArtSway, SCAN & Arts Council England.
Exhibition Related Events at ArtSway:
Motion Capture Demo Day: Monday 27 March 2006, 10:30am 3:30pm An opportunity for college students, lecturers and heads of departments to visit ArtSway for a demonstration day of Motion Capture Technology by experts Animazoo. Igloo will also be demonstrating this equipment in relation to their work. FREE
Gallery Talk: Saturday 8 April 2006 at 2pm: Join Igloo artists Ruth Gibson and Bruno Martelli on an informal tour of the exhibition as they discuss their work with SCAN Director Helen Sloan. FREE
Digital Reality Flythrough: Tuesday 7 March, 6.30 - 8.30pm--DANA Centre, Queens Gate, London SW7 5HE: A seminar looking at realism in CG special effects and gaming. Representatives from software, gaming, arts and special effects will examine current trends in the real and hyper-real. As part of this seminar Igloo will present their research for Summerbranch. Organised by SCAN and Science Museum as part of Node.L Digital Media Festival, London (Free - Please contact DANA Centre for booking: 020 7942 4040 or tickets[at]danacentre.org.uk)
Summerbranch: Ruth Gibson & Bruno Martelli
Igloo Collaborators: Mark Bruce, Joanne Fong, Alex Jevremovic, John McCormick, Adam Nash, Alex Woolner
Many thanks to Henry Dalton, Lisette Punky Pixie, Matthew Andrews, Gillian Carnegie, Toby Zeigler, Verushka & everyone at ArtSway
Industry Support: Animazoo, RMIT, Coventry University, Bionatics, 3TRPD
PRESS + MEDIA:
If you require full press release, additional images or access to artists please contact Adelina Jedrzejczak on +44 (0)1590 682260 or by email adelina[at]artsway.org.uk
ArtSway, Station Road, Sway, Hampshire SO41 6BA
Tel: +44 (0)1590 682260 E: mail[at]artsway.org.uk
February 08, 2006
Keith Armstrong, Charlotte Vincent, Guy Webster
Shifting Intimacies is an interactive/media artwork by Keith Armstrong, Charlotte Vincent and Guy Webster that invites the participant to meditate upon and witness the human body disintegrating and transforming whilst in motion.
Each participant enters a large, dark space (20m x 8m) alone. They see two circles of projected film imagery, one on a floating disc of white sand and the other on a circle of white dust. Sounds sweep up and down the space through surround sound, whilst participants’ movements direct and affect the filmic image and audio experience. Throughout the work a layer of dust (an artificial life form) slowly eats away and infuses itself deep into the imagery. This immersive work invites differing states of meditation, exploration, stillness and play and moves through states of eternally shifting balance in ways that produces a heightened awareness of the body.
The work uses a range of technologies including interactive video (Very Nervous System), body heat sensors, custom built electronics, image databases, real time computational synthesis software (Opcode Max), networking software, real time audio digital signal processing (Max MSP) and real time show control protocols. Controllable actuators also move physical material through the air.
Historic Project Blog: http://www.embodiedmedia.com/SI/si.html
Event: Shifting Intimacies, Capture 4 Award, 2006
Venue: The Institute of Contemporary Art (ICA), The Mall, London
Times: 16 February (noon) until 21 February (7.30pm)
Contact: ICA Box Office
Phone: +44 20 7930 3647
Vincent Dance Theatre: http://www.vincentdt.com/curshifting.htm
C4 Home: http://www.portlandgreen.com/capture4/
January 19, 2006
Computational Poetics at Upgrade Vancouver
One River (running)
Join us Thursday, January 26th at 8pm for a discussion with Kenneth Newby, Martin Gotfrit, Aleksandra Dulic and Dinka Pignon, who will show us One River (running) and talk about their work with the Computational Poetics Research Group. As always, we will adjourn after the talk to the Whip Gallery [209 east 6th avenue] for a drink and some chatter.
One River (running) is an interactive, immersive audio environment designed to create both a visual and audio experience of rivers using a complex system of moving sound, moving images, and a physical structure also designed to echo the river¹s undulating geographic form. The artwork¹s images originate from digital photographs of people¹s mouths whom the team interviewed. These still images were then algorithmically programmed. The mouths recognize the voices, and move as though they are speaking the words they hear. The artists created a voice recognition software program to do this synchronization. The video is listening and responding to the audio. The piece was recently exhibited at the Surrey Art Gallery.
Kenneth Newby, Martin Gotfrit and Aleksandra Dulic are part of the Computational Poetics Research Group, a research project that works at the intersections of art, culture and computation and aims to articulate some of the features of an emergent poetics of digital art performance while developing a tool-set to enable artists working in the computational medium to create, present and document their work. A key objective of their work is to share the compositional process and the issues that arise in the work of interdisciplinary computational media performance. Contemporary computational techniques enable creative and performing artists to enter into new collaborative relationships with encoded systems.
When: Thursday, January 26 at 8pm
Where: Western Front, 303 East 8th Avenue, Vancouver, Canada FREE! Everyone welcome.
December 02, 2005
Move Freely in a Fully Immersive Environment
VirtuSphere revolutionizes the way humans interact with computers. The method and system provide infinite space and the most immersive experience for simulated training, exercise and gaming.
The VirtuSphere platform consists of a large hollow sphere that sits on top of a base and allows the sphere to rotate 360 degrees. Wearing a wireless, head-mounted display, users can step inside the sphere to fully interact in immersive virtual environments. The VirtuSphere enables 6 degrees of freedom – one can move in any direction; walk, jump, roll, crawl, run over virtually unlimited distances without encountering real-world physical obstacles.
VirtuSphere systems are made to client specifications and typically include an easy-to-assemble sphere, a base platform that enables it to rotate, a head-mounted display, 3D sensors, sphere rotation trackers, a computer, device drivers and 3D software applications. [via Rhizome]
October 24, 2005
Calibrate an Artificial Immune System
Fugue is the result of a collaboration between artists, new music composer and computer scientists. The result is an on-going project which provides a new way of communicating complex scientific ideas to any audience. Immersive virtual reality and sound provide an interactive audiovisual interface to the dynamics of a complex system – for this work, an artificial immune system. Alongside with providing the greatest immersive effect currently available, this technology offers the potential to control and calibrate particular audio-visual elements.
The aesthetics of the Fugue is emergent, based upon the essential, fundamental and hidden beauty of the organic processes manifested through the dynamics of the real-time generated, unpredictable algorithm. The piece is set up as an interaction between a virtual (artificial) immune system and a human participant. Participants are able to see and interact with immune cells flowing through a lymphatic vessel and understand how the complex dynamics of the whole are produced by local interactions of viruses, B cells, antibodies, dendritic cells and clotting platelets.
The sound, envisaged as a ‘mental soundscape', a resonance of the function of immune system in the body, provides a major channel for interaction. By overlaying and modulating the sonic pulsation, cycles - such as circadian rhythms, or other inputs such as stress level, will be introduced in the future. The Artificial Immune System will ‘inhabit' the virtual space of the master server computer at the CS UCL that will run the Artificial Immune System continually, providing the possibility of being displayed on different interfaces. [via]
October 07, 2005
Improvisatory in Nature
Contained--by MIX--is a collaborative and interactive, hybrid work performance/installation that utilizes technology to blur artistic boundaries and challenges notions of presentation/audience. The artists have created a generative, improvising, real-time multimedia environment using both commercial and custom designed software in conjunction with dedicated hardware sensors, motorized robotics and real time processing or text, video, images and sound. Contained is improvisatory in nature, principles of movement improvisation are employed in the creation of the imagery, sound, and movement generated by the system.
October 7, 8, 2005; 7:30pm; Doane Dance Performance Space; Denison University campus, Denison University, Granville, Ohio 43023 USA • 1-800-DENISON; Performance with Artists' Talk to follow.
Sandy Mathern-Smith, dance,
Alexander Mouton, Art,
Micaela de Vivero, Art,
Christian Faur, Art, Digital Technology
Aaron Fuleki, Music, Digital Technology
Marlon Barrios Solano, dance and interactive art
Mary Sykes, Lighting Designer
September 29, 2005
Interface [s] Montréal
[Immersion and Virtual Reality]
[Immersion and Virtual Reality] Beyond reality and interactive reality experience--From aerospace to surgery, and in all things game related, the simulators and immersion environments developed to serve humankind are indispensable tools that significantly improve human knowledge and enhance our reality experience.
Speakers: Yves Gonthier - Canadian Space Agency; Jean-Claude Artonne - Immervision; Jocelyn Faubert - Université de Montréal, École d’optométrie; Carl-Éric Aubin - École Polytechnique de Montréal, Dép. génie mécanique & CHU Sainte-Justine; Luc Courchesne - Université de Montréal et Ideaction.
TUESDAY, 4th October 2005 Interface [s] Montréal :: [Immersion and Virtual Reality]; 5:30pm - $30 at the door - package deals available.
[Yves Gonthier] Canadian Space Agency: A Real-Time Simulator for 3D Mental Image Reconstruction On-Board the International Space Station--The operations of manned and unmanned space vehicles and their associated supporting docking and robotics systems require significant crew training both on ground and on orbit. A number of psychological and physiological factors are known to affect the crew on-board performance. Therefore, skills degrade over time and the frequency, depth of proficiency and refresher training need to be studied. Currently a new experiment is designed in order to study the 3D mental image reconstruction for tasks involving the operation of all robotic components of the Mobile Servicing System of the International Space Station. The long-term goal of this research project is to gather data to study skill degradation and recovery of psychomotor and cognitive skills. This data will be analyzed to help define metrics that could be used to assess the level of readiness of an operator to perform complex tasks.
To study performance degradation and skill recovery, a highly efficient simulator is required in order to ensure on-orbit real time simulation and fast feedback to the operator. In this project, the challenge is the real-time simulation of Canadarm2 and Dextre while performing graphics rendering of the worksite environment using just a single computer, in particular a P4 1.8Ghz IBM ThinkPad. The simulator relies on the modeling technology from SGDL to generate highly realistic images. For the graphics rendering of the models, the SGDL models are transformed into an approximate polygonal representation. This minimizes the computational load on the CPU and optimizes the rendering rate by using graphics card hardware acceleration. At the same time, a collision detection algorithm is applied to the exact SGDL model to monitor any collision event.
It is planned that the experiment will be launched on the space station in April 2006.
[Jean-Claude Artonne] Immervision: Panoramic technologies in everyday applications--During the last ten years computer and optical sciences have evolved to open possibilities and create opportunities for new applications. Thanks to these technological advances today we can experience a new “Immersive” dimension.
With today’s panoramic imaging technologies we can digitalise and visualise in real-time a whole 360 degrees environment. Hosting a panoramic videoconference from your mobile phone, keeping an eye on every room of your house, preventing transportation accidents thanks to 360 degrees dynamic analysis, caring for your loved ones from distance and virtually immerging yourself inside the human body will soon all be panoramic imaging technology’s applications as common as today’s ultrasound, radar, telephone and many other technologies developed during the past 50 years.
A new 360 degrees technology invented and patented by Jean-Claude Artonne and developed by ImmerVision meet the quality, flexibility and performance required by today’s rich multimedia applications and most rigorous security and aerospace applications. It is a combination of both hardware and software that run on today’s video camera and computer technology from the smallest pocket PC to the most powerful system. The hardware includes the new panomorph lenses with increasable resolution on specific areas combined with software based on advance imaging algorithms.
[Jocelyn Faubert] Université de Montréal, département optométrique, Chair NSERC-Essilor on presbytia and visual perception: Understanding human behavior with immersive virtual environments--Full-immersive displays such as the CAVE system have been originally developed for visualization and industrial uses. Using such an environment for the study of human performance involves a number of challenges, as it was not originally designed for such a purpose. For several years now our laboratory has adapted the CAVE technology to help us understand human behavior. We are interested in determining the effect of age-related changes on perception, posture and visual-motor control in ecological environments. In particular, we try to understand how prebyopes cope with visual deformations that are induced by corrective lenses. Presbyopia is an age-related change that affects our capacity to focus at near. The first signs usually appear in the 40s and almost 100% of individuals in the 50s are presbyopic. Our initial results are extremely promising and demonstrate that a full-immersive environment is a very powerful tool for assessing human performance but this is a technology that still requires enormous resources in both initial cost and maintenance and may not be generally accessible in the near future.
[Carl-Éric Aubin] École Polytechnique de Montréal, Dép. génie mécanique & CHU Sainte-Justine: Surgical Simulator for the Virtual Prototyping of the Surgical Instrumentation of the Scoliotic Spine--Prof. Carl-Eric Aubin Ph.D., ing.; Ecole Polytechnique de Montréal, département génie mécanique & CHU Sainte-Justine; Collaborateurs : Profs. H. Labelle, B. Ozell, F. Cheriet.
[Luc Courchesne] Université de Montréal and Ideaction: Panoscope 360°--Enter the Panoscope 360° to be fully immersed in a 3D world. A 3-axis joystick placed at the center of the viewing platform will let you and your friends (up to 8) fly through the space as in dreams. This single channel immersive display uses a PC and a custom designed hemispheric projector above your head to project in real time a rendering of your entire horizon onto the hemispheric screen."
September 19, 2005
the Space Between the Physical and the Virtual
Cognitive Agents in 3D Virtual Worlds
"Abstract: We present an agent-based model of virtual worlds in which the objects in the world have agency, that is, the objects can sense their environment, reason about their goals, and make changes to the environment. The agent-based model has the following reasoning processes: interpretation, hypothesizing, and action activation. This agent model is described and illustrated using a wall agent in a multi-user virtual world. We extend the illustration through a demonstration of a multi-agent world in which many of the objects in the world interactively reason about the use of the world and respond specifically to the people in the world." From Cognitive Agents in 3D Virtual Worlds by ML Maher, JS Gero, G Smith, N Gu, University of Sydney, Australia; International of Design Computing, Vol 6, 2003.
"Abstract: The cyberPRINT is a fully immersive, interactive virtual environment that is being generated in real time based on physiological data readings of a human body. In other words, the cyberPRINT is based on creating interfaces between physical and digital spaces and between biology and information technologies. The cyberPRINT is also an event, wherein a performer is connected to the cyberPRINT generator to create a self-sustaining feedback mechanism. Although the use of the body to electronically drive music and media events is not new, most of these works have paid little or no attention to the potential of interactive 3D virtual environments. Nor have they been so technologically advanced, interdisciplinary intensive (involving Architecture, Choreography, Modern Dance, Music, Bioengineering, Medicine and Computer Science), or architecturally focused as the cyberPRINT.
This project covers a wide and fertile territory that goes from the very technical and design oriented to the very theoretical and interdisciplinary. This paper is intended to (1) expand what has been already published about this project (Bermudez et al 2000a) and (2) establish potential areas for discussion before and after the performance."
1. Introduction: Why a Live Performance?
This paper provides background for the live performance of the cyberPRINT, a real time, physiologic data-driven virtual architecture developed by an interdisciplinary team led by two architects during the past 5 years. The reason for this live performance and demonstration is simple. It is only through performance that we can show the true nature of the cyberPRINT. Such demonstration will also provide empirical proof of the theoretical claims and technological details already published elsewhere (Bermudez et al 2000a). In addition, this version of the cyberPRINT will add some novelties occurred since then (such as a new virtual world, data-driven music in real time, a navigational data-globe). Images of a live performance are shown in Figure 1. [via]
August 26, 2005
Residual Memory Immersed, Materialized in a Real Environment
The Residual Data-Cloud--by Diogo Terroso--is an application that loads images from a networked source and generates a data-driven three-dimensional form. Images are collected via a digital camera, or a mobile phone, by the author and participants during presentation. The resulting shape, which resembles a cloud of dust, is a metaphor of residual memory immersed and somehow materialized in a real environment.
Digital appears here as a parallel dimension, in which user’s perception is subjected to layers of abstraction and figuration. Its behaviour in the real space, captured by a tracking device, affects data display by revealing different properties of the cloud. Recognizable shapes appear and disappear through interaction. Movie.
August 25, 2005
StarChild + FlatWorld
Data sonification has been viewed as a tool with great potential for studying complex sets of scientific data. Sonification can provide the researcher with the ability to perceive variations and trends that are invisible to more established data analysis techniques. However, few have explored the potential artistic applications of sonification. In 1996, Jarrell Pair worked with Alec Robinson to se Matlab and Csound to prototype software to transduce data from various sources such as images, temperature, and light intensity into aesthetically pleasing audio. We used this work to develop the audio effects for StarChild. Using Csound and a custom C program, astronomical data from the Shoemaker Levy-9 comet collision was used as input to create audio for portions of StarChild. Additionally, images of the collisions with Jupiter were transduced into audio effects using Hyperupic, an application running on a NeXT computer. 440k MP3 file (low sample rate) taken from the sample files created from the comet collision.
From July 1995 through June 1996, Pair was extensively involved in the technical development of StarChild, a multimedia opera. The opera was composed and produced by Audio Research Team director, James Oliverio. Alec Robinson and Pair created sound effects for the opera using data sonification methods we had developed as part of an ongoing team project. He was also involved in the installation, testing and evaluation of the eight channel audio steering system used in the opera.
The StarChild production team included visual futurist Syd Mead (designer for the films Blade Runner, Aliens, Tron, and Star Trek: The Motion Picture ), the internationally recognized lighting designer Lloyd Sobel, animator Steve Walker, and scientists and engineers from across the Georgia Tech Campus. Students met and worked with the guest artists in workshops, lectures, and in the production of the opera itself. An internet MBone broadcast of StarChild took place on June 5th, 1996. Two live performances followed on June 6th and 7th.
FlatWorld: The Mixed Reality Simulation Space
Since 2001, Pair has overseen the design and development of the FlatWorld project at the University of Southern California's Institute for Creative Technologies (ICT).
FlatWorld is a mixed reality simulation environment merging cinematic stagecraft techniques with immersive media technology. Current virtual environments have severe limitations that have restricted their use. For example, users are often required to wear bulky head mounted displays that restrict a person’s freedom to physically move as they would in the real world. Furthermore, a person cannot touch or feel objects in the virtual world.
This project addresses these issues by developing an approach to virtual reality simulation which allows individuals to walk and run freely among simulated rooms, buildings, and streets.
August 24, 2005
Immersive 3D TV
Watch, Sniff and Feel the Big Game
Almost 77 years after the first demo of stereo TV in 1928 by John Baird, there's evidence of a strong resurgence of interest and research in immersive TV:
"The Japanese Government is quietly throwing huge financial and technical weight into the development of three-dimensional,virtual reality television",reports the Times.It "has obtained an interim report from the Communications Ministry’s "Universal Communications" study group detailing the work in progress.Three-dimensional images apart,the ministry wants to develop the ability to send thousands of different odours through the new television to enhance the sense of reality.Its plans also call for the "recreation of tactile sensations",a hitherto elusive concept that would give viewers the ability to reach out and "feel" what they were seeing.Current projects are working on electrical stimulation for the fingers, ultrasound and air pressure." From The Times Online (via smartmobs) [blogged by sfisher on USC Interactive Media]
August 19, 2005
I-F-E-A-R (infrasound fear emotion audio reverb) is a project by Jodie Hancock, part of the CoEDD graduate exhibition. "This project is an exploration of emotion as a physical force and how that force can define an environment, often with more intensity than is defined by the senses. Fear is a dark and physical emotion which lends itself perfectly to conscious spatial composition."
Jo is using infrasound to see if specific emotions can be triggered. Just like walking down a dark corridor in a game can envoke fear, an installation was built for the exhibition which consists of a corridor. Outside the game Doom 3 is played by another participant. The two event are linked, as the player is controlling the infrasound played to the person in the dark corridor. As Doom is played, the video signal is sent to Max/MSP. The level of light as you walk around the game is monitored by Jitter video. Each time the player shoots in the game, the frequency of the infrasound is changed based on the darkness of the game, creating a linked experience. project website / development blog. [blogged by Chris on pixelsumo]
August 17, 2005
R&Sie(n)’s Dandy & Mutant A-life Architecture
I’ve heard about…©
R&Sie(n)’s Dandy & Mutant A-life Architecture by Joseph Nechvatal: R&Sie(n)’s exhibition “I’ve heard about…©” opened on the 6th of July at the Musee d’Art Moderne de la Ville de Paris‘s temporary space at the Couvent des Cordeliers - and I think it is one of the most relevant exhibitions to what is going on in art today that is of importance. R&Sie(n) is an investigational architectural firm consisting of François Roche, Stéphanie Lavaux, and Jean Navarro; working here with Benoît Durandin. Together, they utilize generative heterogeneous mutations in the creation of proposed utopian city spaces. In fact what they propose at the Musee d’Art Moderne is the artificial growing of extruded urban housing (generative & robotic) - where new cities are constructed via robotic processes by feeding off the carcasses of older dying cities. Very viral. Envisioned is an approach to city planning based on growth scripts and open algorithmic procedures. Towards these ends the show itself includes some subtle audio tracts, model-sculptures, a fully immersive hypnosis chamber with video monitors, booking services, 3D movies and robotic drawings/plans that reveal the source code of the generative program at the heart of their work.
There is a definite tangled and intertwined approach to the city vector that reminds me of the dithyrambic visual hyper-logic which has manifested in all modes of decadent artistic periods; from the Hellenistic and Flamboyant Gothic, to the Mannerist, Rococo, and Fin-de-Siècle - as they all opposed dogmatically imposed ocular paradigms with hyper-engendering strategies of form. The multiplicity of its interwoven experiences challenges the now bogus idea of simplicity – a modernist-minimalist idea which has taken on the intensity of a righteous injunction in many cities where the implied equation between simplicity, surveillance and goodness obscures a less evident function: that of cognitive constraint. Such constraint runs counter to what Georges Bataille considered to be the non-hypocritical human condition, which he took as being roused non-productive expenditure (threshold excess) entangled with exhilaration. For the finest comprehensive overview of Bataille's thought in this regard, see his book Eroticism - but also Denis Hollier's book on Bataille's general postulates, Against Architecture.
Given the organic-like, biomorphic architectural forms R&Sie(n) spawn by their generative program, I could not avoid thinking about the Palais Idéal of Ferdinand Cheval (1836-1924) - which to my eyes appeared to be one huge budding edifice when I visited it a few years ago - as if the stupendous mannerist grotto-façade at villa Borromeo had been left to grow untrimmed and run amok. However, the Palais Idéal was constructed by the postman Cheval alone and by hand in the Hauterives (Drôme) (near Lyon) between the years 1879 and 1912; the result of 93,000 man hours of hard labour. R&Sie(n) rightly prefers the work to be accomplished through a computer programmed emergence via artificial intelligence which directs robotic execution. Why should humans physically work when we might better be playing and dreaming?
The other inescapable reference for R&Sie(n)’s work is the visionary city-planning put forth by the Situationists. One thinks immediately of Guy Debord’s essay On Wild Architecture, for example. Like the Situationists, for R&Sie(n), the urban form no longer depends on the arbitrary decisions or control over its emergence exercised by the elite few. Ultimately R&Sie(n) leads us towards juicy Situationist-like complexities and engagements by way of immersion into an open-ended multiplex conceivable as a virtual environment: the Virtual Reality experience.
As R&Sie(n) say themselves, “Many different stimuli have contributed to the emergence of “I’ve heard about…©” and they are continually reloaded. Its existence is inextricably linked to the end of the grand narratives, the objective recognition of climatic changes, a suspicion of all morality (even ecological), to the vibration of social phenomena and the urgent need to renew the democratic mechanisms. Fiction is its reality principle…” Ahhhhh, the domain of decadent art, VR and artifice. Against Nature.
What has been somewhat poorly determined however is the degree a dweller feels totally immersed in an optically excessive space. And this depends to a large extent on personal psychological need and adaptability in accord with the proposed spatial depth cues. Cognitive-aesthetic space has to be coordinated phenomenologically with the proprioceptive space of the eye - and R&Sie(n)’s only failure is in maintaining the evident structural seams of the immersive faux-hypnotic chamber (the only enterable structure and the highlight of the show) because what the entire show is proposing is a seamless immersion into generative totality, and the visual seams take us out of that exquisite fantasy. So they are denied the loveliest of triumphs.
A pity, as one might otherwise imagine oneself totally immersed there somewhat like a 21st Century dandy. As at the birth of the 20th Century, this new hyper-dandy constantly might affirm his or her originality down to the decorative details of the home. In that the robots are doing the algorithmic planning and building, this work definitely proposes a new form of dandyism - if dandyism’s defining characteristic is remembered to be the making of one’s person a work of art while extolling laziness and displaying contempt for work. Evident here are the Baudelairean/Duchampian dandy ideals of impassivity, nonchalance elegance, and inscrutability. What matters are the triumphs of a radical contempt for one’s “hand”.
Indeed one can say that “I’ve heard about…©” favorably extolled artificiality, indifference, impassiveness - the reign of an ironic causality and knotted ambivalence, while staying open to all transactions. Most importantly, a-life forms are embedded within it and its growth is artificial and synthetic. So R&Sie(n) maintains a version of transcendental phenomenological idealism, but they do not disavow the extant actuality of the material sphere. Instead they seek to elucidate the sense of the world-as-is today - that is viractualized – by stressing the embodied nature of human and artificial consciousness and bodily existence as the original and originating material premise of sense and signification.
All told, the show is well done – as proposition. However this proposition inevitably turns the mind to the actualized imposing suavity of Antoni Gaudí’s fully realized wavy architectural shapes in Catalonia. Although he did not travel about Europe, Gaudí was aquatinted with fin-de-siècle Belgium/French avant-garde movements because of the intimate relationship between Barcelona and France and with the pre-modernistic movements of Arts and Crafts, Gothic Revival, and Impressionism which were discussed in the intellectual proto-modernist circle which he frequented. But it was Victor Horta's Art Nouveau movement that influenced Gaudí the most, stimulating him to experiment with new materials and new fluid shapes that appear grown. Gaudí's version of Art Nouveau is characterized by an overwhelming proclivity for the organic nature of women, beasts, and plants which he translated into immersive utility.
Antoni Gaudí is a chief exponent of R&Sie(n)-type open algorithmic building procedures precisely with his 1906 building Casa Batlló located at 43, Passeig de Gràcia, Barcelona - noticeable for its organic tactility of bones and shells within, and its external cocked surf façade and chimerical roof. With Casa Batlló, Gaudí accomplished an astute transformation of an existing building, transforming it into an enchanting immersive gesamtkunstwerk as Gaudí thoroughly undertook the design of every single element of the building, from the extravagantly protuberant façade to all aspects of the interior, including the gracefully gnarled furniture. On the exterior Gaudí was able to combine a flamboyantly surging façade (in an ingeniously cool-color orchestration) while maintaining a dialogue with the neighboring Casa Ametller (1900), built by Josep Puig i Cadafalch (1869-1956) four years earlier. Powerful pillars which resemble the substantiality of mammoth elephant legs accost the visitor at street level, protruding into the sidewalk, nigh tripping up an unaware pedestrian. These legs are bordered by a craggy vertebrae-like tier and the wavy façade extends upward between these two biologically evoking forms, culminating at the roof in a gargoylesque humping crescendo. The façade itself, coated in a layer of Montjuïc stone, shimmers seductively under the sun in multifarious chameleon-like colors; fraught with a scattering of small roundish plates resembling fish or reptilian scales. Affixed to this seething mass of swelling construction are a number of small, elegantly curved balconies with oval shaped portholes.
The entire structure feels unsharpened, flowing and smooth in opposition to the street itself on which the arrangement sits, with the exception of a few square windows up top. Even the walls are gently rounded in strained undulation and contraction, as if they too have entered into the oceanic female throws of a fluttering uteral orgasm. The walls appear to be made of a soft, smooth, supple, leathery material and this illusion of softness is carried through by the roundness of the inside forms of the building where one has the feeling of being pleasantly encased in an expanse of hardened dripped honey. Turning, lunging stair railings are met, engulfed and supplemented by softly heaving honey-colored walls and wooden biomorphicly shaped carved doors and irregularly shaped windows. There are no right angled corners or straight lines, which offers an impression of being wrapped up in one continuous fluid wave motion, complimentary with the exterior. By comparison, R&Sie(n) still has a ways to go in achieving a like sensuality of its avant-garde stance. But one only hopes for them an immediate success in doing so.
R&Sie(n) = François Roche, Stéphanie Lavaux, and Jean Navarro with Benoît Durandin
With the production and authorship of :
-Berokh Khoshnevis (Contour Crafting Process, USC, LA)
-Francois Roustang (Hypnosis specialist, Paris)
-Chris Delaporte (Film director, 3D effect, Paris)
-Christophe Berdaguer & Marie Pejus (Artist, Marseille)
-Mathieu Lehanneur (Designer, Paris)
-Laurent Genefort (Science Fiction writer, Paris)
-CNRS Grenoble, Laboratoire de Spectrometrie (Nano Particules)
-M/M (Graphic designer, Paris)
-Gilles Schaeffer (mathematicien, Paris)
-Michel Boulcourt (Landscape architect, Paris)
-Alexandra Midal (Author, Paris)
-Matthieu Kavyrchine (Video artist, Paris)
-Sebastien Szczyrk (Sound designer, Paris)
-Alexandre Merlet (Video producer, Paris)
-Stephan Henrich (Architect, Germany)
Prototype / installation / publishing
-Ufacto, David Toppani (prototype scale 1)
-One Star Press (neighbourhood protocole publishing)
-Christian Hubert Delisle (prototype)
-Thibaut Boyer (installation)
-Jean-Michel Castagné (electronic driver)
MAM Paris-Musée (F), MUDAM (L), De SINGEL (B), USC (USA), KANAZAWA 21st century museum (J), CNC / Dicream, (F), LAFARGE (F), Materialise (B), Next Limit Technologies (SP), DAPA (F), University of Architecture Innsbruck (OST), New-Territories (F)
Show closes the 9th of October. Musee d’Art Moderne de la Ville de Paris / Couvent des Cordeliers / 15 Rue de l'ecole de Medecine / 75006 / Phone : 0156813321
Images and more info available at: www.mam.paris.fr
Even more info available at: www.new-territories.com
August 10, 2005
Everyday Drama Enacted Moment by Moment
"My goal as an artist is to understand and intervene into the production of material power and technologies. My day job as MIT professor involves doing that at a very literal level. "Skin" and "Control" do this at a more symbolic level, in the form of installations that operate as immersive environments, walk-in tableaux with everyday drama being enacted moment by moment." –Chris Csikszentmihályi
On Skin Control by Chris Csikszentmihályi--shown at Location One--Csikszentmihályi is conceptual and scalar ambitions locate him within a specific generation that has its own characteristic forms of artistic practice. For artists coming of age since the 1980s, (Damien Hirst, Matthew Ritchie, and Anselm Kiefer come to mind), science and technology are not truth as much as culture they form the mythic structures of our time. –Caroline Jones
What Csikszentmihályi offers is a way of visualizing a certain kind of power at work in the world. The power of control, of telemetry, of the ways and means of speed. It is a surprisingly abstract art. –McKenzie Wark.
August 07, 2005
Public Participation in Regional Planning
Sim Civics by Jeff MacIntyre, The Boston Globe, August 7, 2005: New game-like computer software is empowering ordinary citizens to help design better cities. Can the professionals and the public learn to play well together?
[Image: SIM ISLAND--This 3-D image of Honolulu was created using GIS software from the company ERSI, as part of Honolulu's six-year planning initiative. (Courtesy of ESRI/City of Honolulu).]
"FIFTEEN YEARS AGO, the future of urban planning arrived in the form of a wonkish but strangely addictive new computer game. In SimCity, a player assumed the twin roles of mayor and city planner, creating elaborate cityscapes, managing zoning, transportation, and growth, while fighting off poverty, crime, traffic, and pollution..."SimCity...has probably introduced more people to urban planning than any book ever has."...Today, a new generation of GIS applications, known as ''scenario planning" or ''decision support" tools--which allow users to visualize, project, and manipulate a wealth of environmental data--have made citizens into major players in the gaming of urban futures. Across the United States...these tools are enabling an unprecedented level of public participation in broad regional planning initiatives..."
August 04, 2005
Virtual Playa Project
The VIRTUAL PLAYA PROJECT is a navigable 3d digital Burningman environment using Microsoft Flight Simulator as a platform. It is intended to be an open-ended project that invites participation at various levels. It can be downloaded for home use; played on a giant screen at a Burningman event, or even be used as a design tool for a theme camp or artist wishing to plan an installation before it ever gets to Black Rock City.
The ultimate wish for the project however, is for the Virtual Playa to be the Burningman Cyber Regional. Using multi player technology, it can become a portal through which we can meet on line, and share experience with other cyber burners from anywhere in the world in real time. This takes the project from just being a cool piece of collaborative digital art, to a true meeting place for the cyber-tribe. Download it for free, copy it, send it to pals, leave it on buses, give it away as a gift.....spread the word.
July 27, 2005
Ambient Experience Suite
and Kitten Scanner
"The Ambient Experience suite uses Philips' lighting and consumer electronics to create a welcoming and patient-friendly environment for children undergoing medical scans. Featuring a Philips Brilliance CT (computed tomography) scanner in a room with curved walls, it lets young patients choose a theme - or 'ambient environment' - for the room by waving a radio frequency card over a reader to project cartoons and animation themes onto the walls and ceiling using Philips technology. They can also use the Kitten Scanner.
Designed specifically for children scheduled for a CT scan, the Kitten Scanner will let them 'scan' a stuffed elephant or their own toys at the touch of a button. Animation appears on a screen that shows children what doctors are looking for inside the toy and tells them a story about each one. The aim is to show them how the machine works and help ease any anxiety they may be feeling." (Via PhysOrg)
July 08, 2005
Where are you?
Re-Enchanting the World
Enter the Panoscope 360° to be fully immersed in a 3D world. A 3-axis joystick will let you and your friends (up to 8) fly through the space as in dreams. The immersive display uses a PC and a hemispheric projector to project in real time a rendering of your entire horizon onto the screen.
At scale 0, the world looks like a simple XYZ grid defining the experience of the navigable space. At scale +1, the world turns into an archive of pictures, sounds, texts and objects. Zooming out at scale +2, elements of this archive become particles in a "molecular" world of self-organizing clouds of lights. Zooming yet further out at scale +3 reveals a landscape of mountains and valleys.
At any moment in the Where are you? world, visitors may come upon other beings: live ones through telepresence links, pre-recorded subjects in video windows and themselves when cameras transmit their own image in this constructed world. In Where are you? the visitor controls his/her position, the path and speed of his/her journey and the scale at which he/she is prepared to "exist". Video demo. [blogged by Regine on we-make-money-not]
June 28, 2005
Multi Sensorial Radio
Tune Me is an immersive conceptual radio based upon tactile features. The sound and the visual are triggered by "touchy" interfaces. The visitors enter the ellipse-shaped space, immersing themselves in a new world where to listen to the radio waves.
As well as the sound, each channel provides light features as well as vibrating and pulsing experience. When choosing the different FM stations, the overall space changes, defining different moods upon the nature of the different content. News, sport, classical music and international pop. Each of them triggers a different visual experiences, the space vibrates, pulses and interacts with the visitors.
Developed by Line Ulrika Christiansen, Stefano Mirti and Stefano Testa (with Daniele Mancini and Francesca Sassaroli). More pictures by Stefano and Simone. Also part of Touch Me at The Victoria & Albert Museum (London) till August 29th 2005. [blogged by Regine on we-make-money-not]
June 14, 2005
Living Book of the Senses
In Living Book of the Senses--by Diane Gromala--users are able to see their physical surroundings while dynamically engaging with three-dimensional mixed realities which appear on their headsets. Users can interact with the book in dynamic ways. They can ask the book questions (via voice recognition), and can influence the book through their sensory (bio) feedback. Users wear a headset/head-tracker/color camera system that enables them to see physical reality enhanced with a virtual reality overlay. The camera inputs images/patterns and feeds them back into the ARToolkit software which then displays digital information associated with the physical markers onto the headset. The ARToolkit can calculate camera position and orientation relative to physical markers in real time for video-mediated reality.
Each reader can view AR scenes from their own visual perspective. Users can fly into the immersive world and see each other represented as avatars in the same virtual scene. Readers remaining in the AR scene have a birds'-eye view of other readers as miniature avatars in the virtual scene displayed through their headset. User-controlled dialog with the book elicits responses/answers from the book (expressed in digital data: visual, textual, auditory). As the users simultaneously interact with the book in the physical and virtual realms, the book responds to individual and multiple physical states (via biofeedback) to express resulting changes in narrative. The narrative is a cultural history of the senses. [Read Extensive Bodies, interview by Yvonne Volkart with media artist and theorist Jill Scott].
June 10, 2005
Harnessing Cameras "In the Wild"
RealityFlythrough is a telepresence/tele-reality system that works in the dynamic, uncalibrated environments typically associated with ubiquitous computing. By opportunistically harnessing networked mobile video cameras, it allows a user to remotely and immersively explore a physical space. Live 2d video feeds are situated in a 3d representation of the world. Rather than try to achieve photorealism at every point in space, we instead focus on providing the user with a sense of how the video streams relate to one another spatially. By providing cues in the form of dynamic transitions, we can approximate photorealistic telepresence while harnessing cameras “in the wild.” [via]
May 05, 2005
Full-Body Interfaces for Linking People
CINE--by Miro Kirov, Houston Riley, and James Tunick--is a new type of networked computing environment that exists in an immersive dynamic space instead of in a PC box. Its displays fill entire walls instead of being confined to small, isolating screens. It is our belief that the mouse and 2D desktop are dated and inadequate. CINE's full-body interfaces will allow richer modes of expression and creative 3D data organization will engage users, inspiring a sense of magic while also making information retrieval and collaboration more efficient.
Similar to the futuristic vision of the *Holodeck, CINE is envisioned as an immersive visualization platform and advanced collaboration space equipped with an intuitive multi-user gesture interface. CINE, however, is not merely an illusionary virtual space, it is a mixed-reality space that augments group experiences by linking people in virtual spaces to people in real spaces. [via ColumnNetwork]
April 25, 2005
The Walking Experience
lifeClipper is an open air art project. It offers an audiovisual walking experience in a virtually enhanced reality. Technically it is based on portable computer equipment worn by an individual. When walking around in a chosen culturally interesting area or impressive landscape the visitor's position and viewing direction is measured by means of GPS and the found situation augmented according to defined presets. Image and sound are displayed on an HMD (Head Mounted Display). Live captured image and sound are treated in real time by altering parameters as well as by adding music (composition, spoken text and sampling of documetary material), photo and video material (documentary and fictively arranged).
Through interventions on habitual ways of listening and seeing, reality becomes questioned and day-to-day situations become an adventure. Users feel as though they are watching a film in which they participate as active observers and in which they get attention from virtual players. The borders between subjective and objective perception become blurred as the user is immersed into space and action but also contemplates artistic compositions and cultural reflections. Read a review.
April 20, 2005
InterPlay: Loose Minds in a Box
Six by Six
InterPlay: Loose Minds in a Box is a collaborative work that explores the basic concept of the "box". The box is a metaphor for the physical, social, political or psychological constraints that we and/or others place upon us. The box also represents a sense of place in the realm of the virtual as well as in our sub-conscience. InterPlay is a multi-faceted telematic event that consists of six simultaneous performances that occur in six states throughout North America. The performances incorporate theater, text, music, performance art, virtual reality, and motion capture and are concurrently captured, mixed, digitized, encoded and streamed onto the network.
April 19, 2005
Immersed in Radical Shifts
Two UCLA professors--media and net artist Victoria Vesna and nanoscience pioneer James Gimzewski--are at the forefront of the intersection of art and science. Their groundbreaking project, NANO presents the world of nanoscience through a participatory aesthetic experience.
The project seeks to provide a greater understanding of how art, science, culture and technology influence each other. Modular, experiential spaces using embedded computing technologies engage all of the senses to provoke a broader understanding of nanoscience and its cultural ramifications. The various components of "nano" are designed to of scale and sensory modes that characterize nanoscience, which works on the scale of a billionth of a meter. Participants can feel what it is like to manipulate atoms one by one and experience nano-scale structures by engaging in art-making activities.
April 13, 2005
Sensory Relationships Between Natural Media
Mocean is a musical immersive environment that invites people to touch, stir and play with water in a tank. The movement of the water is translated into movement of air in the organ pipes suspended above the water. The sound of the pipes envelops the person, its movement echoing the waves and ripples in the tank.
The organ pipes are connected to blowers via a tubing structure. A video camera, placed below the tank, sends images to a computer, which analyzes the movement of the waves and sends commands to a microchip board, which turns the blowers on and off. The air from the blowers is routed to the organ pipes. The spatial arrangement of the pipes reflects the physical dimensions of the tank, thus it is possible to create the sensation of moving sound around by moving water in a particular direction. [blogged by Regine on near near future]
March 04, 2005
A Model Strategy for Collaboration
"Imaging Place--by John (Craig) Freeman--is a documentary virtual reality method, which uses a combination of panoramic photography and digital video to investigate place. Although the method borrows freely from the traditions of documentary still photography and filmmaking, the Imaging Place method departs from those traditions by using the emerging nonlinear narrative structures made possible by new interactive technologies and telecommunication apparatuses. It is exhibited primarily in alternative art exhibition spaces, museums and over the Internet. The work is projected up to nine by twelve feet in a darkened space with a podium and a mouse placed in the center of the space, which allows the audience to navigate throughout the project. When it is activated by the click of a mouse, the project leads the user from global satellite and vertical aerial perspectives to virtual reality scenes on the ground.
The user can then navigate throughout an immersive virtual digital video space. Rather than the linear structures of the novel or cinema, this new form allows the story to unfold in a meandering labyrinth of discovery and associations. The goal of the Imaging Place method is to document sites of cultural significance, which for political, social, economic or environmental reasons are under duress, at risk of destruction or undergoing substantial changes. This includes historic sites as well as sites of living culture which are being displaced by globalization and the collapse of industrial modernism.
Imaging Place is designed to accommodate interdisciplinary collaboration conducted across institutions and over distances. It uses new technology to bring disparate bodies of knowledge together through the investigation and documentation of place. The method attempts to bridge the gaps in understanding that exist between esoteric disciplines that have developed as a result of academic and industrial specialization. The technological tools are now available for bringing the work of experts together without sacrificing the depth and dimension of specialized knowledge and to connect the abstraction of highly specialized thinking with the visceral experiences of people on the ground. In addition to providing a form for the generation, dissemination and accumulation of interdisciplinary research and artistic production, the Imaging Place method provides a model strategy for collaboration."
February 16, 2005
When Rooms Respond, 2
3 minutes2, by French art collective Electronic Shadow, recreates an extremely reduced living unit that can extend beyond its physical borders via the image. The space permanently reconfigures itself according to its inhabitant’s activities and also defines itself in time. The scenario presents in a few minutes the compression of most activities and functions taking place in the habitat and corresponding to its inhabitant daily life, eating, sleeping, working, etc.
The inhabitant himself is represented in the image as a silhouette and habitat is building itself around her/him as a cocoon. 3 minutes2 tries to draw the shape of a daily life modified by technologies and the presence of the virtual. Hybridization of real and virtual is fictively acquired and becomes the ground for the proposition of a habitat which anticipates the technological and social modifications making it possible.
No screens, no visible interfaces, the two characters touch the walls, make movements, the habitation responds to them. The technology has become totally invisible and the effect of technologic becomes then magic. [blogged by Regine on near near future]
The Living Room
When Rooms Respond
The Living Room, by Christa Sommerer and Laurent Mignonneau, is an intelligent, interactive image, sound and voice environment. It becomes "alive" and starts to "sense" when users enter and interact with this room.
Like in a perfect surveillance system all sounds, voices, gestures and motions of the users are detected through state-of-the-art camera tracking as well as sound and voice recognition systems. When the various users start to interact and communicate with each other within this room, they will also start to communicate with The Living Room.
As if it were an intelligent organism, The Living Room will react back to the user by interpreting the collected position and speech data in form of images and image elements displayed on the room's four large projection walls. All images and image elements are directly derived from the Internet, they are The Living Room's interpretation of the users' interactions and conversations.
Since the users' position, movement and voice data are constantly changing, the images streamed from the Internet are changing constantly as well. Due to the almost unlimited amount of image data available on the Internet, the users will become completely engulfed in this virtual image space of the Internet, displayed as life-streams on the four projection walls. Besides interpreting the users' interactions and conversations visually, The Living Room also uses these data to generate and broadcast its own sound and voice output.
Conceptually The Living Room thus metaphorically plays with ideas of surveillance, detection, intelligence, interpretation, miss-interpretation and communication. To the users it provide a feeling of immersion into a constantly changing and dynamic data space, full of unpredictable images, sounds and voices.