New Thursday Club Event: 25 January with Michael Young

**THURSDAY 25 JANUARY with MICHAEL YOUNG**
Supported by the Goldsmiths DIGITAL STUDIOS and the Goldsmiths GRADUATE
SCHOOL

25 JANUARY, 6-8PM, BEN PIMLOTT BUILDING, GOLDSMITHS, UNIVERSITY OF LONDON,
NEW CROSS, SE14 6NW. Seminar Rooms, Ground Floor, Right.

FREE, ALL ARE WELCOME

*AUR(O)RA: EXPLORING ATTRIBUTES OF A LIVE ALGORITHM*

This presentation proposes attributes of 'living computer music', the
product of live algorithmic behaviours. The improvisation system
"aur(o)ra", in development, illustrates how these can inform creative
design.

A live algorithm (LA) is the function of an ideal autonomous system able
to engage in performance with abilities analogous (if not identical) to a
human musician. An LA is distinct from established AI which generates
music from a rule-base, and is most relevant where structure and character
are emergent properties, products of interaction with the heterarchical
group. Living computer music diffeers from traditional live electronics
and fixed-media work by avoiding performer control or explicit a priori
knowledge (compositional design, notation). Instead, a number of other
properties are desirable...

Dr Michael Young is Lecturer at Goldsmiths, University of London and
composer. Michael completed a PhD in composition in 1995. He has lectured
at the University of Wales, Bangor and Oxford Brookes University. His
music has drawn upon a range of live and electroacoustic resources; more
recent work has focused on interactive and generative music systems. An
undercurrent in his output is collaborative and interdisciplinary
practice; he has worked with jazz musicians and improvisers in the role of
pianist, laptop musician and/or composer, and has been commissioned to
provide electroacoustic music for performance in theatre and gallery
exhibitions. He is co-director, with Tim Blackwell, of the Live Algorithms
for Music Research Network, creaetd with funding from EPSRC.

For more information on the Thursday Club check
http://www.goldsmiths.ac.uk/gds/events.php or email maria x at
drp01mc@gold.ac.uk

New Thursday Club Season

Supported by the Goldsmiths DIGITAL STUDIOS and the Goldsmiths GRADUATE SCHOOL

6pm until 8pm, Ben Pimlott Building, Goldsmiths, University of London, New Cross, SE14 6NW

FREE, ALL ARE WELCOME

11 JANUARY with CHARLOTTE FROST
:

NEW FUTURES IN NET ART

Charlotte is the editor of Furthertext as well as a net art critic and PhD candidate at Birkbeck. Visit http://www.charlottefrost.info/

25 JANUARY with MICHAEL YOUNG
:

LIVING COMPUTER MUSIC? Recent Compositions and Explorations with Max

Michael is Lecturer at the Music Department at Goldsmiths. His music explores a variety of live and electroacoustic resources, and has reflected his interests in jazz and collaborative/interdisciplinary practice. His current research interests focus on interactive live electronics and improvisation. Michael is co-investigator with Tim Blackwell, Department of Computing, for the Live Algorithms for Music research network.

15 FEBRUARY with JON MEYER

Jon is a digital artist who specializes in computer graphics, animation, and user interfaces, and has worked as a Program Manager at Microsoft and as a Research Scientist at New York University’s Media Research Laboratory. Jon has taught multimedia classes at NYU and at the Surrey Institute of Art and Design. He is currently doing an MFA in Fine Art at Goldsmiths. Visit www.cybergrain.com

22 FEBRUARY with BILL GAVER

Bill is Professor of Design at Goldsmiths College. He has pursued research on innovative technologies for over 15 years, his work spanning auditory interfaces, theories of perception and action, and interaction design. Currently he focuses on design-led methodologies and innovative technologies for everyday life.

8 MARCH with SUE BROADHURST
:
DIGITAL PRACTICES

Sue is Subject Leader, Reader in Drama and Technologies at the School of Art, Brunel University of West London. Sue is also a writer and practitioner in the creative arts. Her new book /Performance and Technology: Practices of Virtual Embodiment and Interactivity/ has just been published by Palgrave. She is co-editor of the /Body, Space and Technology /online journal and is currently working on a series of collaborative practice based research projects entitled “Intelligence, Interaction, Reaction and Performance”.

22 MARCH with IGLOO

International and award winning artists Igloo create intermedia artworks, led by Ruth Gibson & Bruno Martelli. “In the mid-sixties, Fluxus artists began using the term ‘intermedia’ to describe work that was ….composed of multiple media. The term highlights the intersection of artistic genres and has gradually emphasized performative work and projects that employ new technologies.” Marisa Olson – Rhizome.org

Igloo projects are created with teams of highly skilled practitioners drawn primarily from performance, music, design, architecture, costume, computer science and technology backgrounds. Their work combines film, video, motion capture technology, music and performance with digital technology. The work is developed in a variety of formats and made for distribution across a range of platforms, including gallery installation, internet sites, large and small scale performance and Cd Rom.

THE THURSDAY CLUB is an open forum discussion group for anyone interested in the theories and practices of cross-disciplinarity, interactivity, technologies and philosophies of the state-of-the-art in today’s (and tomorrow’s) cultural landscape(s).

For more information check http://www.goldsmiths.ac.uk/gds/events.php or email maria x at drp01mc@gold.ac.uk

Deptford.TV diaries out now!

Happy new Year! The Deptford.TV diaries reader is out now:

Deptford.TV is an audio-visual documentation of the regeneration process of Deptford (south-east London) in collaboration with SPC.org media lab, Bitnik.org, Boundless.coop, Liquid Culture and Goldsmiths College.

Since September 2005 we started assembling AV material around the area, asking community members, video artists, film-makers, visual artists and students to contribute statements, feedback and critique of the regeneration process of Deptford.

The unedited as well as edited media content is being made available on the Deptford.TV database and distributed over the Boundless.coop wireless network. The media is licensed through open content licenses such as Creative Commons and the GNU general public license.

This book is a compilation of theoretical underpinnings, interviews and written documentation of the project.

Contributors: Adnan Hadzi, Maria X, Heidi Seetzen, James Stevens, Erol Ziya, Bitnik media collective, Andrea Pozzi, Andrea Rota and Jonas Andersson, alongside selected public-license texts from Hakim Bey, Jaromil and Guy Debord.

To order (5GBP for book, 10GBP for book & DVD) send an email to info@deptford.tv, go to openmute, Amazon  or download it for free here:

http://www.deptford.tv/about/diaries/DeptfordTV-diaries1.pdf
http://www.deptford.tv/about/diaries/DeptfordTV-diaries1-cover.pdf

Open Knowledge 1.0, 17th March 2007

Open Knowledge 1.0
Saturday 17th March 2007
Limehouse Town Hall
http://www.okfn.org/okforums/okcon/

Discussions of ‘Open Knowledge’ often end with licensing wars: legal arguments, technicalities, and ethics. While those debates rage on, Open Knowledge 1.0. will concentrate on two pragmatic and often-overlooked aspects of Open Knowledge: atomisation and commercial possibility.

Atomisation on a large scale (such as in the Debian ‘apt’ packaging system) has allowed large software projects to employ an amazing degree of decentralised, collaborative and incremental development. But what other kinds of knowledge can be atomised? What are the opportunities and problems of this approach for forms of knowledge other than Software?

Atomisation also holds a key to commercial opportunity: unrestricted access to an ever-changing, atomised landscape of knowledge creates commercial opportunities that are not available with proprietary
approaches. What examples are there of commercial systems that function with Open Knowledge, and how can those systems be shared?

Bringing together Open threads from Science, Geodata, Civic Information and Media, Open Knowledge 1.0 is an opportunity for people and projects to meet, talk and build things.

Each thread will have speakers to set the scene, with the rest of theday divided between open space formats and workshop activities.

If you have a presentation or a workshop you would like to give in the open space, or you would like to help organise Open Knowledge 1.0, please get in touch.

Atomization: the Fourth Principle of Open Data Development ==========================================================
Consider the way software has evolved to be highly atomized into
packages/libraries. Doing this allows one to "divide and
conquer" the organizational and conceptual problems of highly
complex systems. Even more importantly it allows for greatly increased
levels of reuse.

A request to install a single given package can result in the
automatic discovery and installation of all packages on which that one
depends. The result may be a list of tens  or even hundreds of
packages in a graphic demonstration of the way in which computer
programs have been broken down into interdependent components.

Atomization on a large scale (such as in the Debian apt packaging
system) has allowed large software projects to employ an amazing
degree of decentralised, collaborative and incremental development.
But what other kinds of knowledge can be atomised? What are the
opportunities and problems of this approach for forms of knowledge
other than Software?

Atomization also holds a key to commercial opportunity: unrestricted
access to an ever-changing, atomised landscape of knowledge creates
commercial opportunities that are not available with proprietary
approaches. What examples are there of commercial systems that
function with Open Knowledge, and how can those systems be shared?

OKFN is supporting software allowing the incremental, decentralised,
collaborative and atomised production of open data. KnowledgeForge is
one Open Knowledge Foundation project to provide a platform for
collaborative data development and distribution. The "Open
Shakespeare" project is a prototype distribution of public domain
information with utilities for annotating and cross-referencing it.

--------------------------------------------------------------

Letter from Geospatial: Open Standards, Open Data, Open Source
==============================================================

The "open standards, open data, open source" mantra is not unique to
the geospatial community, but is core to it. Due to our high degree of
specialisation, socialisation and closeness to data, the open source
geospatial community has "incubated" some concerns that are coming to
be apparent in domains where software, knowledge and scientists are
not yet so close together.

Our standards consortium is like a networking club for proprietary
interests; its recent specifications are baggy monsters, filled with
extensions largely concerning access rights, limits and payment
mechanisms. Their older, core standards for RESTful web services *are*
widely used, and have helped the geospatial community to a new level
of "interoperability", as it is still quaintly known.

The new wave of web-based "neogeography" drove the development of
community-based specifications for the simple exchange of geographic
information have become de facto standards. There has been an
implementation-driven focus from open source projects seeking to make
it easier to contribute, distribute and maintain open licensed
geographic information. Now our standards organisation has the bright
idea of a "mass market", "lightweight" standards programme to harness
the energy in this activity. Their established membership, with a lot
of time vested in the matter, are not happy with this.

In the decision-making bodies following the advice of traditional
domain experts, much issue is made of "discovery", "catalog services"
and "service discovery services". Among the "grassroots" at the nexus
of open source, open standards and open data there is a call for a
"geospatial web" approach, re-using as much as possible existing
distribution mechanisms and toolkits, RSS/Atom in particular.

ISO standards for information exchange are not solving the problems
faced by the geospatial community. Yet they are being embedded in
international law; "risk management" and disaster recovery provide a
big political drive for exchanging more geographic information.
Through the Open Source Geospatial Foundation, the community is
attempting to influence decision-making bodies through the strength of
the open source / open data approach. "Open" standards are a gateway
to this, and it is a sad day when our official specification for
metadata exchange is an "add to my shopping basket" page.

There's always a lack of emphasis on contribution; transaction and
feedback are an afterthought. The traditional theory of "Public
Participation GIS" comes closer to implementable reality.
"Collaborative mapping" projects producing open licensed data are
becoming the stuff of business plans. The ISO moves in glacial time;
it would be of benefit to shorten the circuit.

How can we bring good status to "complementary specifications"?
Can we use open source software to influence decision-makers?
Can we help provide a good data licensing precedent for others?
Do our distributed storage and query problems look like yours?

future of film, 5th march 2007

1. General outline http://www.londonwestside.com/

The film and television industry is changing, but not fast enough. While
studios take fewer risks, fall back on old formulas and find their
traditional markets drying up, new, vibrant cultures and markets for
film are exploding all over the Internet: from video podcasting and
peer-to-peer networks to mobile media, live streaming and interactive
environments. For those with the imagination, curiosity, and passion for
film, there are more opportunities and niches for film-making than ever.

This four day programme will introduce film producers to emerging
techniques and technologies for creating, distributing and promoting
film on the Net. Mixing hands-on training with master-class
presentations, open discussions and public screenings, the focus will
shift from technological developments to creative potentials, and
crucially, to the economic realities: how to actually survive and make
money with all this stuff.

By taking these discussions and learning resources on-line through the
workshop website, the programme will create an ongoing forum for
information sharing and networking, and a showcase for the work of
participants.

———————————————————————

2 General Outline for workshops

Video blogging, alternative distribution, peer-to-peer: these are
phrases that set the TV, Film and music industry quaking in their
boots… but needlessly. Like ‘home taping’, VCRs and DVD recorders,
these technologies are not bogeymen, they are business opportunities.
These workshops will focus on how to create, promote, fund and
distribute film totally on-line, without the cumbersome middlemen of the
distributors and promoters – until your film gains sufficient notoriety
for it to go mainstream, of course!

2.1 Technical Workshop (Adnan Hadzi)
The workshop will introduce participants to tools, technologies and available services for encoding, uploading and sharing their films and video blogs online using free and open source software such as Broadcast Machine (RSS feed, Democracy Player, iTunes Vodcast, Bittorrent) and DyneBolic.
Participants will also be shown how to use x.264 technology (portable video devices iPod, sony PSP, Archor etc.) in order to encode and prepare their movies, in conjunction with encoding tools that they can download and take home.

2.2 Production Workshop (Penny Nagle)
The production workshop will introduce participants (briefly) to the
terminology and areas of interest they’ll need to understand to manage
projects in this area. It will then delve into the business issues
involved in using p2p technologies – the advantages, dangers, and
possibilities it opens up. There will also be an overview of the kinds
of business models that are flourishing online, with examples of
cross-overs between established film-industry and new, emerging markets
in online distribution.

Deptford.TV workshops Phase II October – December 2006

The Deptford.TV workshops entered phase II. For the first time we started to use the database content directly in the editing suites. We used the commercial software Avid Xpress Pro as well as Final Cut Pro and imported the x.264 (x.264) files. Avid had problems with the importing of the files, they all came out asynchronous, forcing us to export and import the sounds seperately.

The next test will be with http://cinelerra.org embedded in the http://dynebolic.org live cd. Our goal is to have the editing process running on FLOSS software by spring / summer next year!

The films where premiered on the pirate boat (mind-sweeper) together with the presentation of our partners, bitnik’s Download Finished system.

stricly for true file-sharers & river bound culture vagabonds, YARRR!

– the premiere of the film “Strategies of Sharing”

– the premiere of four shorts “Pipes”, “streetworks”,

“Bookie Talks”, “Living Archive”

– and the premiere of “DOWNLOAD FINISHED”

friday 24th of november

screening starts 9pm

RSVP essential – places are limited – please book in advance by sending

email to info@deptford.tv !

media collective bitnik and sven koenig:

DOWNLOAD-FINISHED ­ MAKE PEER-2-PEER CINEMA!

DOWNLOAD-FINISHED is an assistant to transform and re-publish films from

p2p networks and online archives. found footage becomes rough material

for the transformation machine, translating data structure of the films

onto the surface of the screen. the original pictures dissolve into

pixels and overlap into a second layer, the hidden structure gets

visible. file-sharers are becoming authors through DOWNLOAD-FINISHED and

re-interpret their most beloved films.

it is the premiere of DOWNLOAD-FINISHED.

www.download-finished.com – the art of file-sharing

WARNING, entering of the premises on your own risk: be aware that you

will have to “squeeze” through the entrance gate (but we informed the

police and the owners of the industrial estate, set up a sign

“mindsweeper” and will set you guiding lights to the boat, so that you

get there…)

transmission.cc, 13-15 october 2006, london, limehouse

At the transmission festival one of the discussions was around FLOSS manual (see also http://www.flossmanuals.net) the importance of documenting multimedia tools. One of the projects dealing with manuals is http://converge.org.uk looking at video distribution of the x.264 codec (vodcasts). We decided to hold a follow up event at the Limehouse on April 27, 2007.

(quoted from http://transmission.cc/About)

Re-transmission was a three day gathering of video makers, programmers and web producers developing online video distribution as a tool for social justice and media democracy. The two events at the British Film Institute were made up of presentations and screenings, firstly exploring citizen video reporting on the Net, secondly discussing how to collaborate and share content on the net in the new era of Open Source and WebTV. For the full Re-transmission program see

http://retransmission.org.uk

for documentation of the event see:

http://www.archive.org/details/retransmission the film

http://wiki.transmission.cc/index.php/London%2C_October_2006 the wiki

Online video, 10 years in.

Metadata process update

Thanks to some funding from the alt-media-res project, we now have a draft metadata standard PDF prepared by JJ King and Jan Gerber:

This report and proposal is under consultation within Transmission-connected networks until Monday November 6th;

We are looking for substantive responses, feedback, proposals, in particular direct inputs from the other Transmission working groups – eg Translation/Subtitling, DoNoHarm, Aggregator R&D, Documentation….

I am posting summaries of responses in the wiki and hope that people with experience of this field of work can suggest practical ways forward to finalise and then begin to implement the standard.

After 6th November we will gather the working group together to review and improve the schema, spec out next steps etc. please add to and improve the metadata to do list

As part of this process, JJ King and I have drafted a proposal for implementation of this standard.

28th October 2006 (zoe)

http://wiki.transmission.cc/index.php/Responses_to_draft_schema

http://www.shiftspace.cc/j/meta/tx_report_0.2.pdf

http://www.clearerchannel.org/transwiki/index.php?title=Proposal_for_further_funding_for_implementation_of_RDF_schema

DIGITAL NARRATIVES

A presentation based on the Open the Space Guide produced by the Trace Online Writing Centre in 2002-2003, with many adaptations, additions and changes.

1. What is hypertext?

New Media Writing began as hypertext, which in turn began as a concept for the organisation of information.

In 1945, Vannevar Bush published an article entitled As We May Think in which he called for scientists to find new ways to store, process and access the massive amounts of knowledge available and constantly growing in the world. Libraries and their traditional methods of indexing and classification are no good for the navigation of such large data stores, he said, because they are not sufficiently intuitive: “The human mind operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain…”. (as quoted in http://tracearchive.ntu.ac.uk/transition/guide/origins.htm accessed 25/11/2006)

The concept was further developed twenty years later when American programmer and designer Ted Nelson invented a system called Xanadu because he realised that: “We need a way for people to store information not as individual “files” but as a connected literature.” (as quoted in http://tracearchive.ntu.ac.uk/transition/guide/origins.htm accessed 25/11/2006)

Nelson coined the terms hypertext and hypermedia:

“Hypertext was ‘nonsequential’ text, in which a reader was not constrained to read in any particular order, but could follow links and delve into the original document from a short quotation. Ted described a futuristic project, Xanadu, in which all the world’s information could be published in hypertext. (…) He had the dream of a utopian society in which all information would be shared among people who communicated as equals.”
[Berners-Lee, Tim Weaving the Web: the Past, Present and Future of the World Wide Web London and New York: Texere Publishing Ltd., 2000 (1st published: London: Orion Business, 1999), pp. 5-6]

In 1991 (26 years later) at Cern, in Switzerland, Tim Berners-Lee developed the first global hypertext: the World Wide Web.

“The fundamental principle behind the Web was that once someone somewhere made available a document, database, graphic, sound, video or screen at some stage in an interactive dialogue, it should be accessible (subject to authorisation, of course) by anyone, with any type of computer, in any country. And it should be possible to make a reference -a link- to that thing, so that others could find it. This was a philosophical change from the approach of previous computer systems. (…) Getting people to put data on the Web often was a question of getting them to change perspective, from thinking of the user’s access to it not as interaction with, say, an online library system, but as navigation through a set of virtual pages in some abstract space.” (Ibid, p. 40)

“When I proposed the Web in 1989, the driving force I had in mind was communication through shared knowledge, and the driving ‘market’ for it was collaboration among people at work and at home. By building a hypertext Web, groups of people of whatever size could easily express themselves, quickly acquire and convey knowledge, overcome misunderstandings and reduce duplication of effort. This would give people in a group a new power to build something together.” (Ibid, p. 174-174)

Today we talk about the Web 2.0, a second generation of Internet-based services –such as social networking sites, wikis, and communication tools– that emphasize online collaboration and sharing among users (Wikipedia, accessed 26/11/2006), and we continue to apply technology to art to make new meanings and to connect with each other.

2. Interactive storytelling

In recent years new forms of media writing have emerged, along with many different terms used to describe these: digital fiction; hypermedia; flash poetry; electronic literature; hypertexts; multi-media texts; web-based narratives . . . the list is long.

New media writing, being an emergent genre, does not even quite recognize itself yet. New media writers use different terms to refer to their work and to themselves. This is not unlike the broader debate about the terms and practices of new media art or media art or digital art or electronic art or art and new technologies….

Nevertheless, all new media writings have a least one thing in common: they must be viewed through the medium of an electronic display, usually a screen but sometimes just audio, via a computer, a PDA, mobile phone, data projector, or other. Their uniting characteristic is that the computer is an essential and inherent component of the writing, and without it the work would not exist.

Another common feature of much new media writing is the use of hypertext, which structures information in such a way that related items are connected, or threaded, together by links called hyperlinks. The items so linked may be text, but increasingly include other media, such as graphics, sound, animation or video. In this way hypertext becomes hypermedia.

Janet H. Murray, in her book Hamlet on the Holodeck talks about authorship in a new media context as ‘procedural’, which means that the author writes not only the text, but also the rules by which the text appears, that is, the rules for the readers/interactor’s involvement. I would add that, within such a context, the writer sometimes does not write the text at all. Instead, s/he creates the conditions for the interactors to produce the text themselves, and sets the context and rules for what can be produced and how. According to Murray: “The procedural author creates not just a set of scenes but a world of narrative possibilities.” (Murray, Janet H. Hamlet on the Holodeck: the Future of Narrative in Cyberspace Cambridge, Mass.: MIT Press, 1997, p. 153)

Another term Murray uses to describe the structure on new media writings is “kaleidoscopic”. This means that the structure allows for many actions to take place simultaneously, in multiple ways. (See ibid)

Finally, another element that Murray identifies as important for new media writing is the potential of enactment: the computer does not describe characters, like printed text does, nor does it observe them, like moving image does; instead, the computer “embodies and executes them” (ibid, p. 181), thus allowing us to explore this process of becoming. At the same time, the reader/interactor not only reads/ witnesses the story but, in some cases, s/he becomes its very protagonist.

3. Books vs. Electronic Media

According to the Trace guide, books have had several centuries to evolve and we have had all these centuries to become very sophisticated book-readers. We no longer ‘see’ the technology involved in book production, whereas we do ‘see’ the technology involved in the production of a hyper-novel or other piece of media writing.

When you see the physical object of a book, you know what to expect of this book from its very looks: its cover, the images and colours used, the type-face, the publishing company, along with the title and the name of the author, all convey information about what you should expect. In a glance you can judge if this is ‘serious’ fiction, a ‘thriller’, or an academic book.

In the same way, when reading a book you can easily assess where you are in the text overall. You know when you’ve just begun, you know when you’re half-way through, and you know how long it will take you to finish it.

We often think of interactive story-telling as something that can only happen on the web or through the use of a hypertext. This is not the case. Some examples of interactive pre-web literature books are:
– Jorge Luis Borges (1941) The Garden of Forking Paths, Labyrinths
– Milorad Pavic (1988) Dictionary of the Khazars

These conventions most often don’t apply to new media writing. This can make the life of a non-experienced new media reader fairly complicated to start with. For example, often there is no way of determining how large or complex a piece of writing is before you actually start navigating your way through it, so authors often provide tools such as help-files or site-maps to guide the reader. Once you begin to navigate through the text, the level of complexity becomes clear, but there is still no obvious way of assessing the length of a piece. In many cases this question does not even have an answer as, often, a hypertext is as ‘long’ as you want to make it. Length quickly becomes irrelevant because new media works often do not reach an ending or resolution in any conventional sense. Some narratives end by taking the reader back to the beginning; others do not end at all, but rely on the reader to find a sense of completion through exploring all the links via their own self-created pathways through the work.

New media writing relies on reader input to a far greater extent than print fiction. This is not true of all works –with some new media pieces the only ‘input’ the reader has is the electronic equivalent of turning pages, clicking the mouse to move forward or to begin an animation /film. Other pieces offer myriad alternate routes for the reader, whereas some depend on the community of their readers for their very existence (e.g. Wikis).

The range of new media writing available now is vast. There is non-fiction, short fiction, novels, poetry, journalism and works that fuse several forms. There are pieces that use sound as well as moving images, pieces that require the reader to contribute to the text, literary games, collaborative works, and works-in-process that are constantly changing.

As a reader, you may be asked to contribute something of your own –a fragment of text, a sound, or a memory. You may be asked to provide your email address so that the characters can interact with you after you have stopped ‘reading’ the work. Indeed, the text you’re reading may be written by hundreds of other people, sometimes anonymously, sometimes named. Some times there is no text at all until you have helped create it.

Reading new media writing is all about exploring –exploring the web to see what’s out there, exploring the new technologies and how to use them, exploring new ways of reading, new ways of telling stories.

Short Bibliography:

Berners-Lee, Tim Weaving the Web: the Past, Present and Future of the World Wide Web London and New York: Texere Publishing, 2000

Bush, Vannevar “As We May Think” in The Atlantic Monthly July 1945. Available at: http://www.theatlantic.com/doc/194507/bush

Calvino, Italo Invisible Cities London: Harcourt, 1974

Calvino, Italo If On a Winter’s Night a Traveler London: Vintage, 1998

Murray, Janet H. Hamlet on the Holodeck: the Future of Narrative in Cyberspace Cambridge, Mass.: MIT Press, 1997

Pavic, Milorad Dictionary of the Khazars London: Penguin, 1989

Pavic, Milorad Last Love in Constantinople London: Peter Owen Publishers, 1998

Rieser, Martin and Zapp, Andrea (Eds) New Screen Media: Cinema / Art / Narrative London: BFI Publishing, 2002

Wardrp-Fruin, Noah and Harrigan, Pat (Eds) First Person: New Media as Sotyr, Performance and Game Cambridge, Mass. and London: MIT Press, 2004

Links for this presentation at:
http://del.icio.us/mariax/dnarratives

CHArt 2006

CHArt

Last week, on Thursday 9 and Friday 10 November, I spent two days at the CHArt 2006 conference. The title this year was Fast Forward – Art History, Curation and Practice after Media.

Previous CHArt conferences –CHArt has been taking place since 1985!– were concerned with the practice, history, and preservation of ‘computer arts’. Nevertheless this last event was broader, looking at current issues of curation along with questions of history and preservation. To have a look at the programme and abstracts visit http://www.chart.ac.uk/chart2006/index.html

New Club Night on Thursday 16th Nov. with TIM HOPKINS

Thursday November 16, 6-8pm in the Seminar Rooms, Ben Pimlott Building, Goldsmiths, University of London, New Corss, SE14 6NW

FREE, ALL ARE WELCOME

*ELEPHANT AND CASTLE*

Tim Hopkins will introduce a new lyric theatre /digital media work-in-progress, called ELEPHANT AND CASTLE.

“Architecture is music, frozen.” Goethe

A new lyric theatre piece using the web to link audiences in two architectural spaces simultaneously, based at the Elephant and castle Shopping Centre and Aldeburgh Festival, Suffolk. This explores how human activity is directed by environment, in this case in two places that represent contrasting ideas of a designed society.

The Elephant was Britain’s first Drive-In Shopping Centre, opened in 1965, and along with many other buildings of its generation, is being redeveloped or effaced. The Snape Maltings concert hall was opened in 1967.

Commissioned by LONDON ARTISTS PROJECTS
Research Phase funded by ARTS COUNCIL (UK)

*Tim Hopkins* works in two related areas: opera production and making new lyric theatre works with multimedia elements. He began making work in Opera and Theatre as a director from 1989, and additionally as a scenic designer and filmmaker from 1998.

He has been commissioned to direct opera repertoire for WNO, English National Opera, The Royal Opera Covent Garden, Opera North, Glimmerglass, Teatro dell’Opera Roma, Bayerische Staatsoper Festspiel, Theatre Basel, Graz Oper, Staatsoper Hannover, Wexford Festival, ETO, Alternative Lyrique Paris, Almeida Opera, Aldeburgh Festival and others. He has been commissioned to make original works, involving lyric theatre, moving image and digital media by Opera North, Aldeburgh Festival, ROH2, The Sage Gateshead, Birmingham Contemporary Music Group, Channel 4 TV, LAP.

In 2001 he was awarded a NESTA Fellowship for personal artistic development.