It has taken me quite a while to boil down a long discussion into this much, I hope its useful, and I hope that any of the participants feel able to add comments and or amendments to this text. Apologies for the unforeseen delays in getting this done.
Our session started with a presentation by Patrick Crogan on work by Bernard Stiegler, followed by a discussion that generated a list of questions/provocations both for future discussion and for pervasive media designers. n.b. My notes don't always differentiate between questions for software designers and questions for pervasive media designers
Patrick focused on Stiegler's ideas on Autopilot behaviour and Grammars of Activity, and the following is a very brief outline of these.
Autopilot:
in a lot of our everyday life we function in Autopilot mode, where we respond without conscious thought to our environment and other people. If we did not have the ability to develop these patterns of activity then we would not be able to negotiate everyday life. Stiegler suggests that we should sometimes pay more attention to these autopilot activities, to be sure that we have an awareness of what we are doing.
Autopilots are based on “grammars of activity”. We learn these ways of doing/being, they are not innate/natural but are socialised behaviours that at some point been written down and formalised as sets of rules of behaviour in society. The problem with embedding these grammars into smart technologies is that they then become fixed.
As makers, our creative focus is influenced by both technical and social grammars, whether we realise it or not:
Technical specifications limit or condition the activity. Technological grammars are the outcome of previous histories, political decisions etc Stiegler has looked at the development of international mpeg standards as a case in point.
Any maker is always working within a milieu/company/industry/funding opportunities etc. Academics work within formats that mould projects and outcomes. They are also working within social/cultural/broad technical culture. Culture has always been about the passing down of techniques – passing on, in a transmittable format, the earlier experiences of someone who may be now long dead.
(This made me think that I should look again at Judy Wajcman)
Stiegler talks of
savoir faire how to live
savoir vivre how to live well: to enjoy/appreciate art/wine/cinema/games ....ie evolve technical competencies
savoir penser how to think well: to reflect on things, argue, propose a position, develop existence/morals/ethics etc
These are all carried forwards in our culture through artefacts, and internalised artefacts. (Memory is a form of artefact if you are human, which is what makes us different from animals.) In the creative moment we have all these as part of our autopilot processes, before, during and afterwards.
To develop our ethical considerations around pervasive media it may be useful to think about these three states of Stiegler's.
Referring back to the paper on Metadata that Stiegler presented at Goldsmiths , this was about promoting a more critical engagement with metadata that we unwittingly produce.
Stiegler talked about Web 2.0 and how it combines topdown and bottom up production of grammars.
Top down could be characterised as being developed by 'experts', eg mpeg standard associations, governing bodies etc Stiegler feels that these are too dominated by market forces, based on an outdated model of capital that privileges short-term thinking, and that media industries are tied to this mid twentieth century style of thinking, where experts define what people should think. He says that this “battery farming of attention”, should and could be challenged with a revival of 'enlightenment' style essays to promote more critical thinking. The main point is that we need to develop critical thinking so that we are not dependent on other people for our ideas.
Discussion points:
The software developer is making the tools with which other people make the artefact; constructing the grammar of the tool so the users are enabled, but also constrained. The constructors of the tools need to be self-aware of the grammars they are explicitly/implicitly embedding in the tools.
With a book, you're not interested in the paper construction but if your reading of it was being followed by someone, you would want to know. So how do we make the footprint visible in our reading?
Some things promote misunderstandings eg using headlines to grab attention.
If a book would always automatically show that, how to create a tool that creates books that always show.
Can you build ethics into a tool, or at least create set of conditions where the user can?
1.youtube, anyone can flag offensive contents (bottom-up)
2.bots that scout the web looking for copyright violation (top-down control)
The ethics is in the nature of the experience – is there a danger in a way of interpreting technology, will only a few people understand? Its like the difference between people watching a film and watching a talk about the making of a film.
The ethic of experience will be automatically inherited from the context in which its used and tools used.
Mscape ethic is in the technology – a decision was made early in the development process so that user position would not be delivered back to HP.
Indigenous australian project - sensitivities about representations of people and artefacts in the territories etc
People from different media backgrounds won't necessarily recognise ethics from each others backgrounds. Need to share examples, be explicit so that the language is not impenetrable. One way would be to pose a set of ethical questions and give examples.
Farming of attention – whether technology promotes critical/intelligent thinking. Example of taking signage off road to make it safer because people concentrate more – can't be on autopilot when having to say aware of surroundings.
Suspect there will be a demand for greater regulation with pervasive media in the future because you can bump into it anywhere, even though we think there are already too many rules. Principles are a better way to frame a project.
Provocations/general questions for future discussions:
What pervasive media have people already accepted?
eg traffic lights, cctv, store cards .... Most will ignore implications of these ie will use store cards to get the discount at supermarket and not worry about their personal data being sold. Have we accepted it as a business transaction?
What level of public awareness of pervasive media is there?
Talking to young people would be fruitful: they have grown up with these technologies that enable (?) differing notions of privacy and self
not everyone wants to know about the author of the book, but may like to know the decisions that were made in the construction of the artefact .
Steigler says of legislative amendments in France eg where minors can be tried for crime as adults, is a collapse of minority/majority that speaks a lot about the failure of the majority to take care of minors; not paying attention to them, or angaging with them, but leaving their development to games devices and tv. The result is that their education is primarily in becoming a consumer. He says this means there is a loss of the superego in our culture. We think we have recreated it in laws, cctv, regulations, but this is not superego, it is not internalised. Pervasive media is more sensitive because it involves the idea of public space.
Bottom up approach to development of pervasive media does not in itself make a project more ethical, although it may be more critical.
Ethics is not absolute: it is about promoting Majority in Kantian terms – we have a duty as a citizen to accept “rules” but must critique them. The tension then is what we do if we dispute that they are good laws: do we disobey bad laws?
Privacy is seeing who can oversee you. A natural condition of human activity. Can't tell in public realm now we have cctv which takes away the sense of mutuality where we see each other, we don;t necessarily know who is seeing us. Reflexive awareness is what is needed. Is it ok to give your personal info if you can edit it yourself? Is it enough to know what I have given away?
What guidelines are there that are relevant from other sectors eg advertising standards, education
What defines a unique experience in a pervasive media project?
What are the specific challenges of pervasive media?
Can't make rules to get rid of rules – that is why we want to develop provications and/or design principles for designers of software and applications.
Deep principles, not rules:
Respecting user choice:
promote anonymity as far as possible.
make it so people are able to choose which to share
who has editorial control?
is editorial control with the maker or the audience, or does it shift between the two?
Build in capabilities for reciprocity, responsibility, citizenship:
mediate don't automate
we are more flexible than the technologies we produce
A pervasive media experience is one that can move across boundaries of different technologies/live art:
a fundamental aesthetic is knowing the border of the game, not squash it
build in options for users to bypass aspects of it
What is the appropriate cultural context/cultural setting eg school, public space?
What platforms does it run on/migrate across – from big screen to mobiles?
What behaviour does it elicit; how will it impact on others and how will this affect safety?
How might it be subverted by others, or changed in a public space?
How immersive is it?
What levels of consent are needed? e.g. if the people playing game A don't know that they were being tracked by people playing game B as part of that game
Does the technology itself raise ethical questions because of what is coded in the technology?
Don't want to have to click a disclaimer at the start of a game.
Don't want to have to do an ethical risk assessment every time you make something.
Thursday, 12 March 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment