Friday 27 March 2009

tweet-a-watt

A friend of mine shared this on her facebook site, and I am trying to think of how one might use it. Its a wattage meter that then sends info about consumption levels via twitter. Does this mean that I can keep an eye on whether the kids have switched on the computer/tv/games console when they weren't meant to, and I would know even if I wasn't at home. Hmmm.......

Thursday 19 March 2009

google street maps goes live

http://news.bbc.co.uk/1/hi/technology/7952317.stm

Thursday 12 March 2009

Is Privacy Possible in the 21st century?

These are my rough notes from the event organised by University of Bristol, where
Andrew Charlesworth (IT&Law) and Nello Christianini (A.I. Dept) gave really engaging talks on both the Legal and Technical perspectives on 21st century privacy.

The talk fitted well with our discussions about metadata, ethics for developers etc. It was really interesting to have the legal perspective too.

Patterns in personal data: we leave trails in transaction space, weblogs and so on.
AOL – can build up picture of us through search terms we use
Google – queries, position, email content, background information, youtube use, calendar, news preferences etc
upmystreet.com
Today it is possible to segment by past behaviour eg people who brought a pizza and rented a video may buy a beer next. Look at www.data-response.co.uk/consumer_response.html
The USA presidential election used targeted/micro marketing to identify potential voters
What concepts, laws and values should we develop to safeguard ourselves and future generations?

Is privacy a 21st century delusion? – the legal perspective:
What is privacy? Individual vs community objectives is a modern invention, first mentioned in a law journal in USA in 1890, privacy is primarily a twentieth century notion. In the 1950s there was a fear of dossiers, now the fear is data-mining of information about us.
Erik Larson's Law of Data Dynamics, from The Naked Consumer, mentions that in the 1960s, information about birthdays that had been collected by the Dairy Queen chain to give free celebratory milkshakes was used by the draft board in the USA to determine whether young men eligible to fight in Vietnam (i.e. over 18). This is an example of Larson's Second Law – that data will always be used for purposes other than originally intended.
Data protection is NOT the same as privacy protection. The data protection act is out of date, too complex, places too much reliance on individual self-help, is toothless. It was written in the days of mainframes and dumb terminals. The public sector is often exempted. Private sector faces only minor limits. It only covers data identfying a living individual.
Phorm – BT – examines every webpage visited and generates adverts accordingly
Key questions
1.Can governments and corporations achieve legitimate goals withough causing disproportionate harm?
2.Will changes in privacy affect us equally, or continue divisions based on wealth?
3.Will we see backlash against datamining, profiling and clustering and what form might that take? Eg encryption, data pollution, physical/electronic attacks? The government is developing a new database that will contain all our travel details.


Usenet back archives were bought by google and made online and searchable – so anytihng you looked at or wrote to in the 1980s is now public. We need new laws on informational privacy

We need a culture shift.
1.without social/legal restraints the collection and retention of personal data will increase in line with storage capacity
2.there is no 'forgetting' of personal data – no clean slate for juvenile offenders.
3.Governments and businesses will increasingly interact with your 'data shadow'
4.individuals will modify their behaviour either consciously or unconsciously to maintain a favourable data shadow.

Questions from the floor:

On marketing making us homogenous - Andrew more concerned with political homogeneity

Used to rely on totalitarian states being inefficient – is it much harder now?

Importance of using consumer pressure – Companies could market themselves as companies who don't log data, as ecommerce firms have an incentive to be perceived as trustworthy, but if they go bust then there is a financial incentive to sell any data they have.

Discussion January 15th 2009

It has taken me quite a while to boil down a long discussion into this much, I hope its useful, and I hope that any of the participants feel able to add comments and or amendments to this text. Apologies for the unforeseen delays in getting this done.

Our session started with a presentation by Patrick Crogan on work by Bernard Stiegler, followed by a discussion that generated a list of questions/provocations both for future discussion and for pervasive media designers. n.b. My notes don't always differentiate between questions for software designers and questions for pervasive media designers

Patrick focused on Stiegler's ideas on Autopilot behaviour and Grammars of Activity, and the following is a very brief outline of these.

Autopilot:
in a lot of our everyday life we function in Autopilot mode, where we respond without conscious thought to our environment and other people. If we did not have the ability to develop these patterns of activity then we would not be able to negotiate everyday life. Stiegler suggests that we should sometimes pay more attention to these autopilot activities, to be sure that we have an awareness of what we are doing.
Autopilots are based on “grammars of activity”. We learn these ways of doing/being, they are not innate/natural but are socialised behaviours that at some point been written down and formalised as sets of rules of behaviour in society. The problem with embedding these grammars into smart technologies is that they then become fixed.

As makers, our creative focus is influenced by both technical and social grammars, whether we realise it or not:

Technical specifications limit or condition the activity. Technological grammars are the outcome of previous histories, political decisions etc Stiegler has looked at the development of international mpeg standards as a case in point.

Any maker is always working within a milieu/company/industry/funding opportunities etc. Academics work within formats that mould projects and outcomes. They are also working within social/cultural/broad technical culture. Culture has always been about the passing down of techniques – passing on, in a transmittable format, the earlier experiences of someone who may be now long dead.

(This made me think that I should look again at Judy Wajcman)

Stiegler talks of
savoir faire how to live
savoir vivre how to live well: to enjoy/appreciate art/wine/cinema/games ....ie evolve technical competencies
savoir penser how to think well: to reflect on things, argue, propose a position, develop existence/morals/ethics etc
These are all carried forwards in our culture through artefacts, and internalised artefacts. (Memory is a form of artefact if you are human, which is what makes us different from animals.) In the creative moment we have all these as part of our autopilot processes, before, during and afterwards.

To develop our ethical considerations around pervasive media it may be useful to think about these three states of Stiegler's.

Referring back to the paper on Metadata that Stiegler presented at Goldsmiths , this was about promoting a more critical engagement with metadata that we unwittingly produce.
Stiegler talked about Web 2.0 and how it combines topdown and bottom up production of grammars.
Top down could be characterised as being developed by 'experts', eg mpeg standard associations, governing bodies etc Stiegler feels that these are too dominated by market forces, based on an outdated model of capital that privileges short-term thinking, and that media industries are tied to this mid twentieth century style of thinking, where experts define what people should think. He says that this “battery farming of attention”, should and could be challenged with a revival of 'enlightenment' style essays to promote more critical thinking. The main point is that we need to develop critical thinking so that we are not dependent on other people for our ideas.

Discussion points:


The software developer is making the tools with which other people make the artefact; constructing the grammar of the tool so the users are enabled, but also constrained. The constructors of the tools need to be self-aware of the grammars they are explicitly/implicitly embedding in the tools.

With a book, you're not interested in the paper construction but if your reading of it was being followed by someone, you would want to know. So how do we make the footprint visible in our reading?
Some things promote misunderstandings eg using headlines to grab attention.
If a book would always automatically show that, how to create a tool that creates books that always show.
Can you build ethics into a tool, or at least create set of conditions where the user can?
1.youtube, anyone can flag offensive contents (bottom-up)
2.bots that scout the web looking for copyright violation (top-down control)

The ethics is in the nature of the experience – is there a danger in a way of interpreting technology, will only a few people understand? Its like the difference between people watching a film and watching a talk about the making of a film.
The ethic of experience will be automatically inherited from the context in which its used and tools used.
Mscape ethic is in the technology – a decision was made early in the development process so that user position would not be delivered back to HP.
Indigenous australian project - sensitivities about representations of people and artefacts in the territories etc

People from different media backgrounds won't necessarily recognise ethics from each others backgrounds. Need to share examples, be explicit so that the language is not impenetrable. One way would be to pose a set of ethical questions and give examples.

Farming of attention – whether technology promotes critical/intelligent thinking. Example of taking signage off road to make it safer because people concentrate more – can't be on autopilot when having to say aware of surroundings.

Suspect there will be a demand for greater regulation with pervasive media in the future because you can bump into it anywhere, even though we think there are already too many rules. Principles are a better way to frame a project.

Provocations/general questions for future discussions:
What pervasive media have people already accepted?
eg traffic lights, cctv, store cards .... Most will ignore implications of these ie will use store cards to get the discount at supermarket and not worry about their personal data being sold. Have we accepted it as a business transaction?

What level of public awareness of pervasive media is there?
Talking to young people would be fruitful: they have grown up with these technologies that enable (?) differing notions of privacy and self
not everyone wants to know about the author of the book, but may like to know the decisions that were made in the construction of the artefact .

Steigler says of legislative amendments in France eg where minors can be tried for crime as adults, is a collapse of minority/majority that speaks a lot about the failure of the majority to take care of minors; not paying attention to them, or angaging with them, but leaving their development to games devices and tv. The result is that their education is primarily in becoming a consumer. He says this means there is a loss of the superego in our culture. We think we have recreated it in laws, cctv, regulations, but this is not superego, it is not internalised. Pervasive media is more sensitive because it involves the idea of public space.

Bottom up approach to development of pervasive media does not in itself make a project more ethical, although it may be more critical.

Ethics is not absolute: it is about promoting Majority in Kantian terms – we have a duty as a citizen to accept “rules” but must critique them. The tension then is what we do if we dispute that they are good laws: do we disobey bad laws?

Privacy is seeing who can oversee you. A natural condition of human activity. Can't tell in public realm now we have cctv which takes away the sense of mutuality where we see each other, we don;t necessarily know who is seeing us. Reflexive awareness is what is needed. Is it ok to give your personal info if you can edit it yourself? Is it enough to know what I have given away?

What guidelines are there that are relevant from other sectors eg advertising standards, education

What defines a unique experience in a pervasive media project?

What are the specific challenges of pervasive media?

Can't make rules to get rid of rules – that is why we want to develop provications and/or design principles for designers of software and applications.

Deep principles, not rules:

Respecting user choice:
promote anonymity as far as possible.
make it so people are able to choose which to share
who has editorial control?
is editorial control with the maker or the audience, or does it shift between the two?

Build in capabilities for reciprocity, responsibility, citizenship:
mediate don't automate
we are more flexible than the technologies we produce

A pervasive media experience is one that can move across boundaries of different technologies/live art:
a fundamental aesthetic is knowing the border of the game, not squash it
build in options for users to bypass aspects of it

What is the appropriate cultural context/cultural setting eg school, public space?

What platforms does it run on/migrate across – from big screen to mobiles?

What behaviour does it elicit; how will it impact on others and how will this affect safety?
How might it be subverted by others, or changed in a public space?

How immersive is it?

What levels of consent are needed? e.g. if the people playing game A don't know that they were being tracked by people playing game B as part of that game

Does the technology itself raise ethical questions because of what is coded in the technology?

Don't want to have to click a disclaimer at the start of a game.

Don't want to have to do an ethical risk assessment every time you make something.