Wednesday, 15 April 2009

interview with Turing Award winner Barbara Liskov

Interesting interview with Barbara Liskov, only the 2nd woman to win the Turing award, sigh! She is currently working in the field of distributed/cloud computing, and so talks about data storage, privacy, security etc. She mentions one ethical issue that I hadn't even thought about: military research into using robots in warfare....

Thursday, 9 April 2009

notes from march 2009 session

The session started with a presentation by a pervasive media studio resident who is developing an application for iPhone, as we had previously decided that having a specific project would be the best way to generate discussion.

We referred back to some earlier thoughts on design principles (not rules) for pervasive media producers, to see how they might apply to this project. As the project is stil in development we won't go into detail here, but will report those aspects that we felt related to some earlier thoughts on design principles.

Any application might be used in unintended ways, and it can also be hard to think how applications might be used malevolently. This might be a useful role of Design Principles too – to help designers explore unintended use of their applications?

Aim to promote anonymity as far as possible: tagging location to place is about to explode, does the application have to connect to other applications such as twitter/facebook? Passing data to other applications that are out of the maker's control make it impossible(?) to erase any data trail that the user generates.

Can you design the application so that people are able to choose which information they share?

● What data is being collected/collated and commercially exploited by the application?

e.g. facebook usage generates data that then has commercial uses.

An application might be designed with no commercial intentions but the data produced might be usable for other purposes by 3rd party applications. One principle of open ethics is to have a clear relationship with your users so they know what is being used; if there is commercial usefulness to their data you tell the user that you are exploiting this.

Can your application use open-id rather than a 3rd party app like facebook? For the developer, using the facebook friends list is easier but control over the data trail is lost.

What are the borders of the game?
A pervasive media experience can move across boundaries of different technologies/live art: What platforms does it run on/migrate across – from big screen to mobiles, iPhones to website?
A fundamental aesthetic is knowing the border of the game, not squash it. Need to build in options for users to bypass aspects of it

What effect does participation have on the 'audience'?
As a user, some experiences invite you to cross those ethical thresholds in ways in which you surprise yourself, so, What behaviour does the application elicit?

How will the behaviour of participants impact on others and how will this affect safety? e.g. are they running through a crowded space?

How immersive is it? Is it 'too' immersive to be used in a space that includes traffic?
How might it be subverted by others, or changed in a public space?

A significant consideration for the designer is the way that you manipulate others to behave in public space, and also, how you protect your participants in the game.

Where do the ethical thresholds lie in real space and virtual space?

What are the experiential aspects of the application? Pervasive media application can be powerful tools that create an emotional response in the user. Developers need to recognise whether they engender 'appropriate emotions' with the application, ie appropriate for the context and the user: awareness of age group, background, ability/disability/access.

Those contexts can be rapidly out of designer control e.g. age boundaries are transgressed by children getting onto facebook and /or games.

What levels of user consent are needed?

e.g. if the people playing game A don't know that they were being tracked by people playing game B as part of that game. This also links to whether users know they are about to pay for use of services on their phone, or are about to send personal data to a third-party software such as twitter, facebook.

Which design decisions can be left to the end user?

  • how public they make it themselves

  • could design it to be a personal reflexive tool?

Private journal – would be more honest ?

If its privat then the application wouldnt need to publish data to other applications so could keep all info on device

Is the application addictive?

Many applications aim to develop a sense of belonging in the user which can be problematic – see Tavistock clinic work with games addicts. Do you deliberately make your application as addictive as possible?

Using World of Warcraft (WoW) as an example could be useful to pervasive media designers to look at. WoW involves sophisticated dialogue with groups of people: belonging to the group is the important element for most people, notions of inclusion/exclusion – need to be part of it – is what hooks users.

Specific arc of structured experience in games such as WoW , whereby the player is accruing points, status and becoming part of a heirarchy. So greater and greater top levels are introduced so the users keep buying in and playing.

What will the implicit prescribed tasks be in pervasive media?

Could be prescriptive eg tasks to tick off, or the developer could make it other......

Are applications such as facebook too reductive to allow meaningful relationships?

What context makes it more likely for people to share their personal data?

From totally private <====================>facebook style data sharing

(i.e. all data stays on personal device) 3rd party can access data

Giving the user a variety of options – they can opt in to use the services they trust.

Any application could have different levels of engagement/privacy so you choose how much information to share; play solo, play with a group of chosen friends, play with a public group.

If there is a finite duration to what we are sharing would make it easier for us to share? e.g. if a game takes place over a set period of time then will people find it easier to share their info with others?

How do we avoid developing pervasive chugging application?

Suggested that experiences such as Blast theory (Uncle Roy etc) rely on novelty

once that becomes an iphone app the novelty aspect is lost, and if the user was offered that every week it would become irritiating, like having to avoid chuggers.

Not an ethics question but an interesting tangent.......

We discussed how applications might try to quantify emotional reponses of the user and whether and how developers could build in feed back to person about how they feel.

Emotions are essentially human and un-computerlike, hard to quantify/database objectively especially if users are having to stop and think and translate emtion into language. Pervasive media apps could use elements such as gesture when asking for input, but how to compare between two different users inputs ie when they shake a device, one may be stronger than the other but have the same emotional response. Asking user to choose a colour to represent their mood is alos hard to quantify for database (and subsequent selling of useful data) – two people may have different relationships to the same colour, e.g they both choose black as an 'answer' but one has a negative connotation, one a positive... personal associations between emotion and colour can't be compared to other users)

It is not only artists who explore emotion in their work - researcher at geog dept bristol is making quality of life surveys using quantitative geographies of happiness –

Is it possible to make the capturing process automated, getting a device to “understand” emotions in the same ways as another person?

If data is not being used in relation to other data, then the way that the user inputs emotional response to that time and that place can be individual to them – perhaps in the form of colour that they associate with different emotions, and they can build up their own personal picture of their emotional responses over a period of time.

Would this type of personal data generated be authentic? – probably more so if the user knew their data would be kept private ie local to their phone. Would a system be swamped with erroneous information?

Wednesday, 8 April 2009


interesting social software application using location + stories + shared input, using iPhone, twitter and facebook - and you can shake your iPhone to find out more at the end of the intro pages on the whrrl site. Needless to say that it didn't work when I shook my laptop. Trying to think how and when I would bother to use whrrl in daily life - but I don't have an iPhone and I haven't read all the details about whrrl yet. In our last ethics discussion we were looking at location + data and possible implications for designers of linking your software to facebook - when you lose control of data i.e. can't erase the user data trails, so I was interested to see this appear.

Wednesday, 1 April 2009

Permanent tracking for cars

Not sure I like the look of this idea of giving cars a heartbeat trackable anywhere in Europe.