Tuesday 29 December 2009

more visions of the future

"the law says I own her data til she's 18"

presumably even if she has had children already?

Tuesday 17 November 2009

UK mobile phone data 'was sold'

BBC news report on selling on of phone data:

"Staff at one of the UK's major mobile phone companies sold on millions of records from thousands of customers, the information watchdog says.
Christopher Graham told the BBC that brokers had bought the data and sold it on to other phone firms, who called the customers as contracts neared expiry."

You just can't get the staff.

Wednesday 14 October 2009

event

Who's watching you? Facing up to the darker side of social networking
Speaker: Mr Andrew Charlesworth, School of Law, University of Bristol
Time & Date: 6pm, 28 October 2009
Venue: Folk House, 40a Park Street, Bristol, BS1 5JG
Free, Booking required (margery.lever@bristol.ac.uk) tel: +44 (0)117 331 8313

Thursday 13 August 2009

data in your palm

So I have been distracted by real life getting in the way of doing much apart from playing with facebook - a useful tool for family togetherness. Then I started noticing various things popping up in the news that I wanted to make a note of for future reference, so thought this could be a place to do that. Today I just read on the BBC website about the Palm Smartphone, which was sending user data back to Palm on a daily basis, without the user realising.
The article quotes a Palm spokesperson:
"Our privacy policy is like many policies in the industry and includes very detailed language about potential scenarios in which we might use a customer's information, all toward a goal of offering a great user experience........ We appreciate the trust that users give us with their information, and have no intention to violate that trust."
So perhaps the smartphone shouldn't set up to send back private data as the default option?

Tuesday 2 June 2009

games that respond to our emotional state

Just read on the BBC website about a new Microsoft video game controller, which offers the possibility of games that recognise the expressions on our face and respond accordingly.
Steven Spielberg was at the launch:

"The video games industry has not allowed us the opportunity to cry, because we were too busy putting our adrenalin rush into the controller, or wherever we swing our arm with a Wii controller to get a result," he said.

"Because of that, there is no room for a video game to break your heart. We now have a little more room to be a little more emotional with Natal technology than we did before."

I found Steven Spielberg's comments slightly disturbing, but I suppose making video games that really upset people may be something that games designers want? It could be another design principle for pervasive media people to think about; not just how addictive we make experiences, but how much we manipulate people's emotions.

Wednesday 20 May 2009

Are Computers Transforming Humanity?

Been busy, but this article from computer world by Mary K. Pratt caught my eye and I wanted to share it:

"Researchers and technologists agree that computer technology is fueling significant changes in individuals and society as a whole, and the endpoint is unknown. Carnegie-Mellon University professor Dan Siewiorek says that our perception of privacy has changed in response to technology, and that a certain lack of concern about revealing personal details has infiltrated our lives. Jennifer Earl of the University of California, Santa Barbara observes that innovative uses of technology are enabling new communal efforts, in particular those that span heterogeneous groups. Social Solutions president Patricia Sachs Chess notes that technology also is creating changes in communication, such as the growing use of slang, jargon, abbreviations, phonics, and colloquial syntax in electronic discourse. This trend could augur a transformation of our values, skills, and capabilities, with some experts worried about declining grammar and writing proficiency, among other things. Others are concerned that meaningful interactions between people could be adversely affected by a digital narcissism, which Rochester Institute of Technology professor Evan Selinger describes as people's use of social networking and other electronic mediums "to tune out much of the external world, while reinforcing and further rationalizing overblown esteem for their own mundane opinions, tastes, and lifestyle choices." Another technology-driven change researchers are seeing is one of brain function, with Arizona State University professor Brad Allenby predicting that "once we get seriously into [augmented cognition] and virtual reality, the one who has the advantage isn't the one who is brilliant but the one who can sit in front of the computer screen and respond best." Researchers also wonder whether the unending stimulation and ongoing demands of technology, while perhaps making us better multitaskers, is lessening our deep thinking abilities."

Wednesday 15 April 2009

interview with Turing Award winner Barbara Liskov

Interesting interview with Barbara Liskov, only the 2nd woman to win the Turing award, sigh! She is currently working in the field of distributed/cloud computing, and so talks about data storage, privacy, security etc. She mentions one ethical issue that I hadn't even thought about: military research into using robots in warfare....

Thursday 9 April 2009

notes from march 2009 session

The session started with a presentation by a pervasive media studio resident who is developing an application for iPhone, as we had previously decided that having a specific project would be the best way to generate discussion.


We referred back to some earlier thoughts on design principles (not rules) for pervasive media producers, to see how they might apply to this project. As the project is stil in development we won't go into detail here, but will report those aspects that we felt related to some earlier thoughts on design principles.


Any application might be used in unintended ways, and it can also be hard to think how applications might be used malevolently. This might be a useful role of Design Principles too – to help designers explore unintended use of their applications?


Aim to promote anonymity as far as possible: tagging location to place is about to explode, does the application have to connect to other applications such as twitter/facebook? Passing data to other applications that are out of the maker's control make it impossible(?) to erase any data trail that the user generates.

Can you design the application so that people are able to choose which information they share?


● What data is being collected/collated and commercially exploited by the application?

e.g. facebook usage generates data that then has commercial uses.

An application might be designed with no commercial intentions but the data produced might be usable for other purposes by 3rd party applications. One principle of open ethics is to have a clear relationship with your users so they know what is being used; if there is commercial usefulness to their data you tell the user that you are exploiting this.

Can your application use open-id rather than a 3rd party app like facebook? For the developer, using the facebook friends list is easier but control over the data trail is lost.


What are the borders of the game?
A pervasive media experience can move across boundaries of different technologies/live art: What platforms does it run on/migrate across – from big screen to mobiles, iPhones to website?
A fundamental aesthetic is knowing the border of the game, not squash it. Need to build in options for users to bypass aspects of it

What effect does participation have on the 'audience'?
As a user, some experiences invite you to cross those ethical thresholds in ways in which you surprise yourself, so, What behaviour does the application elicit?

How will the behaviour of participants impact on others and how will this affect safety? e.g. are they running through a crowded space?

How immersive is it? Is it 'too' immersive to be used in a space that includes traffic?
How might it be subverted by others, or changed in a public space?

A significant consideration for the designer is the way that you manipulate others to behave in public space, and also, how you protect your participants in the game.

Where do the ethical thresholds lie in real space and virtual space?


What are the experiential aspects of the application? Pervasive media application can be powerful tools that create an emotional response in the user. Developers need to recognise whether they engender 'appropriate emotions' with the application, ie appropriate for the context and the user: awareness of age group, background, ability/disability/access.

Those contexts can be rapidly out of designer control e.g. age boundaries are transgressed by children getting onto facebook and /or games.


What levels of user consent are needed?

e.g. if the people playing game A don't know that they were being tracked by people playing game B as part of that game. This also links to whether users know they are about to pay for use of services on their phone, or are about to send personal data to a third-party software such as twitter, facebook.


Which design decisions can be left to the end user?

  • how public they make it themselves

  • could design it to be a personal reflexive tool?

Private journal – would be more honest ?

If its privat then the application wouldnt need to publish data to other applications so could keep all info on device


Is the application addictive?

Many applications aim to develop a sense of belonging in the user which can be problematic – see Tavistock clinic work with games addicts. Do you deliberately make your application as addictive as possible?

Using World of Warcraft (WoW) as an example could be useful to pervasive media designers to look at. WoW involves sophisticated dialogue with groups of people: belonging to the group is the important element for most people, notions of inclusion/exclusion – need to be part of it – is what hooks users.

Specific arc of structured experience in games such as WoW , whereby the player is accruing points, status and becoming part of a heirarchy. So greater and greater top levels are introduced so the users keep buying in and playing.

What will the implicit prescribed tasks be in pervasive media?

Could be prescriptive eg tasks to tick off, or the developer could make it other......

Are applications such as facebook too reductive to allow meaningful relationships?


What context makes it more likely for people to share their personal data?

From totally private <====================>facebook style data sharing

(i.e. all data stays on personal device) 3rd party can access data

Giving the user a variety of options – they can opt in to use the services they trust.

Any application could have different levels of engagement/privacy so you choose how much information to share; play solo, play with a group of chosen friends, play with a public group.

If there is a finite duration to what we are sharing would make it easier for us to share? e.g. if a game takes place over a set period of time then will people find it easier to share their info with others?


How do we avoid developing pervasive chugging application?

Suggested that experiences such as Blast theory (Uncle Roy etc) rely on novelty

once that becomes an iphone app the novelty aspect is lost, and if the user was offered that every week it would become irritiating, like having to avoid chuggers.


Not an ethics question but an interesting tangent.......

We discussed how applications might try to quantify emotional reponses of the user and whether and how developers could build in feed back to person about how they feel.

Emotions are essentially human and un-computerlike, hard to quantify/database objectively especially if users are having to stop and think and translate emtion into language. Pervasive media apps could use elements such as gesture when asking for input, but how to compare between two different users inputs ie when they shake a device, one may be stronger than the other but have the same emotional response. Asking user to choose a colour to represent their mood is alos hard to quantify for database (and subsequent selling of useful data) – two people may have different relationships to the same colour, e.g they both choose black as an 'answer' but one has a negative connotation, one a positive... personal associations between emotion and colour can't be compared to other users)


It is not only artists who explore emotion in their work - researcher at geog dept bristol is making quality of life surveys using quantitative geographies of happiness –


Is it possible to make the capturing process automated, getting a device to “understand” emotions in the same ways as another person?


If data is not being used in relation to other data, then the way that the user inputs emotional response to that time and that place can be individual to them – perhaps in the form of colour that they associate with different emotions, and they can build up their own personal picture of their emotional responses over a period of time.


Would this type of personal data generated be authentic? – probably more so if the user knew their data would be kept private ie local to their phone. Would a system be swamped with erroneous information?

Wednesday 8 April 2009

Whrrl

interesting social software application using location + stories + shared input, using iPhone, twitter and facebook - and you can shake your iPhone to find out more at the end of the intro pages on the whrrl site. Needless to say that it didn't work when I shook my laptop. Trying to think how and when I would bother to use whrrl in daily life - but I don't have an iPhone and I haven't read all the details about whrrl yet. In our last ethics discussion we were looking at location + data and possible implications for designers of linking your software to facebook - when you lose control of data i.e. can't erase the user data trails, so I was interested to see this appear.

Wednesday 1 April 2009

Permanent tracking for cars

Not sure I like the look of this idea of giving cars a heartbeat trackable anywhere in Europe.

Friday 27 March 2009

tweet-a-watt

A friend of mine shared this on her facebook site, and I am trying to think of how one might use it. Its a wattage meter that then sends info about consumption levels via twitter. Does this mean that I can keep an eye on whether the kids have switched on the computer/tv/games console when they weren't meant to, and I would know even if I wasn't at home. Hmmm.......

Thursday 19 March 2009

google street maps goes live

http://news.bbc.co.uk/1/hi/technology/7952317.stm

Thursday 12 March 2009

Is Privacy Possible in the 21st century?

These are my rough notes from the event organised by University of Bristol, where
Andrew Charlesworth (IT&Law) and Nello Christianini (A.I. Dept) gave really engaging talks on both the Legal and Technical perspectives on 21st century privacy.

The talk fitted well with our discussions about metadata, ethics for developers etc. It was really interesting to have the legal perspective too.

Patterns in personal data: we leave trails in transaction space, weblogs and so on.
AOL – can build up picture of us through search terms we use
Google – queries, position, email content, background information, youtube use, calendar, news preferences etc
upmystreet.com
Today it is possible to segment by past behaviour eg people who brought a pizza and rented a video may buy a beer next. Look at www.data-response.co.uk/consumer_response.html
The USA presidential election used targeted/micro marketing to identify potential voters
What concepts, laws and values should we develop to safeguard ourselves and future generations?

Is privacy a 21st century delusion? – the legal perspective:
What is privacy? Individual vs community objectives is a modern invention, first mentioned in a law journal in USA in 1890, privacy is primarily a twentieth century notion. In the 1950s there was a fear of dossiers, now the fear is data-mining of information about us.
Erik Larson's Law of Data Dynamics, from The Naked Consumer, mentions that in the 1960s, information about birthdays that had been collected by the Dairy Queen chain to give free celebratory milkshakes was used by the draft board in the USA to determine whether young men eligible to fight in Vietnam (i.e. over 18). This is an example of Larson's Second Law – that data will always be used for purposes other than originally intended.
Data protection is NOT the same as privacy protection. The data protection act is out of date, too complex, places too much reliance on individual self-help, is toothless. It was written in the days of mainframes and dumb terminals. The public sector is often exempted. Private sector faces only minor limits. It only covers data identfying a living individual.
Phorm – BT – examines every webpage visited and generates adverts accordingly
Key questions
1.Can governments and corporations achieve legitimate goals withough causing disproportionate harm?
2.Will changes in privacy affect us equally, or continue divisions based on wealth?
3.Will we see backlash against datamining, profiling and clustering and what form might that take? Eg encryption, data pollution, physical/electronic attacks? The government is developing a new database that will contain all our travel details.


Usenet back archives were bought by google and made online and searchable – so anytihng you looked at or wrote to in the 1980s is now public. We need new laws on informational privacy

We need a culture shift.
1.without social/legal restraints the collection and retention of personal data will increase in line with storage capacity
2.there is no 'forgetting' of personal data – no clean slate for juvenile offenders.
3.Governments and businesses will increasingly interact with your 'data shadow'
4.individuals will modify their behaviour either consciously or unconsciously to maintain a favourable data shadow.

Questions from the floor:

On marketing making us homogenous - Andrew more concerned with political homogeneity

Used to rely on totalitarian states being inefficient – is it much harder now?

Importance of using consumer pressure – Companies could market themselves as companies who don't log data, as ecommerce firms have an incentive to be perceived as trustworthy, but if they go bust then there is a financial incentive to sell any data they have.

Discussion January 15th 2009

It has taken me quite a while to boil down a long discussion into this much, I hope its useful, and I hope that any of the participants feel able to add comments and or amendments to this text. Apologies for the unforeseen delays in getting this done.

Our session started with a presentation by Patrick Crogan on work by Bernard Stiegler, followed by a discussion that generated a list of questions/provocations both for future discussion and for pervasive media designers. n.b. My notes don't always differentiate between questions for software designers and questions for pervasive media designers

Patrick focused on Stiegler's ideas on Autopilot behaviour and Grammars of Activity, and the following is a very brief outline of these.

Autopilot:
in a lot of our everyday life we function in Autopilot mode, where we respond without conscious thought to our environment and other people. If we did not have the ability to develop these patterns of activity then we would not be able to negotiate everyday life. Stiegler suggests that we should sometimes pay more attention to these autopilot activities, to be sure that we have an awareness of what we are doing.
Autopilots are based on “grammars of activity”. We learn these ways of doing/being, they are not innate/natural but are socialised behaviours that at some point been written down and formalised as sets of rules of behaviour in society. The problem with embedding these grammars into smart technologies is that they then become fixed.

As makers, our creative focus is influenced by both technical and social grammars, whether we realise it or not:

Technical specifications limit or condition the activity. Technological grammars are the outcome of previous histories, political decisions etc Stiegler has looked at the development of international mpeg standards as a case in point.

Any maker is always working within a milieu/company/industry/funding opportunities etc. Academics work within formats that mould projects and outcomes. They are also working within social/cultural/broad technical culture. Culture has always been about the passing down of techniques – passing on, in a transmittable format, the earlier experiences of someone who may be now long dead.

(This made me think that I should look again at Judy Wajcman)

Stiegler talks of
savoir faire how to live
savoir vivre how to live well: to enjoy/appreciate art/wine/cinema/games ....ie evolve technical competencies
savoir penser how to think well: to reflect on things, argue, propose a position, develop existence/morals/ethics etc
These are all carried forwards in our culture through artefacts, and internalised artefacts. (Memory is a form of artefact if you are human, which is what makes us different from animals.) In the creative moment we have all these as part of our autopilot processes, before, during and afterwards.

To develop our ethical considerations around pervasive media it may be useful to think about these three states of Stiegler's.

Referring back to the paper on Metadata that Stiegler presented at Goldsmiths , this was about promoting a more critical engagement with metadata that we unwittingly produce.
Stiegler talked about Web 2.0 and how it combines topdown and bottom up production of grammars.
Top down could be characterised as being developed by 'experts', eg mpeg standard associations, governing bodies etc Stiegler feels that these are too dominated by market forces, based on an outdated model of capital that privileges short-term thinking, and that media industries are tied to this mid twentieth century style of thinking, where experts define what people should think. He says that this “battery farming of attention”, should and could be challenged with a revival of 'enlightenment' style essays to promote more critical thinking. The main point is that we need to develop critical thinking so that we are not dependent on other people for our ideas.

Discussion points:


The software developer is making the tools with which other people make the artefact; constructing the grammar of the tool so the users are enabled, but also constrained. The constructors of the tools need to be self-aware of the grammars they are explicitly/implicitly embedding in the tools.

With a book, you're not interested in the paper construction but if your reading of it was being followed by someone, you would want to know. So how do we make the footprint visible in our reading?
Some things promote misunderstandings eg using headlines to grab attention.
If a book would always automatically show that, how to create a tool that creates books that always show.
Can you build ethics into a tool, or at least create set of conditions where the user can?
1.youtube, anyone can flag offensive contents (bottom-up)
2.bots that scout the web looking for copyright violation (top-down control)

The ethics is in the nature of the experience – is there a danger in a way of interpreting technology, will only a few people understand? Its like the difference between people watching a film and watching a talk about the making of a film.
The ethic of experience will be automatically inherited from the context in which its used and tools used.
Mscape ethic is in the technology – a decision was made early in the development process so that user position would not be delivered back to HP.
Indigenous australian project - sensitivities about representations of people and artefacts in the territories etc

People from different media backgrounds won't necessarily recognise ethics from each others backgrounds. Need to share examples, be explicit so that the language is not impenetrable. One way would be to pose a set of ethical questions and give examples.

Farming of attention – whether technology promotes critical/intelligent thinking. Example of taking signage off road to make it safer because people concentrate more – can't be on autopilot when having to say aware of surroundings.

Suspect there will be a demand for greater regulation with pervasive media in the future because you can bump into it anywhere, even though we think there are already too many rules. Principles are a better way to frame a project.

Provocations/general questions for future discussions:
What pervasive media have people already accepted?
eg traffic lights, cctv, store cards .... Most will ignore implications of these ie will use store cards to get the discount at supermarket and not worry about their personal data being sold. Have we accepted it as a business transaction?

What level of public awareness of pervasive media is there?
Talking to young people would be fruitful: they have grown up with these technologies that enable (?) differing notions of privacy and self
not everyone wants to know about the author of the book, but may like to know the decisions that were made in the construction of the artefact .

Steigler says of legislative amendments in France eg where minors can be tried for crime as adults, is a collapse of minority/majority that speaks a lot about the failure of the majority to take care of minors; not paying attention to them, or angaging with them, but leaving their development to games devices and tv. The result is that their education is primarily in becoming a consumer. He says this means there is a loss of the superego in our culture. We think we have recreated it in laws, cctv, regulations, but this is not superego, it is not internalised. Pervasive media is more sensitive because it involves the idea of public space.

Bottom up approach to development of pervasive media does not in itself make a project more ethical, although it may be more critical.

Ethics is not absolute: it is about promoting Majority in Kantian terms – we have a duty as a citizen to accept “rules” but must critique them. The tension then is what we do if we dispute that they are good laws: do we disobey bad laws?

Privacy is seeing who can oversee you. A natural condition of human activity. Can't tell in public realm now we have cctv which takes away the sense of mutuality where we see each other, we don;t necessarily know who is seeing us. Reflexive awareness is what is needed. Is it ok to give your personal info if you can edit it yourself? Is it enough to know what I have given away?

What guidelines are there that are relevant from other sectors eg advertising standards, education

What defines a unique experience in a pervasive media project?

What are the specific challenges of pervasive media?

Can't make rules to get rid of rules – that is why we want to develop provications and/or design principles for designers of software and applications.

Deep principles, not rules:

Respecting user choice:
promote anonymity as far as possible.
make it so people are able to choose which to share
who has editorial control?
is editorial control with the maker or the audience, or does it shift between the two?

Build in capabilities for reciprocity, responsibility, citizenship:
mediate don't automate
we are more flexible than the technologies we produce

A pervasive media experience is one that can move across boundaries of different technologies/live art:
a fundamental aesthetic is knowing the border of the game, not squash it
build in options for users to bypass aspects of it

What is the appropriate cultural context/cultural setting eg school, public space?

What platforms does it run on/migrate across – from big screen to mobiles?

What behaviour does it elicit; how will it impact on others and how will this affect safety?
How might it be subverted by others, or changed in a public space?

How immersive is it?

What levels of consent are needed? e.g. if the people playing game A don't know that they were being tracked by people playing game B as part of that game

Does the technology itself raise ethical questions because of what is coded in the technology?

Don't want to have to click a disclaimer at the start of a game.

Don't want to have to do an ethical risk assessment every time you make something.

Monday 9 February 2009

January session

More info coming soon - been busy elsewhere but havent forgotten

talk on privacy in the 21st century

Just been sent info about this talk, which may be of interest. Its one of Bristol University's 'Twilight Talks'

6 pm - 8 pm, Thursday 12 February 2009

"Information is collected every time you use your mobile, pay for some shopping or even give out your postcode. There are laws about the ways in which this information can be used, but with advances in computer science, complex pictures of people’s lives can be legally built up very easily. How do we balance our desire for security and for privacy in an increasingly technological world? And are the laws on which we rely to protect our privacy fit for purpose?"

Speakers: Andrew Charlesworth, School of Law, and Professor Nello Cristianini, Department of Engineering Mathematics.

Venue:The Watershed, 1 Canon's Road, Harbourside, Bristol, BS1 5TX.

Free admission, but booking required in advance. Contact Margery Lever by email margery.lever@bristol.ac.uk, or telephone +44 (0)117 331 8313.

Tuesday 6 January 2009

legalised hacking in the UK

http://www.timesonline.co.uk/tol/news/politics/article5439604.ece