I’ll be speaking at Strata+Hadoop this year, on UX, trust and privacy – and how these are becoming increasing important in human-computer interaction. Expect a talk pro-privacy, pro-personalisation and with plenty of practical tips for UXers that want to do Privacy right.
The more subtle elements used in game design to motivate gamers, work just as well on users in everyday design. So whilst I’m retiring some of my notebooks to the shoebox, I’m not retiring my notes on gamification. If you promise to avoid using the PBL triad as much as possible, here’s my Gamification Cheat Sheet pdf for you.
Some member of parliament must have severely misunderstood the meaning of ‘privacy-by-design’. These ‘standardised privacy icons’ and their logic are that disastrous, that enforcing these will not strengthen but weaken the new European privacy legislation. The icons and copy suggestions are unclear, unusable and being forced to show these will punish especially the companies and organisations who do privacy right.
Just show and tell is not enough for people to change their mobility behaviour towards something that both benefits the environment and their stress levels. Even if we want to, it’s hard to break our habits. Yet gamification can help.
▐ ・ ‿ ・▐
What if you take the user’s point of view, and a transparency approach to using only data points that are actually useful? The the obvious data usage principle (ODUP) as described in the The data chicken and egg problem paper by Håkan Jonsson is exactly what you need to be ‘deliberate’ from a user perspective.
These stories caught my eye (and actual attention) this week. Major tech companies ask the US to mind European’s privacy, branded emojis, a psychologist view of UX design and 2/3rd of my Holy Triad: Neil Stephenson (Seveneves preview) and Sterling (on convergence between humans and machines).
The Internet of Things Design Manifesto is a living guideline for responsible design in a connected world.
Sources of the quotes, statistics and some of the ideas presented in my Strata+Hadoop talk TrustUX: Balancing personalisation and privacy to create understanding and trust, in no particular order.
Additional tidbits of knowledge, gathered in April 2015 (aka what happens when you consider your blog to be one giant notebook). This month on measuring usability, user semantic time, privacy gap assessment, the connected service experience and emotion and activity markup languages.
As we are treated algorithmically (as a set of data points subject to pattern recognition engines), we are conditioned to treat others similarly. – Frank Pasquale
The first error turns computers into gods, the second treats their outputs as scripture. – Bogost
We should be careful not to teach robots our biases and discrimination practices towards people who do not belong to ‘our’ community. After all, democratic robot learning would mean we actually do have to lead by example.
How far do we allow technology to go? To which degree should it be allowed to steer our behaviour, have agency on our body? Should it be allowed to decide for us?