The more subtle elements used in game design to motivate gamers, work just as well on users in everyday design. So whilst I’m retiring some of my notebooks to the shoebox, I’m not retiring my notes on gamification. If you promise to avoid using the PBL triad as much as possible, here’s my Gamification Cheat Sheet pdf for you.
Some member of parliament must have severely misunderstood the meaning of ‘privacy-by-design’. These ‘standardised privacy icons’ and their logic are that disastrous, that enforcing these will not strengthen but weaken the new European privacy legislation. The icons and copy suggestions are unclear, unusable and being forced to show these will punish especially the companies and organisations who do privacy right.
Just show and tell is not enough for people to change their mobility behaviour towards something that both benefits the environment and their stress levels. Even if we want to, it’s hard to break our habits. Yet gamification can help.
Good dashboards appeal to us because they offer a sense of control. When designing a dashboard, pay attention to giving an overview of what is happening, the information needed to plan for the future, and assistance with getting things done timely.
▐ ・ ‿ ・▐
As we are treated algorithmically (as a set of data points subject to pattern recognition engines), we are conditioned to treat others similarly. – Frank Pasquale
The first error turns computers into gods, the second treats their outputs as scripture. – Bogost
We should be careful not to teach robots our biases and discrimination practices towards people who do not belong to ‘our’ community. After all, democratic robot learning would mean we actually do have to lead by example.
How far do we allow technology to go? To which degree should it be allowed to steer our behaviour, have agency on our body? Should it be allowed to decide for us?
When users know of the existence of a certain algorithm, their satisfaction with the product increases over time, probably as they start to understand its workings better. Yet when they discovered an algorithm they were previously unaware of, users felt betrayed.
There’s only limited value (and a limited audience) for the health features of the Apple Watch’s heart rate measurements. So Apple decided to sell it to us as ‘easy’ thoughtfulness. As ‘easy’ connectedness. But the whole point about being thoughtful and forging connections, is that it should not be easy. What value does something have, if it does not require at least some effort?
Data from many thousands of individuals, willingly contributed or our digital exhaust, allows designers to deliver products and services that are, or at least feel, unique. This creation of unique customer products and services derived from crowd-based insights, Deloitte calls this the “billion-to-one,” or B2ONE, experience. One of the five requirements? Build, and maintain, trust.
I have long been on ‘team iOS’ as it comes to permissions management flow and I would love that my future car follows the iOS (high) road. A system designed in such a way that it works with a minimum of data required. A system that is transparent about what data is collected, and for which purpose. A system that limits itself to the basics by default, and then asks me step-by-step if I would be willing to trade in more data in exchange for extra functionality.
Professor Pasquale writes that ‘gaming your score’ might even be dangerous, as trying to influence scoring systems could backfire: “If a person attached a fitness device to a dog and tried to claim the resulting exercise log, he suggests, an algorithm might be able to tell the difference and issue that person a high score for propensity toward fraudulent activity.”