Session "How are your morals? Ethics in algorithms and IoT" at StrataHadoop in Singapore, SGP together with Joerg Blumtritt, Datarella.
The codes that make things into smart things are not objective. The algorithms bear value judgments, decisions on methods, or pre-sets of the program’s parameters — choices made on how to deal with tasks according to social, cultural, or legal rules or personal persuasion.
Obvious examples of “ethics codes” are credit scoring or differentiated pricing of a retail offer. However, there are a multitude of “hidden” ethics algorithms that are far more pervasive. When an ad network’s targeting system selects which ads we see and which not, we might not find that too important. But when a search engine or news feed is deciding what it regards as relevant information to show us and what not to, and by that shaping our view of the world without us knowing, it becomes far more important. And the realization that self-driving cars will have to act according to some algorithm when a collision might be inevitable, and this might lead to injuries or even people getting killed, the question of ethics in algorithms becomes highly relevant.
It raises important questions about the transparency of these algorithms including our ability to, or just as important, our lack of ways, to change or affect the way an algorithm views us.
We need to address the end users who need higher awareness, more education, and insight regarding those subjective algorithms that affect our lives. We also need to look at ourselves, data consumers, data analysts, and developers, who more or less knowingly produce subjective answers by our choice of methods and parameters — unaware of the bias we impose on a product, a company, and its users.
Sometimes the only way to understand these presumptions is to “open the black box” — hence to hack. Or support the process of using data with ‘label of content’ and the use of design patterns as guidelines.
We will present some of these value judgments with examples and discuss their consequences. We will also present possible ways to resolve the problem: algorithm audits and standardized specifications, but also more visionary concepts like a “data ethics oath,” “algorithm angels,” and ethics design patterns that could guide developers in building their smart things.
Stephen Few is often seen working in a grey scale to avoid distraction of perception. Visit his site to see samples of 'before and after'. (It's almost like the good old monocrome days)
When looking at an analysis just made: Can you answer the question 'So what?'. If not, you're just counting, not analyzing..
Profound believer of accessing hidden knowledge in data; There is no such thing as collecting data too heavily - only understanding too slowly.
I am a highly analytical person keen on the How-To's about transforming knowledge into actions. Making sure the use of data as decision support actually presents itself as support for the organisation; leaving no path within analytics, visualisation and the language of math unexplored.
Welcoming Data Science as one way of creating needed focus on a more data driven approach on business. The 'how' is only a matter of tecnical solutions, its the 'what' and 'why' that's important.
Developing business through IT. Merging IT with business making sure IT never will be the limiting factor of the way forward. Always trying to find the best solution to the equation no matter if the question asked is about customer loyalty, maximizing revenue, forecast, fulfillment or other business related matters.
I have a background in both IT and economics and have been working with business processes and analytics, Business Intelligence, IT and databases for a couple of decades. Also enjoy teaching and speaking at international conferences as a two way street of sharing knowledge and getting inspiration.