Dilemmas posed to individual privacy, data sharing and fairness by uses of AI technologies were explored during the annual PI.lab conference in The Hague on the 6th of December 2019. Many issues have been discussed. Several TNO experts contributed their views on how to ensure privacy, data sharing and fairness.

Preliminary findings

Gabriela Bodea discussed preliminary findings on these emerging trends. Linked to the increasing use of alternative data, algorithmic applications are slowly making their way into all categories of public and commercial services. By regulating and conditioning access of individuals to anything from social to financial services, and from employment to health and education, they change our relation to digital technologies and pose new and far-reaching challenges to conventional understandings of individual autonomy, consent, privacy and other fundamental rights.

Privacy and security in the public space

Tjerk Timan delved into the issue of hardcoding human assumptions about what good behavior is or should be in public space. In public spaces, we collectively determine what is acceptable behavior and what constitutes boundaries of that behavior. In many digital security services, such boundaries are not decided collectively anymore. They are being datafied and hardcoded by a small group of individuals. Besides improving a generic understanding of often mis-interpreting laws around privacy in -and for public space, we also need to develop far more inclusive methods to allow for collective value-judgements in the development of algorithms in and for public space.

fairness in AI and machine learning

Cor Veenman (TNO/Leiden University) presented challenges on fairness in AI and machine learning. Traditionally bias and discrimination can go unnoticed. When we use algorithms these issues become apparent. The unfair treatment of subgroups is measured and made explicit. However, often the algorithms are considered the cause of the unfairness. The good news is that machine learning gives the ability to mitigate bias and discrimination.

Multi-Party Computation

Thijs Veugen (TNO) delivered a talk on Multi-Party Computation, a cryptographic technique that enables data analysis without revealing sensitive data. As such, it addresses the paradox of conflicting values between data confidentiality and data-driven decision-making. Multi-Party Computation enables the most sensitive databases to be linked together in a privacy-preserving manner, thus opening the door to all kinds of new products and services. TNO possesses the unique expertise needed to assist you with these techniques and to advise you on customized solutions.