Telling stories about privacy and ethical risks in learning technology development

Tuesday, 18. April 2017

By David Barnard-Wills, Trilateral Research Ltd., UK


The DEVELOP project is building a personalised learning environment (PLE) for career development. The idea behind the DEVELOP PLE is that it will allow users to assess their transversal competencies and social capital in order to recommend the appropriate learning opportunities to support the users in meeting their career development goals. The ambition is that personalised visualisations of potential career paths can inform and guide learners towards realistic and attainable careers. The DEVELOP project is investigating the ways in which data from workplace social networks can be used to provide such recommendations and support career development. These data are a potentially rich source of information on the way that workplace relationships and communication patterns can influence job opportunities and career progression, but they are also sensitive and private.

To respond to this, the DEVELOP team is conducting a privacy and ethical impact assessment exercise (PIA+) throughout the project. Within the PIA+, the project team developed a set of vignettes to demonstrate potential privacy, ethical and legal impacts that might arise from a PLE such as DEVELOP, if careful thought was not given to these issues in the design process. The aim is to identify problems before they can occur and whilst there is still time in the design process to implement appropriate solutions. These short stories help to ground the sometimes abstract and legalistic ethical and privacy risks in the specific context in which the DEVELOP system will be tested and subsequently used.

Issues explored in the vignettes include:

  • misuse of the employer-employee relationship to gain "consent" for data processing;
  • misleading recommendations;
  • the difficulty employees might have in challenging or correcting the recommendations or assessments made by the system;
  • ways in which assumptions about confidentiality and privacy can be broken or exploited;
  • lack of transparency around data processing and how the system generates its recommendations;
  • the risk of assuming that what is easily measured is what matters within an organisation.

Ways in which these issues can be addressed include technological solutions (such as the way the data storage is set up or access controls), as well as social and organisational solutions (such as training, guidance and consultancy on how organisations can best manage some of these issues). These solutions come back to a number of core principles: transparency and privacy visualisation; user control, voluntary use and consent; data minimisation and anonymisation; role-based access control, restrictions on the scope of data included; error-correction and revocability; and specific solutions for AI-planning and social network analysis.

We discuss these potential problems and their solutions in detail in the DEVELOP project White Paper "Telling stories about privacy and ethical risks in technology development".

You can download the white paper here.