Technology and design
Social media accounts, government platforms and websites collect various types of information about users of or visitors to their services. How important is it that these services are designed to be reliable and handle data safely and securely?
Why the contribution is important
The Scottish Government has committed to engaging with citizens and public, private and third sector organisations and is interested to hear your thoughts on this topic.
by Sophie_ScotGov on December 08, 2020 at 08:44AM
Posted by Ingrid December 10, 2020 at 11:00
Report this Comment (Requires Log In)
Posted by Anthony December 14, 2020 at 14:14
Report this Comment (Requires Log In)
Posted by SOCITM December 16, 2020 at 18:07
Ethics by Design
These focus on the design phase of digital and data tools. It directly concerns technology in all its technical complexity and the know-how of engineers, programmers, etc. These ethics therefore touches in particular on the deontology (duty-based ethics) of digital creators of all kinds (developers, digital designers, project managers, etc.). Indeed, they have an ethical responsibility from the design stage onwards, insofar as data or algorithms may or may not reproduce human biases, reveal new discriminations (or reproduce them on a larger scale), give rise to injustices, etc.
Are the solutions and approach under examination proportionate and ethical?
• Ensure the system is appropriate for the application (the function, the problem) under consideration
• Will the solution work and be better than anything else for that purpose?
• Take care to make sure its use in the proposed context be lawful, safe, acceptable to stakeholders, wise, and not have bad side effects
Do you understand the data context and are you looking to rigorously test it is critical to the integrity of any process?
• Are the precise calculations and data processing methods used in the program?
• What is the exact nature of the data used in the development and testing of the program, and in its intended operation?
• What is the purpose for which the system will be used?
• Remember that the social, political, professional, environmental and operational process and practice contexts within which it will be used - its use will change these, and these are where consequences will be felt.
Are the risks of bias in the datasets identified and addressed?
• Train policy makers, designers and practitioners to focus on identifying, defining and understanding the potential of algorithmic risks from the outset if we are serious about avoiding discriminatory and adverse personal outcomes.
• Carry out rigours impact assessment (in accordance with official guidance and GDPR) in order to analyse the possible design-induced discriminatory impacts of algorithms
• Put in place checks and balances at each stage of development to ensure there is no bias in the results
Can the operating rationale of the algorithms deployed for artificial intelligence be explained?
• Have a system explain ability policy encompassing the whole chain (data provenance, explanation of the reasoning followed)
• Develop algorithms that are transparent by design, in order to make it easier to explain them and to analyse how they reason
• Adopt a labelling (with an ethical scoring/rating system) and ethical support approach
Does the organisation offer training programmes on ethics in the creation of digital tools?
• Set up training workshops and/or skills refresher courses within the IT and related Service Departments
Are solutions designer’s representative of the social, ethnic and gender diversity of society?
• Draw up a HR policy ensuring social and gender diversity in the workplace
Are new projects evaluated for their impact on privacy and personal data?
• Consider setting up an ethics committee to approve sensitive projects
Do tools and solutions protect personal data by design and factor in the right to be forgotten factored into the design chain?
• Adopt a privacy by design approach, in accordance with the requirements of the GDPR: this means building the protection of personal data into products and services by design, but also by default (notably by abiding by the data minimisation principle introduced by the GDPR).
• Remember these are also cultural challenge because this concept needs to be factored into a project early on
Does the correlation of data collected from various sources result in the production of personal information (as part of big data and AI projects, for instance)?
• Put in place a system that measures the personalisation of data after processing operations
Ethics of Use
These aim to examine how the service users and employees as well as the managers and partners of an organisation use emerging technology and data. This entails conducting an ethical evaluation of how people use the technological resources at their disposal.
Is there a robust set of checks and balances built around political and executive scrutiny?
• Focus on the adoption of due diligence frameworks, appropriate standards, principles together with accountable public good focussed audit and risk regimes that allow for an effective measure of public participation in all the stages from initial evaluation to implementation
Are ethical rules for data collection and processing defined and shared internally within the organisation?
• Raise staff awareness with data ethics training and workshops
Is there a framework for internal rights of access to personal and/or sensitive data?
• Clearly define procedures for access to sensitive data based on employees’ profiles and roles
Are digital tools designed with the accessibility needs of disabled people in mind?
• By default, design solutions that are accessible for people with disabilities
Are ethics-related issues addressed on a cross-functional basis within the organisation?
• Consider establishing a Chief Digital Ethics Officer tasked with ensuring the overall coherence of the organisation’s “ethics and digital” policy
• Put in place an awareness-raising programme for all employees (information and examples of best practice)
Are employees informed of how their data will be stored and processed, and their rights in this area?
• Inform employees of how their data will be stored and processed, and their rights in this area (display, updating of internal regulations)
Are the consequences of the internal use of certain digital tools assessed?
• Carry out an assessment of the impact of digital tools on the day-to-day experience of employees in the organisation.
Are the users of personalised services given the option to manage their settings?
• Ensure that the information given to users is clear and transparent
• Make it easy for users to change their personal data management settings, and to make informed choices
Are users informed of the terms of use of a digital solution or application?
• Consider drawing up a digital user charter setting out the ethical terms of use of a solution
• Look at setting out a framework for the use of a solution in contracts, and allow for stakeholder object to noncompliant use
Is there a policy allowing you to check, when digital solutions are built by a number of partners, that the whole process is ethical?
• Ensure that the ecosystem is trustworthy and give each partner a vision of the purpose of the overall solution
• Call on trusted third parties, certifications and/or labels, demonstrating the ethical commitment of each participating stakeholder
see more at https://socitm.net/inform/collection-digital-ethics/
Report this Comment (Requires Log In)