Transparency and clarity can be important to reduce perceived risks about adopting new technologies which collect data. The motives of the people and organisations driving the development of data usage can influence trust in products and services. How important is it for organisations to be transparent about their motives for data collection?

Why the contribution is important

The Scottish Government has committed to engaging with citizens and public, private and third sector organisations and is interested to hear your thoughts on this topic.

by Sophie_ScotGov on December 08, 2020 at 08:53AM

Current Rating

Average rating: 5.0
Based on: 7 votes


  • Posted by iainarthurmckie December 09, 2020 at 14:51

    There is a general failure to clearly explain how the information given will be used and be clear about their reasons for using it. Requests to use cookies for instance often lead to summaries of uses which are far from clear requiring considerable time to unscramble how they will be used. Companies understand that the more confusing and difficult the information is the more likely users will consent to save time and effort. There should be a mandatory clear language used to show exactly who can access your information and what it will be used for. Vague terms like 'third parties' and 'to make our site more efficient' etc tell us little. It should be made clear what essential cookies are to allow the site to operate, not to allow them to create algorithms on users. So much information is being gathered without any clarity who and how it is being used. There should be strict limits placed on the information companies etc can request and there should be standardised wording for the requests.
  • Posted by Ingrid December 10, 2020 at 10:41

    It is critical for Organisations, public bodies in particular, to be transparent with customers and citizens and their employees and supply chain - building and retaining that trust is a critical factor in ensuring that relevant data is available and generates value for both the organisation(s) and the individual. Organisations that are mistrusted have bigger ground to cover, and Organisations that are trusted must work hard to ensure that trust is sustained through transparency. Ensuring what and why data is collected in easy to understand terms is a key component, enabling individuals to have ownership over that data is not only a legal obligation but just the right thing to do.
  • Posted by SOCITM December 16, 2020 at 17:59

    This accords with Socitm’s digital ethical practice attribute of Explicability = Operate Transparently Be ready to explain a system’s working as well as its outputs. Make all stages of the implementation process open to public and community scrutiny. Key Themes: Intelligibility, transparency, trustworthiness and accountability • Transparency: Including traceability, explainability and communication – as Smart Information Systems can be involved in high-stakes decision-making, it is important to understand how the system achieves its decisions. Transparency, and concepts such as explainability, explicability, and traceability relate to the importance of having (or being able to gain) information about a system (transparency), and being able to understand or explain a system and why it behaves as it does (explainability). • Accountability: Auditability, minimisation and reporting of negative impact, internal and external governance frameworks, redress, and human oversight. Given that Smart Information Systems act like agents in the world, it is important that someone is accountable for the systems’ actions. Furthermore, an individual must be able to receive adequate compensation in the case of harm from a system (redress). We must be able to evaluate the system, especially in the situation of a bad outcome (audibility). There must also be processes in place for minimisation and reporting of negative impacts, with internal and external governance frameworks (e.g., whistleblowing), and human oversight. Areas of Focus: • Traceability: the data sets and the processes that yield the AI system’s decision should be documented • Explainability: the ability to explain both the technical processes of an AI system and the related human decisions • Interpretability: Helping to give users confidence in AI systems, safeguarding against bias, meeting regulatory standards or policy requirements and overall improving system design • System Accountability: Any system, and those who design it, should be accountable for the design and impact of the system. As a minimum this should include that you can: • Ensure that systems with significant impact are designed to be auditable; • Ensure that negative impacts are minimised and reported; • Ensure internal and external governance frameworks; • Ensure redress in cases where the system has significant impact on stakeholders; • Ensure human oversight when there is a substantial risk of harm to human values.
  • Posted by simonbarrow December 18, 2020 at 15:39

    Socitm’s digital ethical practice is both helpful and vital in this regard.
  • Posted by Pepper December 22, 2020 at 08:28

    Personable, empathetic, involve, listen, understand, work together.
Log in or register to add comments and rate ideas

Idea topics