CYIL vol. 16 (2025)

CYIL 16 (2025) ARE DATA IN CLINICAL GENETICS SUFFICIENTLY PROTECTED? … iv) Security and resilience of transport systems and mobility, critical infrastructure and networks; v) the efficiency and quality of public administration and public services; b) the data processed are necessary for compliance with one or more of the requirements set out in Chapter III, Section 2, insofar as those requirements cannot be effectively met by processing anonymised, synthetic or other non-personal data; c) there are effective monitoring mechanisms in place to identify whether any high risks to the rights and freedoms of data subjects referred to in Article 35 of Regulation (EU) 2016/679 and Article 39 of Regulation (EU) 2018/1725 may arise during the sandbox experimentation, and there are response mechanisms in place to mitigate those risks immediately and, where necessary, to stop the processing; d) all personal data to be processed within the sandbox is located in a functionally separate, isolated and secure data processing environment under the control of the potential provider and is accessible only by authorised persons; e) Providers may only further share the data originally collected in accordance with Union data protection law; any personal data created within the sandbox cannot be shared outside the sandbox; f) any processing of personal data within the sandbox does not lead to measures or decisions affecting data subjects or affect the exercise of their rights under Union law in the field of personal data; g) all personal data processed within the sandbox is protected by appropriate technical and organisational measures and deleted once participation in the sandbox or the retention period of the personal data has ended; h) logs of the processing of personal data within the sandbox shall be kept for the duration of the participation in the sandbox, unless otherwise provided for by Union or national law; i) a full and detailed description of the procedure and rationale for training, testing and validation of the AI system shall be kept with the test results as part of the technical documentation listed in Annex IV; j) a brief summary of the AI project developed in the sandbox, its objectives and expected results is published on the website of the competent authorities; this obligation does not apply to sensitive operational data relating to the activities of law enforcement, border protection, immigration or asylum authorities. 21 Another key measure introduced by the Artificial Intelligence Act is transparency. Users must be informed that they are interacting with AI systems and have access to information about how their data is being used. The Artificial Intelligence Act prohibits discriminatory practices that could lead to unfair treatment of individuals based on their personal characteristics. All AI systems must be designed to be safe and reliable, minimise risks and protect users from potential harm. They also set out the rights of individuals, which include both the right to access and the right to correct or delete their data. To conclude on the Artificial Intelligence Act, all the measures introduced or being introduced in this Regulation aim to ensure that the use of AI systems is in line with the values of the Union and protects the fundamental rights of individuals.

387

21 Ibid Article 59(1).

Made with FlippingBook. PDF to flipbook with ease