EU Streamlines Private Data Access for AI Training by Tech Giants

Єврокомісія

The European Commission presented changes regarding AI / Depositphotos

This past Wednesday, on November 19, the European Commission revealed extensive revisions to European digital laws. Critics assert these adjustments could potentially facilitate larger tech firms’ access to private data and erode privacy safeguards that have been in effect for two decades.

Delo.ua reports this , citing Reuters.

The Digital Omnibus collection involves pushing back stringent artificial intelligence regulations until the close of 2027, streamlining guidelines for data access to utilize vast data collectionsincluding delicate health and biometric detailsfor AI training purposes without express authorization, alongside actions aimed at diminishing bureaucratic hurdles for businesses.

Commission representatives state that the adjustments, which still necessitate endorsement from EU member nations and the European Parliament, will uphold rigorous privacy benchmarks within the region, in spite of cautions from civic organizations that they disproportionately advantage the interests of major technology corporations.

Deferring the enactment of regulations for high-risk AI until 2027

Enterprises employing so-called high-risk artificial intelligence systems will be granted an additional 16 months prior to the imposition of stricter rules, extending the deadline from August 2026 to December 2027.

“High-risk AI” encompasses applications in fields such as law enforcement, education, judicial proceedings, asylum and immigration processes, public sector amenities, human capital administration, vital infrastructure (water, gas, electricity), and the utilization of biometric data.

Simplified Access to Personal Data for Tech Companies

The European Commission is endeavoring to specify the juncture at which data ceases to be classified as personal under privacy legislation. This clarification could potentially render it simpler for the anonymized data of EU citizens to be employed in AI training endeavors sans explicit permission.

As per the proposition, data that has undergone anonymization will not be deemed personal data provided that the entity processing the data is incapable of ascertaining the identity of the individual to whom it pertains.

In the realm of AI training, enterprises will be authorized to leverage extensive datasets, even those encompassing sensitive personal particulars, such as medical or biometric details, on the condition that they undertake suitable measures to expunge said particulars.

Source

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *