Contact
Share
Discover our news & publications
Running into the Future: How AI-Enhanced Wearables Are Redefining Paris Olympics
Posted on 25 July 2024 in News > Media, Data, Technologies & IP

The Paris Olympics are just around the corner which give us a great opportunity to explore artificial intelligence used by connected devices in sport to highlight underlying legal issues.

Already during the Rugby World Cup in 2023, connected devices embedded in players’ jerseys revolutionised performance monitoring and injury prevention. Such technologies, and others like GPS vests in football (e.g. Barça GPS Tracker and CityPlay), have paved the way for even more sophisticated applications in sports.

The Paris Olympic Games are no exception. Athletes will not only be in the spotlight for their exceptional skills but also as early adopters of AI-enhanced wearable technology. These devices are transforming how athletes train, compete, and improve. They track everything from heart rates and oxygen levels to stride patterns and fatigue, offering insights that were once the domain of high-tech labs.

This brings up an important question: now that the Data Act Regulation (EU) 2023/2854 on harmonised rules on fair access to and use of data, and the AI Act Regulation (EU) 2024/1689 laying down harmonised rules on artificial intelligence are in force, amid the current sweltering summer temperatures, is it just as challenging to identify the new applicable obligations and to categorise the risk levels of the AI systems used by Olympic athletes?

 

1. Delving into the AI Act:

The AI Act ensures that artificial intelligence systems (or “AI systems”) used in the EU are safe, transparent, and respect people’s fundamental rights. This applies to all AI systems, whether or not integrated into technology, such as a wearable device.

The AI Act also introduces a risk-based approach, which means that the applicable rules depend on the systems’ risk levels.

AI systems are therefore classified into three or four levels of risk:

  • low-risk AI systems,
  • high-risk AI systems,
  • prohibited AI practices that are considered unacceptable, and
  • certain AI systems that are subject to transparency obligations (e.g. when it may be unclear if users are interacting with an AI system or a human).

The use of AI systems at the Paris Olympics poses various challenges with regards to the athletes’ privacy, protection of sensitive data, and transparency of algorithms. At the same time, identifying the categories of AI systems being used, and understanding the rules that apply to them, has become increasingly important.

While low-risk AI systems are mostly not covered by the AI Act and only need to meet transparency requirements whenever necessary, high-risk AI systems must comply with various requirements, including:

  • A risk management system to spot and assess health and safety risks,
  • Detailed technical documents,
  • Automatic record-keeping of the AI system’s activities throughout its life,
  • Clear instructions for those setting up the AI system, and
  • Tools that properly allow for controlling the AI system, etc.

We believe that AI systems used for the Paris Olympics will mainly be categorised as low or high-risk, requiring strict security and personal data protection controls, in the context of:

  • Performance analysis: AI systems can analyse athletes’ biomechanical and physiological data in real-time to optimise their training. Sensors integrated into sports equipment can collect detailed data on movements and efforts, helping trainers to adjust training programs in a personalised way. These systems are likely to be generally classified in the low-risk category, as they influence users’ decisions without impacting their safety or fundamental rights.
  • Injury prevention: AI algorithms can analyse health data and past injury records to predict injury risks and suggest preventive measures. This approach enables athletes to stay in better shape and avoids interruptions in their preparation. These AI systems will be classified as high-risk since they process sensitive personal data and have a direct impact on the health and safety of athletes.
  • Nutrition and recovery: AI and connected devices can also play a role in managing athletes’ nutrition and recovery. By analysing athletes’ nutritional needs and optimal recovery periods, AI systems can recommend tailored diet plans and recovery programs. Currently, these systems are generally low-risk AI systems. However, they may be subject to transparency obligations and requirements concerning the content they generate. Moreover, if these systems process sensitive health data or substantially influence athletes’ performance and physical recovery, they could also be classified as high-risk.

Data on health, nutrition, and recovery collected by wearable devices could subsequently be shared by the company selling or renting the devices, including with emergency and first aid services, as well as with insurance companies that provide coverage to athletes. The Data Act governs how this data is managed and shared with these entities.

 

2. Delving into the Data Act:

The Data Act aims to set up a harmonised framework for sharing and using data across all business areas within the EU. It ensures that all types of data (personal and non-personal data alike) are shared fairly, securely and in a way that respects individual rights.

For example, during training, connected devices gather a lot of athletes’ data, which helps optimise their overall performance. This data may be shared with many different stakeholders for processing. This data may be shared in various contexts, such as:

  • Business to consumer data sharing,
  • Business to business data sharing, and
  • Business to public sector data sharing (a public sector body, or an EU institution).

In this regard, the Data Act requires data collected by devices and shared with others to adhere to strict measures, ensuring sensitive information is secure and correctly used:

  • The company selling or renting the device must inform users what data it collects, including the data types and volume,
  • Users must be able to access their data quickly and without paying anything for such access,
  • If rules are not followed, users can complain to specific authorities,
  • Users can ask that their data be shared with a third party (e.g. an insurance company), and
  • The third parties that receive data can only use it for the agreed reasons, etc.

 

We believe that the use of AI in connected devices will be an integral part of these Olympic Games. This will undoubtedly mark an important milestone in the eyes of the public in terms of the undeniable benefits that AI can provide.

The flip side of the coin is the need for these technologies to comply with the AI Act, the Data Act and finally the GDPR, which leads to the conclusion that robots must, in their own way, pass an anti-doping test!

The legal framework applicable to new technologies and the protection of data, either personal or not, is complex – and an area in which MOLITOR’s Media, Data, Technology, and IP team can offer comprehensive expertise.

Please contact us if you require our legal assistance in complying with and navigating the complexities of the GDPR, and the AI Act and the Data Act Regulations.

 

Newsletter

Subscribe to our news updates

Archives

Subscribe to our news updates