Fitness Google Privacy

Change is inevitable

Happy (EU) Data Privacy Day! On January 28th, 1981, Europe signed their first modern privacy treaty. The purpose of this day is to better inform European citizens about their rights regarding the use of their personal data by governments, companies, and other organizations. Companies and organizations are also encouraged to improve the protection of personal data on this day.

Within both the government and the business community, and also between the two, there is a trend to want to share more data with each other. Anonymous data about our living environment, for example, but also personal data about people. The reasons for this are diverse. For example, the government wants to improve its services to citizens and at the same time combat and prevent fraud. And in the business world, people are increasingly aware of the fact that personal data collected by others can also be used in a valuable way for the company’s own purposes.

That sharing of personal data does not go unnoticed. The ‘Data Processing by Collaborations Act’ is in preparation for the government. In addition, the rules for Artificial Intelligence that are being made demand greater transparency from all organizations about the origin of personal data. As a society, we will look more critically at proposals to share our personal data between different parties. Do we actually want that? What are the consequences of this? Questions that will increasingly have to be answered in 2021.

Change that is needed
Recently Google has finalized their acquisition with Fitbit after an agreement with the European Union was made. The announcement follows last year’s decision by the European Union to authorize the acquisition.

The EU launched an investigation into the acquisition in August, among other things to ensure that the privacy of Fitbit users is not compromised. Google could link the data of users to its advertising platforms but they will not use users’ health and wellness data for advertising purposes.

In addition, third parties will continue to have access to the Fitbit Web API, so that other companies can continue to offer services to users who use the health and fitness data from Fitbit devices. Collected Fitbit user data is also stored separately from other Google data. These requirements are valid for ten years and can be extended for another decade after this period.

The technical separation of the companies was also a requirement for users within the EEA and they are given a choice whether they want to link Fitbit to other Google products, such as Maps.

Transparency of algorithms
Since 1995, European privacy legislation (the GDPR and predecessors) has had a remarkable rule: making automated decisions about people based on their personal data is prohibited in principle. For a long time, that rule was a bit obscure and hardly applicable. People, not machines, made decisions about people. That will change in 2021.

This has everything to do with the exposure of algorithms that can make decisions based on data. In recent years developers have become much better at making algorithms like that. With the advanced tech companies now have and the awareness that they are being applied, you can expect to see more about this topic in the coming year. And algorithms will again be increasingly used this year to make judgments about people based on their personal data. That is why in the coming year a lot of attention will certainly be paid to a privacy rule that has so far been little known. What exactly does automated decision-making about people actually mean? When you use an algorithm, do you understand what it does? What part does the algorithm have in the decision? Is the algorithm based on the correct data sources? And can we as humans actually check how that works?