According to the cited source, Italy's Data Protection Authority found numerous violations of EU privacy law - including GDPR principles of transparency, notification, legality of personal data processing, security, and confidentiality. "The applied fine is extremely important from the perspective of the evolution of the employee-employer relationship, including for the Romanian market," say the specialists. "In recent years, ridesharing and food delivery platforms have stimulated the development of the freelancing field as a new normal on the service market. Of course, this easy way to find your next gig comes with a series of high risks for the rights of freelancers. If when we have an employment contract we are talking about a legally regulated relationship, in which the parties have clear verification mechanisms at their disposal of productivity as well as tools to prove the good faith of one of the parties, in the case of workers in the food-delivery industry, the platform's algorithm is the one that decides whether a worker is suitable or not, based on a scoring that is not based on a transparent mechanism that can be challenged or discussed", explained Cristiana Deca, Decalex founder and data protection expert. According to Decalex, in this context we are talking from the start about a contractual imbalance, in which the freelancer cannot do anything to demonstrate that a given assessment of the client is real or not and that he represents him and cannot dispute him. The quoted source mentions that the fine of the company owned by Glovo in Italy is a first for the food-delivery industry from the perspective of the fact that it brings into question the opacity of the algorithm and the imbalance between the rights of the platform workers and the algorithm. "Unfortunately, it is not a first in what regarding the employee-employer relationship, in the last few years we have had a series of fines related to the excessive or incorrect processing of information about employees. An example is the fine applied to the H&M group in the amount of 35 million euros for illegally collecting data on the private life of employees in Germany", reports Decalex. Also in Germany, the Notebooksbilliger company was fined 10.4 million euros for excessive monitoring of employees.
The main threats regarding personal data
According to Decalex specialists, a first problem is related to workers who call customers from personal phones and remain with their addresses in the history once the orders have been delivered. Basically, the platform workers they become data controllers once the order has been delivered, and can use the remaining customer data in their possession as they see fit. Another concern is related to negative reviews and the impact on the end customer. For example, if a supplier delivers only one order in a day, you will know who gave the negative review and all his personal information, namely home address and phone number. From the perspective of the platform's collaborators, the most obvious problem remains the restriction of rights and the lack of ways to challenge the decision-making process of the algorithm, since their access to an order and a minimum income depends on the reviews and the interpretation that the algorithm gives them "Also, for entrepreneurs and investors, in the last year we see a very big concern in the area of software development built according to the GDPR perspective. We work closely with many investors from Romania and Europe who want to know before investing what compliance risks the algorithm is exposed to, in order to actually understand the real costs of correcting it and what risks the investment entails from this perspective. Today, the teams that develop solutions based on algorithms understand that including the prototype in the testing phase, they must take into account the GDPR in terms of the architecture and the applicable decision-making model", said Cristiana Deca, founder of Decalex. In Romania, the software development industry is beginning to understand that not only the functionalities are important for the user, but also the respect of the users' rights.What do the new fines mean for the new business models and users
According to Decalex representatives, the major problem is lack of compliance from the technical perspective of the software with GDPR provisions. They argue that any software that enters the market and processes personal data should, from the design phase, so before the prototype, take into account the principles of privacy by design and privacy by default, principles that are necessary and mandatory for the creation of transparent software. "All software development companies should take these two principles into account and integrate them into the architectural development of the product. We are obviously talking about a more complex development team than programmers, which includes the UX designer and the data protection consultant or the DPO who contributes to the design of the application and the functions in such a way as to respect the rights of the users", says the quoted source. But the GDPR is not the only regulation at European level aimed at the transparency of algorithms: the European Commission has a specific legal framework for online platforms that mediate the sale of services and products to users, the so-called marketplace platforms. From the user's perspective, the more opaque the algorithm is, in the long term the user will lose trust in the brand and give up using the respective platform/application. opt not to be tracked by apps, 96% of US users have opted not to be tracked by apps on their phone, according to statistics.", say the specialists