19 March 2021 by

Three applicants v Ola Netherlands B.V. C/13/689705 / HA RK 20-258, District Court docket, Amsterdam (11 March 2021)

An Amsterdam Court docket has ordered Ola (a smartphone-hailing taxi organisation like Uber) to be extra clear concerning the information it makes use of as the idea for choices on suspensions and wage penalties, in a ruling that breaks new floor on the rights of staff topic to algorithmic administration.

James Farrarr and Yaseen Aslam, who received the landmark victory within the UK Supreme Court docket in February, led the motion by a bunch of UK drivers and a Portuguese driver, who purchased three separate instances towards Ola and Uber searching for fuller entry to their private information.

The next is a abstract of the case towards Ola taxis. Anton Ekker (assisted by AI knowledgeable Jacob Turner, whom we interviewed on Law Pod UK here) represented the drivers. He stated that this case was the primary time, to his information, {that a} courtroom had discovered that staff have been topic to automated decision-making (as outlined in Article 22 of the GDPR) thus giving them the suitable to demand human intervention, specific their viewpoint and enchantment towards the choice.

The Information

Ola is an organization whose dad or mum firm is predicated in Bangalore, India. Ola Cabs is a digital platform that pairs passengers and cab drivers via an app. The claimants are employed as ‘personal rent drivers’ (“drivers”) in the UK. They use the providers of Ola via the Ola Driver App and the passengers they transport depend on the Ola Cabs App.

Proceedings are pending in a number of nations between firms providing providers via a digital platform and drivers over whether or not an employment relationship exists.

By separate requests dated 23 June 2020, the primary two claimants requested Ola to reveal their private information processed by Ola and make it out there in a CSV file. The third claimant made an entry request on 5 August 2020. Ola supplied the claimants with numerous digital information and copies of paperwork in response to those requests.

Ola has a “Privateness Assertion” during which it has included basic details about information processing.

All references on this judgment is to the AVG, which is Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the safety of people with regard to the processing of non-public information and on the free motion of such information (GDPR).

The candidates requested an order from the courtroom that they need to be given entry in a typical digital format to a variety of knowledge, together with private information, the recipients to whom that dat could be disclosed, and the existence of automated choice making, together with profiling as referred to in Article 22(1) and (4) of the AVG (GDPR). In addition they requested for applicable safeguards within the case of switch to a 3rd nation or a global organisation, in accordance with Article 46 AVG.

The claimants additionally sought an order from the courtroom in respect of Ola, inside one month of notification of the choice, to offer them with their private information in a structured, generally used and machine-readable kind, that’s to say as a CSV file, in such a approach that this information may very well be transmitted instantly to a different controller.

They requested the courtroom to implement the foregoing orders on ache of a penalty of €2,000 for every day or a part of a day that Ola stays in default of complying and for Ola to pay the prices of the proceedings.

Arguments earlier than the Court docket

The claimants maintained that Ola had not supplied full entry to their private information in response to their entry requests. Ola’s Privateness Assertion and accompanying paperwork confirmed that the corporate processes a lot of classes of non-public information, however the claimants weren’t in a position to get hold of entry to a big a part of these classes. This, they claimed, was insufficient below GDPR.

Ola makes use of automated decision-making and profiling within the efficiency of the contract with their drivers. When making use of profiling, recital 71 of the AVG (GDPR) requires Ola to implement applicable procedures and take measures to make sure honest and clear processing for the information topic. Discriminatory results of profiling also needs to be prevented. So as to have the ability to assess whether or not Ola had complied with the necessities of Article 22(3) of the AVG when utilizing it, the claimants maintained that they need to have had entry to automated decision-making and profiling, details about the underlying logic and the anticipated penalties of such processing. They identified that proceedings have been being performed in numerous nations concerning the query of whether or not an employment relationship exists between suppliers of ‘Journey Hailing apps’ and drivers.

Of significance right here is the extent to which such suppliers have administration management which they train via, inter alia, algorithms and automatic decision-making.

Referring to the recent ruling by the UK Supreme Court docket that drivers are entitled to minimal wage and vacation allowance for every our that they’re logged on to a “Journey Hailing platform”, the claimants argued that they wanted entry to their information with a view to calculate these wages. This information, they contended, was essential for drivers to “organise themselves and construct collective bargaining energy”. Transparency about information processing, they stated, was essential to guard the pursuits of drivers vis-à-vis platform suppliers; and when deciding on their licence to drive, drivers are assessed on the idea of their suitability, during which context their monitor file and conduct is related. Subsequently, stated the candidates, drivers have an curiosity in unrestricted entry to their information.

Ola contended that the requests be rejected, or granted (partly) with due regard for the circumstances and ensures referred to by Ola, and that [applicant 3] be ordered to pay the prices of the proceedings (together with subsequent prices), plus statutory curiosity.

The Court docket’s conclusions

The Amsterdam District Court docket discovered that the car-booking app had used a wholly automated system to make deductions from one driver’s earnings. It is a discovering that draws larger authorized safety below Dutch regulation.

The choose was at pains to emphasize that, in precept, an information topic doesn’t have to provide causes or substantiate why he’s making a request for inspection below the AVG (GDPR).

When exercising his proper of inspection, the information topic doesn’t must put ahead a selected curiosity or state the aim he intends to attain with the inspection. The mere incontrovertible fact that information regarding her or him is being processed is adequate. It’s as much as the controller to display abuse of energy.

The claimants argued that they wished to test the accuracy and lawfulness of their very own information and that this was a situation for having the ability to train different privateness rights. That was adequate to fulfill the courtroom. Opposite to Ola’s submission, the truth that the claimants (and the union to which they have been affiliated) additionally had one other curiosity in acquiring private information, particularly to make use of them to acquire readability about their employment standing or to assemble proof in authorized proceedings towards platforms, didn’t imply that the candidates have been abusing their rights below the GDPR. The declare of abuse of the suitable of inspection was due to this fact rejected.

[The relevant section of the AVG/GDPR] permits the information topic to maneuver, copy or switch private information simply from one IT setting to a different, with out hindrance, and no matter whether or not the information are held on their very own methods, on the methods of trusted third events or on the methods of latest information controllers. Ola rightly argues that an necessary objective of this proper is to facilitate switching to a different service supplier and to keep away from so-called ‘consumer lock-in’ with the unique controller. Nonetheless, this doesn’t imply that the aim pursued by the claimants – evaluation of their very own private information or use for their very own functions – is excluded from the suitable to information portability. There is no such thing as a assist for this within the founding historical past of the AVG, the recitals to the AVG itself or the Pointers. The declare of abuse of the suitable to information portability is due to this fact rejected.

Moreover, the courtroom stated Ola ought to give drivers entry to anonymised rankings on their efficiency, to non-public information used to create their “fraud likelihood rating” and to information used to create an earnings profile that influences work allocation.

Right here we come to probably the most attention-grabbing a part of the judgment, the place the Court docket grappled with the query of entry for info associated to automated choice making. and profiling. The claimants requested entry to the existence of automated decision-making and profiling on the idea of Article 15 (1) of the AVG. This text stipulates that the information topic has the suitable to acquire from the controller details about the existence of automated decision-making, together with profiling, and, at the least in such instances, helpful details about the underlying logic in addition to the significance and anticipated penalties of such processing for the information topic.

Underneath Article 12(1) of the AVG, the information controller should present information topics with concise, clear, understandable and simply accessible details about the processing of their private information. The AVG defines profiling as:

any type of automated processing of non-public information which evaluates, on the idea of non-public information, sure private points regarding a person, particularly with a view to analysing or predicting the person’s efficiency at work, financial scenario, well being, private preferences, pursuits, reliability, behaviour, location or actions.

An information topic should be knowledgeable of the existence of profiling and its penalties (recital 60 of the AVG)

Underneath Article 22 of the AVG, people have the suitable, topic to sure exceptions, to not be topic to a choice primarily based solely on automated processing or profiling which produces authorized results regarding them or considerably impacts them in another approach. A choice primarily based solely on automated processing is one the place there is no such thing as a important human intervention within the decision-making course of. Recital 71 of the AVG mentions as examples of automated choice making the automated refusal of a credit score software submitted on-line or the processing of job functions by way of the web with out human intervention.

Within the case of those claimants, the Court docket noticed that there had definitely been profiling throughout the which means of Article 4(4) of the AVG, as a result of the skilled efficiency of the motive force was being evaluated. Which means that Ola needed to permit entry to the non-public information of the claimants which it used to attract up the profile and also needs to present details about the segments into which the claimants have been labeled, in order that they may test whether or not this info was right.

The Court docket famous that Ola’s automated decision-making course of which determines that journeys usually are not legitimate because of which penalties and deductions are imposed. It adopted from Ola’s rationalization of this decision-making course of that there was no human intervention previous to such a choice. The [automated] choice to impose a discount or nice affected the claimants’ rights below the settlement with Ola. Which means that Ola was prohibited from subjecting the claimants to such decision-making except it was essential for the efficiency of the settlement between Ola and the claimants.

In conclusion, the Court docket ordered that Ola, inside two months after service of this order, supplied the claimants with with a replica or inspection of the (private) information involved.


Other than being one other chip within the wall erected by the “journey hailing” gig taxi business, this judgment is a bellwether for the long run strategy of courts to “black field” automated choice making in processes counting on AI. Jacob Turner of Fountain Court docket Chambers tweeted (@Jacobturner1):

These judgments [including the one I’ve summarised above] are the primary on this planet on the suitable to a proof of automated decision-making below the GDPR. Uber should disclose information about alleged fraudulent actions by the drivers, primarily based on which Uber deactivated their accounts (‘robo-firing’) in addition to information about particular person rankings Ola Cabs should present entry to ‘fraud likelihood scores”, incomes profiles, and information that was utilized in a surveillance system. Within the case of 1 Ola driver, the courtroom determined {that a} choice to make deductions rom driver earnings utilizing an algorithm amounted to an automatic choice missing human intervention.

The Monetary Occasions [paywall] reported the Uber ruling by the identical courtroom. Based on the FT, Uber’s responded that this was a “essential choice”.

The courtroom has confirmed Uber’s dispatch system doesn’t equate to automated decision-making, and that we supplied drivers with the information they’re entitled to. The courtroom additionally confirmed that Uber’s processes have significant human involvement.”

Nonetheless, not the entire journey hailing apps escaped the “automated choice making” class in these judgments and James Farrar, a consultant of one of many drivers’ unions concerned stated “It is a massively necessary first step … We’re going to must do an terrible lot extra.”

Profitable entry to information was essential, he stated, as a result of as platforms’ contractual preparations with staff got here below larger scrutiny, they have been shifting in the direction of extra opaque automated administration methods. Higher transparency wouldn’t solely assist drivers contest unfair choices towards them however would additionally assist to establish their common hourly earnings after prices