Bad Robots: Algorithms Used by Chinese Food Delivery Apps Create Danger for Drivers and Others
Bad Robot Outcome:
Two of China’s most popular food delivery services came under fire after the wide scale publicization of the perils to their workers, as well as other pedestrians and drivers, resulting from the companies’ usage of overly demanding algorithms that created dangerously unreasonable time constraints and penalized those who were unable to keep up with the outrageous demands.
Online food ordering in China is a massive industry. A staggering 395 million individuals had ordered food online as of March 2020. This number represents about 45% of Chinese internet users. By contrast, in the US, that figure sits around 9%. At issue here are the practices of two of the most dominant players in the Chinese food delivery space – Meituan and Ele.me (owned by Alibaba).
As a result, the already underpaid delivery workers were forced to travel at unsafe speeds, violate traffic rules, and work extended hours in an attempt to avoid the penalties instituted by the algorithms. The results were – not surprisingly – less than ideal. In 2019, 325 delivery (food and parcel) drivers in Shanghai alone were either killed or injured, with Meituan and Ele.me workers involved in almost 70% of such incidents.
In addition to the enormous safety concerns, the companies’ problematic algorithms also create financial hardships for their drivers. “It’s all luck whether I make it or not,” lamented Mr. Wu, a 27-year-old Ele.me driver. Three of the seven lunchtime deliveries he made were marked as late, costing him about a third of his daily income.
Meanwhile, the costs borne by Chinese companies like Ele.me and Meituan remain only at about 10-20% of their United States counterparts.
The practices discussed above came to light as a result of a September expose’ in China’s People magazine. Within hours, the story had over 100,000 views and was shared far and wide across social media.
Readers and viewers were quick to jump in with their thoughts and criticisms. “Most people won’t care if their order arrives two minutes sooner or ten minutes late,” reads one comment that got over 33,000 likes.
Not surprisingly, the two companies took notice of the “buzz.” Within 12 hours of the story breaking, Alibaba announced it would incorporate functionality that would allow customers to extend their wait time by 5 to 10 minutes. Ele.me responded with a claim that it would improve its algorithm to account for delays caused by factors outside drivers’ control, such as slow elevators and inclement weather.
This particular “Bad Robot” example is a bit different than some of the others we’ve explored in this series. In the past, we’ve analyzed situations in which bias (either overt or incidental) existed within data sets and thus manifested itself within the algorithms that utilized such information.
Here, however, we’re looking at something quite different. This is not a case of bad data producing data results. Instead, it is quite frankly the usage of a patently unfair piece of technology.
While technology can – and should – be used by companies to increase efficiency and improve customer experience, these factors must be continuously weighed against the impacts that such technology will have on workers and the wider community. To go a step further, it can be argued that companies who employ technology unethically will be at a commercial disadvantage.
Consumer patterns are shifting. Led primarily by millennials and Gen Z, there is a strong interest in supporting companies whose morals align with those of their customers. In fact, according to the 2018 Conscious Consumer Spending Index, 32% of United States residents noted that they would not support a company that is not socially responsible.
This same sentiment was echoed by the hundreds of thousands of viewers, readers, and commenters in the wake of the Ele.me and Meituan fall-outs. Even if an algorithm seemingly improves a company’s bottom line on paper, if it is not deployed ethically and thoughtfully, the numbers might not add up quite right.