AI ASSISTANT
A recent incident involving the Boox e-reader has highlighted concerns over the use of Chinese AI models in global products.
The device, made by China-based Onyx International, reportedly integrated ByteDance's AI model, Doubao, sparking backlash for producing responses that mirrored Chinese government narratives.
In December 2024, Reddit users revealed that Boox's AI assistant, launched last summer, denied events like the Tiananmen Square crackdown and avoided criticising countries like Russia and North Korea.
Instead, it directed criticism towards Western nations, referencing topics such as French colonialism.
These responses shared through screenshots, quickly gained attention and were covered by AI publication The Decoder and The China Show on YouTube.
Here’s what you should know:
ByteDance’s Doubao AI model in Boox devices echoed Chinese government viewpoints, leading to user outcry.
Boox has reportedly replaced Doubao with OpenAI’s GPT-3, but no official statements confirm the current model in use.
Experts warn of the cultural risks posed by AI tools shaped by regional political or cultural biases.
Oops, wrong assistant
ByteDance stated that Doubao was designed solely for use within mainland China.
The backlash has reportedly led Boox to switch to OpenAI’s GPT-3 via Microsoft Azure.
However, Boox has made no public announcements, and OpenAI has not commented on the matter.
This case highlights the broader risks of integrating localised AI models into international products.
Experts like Clement Delangue, CEO of Hugging Face, have warned about the potential for cultural influence when AI tools developed in specific regions dominate the market.
Delangue pointed out that such models might reflect perspectives that don’t align with those in Western countries, particularly when discussing sensitive historical or political issues.
Imagine asking your e-reader a question and it gives you a government press release.