Retailers   //   February 5, 2021  ■  4 min read

Holes in eBay’s keyword blocklist are leading to ads for far-right merchandise

This story was originally published in Digiday, Modern Retail’s sibling publication

After the attack on the capitol on January 6, several online marketplaces pledged to crack down on products that promote hate speech or were associated with far-right groups.

eBay still has some work to do.

Recent searches conducted by Digiday showed that merchandise, including gun parts, branded with marks, logos and phrases associated with far right nationalist groups are readily available for sale on eBay, including numerous listings that were promoted by eBay’s advertising platform.

At the time of writing, searches for the word “boogaloo,” for example, still auto-populate results in the eBay search bar in the militaria category, and return items including a shirt with George Washington holding a gun and wearing a Hawaiian shirt, a symbol of the far-right Boogaloo Boys.

Similar products appeared as promoted listings, eBay’s version of self-served ads. There were promoted listings for an Oath Keeper (also a far-right militia) Glock cover plate, a Kekistan “Kek” flag (loosely based on the Nazi flag), and shirts depicting the Punisher skull, a symbol that has been co-opted by the far-right.

Searches for “iii percent” returned patches with the roman numeral III, a reference to the Three Percenters, a militia group.

After being contacted by Digiday, eBay began reviewing and removing many of the items mentioned above, a spokesperson said. The company did not respond to a request for comment about how the items Digiday found came to be listed as ads.

Merchandise for Antifa, the far-left group, also appeared on eBay, including shirts, stickers and flags, although none came up as sponsored listings.

The deplatforming of former President Donald Trump has caused marketplaces to reckon with what is being sold on their platforms; eBay said that it banned additional merchandise related to Boogaloo, Oath Keepers and Three Percenters, and “Stop the Steal” items after the insurrection. But while neural networks and keyword blocklists can catch and remove most prohibited material, observers said they won’t be able to stop all of it.

“When companies have to scan millions of images, even if 1% aren’t caught, that’s a lot,” said Paul Bloore, cofounder and CTO of TinEye, a reverse image search engine. “Companies can’t fix that 1% without having a human look at everything. To achieve 100% review, you need people looking at 100% of images to catch anything bad. That’s just not feasible.”

eBay said that promoted listings go through the same filters as all other listings, and listings that violate the offensive materials policy cannot be promoted.

eBay sellers can opt to have their products come up as promoted listings by paying an additional percentage on top of eBay’s 8% fee. For example, a seller could pay an additional 3% to have a listing for a promoted shirt. The seller pays eBay the extra 3% only if the listing converts to a sale.

“eBay is pulling words from the seller’s listing description, and matching them with whatever was typed into the search bar,” explained Stuart Clay, associate director of marketplace strategic services at Tinuiti. “With promoted listings, sellers do not provide any keywords to eBay.”

When asked if the listings could be an issue with eBay’s ad platform, or if they were just failing to ban certain keywords, Clay said it could be a bit of both. “eBay has historically run behind on advertising. Remember they started as an auction site, so they are playing catch up.” But if far-right merchandise is still readily available, Clay said, that sounds like a blocklist that hasn’t been updated.

Platforms face two learning curves. On the product side, it can be difficult to keep up with all of the names, symbols and phrases of far right groups that should be blocklisted, said Dr. Megan Squire, a computer science professor at Elon University and a senior fellow at the Southern Poverty Law Center.

On the technical side, the neural networks that scan product images can still miss what should be banned, as was the case with a Camp Auschwitz t-shirt and a Three Percenter ammunition magazine on Etsy.

Platforms usually use a combination of neural networks and humans to review listings and content, Bloore said. To train a neural network to detect, say, a swastika, companies input datasets containing hundreds of thousands of images so that the neural network learns what to identify.

Still, these neural networks will never be 100% accurate at blocking banned photos. For example, what if the swastika was mirrored? A human would still know it’s a swastika, but a neural network might not, unless it’s been fed images of reversed swastikas.

“If a network doesn’t have an example of something, it’s unlikely to catch it,” Bloore said. “Neural systems are not that flexible.”

“This is a difficult problem, and frankly, I don’t think it can be overcome,” said Bloore. “That said, I don’t get the impression that companies are trying to avoid content moderation.”