Twitter has disabled autocomplete in its search bar after users were pointing out that the service began to recommend gory videos of animal abuse and war footage.

Autocompleting search terms is de rigueur for most online services these days. Google often knows what you want to ask it before you’ve finished typing your question. Twitter was no different, but Wednesday evening users began to notice something strange. Typing in once innocuous search terms like “kitten” and “dog” began to return autocomplete suggestions for gore videos.

Motherboard tried this itself and found that searching for “dog” suggested “dog screwdriver video” and “kittens” suggested “kittens in a blender.” Searching for “wagner,” would suggest “wagner hammer execution” and “texas” would suggest “texas mall video.” Some of these search terms would link back to uncensored videos matching this description on the platform. Twitter apparently “fixed” the problem by simply disabling autocomplete. As of this writing, both the desktop and mobile version of the app no longer autocomplete search suggestions.

Gore content has proliferated on Twitter in the past few weeks. In the aftermath of the shooting in Allen, Texas, a graphic video of children killed in the attack circulated on the platform for hours before being removed. It was easy for someone looking for information about the attack to stumble across the video, multiple times, while searching Twitter. That video has largely been removed from the platform. But others—including one of the dead shooter—still appear when searching for information about Allen.

Twitter did not immediately respond to Motherboard’s request for comment.

Subscribe to the VICE newsletter.

By signing up to the VICE newsletter you agree to receive electronic communications from VICE that may sometimes include advertisements or sponsored content.

Read More