It’s because the the company incorrectly listed the term as one that is “typically associated with adult content” and therefore blocked, according to Twitter.
Users alerted the company to the issue over the weekend after this message appeared whenever they typed “bisexual” into the search bar.
Some tweeters accused the company of having double standards:
twitter: doesnt suspend racists, sexists, ban slurs, deactivate spam accounts— alice (@afterglowsdmn) November 5, 2017
also twitter: blocks #bisexual from showing photos/results
Twitter acknowledged the mistake on Sunday afternoon, and said it was “working quickly to resolve” it.
We’ve identified an error with search results for certain terms. We apologize for this. We’re working quickly to resolve & will update soon.— Twitter Support (@TwitterSupport) November 5, 2017
On Monday night, the company explained the root of the problem.
Via a series of tweets, Twitter admitted searches for “certain words related to sexuality,” including “bisexual,” “did not populate complete results.”
It was because the company incorrectly included the word on an outdated list of terms “typically associated with adult content.” Twitter promised to resolve the issue.
It had not been fixed by Tuesday morning, however.
“We apologize for anyone negatively impacted by this bug,” the company added. “It is not consistent with our values as a company.”
Check out Twitter’s full thread below:
1 / Late last week, we discovered a technical issue that affected search results: searches for certain words related to sexuality did not populate complete results. We apologize for anyone negatively impacted by this bug. It is not consistent with our values as a company.— Twitter Support (@TwitterSupport) November 7, 2017
2 / As outlined in our media policy, media that may be considered sensitive is collapsed in places such as search results, meaning that images and videos would be presented as a link, not automatically populated. https://t.co/4KYjAPrnM5— Twitter Support (@TwitterSupport) November 7, 2017
3 / One of the signals we use to identify sensitive media is a list of terms that frequently appear alongside adult content. Many of these words on the list are not inherently explicit, which is why they must be used alongside other signals to determine if content is sensitive.— Twitter Support (@TwitterSupport) November 7, 2017
4 / Our implementation of this list in search allowed Tweets to be categorized based solely on text, w/out taking other signals into account. Also, the list was out of date, had not been maintained and incorrectly included terms that are primarily used in non-sensitive contexts.— Twitter Support (@TwitterSupport) November 7, 2017
5 / When all Tweets containing certain terms were incorrectly collapsed on the photos, video and news search tabs, the search results in those tabs returned an error message indicating that no content was available.— Twitter Support (@TwitterSupport) November 7, 2017
6 / We have audited the list and removed terms that should not have been included. We are making changes during the next 24 hours to correct this mistake. Once we are confident it is completely resolved, we’ll share an update here.— Twitter Support (@TwitterSupport) November 7, 2017
- This article originally appeared on HuffPost.