Google's new "instant" search results turn into a blank screen when you enter certain forbidden terms. But why are terms like "lesbian" and "Latina" given this treatment? A Google employee has shed a little light on the situation.
The words are chosen by computer algorithm rather than human editors, according to a Google employee's help forum post unearthed by Brian Ries at the Daily Beast. Employee "Kelly F" said both the term itself and the search results are analyzed in deciding whether to filter results — and even blacklisted searches can be completed by simply hitting "Return:"
The algorithms we use to remove a prediction from autocomplete consider a variety of factors. Among other things, we exclude predictions for queries when the query itself appears to be pornographic, violent or hateful... Importantly, we also consider the search results themselves for given queries. So, if the results for a particular query seem pornographic, our algorithms may remove that query from autocomplete even if the query itself wouldn't otherwise violate our policies.
This doesn't entirely explain things: None of the first page of results for "lesbian" are pornographic, unless you have an extremely jaded view of the syndicated TV series Xena: Warrior Princess. And, as the website CarnalNation points out, the word "dyke" is allowed past the filter. Though it's been reclaimed by many proud lesbians, the term certainly falls closer to Google's "hateful" standard than the more neutral "lesbian."
Yes, Google's block can be bypassed with a single keystroke. But if Google wants to get people addicted to Google Instant, it might consider making the user interface less schizophrenic and homophobic and more rational.