It’s not scalable. Sure you could have humans comb through the most common 1000 or so results, but there’s got to be billions of unique searches every day
Yeah but downranking one AI-generated page downranks it from all search results, and you could come up with rules like downranking specific service providers or companies that are repeat offenders. I don’t think it would be easy but I think it’s the only way to get something better than what we have with techniques that currently exist.
It’s not scalable. Sure you could have humans comb through the most common 1000 or so results, but there’s got to be billions of unique searches every day
Yeah but downranking one AI-generated page downranks it from all search results, and you could come up with rules like downranking specific service providers or companies that are repeat offenders. I don’t think it would be easy but I think it’s the only way to get something better than what we have with techniques that currently exist.
The amount of man hours this would require would bankrupt even Google. You’d be better off building a new index of whitelisted sites