We recently launched a B-side site with over 100 articles published in less than a month. Our focus was not so much on ranking for this content, but rather we wanted to assist in boosting the rankings of the service's pages through a large number of articles for long-tail keywords. Additionally, thanks to the Silo Structure (content cluster structure), these blog posts also act as support pages.
Initially, the data performed well and many keywords with high CPC value were ranked. Just when I thought this strategy was working, in less than a month, the situation took a sharp turn for the worse - the inclusion rate of AI-generated articles continued to fall, and almost all of the AI articles with more than 80% were eliminated by the search engines.

After this trial, or lesson learned, I now basically no longer rely on so-called AI detection tools. There's a funny quote I saw on YouTube: "Use AI to deceive AI, and then think that the content is not generated by AI." Isn't this a modern version of the Chinese idiom "cover your ears and steal your bells"?
Most of these AI articles are now in the "crawled but not indexed" sequence, and once they are in this sequence, it is very difficult to get them indexed. This is quite different from "discovered not indexed". Although some people have tried to get some articles back into the index by submitting them URL by URL, they are quickly removed again.
To solve this problem, we should start from the following aspects:
- Optimized Articles: Enhance the quality and readability of your articles;
- Using the new URL: Use the new URL when republishing the article;
- Remove links to old articles: Avoiding old articles from influencing the crawler's judgment of the site while saving the crawling budget.
As a result, we are now placing more emphasis on the contextual logic and readability of the articles. Many people are satisfied with the structure of the content generated by ChatGPT, but when you translate it with DeepL you realize that although every sentence looks fine, it looks raw when linked together. Such content, even if it can pass AI detection tools (such ashttp://gtpzero.me), it's also hard to get past Google's BERT algorithm.
Google's research in the field of Natural Language Processing (NLP) is so deep that trying to trick Google with AI-generated articles is tantamount to a fool's errand. As a result, we now write articles where AI is just an aid and the focus is on ensuring contextual relevance and readability. We usually rewrite on the basis of AI-generated articles to ensure the overall fluency of the article.
It's the same thing with SEO. No matter how fancy your technique is, if it's not centered around solving the searcher's problem, it's not going to rank well, and Google's emphasis is on "helpful content," not AI, as long as the content generated with AI is "helpful. Helpful" content generated by AI can also be ranked well.
For example, in one of our e-commerce projects, an employee wrote a "best..." article listing and explaining more than a dozen categories, which was rewritten in ChatGPT, but ranked very high because Google thought it was helpful to users.
ChatGPT is like a double-edged sword, used well it can increase efficiency, used poorly it can be fatal. Next, I will continue to test our new strategy and share the progress with you if it works.
Retrieved from:https://zhuanlan.zhihu.com/p/10870802560