First, the server factor: the server is unstable.
The server is unstable for a long time or the website stops running because it is not ready, which may lead to the reduction of included pages.
2. Pages without content: There are a large number of pages without content or pages with little content information on the website.
There are a large number of pages with no content or some pages with little content information, which leads to high similarity of some pages. Therefore, search engines will delete some pages that are considered to be duplicates. For these pages which are of little use for search, you can write robots.txt file to prevent the search engine from collecting them, thus improving the quality of the content collected by the website.
Third, collect articles: Does your site have too much duplicate content because of collection, so that duplicate pages are deleted?
A friend of a garbage station certainly can't insist on writing originality in your station. It is suggested to sort out the collected contents and edit them before sending them.
Fourth, excessive SEO and cheating: Does your site have excessive SEO or cheating (hiding words, piling up keywords, turning to cheating, etc.)? )?
Don't expect opportunism. Doing SEO is just like being a human being. Down to earth.
5. The content of the article is unhealthy: Does your website contain political, sensational and reactionary information?
In reality, these are still playing, not to mention the internet. These sensitive topics are best not to appear on your own website.
Landlord, your website has dropped to 0. See if your website has 1 above 5: 00! ! !