Prevent indexing
Posted: Wed Sep 30, 2020 5:46 am
Version 3.0.3.2
I find a problem in duplicating the search pages and indexing them on the Google search engine until there are pages indexed from the old site of Opencart, which are still indexed in Google
i have set the code to prevent indexing
User-agent: *
Disallow: / * & limit
Disallow: / *? Sort
Disallow: / * & sort
Disallow: / *? Route = checkout /
Disallow: / *? Route = account /
Disallow: / *? Route = product / search
Sitemap: https://www.medicalsooq.com/index.php?r ... le_sitemap
Coverage report in Search Console There are numbers for highly indexed pages due to search pages. How can I find a solution to the problem of old and new search pages to prevent them from being indexed on Google and other search engines and to delete their previous indexing.
I have a huge number of indexed pages, all old and duplicate, and search pages
I find a problem in duplicating the search pages and indexing them on the Google search engine until there are pages indexed from the old site of Opencart, which are still indexed in Google
i have set the code to prevent indexing
User-agent: *
Disallow: / * & limit
Disallow: / *? Sort
Disallow: / * & sort
Disallow: / *? Route = checkout /
Disallow: / *? Route = account /
Disallow: / *? Route = product / search
Sitemap: https://www.medicalsooq.com/index.php?r ... le_sitemap
Coverage report in Search Console There are numbers for highly indexed pages due to search pages. How can I find a solution to the problem of old and new search pages to prevent them from being indexed on Google and other search engines and to delete their previous indexing.
I have a huge number of indexed pages, all old and duplicate, and search pages