# robots.txt # If your user agent ignores these directives, there is a good chance that your IP will be blocked User-agent: * Sitemap: https://www2.gov.bc.ca/sitemap/sitemap_index.xml Disallow: /zzz/ Disallow: /assets/gov/zzzz* Disallow: /assets/gov/taxes/property-taxes/annual-property-tax/municipality-regional-district-resources/ Disallow: /en/search/ Disallow: /local/gov/media/carousel/images.xml Disallow: /gov/search/results.page?* Disallow: /gov/search/results.page? Disallow: /gov/search* Disallow: /search* Disallow: /meia* Disallow: /bcgov/content/* Disallow: /gov/content/*/search-details-* Disallow: /click* Disallow: /gov/feedback* Disallow: /gov/search?id=*&q=* Disallow: */*.swf Disallow: /gov/DownloadAsset?assetId= Disallow: /gov/DownloadAsset?* Disallow: /enSearch/detail* Disallow: /enSearch/mhdetail* Disallow: /enSearch/raspdetail* Disallow: /enSearch/ccdetail* Disallow: /enSearch/result* Disallow: /cgi-bin/ Disallow: /query.html* User-agent: GPTBot Disallow: / User-agent: LinkTiger Disallow: / User-agent: PaperLiBot Disallow: / User-agent: Sogou Disallow: /