Which three scenarios are valid reasons for customer to create their own robots.txt file?

Salesforce B2B Commerce natively provides a robots.txt file, however, a customer

can also create its own version.

Which three scenarios are valid reasons for customer to create their own robots.txt file? (3 answers)
A . The customer wants to reference multiple storefront sitemap indexes in a single robots.txt file
B . The customer wants to reference a custom sitemap index.
C . The customer wants to have multiple robot.txt files in a single Salesforce Community.
D . The customer’s store is not located at the rootof their domain.
E . robot.txt only works if there is one storefront in the org

Answer: A,B,D

Explanation:

A customer can create its own robots.txt file for three valid reasons:

The customer wants to reference multiple storefront sitemap indexes in a single robots.txt file. This can be useful if the customer has multiple storefronts under the same domain and wants to provide a single entry point for search engines to crawl their sitemaps.

The customer wants to reference a custom sitemap index. This can be useful if the customer has created their own sitemap index that contains custom sitemaps or sitemaps from other sources. The customer’s store is not located at the root of their domain. This can be useful if the customer has their store under a subdirectory or a subdomain and wants to specify a different robots.txt file for their store than for their main domain.

Salesforce Reference: B2B Commerce and D2C Commerce Developer Guide, Robots.txt File

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments