Which out of the box Salesforce B2B Commerce page can give instructions to web crawlers from accessing specific Salesforce B2B Commerce pages?
A . CCCat?SiteMap
B . cc_RobotsTxT
C . CCSiteIndex
D . CCPage
Answer: B
Explanation:
The out of the box Salesforce B2B Commerce page that can give instructions to web crawlers from accessing specific Salesforce B2B Commerce pages is cc_RobotsTxt. This is a Visualforce page that generates a robots.txt file, which is a text file that tells web crawlers which pages or files they can or can’t request from a site. The page uses the configuration settings CO.RobotsTxtAllow and CO.RobotsTxtDisallow to specify which paths are allowed or disallowed for web crawlers.
For example, User-agent: * Disallow: /CCCart will instruct web crawlers to not access the CCCart page.
Salesforce Reference: B2B Commerce and D2C Commerce Developer Guide, Robots.txt File
Latest B2B Commerce Developer Dumps Valid Version with 79 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund