Knowledgebase: General
Why is robots.txt blocking Googlebot from indexing site's images?
  • Issue:
    Google Merchant Center reports that it cannot correctly index images on my 3dcart store.
  • Cause:
    The robots.txt used for the HTTPS version of the site is set to block Googlebot access

When you generate a feed from your store for Google Product Listing Ads, the URL used for images may be following your secure https path to the images.  However, the SSL robots.txt file used by your site is written to prevent bot access to the secure URL and therefore prevents the images from being indexed properly.

To correct this, you have two options available to you:

Option 1:
Manually update the feed (after it is generated) to use your store’s main URL instead of the secure URL.  This way, the Googlebot will index images using your store’s base URL as the path and not the secure URL.

Option 2:
Purchase and install a dedicated SSL certificate.  This way, your store’s secure URL matches the main URL and the Googlebot will be able to access the images.

If you already have a dedicated SSL and the Googlebot is still having trouble accessing the images, you can log into your 3dcart Online Store Manager, copy the regular robots.txt and paste it into the Shared SSL robots.txt field.  This will allow the bot to access the images using either http or https.

If you are still using the shared * URL for your store’s secure URL, do not edit your robots text in this fashion since it will likely result in duplicated content being reported.  Instead, you may edit the shared SSL robots.txt file so that it will allow the googlebot access to the images directory as shown here.

Help Desk Software by Kayako fusion