Knowledgebase: General
Why is Google search showing "A description for this result is not available because of this site's robots.txt" for my site?
  • Issue
    When searching for your domain on Google, the result shows a essage reading: "A description for this result is not available because of this site's robots.txt"
  • Cause
    You are using the *.3dcartstores.com URL as your main URL

Your store typically uses the shared *.3dcartstores.com URL as a placeholder web address to allow you to work on your store while your real domain name's DNS is being transferred over. Then, after the domain is transfered over, the shared URL is used as the store's secure (https://) URL for when the site goes into SSL mode.

Because the shared *.3dcartstores.com URL is used for the store's secure pages, it uses the secure SSL robots.txt file that prevents search engines from accessing and indexing secure portions of the site. Therefore, if you are still using the *.3dcartstores.com URL as your main store URL, the robots.txt file will prevent search engines from indexing the entire site and generate the "A description for this result is not available..." message.

The solution to correct this is to simply update your store's main URL so that it uses your real domain name rather than the shared URL. Here's how:

  1. Log into your 3dcart Online Store Manager
  2. Using the left hand navigation menu, go to Settings ->General ->Store Settings
  3. Under "Store Information" locate the field labeled "Store URL"
  4. Update this field to your real domain name
    (Be sure to include the http:// and www prefixes as applicable)
  5. Click "Save" at the top right to commit your changes.

After saving, your store will use your main domain as its primary URL and will be properly indexed by search engines.

The secure URL may be left as the *.3dcartstores.com URL if needed. However, if you have a dedicated SSL certificate installed for your site, you can also change the secure URL to your domain property (see next section for information on this)


Using the domain as secure URL

As mentioned, if you also want to use your real domain name as both the store URL and Secure URL, you will need a custom SSL certificate. Please click here for more information purchasing a dedicated SSL certificate.

Also, when using the domain as both store and shared URL, you will need to update the site's robots.txt file so that search engines index the site properly. To do this, please take the following steps:

  1. Log into your 3dcart Online Store Manager
  2. Using the left hand navigation menu, go to Marketing ->SEO Tools
  3. Locate and click on the link labeled "Edit Robots.txt File"

This page will have two distinct areas. Within the top half of the page, you will see the Robots.txt section containing your store's regularrobots.txt file. It should look like this:

Sitemap: http://[store-url]/sitemap.xml

# Disallow all crawlers access to certain pages.
User-agent: *
Disallow: /checkout.asp
Disallow: /add_cart.asp
Disallow: /view_cart.asp
Disallow: /error.asp
Disallow: /shipquote.asp
Disallow: /rssfeed.asp
Disallow: /mobile/

Note
If the robots.txt file does not look like the above, you may click on the "Restore Default Robots.txt" link along the bottom of the window to revert it to default.

Within the bottom half of the page, you will see the Shared SSL .3dcartstores.com Robots.txt section containing your store's robots_ssl.txt file. It should look like this:

# Disallow all crawlers access to all pages. SSL
User-agent: *
Disallow: /
  1. Copy the content from the robots.txt section (top) and paste it into the robots_ssl.txt section (bottom)
  2. Click "Save" at the top right to commit your changes.

This will allow search engines to index your site properly.

Additional Information
The robots.txt file will still prevent indexing of certain pages like checkout.asp, add_cart.asp, view_cart.asp and others. This is because these are pages that require actions taken by real visitors (such as someone physically clicking the add to cart button on a specific product).

In other words, these are actions that cannot be performed by a bot and will result in an error if it was just randomly accessed during indexing. To prevent errors from being indexed, we disallow access to these specific pages.


Making the whole site HTTPS

An optional and somewhat common configuration is to set the entire site to use HTTPS. If you are interested in doing this to your site, please click here for more information

 


Help Desk Software by Kayako fusion