So, give Google sometime crawl the whole website/product pages...
Printable View
So, give Google sometime crawl the whole website/product pages...
Ok, this is a screenshot for Bing and Yahoo!
https://www.galaxyhomedecor.us/images/bing-yahoo.png
Can you tell me why that they won't index these?
Have you tried actually clicking on the link and seeing what it says?
I am assuming this is where you wind up https://www.bing.com/webmasters/help...lines-30fba23a
It is pretty much step by step.
Certainly following those steps should help.
I've also have a problem.
but I resolved it today.
Here's how to do it:
Go to Admin->Configuration->Sessions
Change the Force Cookie Use "True" to "False"
Recently, Googlebot crawls without cookie.
If he is force to use cookie, he will be redirected to the cookie_usage page.
As soon as I set this setting to "False", Googlebot started coming.
Nice detective work! I've created a GitHub issue (https://github.com/lat9/sitemapxml/issues/45) and will update the admin tool to display a message identifying the condition, if found.
v4.0.2 of SitemapXML is now available for download: https://www.zen-cart.com/downloads.php?do=file&id=367
This release corrects the following GitHub issues:
#42: Correct reverse-logic when determining if Sitemap 'Execution Token' is correct.
#45: Issue warning message in admin tool if Configuration :: Sessions :: Force Cookie Use is found to be 'True'; search-engine crawlers won't be able to index the site.
#46: Correct PHP short-code, e.g. <? , usage. Should be <?=.
So, this issue with google not crawling while force cookie use is set to true only occurs because this plugin is installed? Put another way is why is this warning message only being applied to this plugin?
No and good point. Zen Cart GitHub issue opened: https://github.com/zencart/zencart/issues/6554
running zen version 1.56c how do i update this module to get rid of the dereciated pings using sitemap 3.96