I figured it out, https:// does not work, it only works in http:// mode
I figured it out, https:// does not work, it only works in http:// mode
Hope someone can help. Everything works hunky dory, but after signing up with Google webmaster Tools and submitting the main XML sitemap, I get an instant warning message saying all the URLS are basically blocked by robots.txt
As I haven't blocked a thing with robots.txt there's something odd going on, and I don't know if it's something I've done, or a glitch with the webmaster tools?!
Hi a_berezin. It is :
http : // thephotosite.com
Just to add, I have submitted my sitemap to Bing webmaster tools, and they don't see to give any warnings, or note any problems.
Last edited by Zarathustra; 18 Oct 2012 at 03:33 PM.
Thanks for the reply. That was my robots.txt during one of my tests to see if any changes would make a difference, but whether I simply allow everything, disallow specific folders, or remove the robots.txt altogether, I keep getting this message from Google. It may be I have to wait 24 hours for the robots.txt to refresh, but if I check 'health' and 'blocked urls' it says robots.txt has never been downloaded by Google, so it shouldn't really be taking any effect at all.
found 2 new warnings, could you please help me fix it? thanks a lot
Code:PHP Warning: filesize() [<a href='function.filesize'>function.filesize</a>]: stat failed for /public_html/sitemap/sitemap.xml in /public_html/includes/classes/sitemapxml.php on line 594 PHP Warning: file_get_contents(/public_html/sitemap/sitemap.xml) [<a href='function.file-get-contents'>function.file-get-contents</a>]: failed to open stream: No such file or directory in /public_html/includes/classes/sitemapxml.php on line 596
Bookmarks