Sitemap.xml was deleted every other day

I used some third party service to generate the sitemap.xml file for my site, submitted to Google etc... 2 days later that file is blank again. Dolphin somehow wiped out the sitemap.xml file and replaced with a blank one every other day.

I also uploaded another sitemap.xml to a subdomain and that stayed, it's not Dolphin though.

Quote · 13 Sep 2013

Dolphin 7.1 has it's own built in sitemap generator.

Admin->Tools-Sitemap

It should not be empty unless it's not functioning properly.

However, if you want to use your own, then disable the one dolphin generates. If it still happens when that setting is off, then try setting the permissions on the sitemap.xml file so its able to be read only. Not writable.

https://www.deanbassett.com
Quote · 14 Sep 2013

just checked. The option is disabled. I try to use that to generate a sitemap and it looks funky, just a few lines without any format.

I will use the last option now, set it read only and generate my own sitemap.xml using 3rd party app.

Quote · 14 Sep 2013

Dolphins generated sitemap only contains a few lines that reference other sitemap files that are in the cache_public folder. So there is a lot more to it than what your seeing.


https://www.deanbassett.com
Quote · 14 Sep 2013

ah, thank you for the information.

For some reason, my traffic drops about 50% if my sitemap is not available for google.

Quote · 14 Sep 2013

My sitemap never work for me. That's the error message I always got everytime  I try to generate the sitemap..


Fatal error: Maximum execution time of 30 seconds exceeded in /home/your-root/public_html/inc/classes/BxDolSiteMaps.php on line 196

Quote · 14 Sep 2013

 Your server has the maximum execution time for scripts to run set at 30 seconds and it takes longer than that for the sitemap to generate. Change the max execution time in your master php.ini.

 

My sitemap never work for me. That's the error message I always got everytime  I try to generate the sitemap..


Fatal error: Maximum execution time of 30 seconds exceeded in /home/your-root/public_html/inc/classes/BxDolSiteMaps.php on line 196

 

BoonEx Certified Host: Zarconia.net - Fully Supported Shared and Dedicated for Dolphin
Quote · 15 Sep 2013

 I contacted my host provider they change the max execution to the max. still giving that error message. Because of that "Alexa and google does not generate the site traffic. For example; I have more than 100 people  joining the site daily Alexa does not report them on the past  Alexa Traffic Rank: 949,412 United States Flag Traffic Rank in US: 85,687 now it is Alexa Traffic Rank: 2,564,169 United States FlagTraffic Rank in US: 1,254,903. I don't think the problem is about the maximum execution time for scripts to run Plus it did not take 30 second to give that error.. it took about 6 to 10 second.

 Your server has the maximum execution time for scripts to run set at 30 seconds and it takes longer than that for the sitemap to generate. Change the max execution time in your master php.ini.

 

My sitemap never work for me. That's the error message I always got everytime  I try to generate the sitemap..


Fatal error: Maximum execution time of 30 seconds exceeded in /home/your-root/public_html/inc/classes/BxDolSiteMaps.php on line 196

 

 

Quote · 15 Sep 2013

Can you open Administration -> Tools -> Host Tools -> phpinfo and post the value for max_execution_time?

BoonEx Certified Host: Zarconia.net - Fully Supported Shared and Dedicated for Dolphin
Quote · 15 Sep 2013
  • PHP: 5.3.25 - OK
    • allow_url_fopen = On - OK
    • allow_url_include = Off - OK
    • magic_quotes_gpc = Off - OK
    • memory_limit = 536870912 - OK
    • post_max_size = 268435456 - OK
    • upload_max_filesize = 268435456 - OK
    • register_globals = Off - OK
    • safe_mode = Off - OK
    • short_open_tag = On - OK
    • disable_functions = - OK
    • php module: curl = curl - OK
    • php module: gd = gd - OK
    • php module: mbstring = mbstring - OK
    • php module: xsl = xsl - OK
    • php module: json = json - OK
    • php module: openssl = openssl - OK
    • php module: zip = zip - OK
    • php module: ftp = ftp - OK
  • MySQL: 5.1.70-cll - OK
  • Web-server: Apache/2.2.24 (Unix) mod_ssl/2.2.24 OpenSSL/1.0.0-fips DAV/2 mod_bwlimited/1.4
    • rewrite_module - OK
  • OS: Linux server2.bouchesocial.com 2.6.32-358.6.2.el6.x86_64 #1 SMP Thu May 16 20:59:36 UTC 2013 x86_64

Site optimization

  • PHP:
    • PHP accelerator = eAccelerator - OK
    • PHP setup = cgi-fcgi - OK
  • MySQL:
    • key_buffer_size = 8384512 - OK
    • query_cache_limit = 1048576 - OK
    • query_cache_size = 16777216 - OK
    • query_cache_type = ON - OK
    • max_heap_table_size = 16777216 - OK
    • tmp_table_size = 16777216 - OK
    • thread_cache_size = 1 - OK
Quote · 15 Sep 2013

Even if sitemap was not generated when you enable it, it still should be generated upon cron run - there is no time limit there and it should be generated without problems. Try to check sitemap the next day after you enable the setting.

My sitemap never work for me. That's the error message I always got everytime  I try to generate the sitemap..


Fatal error: Maximum execution time of 30 seconds exceeded in /home/your-root/public_html/inc/classes/BxDolSiteMaps.php on line 196

 

Rules → http://www.boonex.com/terms
Quote · 18 Sep 2013

my sitemap is not generated buth i think it's a conflict with eaccelerator in the logs apears the next error

 

] [notice] EACCELERATOR(3913): PHP crashed on opline 33 of bx_sys_security_get_impact_threshold() at /home/tulugarf/public_html/social/inc/security.inc.php:123

Quote · 30 Nov 2013

 

I used some third party service to generate the sitemap.xml file for my site, submitted to Google etc... 2 days later that file is blank again. Dolphin somehow wiped out the sitemap.xml file and replaced with a blank one every other day

i've now been having this exact problem for a few weeks now, i've tried everything, my last thought is that i did try the new dolphin built-in sitemap but then disabled it, when i set up the sitemap did it create a cron job? if so, although i've disabled it, is there a chance that it is still running a cron job and that is what is wiping out my main sitemap.xml index file? if it is, how the hell do i turn off the cron job? really need help on this please, i've gone from top of page 1 on google down to page 5 and still droping like a tonne of s...!

Quote · 24 Jan 2014

ok, it almost certainly looks like that it's the dolphin sitemaps that is deleting the info from my sitemap files although i've disabled it, it appears that it still runs and if disabled is told by the file BxDolSiteMaps.php to delete the files, how can i stop the dolphin sitemaps from running? do i have to delete the file BxDolSiteMaps.php or delete everything in the database under sys_objects_site_maps ? will doing either of these effects other parts of my site? 

as you can probably tell from reading this i haven't got a clue what i'm doing and would really appreciate any help or advice on this matter as quick as possible before my site disappears from google completely Cry

Quote · 25 Jan 2014

Quickly checked and seems to override the sitemap.xml file even though in admin enable sitemap generator is off. Off or on cron is running. 

For now you can change permission on sitemap.xml as Deano92964 stated above or go into database --  sys_cron_jobs  and edit sitemap.  Not sure of what sitemap/service you are using, is it auto updated or manual?

Quote · 25 Jan 2014

thanks for the reply, i've changed the permissions to read only so hopefully that should stop the sitemap.xml file being emptied, the problem for the future is that i cannot use the sitemap generator that i have been using because it's all automated so needs to be able to write to the sitemap.xml file so really must sort out how to stop the dolphin sitemap from running, 

this is what i've been using: http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html 

Quote · 25 Jan 2014

 

database --  sys_cron_jobs  and edit sitemap

 just had a look, can i simply delete sitemap from sys_cron_jobs to stop it running or will this create other problems?

Quote · 25 Jan 2014

 If sitemap gen is automated, then permission change will not work. 

 

database --  sys_cron_jobs  and edit sitemap

 just had a look, can i simply delete sitemap from sys_cron_jobs to stop it running or will this create other problems?

Not sure, I haven't realty looked at the sitemap functionality. I wouldn't say delete it as you might want to use dolphin's sitemap gen later on. I will try and look at it more if I find a few min's but for now put permission back on file and change the time in sys_cron_jobs  sitemap  to 0 0 1 1 * which I believe will run once a year January 1.

Quote · 25 Jan 2014

@rhimpr ok i'll do what you suggest, many many thanks for all your help, you're a star

Quote · 25 Jan 2014

Change the permission on the sitemap. Make it read only. Then dolphin can't over write it.

https://www.deanbassett.com
Quote · 25 Jan 2014
 
 
Below is the legacy version of the Boonex site, maintained for Dolphin.Pro 7.x support.
The new Dolphin solution is powered by UNA Community Management System.