Index Search Engine job NEVER ends

Permalink 3 users found helpful
When I go into System and Maintenance, /dashboard/system/jobs/

The Index Search Engine job just keeps running and running. My site is pretty small too, is there a way to reset it or something?

And this has been going on for days and days, says the same thing every time I check it.

BTW - the Generate Sitemap File and Process Email Posts jobs both run and end normally.

dibbc
 
nteaviation replied on at Permalink Reply
nteaviation
jordanlev replied on at Permalink Reply
jordanlev
It's probably an error that isn't getting shown to you (as opposed to the process actually running and running non-stop). This happened to me recently, because I changed the name of a page attribute and then the indexing job got confused because it was looking for the old attribute name. What helped me track this down was going to the Dashboard System and Maintenance page, copying the url at the bottom, pasting it into the browser address bar and hitting Enter. An error message showed up that helped me figure it out.
dibbc replied on at Permalink Reply
dibbc
When I pasted the line I got this:

Fatal error: Call to a member function getInstance() on a non-object in /home/therap3/public_html/dibbvids/updates/concrete5.4.1.1/concrete/libraries/database_indexed_search.php on line 137

Not really sure where to go from for this.
jordanlev replied on at Permalink Reply
jordanlev
Is this a live site, or just something you're in the middle of developing?

I *think* you can fix the problem by going into your server's phpmyadmin and running these two queries:
TRUNCATE TABLE CollectionSearchIndexAttributes

and
TRUNCATE TABLE PageSearchIndex


If it's just a development site then go ahead and try that and then re-run the search index job. If it's a live site, I'm not 100% sure if that's a safe thing to do (I am pretty sure it's okay, as it seems that re-running the index job fills up those tables with data again) -- might be best to get another opinion on it in that case.

-Jordan
senshidigital replied on at Permalink Best Answer Reply
senshidigital
This worked for me:

First go into your SQL database and find 'Jobs' and set jStatus = 'ENABLED';

Then the file:

/concrete/libraries/database_indexed_search.php

Copy this into your top level libraries folder and change the following code:

$bi = $b->getInstance();
if (!is_object($b)) {
   continue;
}


to this:

if (!is_object($b)) {
   continue;
} else {
   $bi = $b->getInstance();
}


This should stop it happening again.
rbnz replied on at Permalink Reply
rbnz
Wow. Thanks! Saved my ass!

Err what did that bit of code do any way?
invision replied on at Permalink Reply
invision
Whew. Six months of pulling my hair out with this issue... almost out of hair. Now it works beautifully.

Thanks, Dojo. Nice work.

Why wouldn't they update the core with this fix?
senshidigital replied on at Permalink Reply
senshidigital
They may do.

One thing I have noticed though is if your C5 has been updated and is not a 'clean install' the above sometimes does not work.
jordanlev replied on at Permalink Reply
jordanlev
I'm pretty sure this has been fixed in the core code (so will be addressed in the next release).
invision replied on at Permalink Reply
invision
Any ETA on the next release?
foster replied on at Permalink Reply
foster
jStatus = ENABLED was just what I needed. Thanks!
Steevb replied on at Permalink Reply
Steevb
I've tried everything that has been suggested, but it still would NOT work!

I ended up doing a COMPLETE re-install?

Wont help you, but just saying.......

Bloody annoying!
senshidigital replied on at Permalink Reply
senshidigital
Thats strange. The above worked like a charm for me.

Sorry to hear.
boomgraphics replied on at Permalink Reply
boomgraphics
I had this issue as well. Maybe this should be a bug? I know it currently seems to be a php memory limitation, but surely the indexing code can be optimized for larger sites?
rbnz replied on at Permalink Reply
rbnz
didn't seem to be a memory issue with me... had about 50 pages and php memory limit of 256MB
boomgraphics replied on at Permalink Reply
boomgraphics
Maybe I got memory and timeout confused... :-) I'm still learning.

Is this the same sort of issue the core team was having with their wordpress import script (watch their most recent weblog video), so now they are rewriting it so timeouts don't occur?
ScottC replied on at Permalink Reply
ScottC
yeah not sure. I gave them the script not sure how much of that will end up in the released version.

--
Scott Conrad
Sent with Sparrow
ScottC replied on at Permalink Reply
ScottC
it is usually a max execution timeout or something like that, so the job runs, dies and the database table never gets updated to say that the job was completed. Your processes regardless of what the dashboard says are generally limited to your script timeout or max_execute_time, i believe one or both of those are 30 seconds by default based on php.ini
fastcrash replied on at Permalink Reply
fastcrash
so what is the best value to set max_execution_time?
i got this error too, the spin run! run! run!(remember me to one piece ost)

this is the error i get when i try jordan suggest.

Request Timeout
This request takes too long to process, it is timed out by the server. If it should not be timed out, please contact administrator of this web site to increase 'Connection Timeout'.

this is just 2$/mont hosting crap :), so no access to php ini

i try
set_time_limit(900);

but still error.
btw, can we turn it off? the spin? it never ending story :)

what benefit to run this spin?
jordanlev replied on at Permalink Reply
jordanlev
If you set the time limit to 900 seconds, that is a LOT of time. Nothing should take that long :)

I think maybe the setting is not working. If you look at the PHP manual (http://php.net/manual/en/function.set-time-limit.php... ) it says that the set_time_limit() function does not work if PHP is in "safe mode". Perhaps your host has safe mode turned on?
fastcrash replied on at Permalink Reply
fastcrash
this is the result when i echo "safe_mode : " . ini_get('safe_mode');

result : safe_mode : 0 //it is off isn't?


the hosting guy add this to my .htaccess:
php_value memory_limit 100M
php_value max_execution_time 300

i dont know if that's is work, after i try 'enable' change job and run again still got the error.

so i think better to 'enable' again job status like dojo said to stop the spin runing.
btw, when i try change code dojo suggest, the code is diferent.
canot find it in database_indexed_search.php
$bi = $b->getInstance();
if (!is_object($b)) {
   continue;
}
Steevb replied on at Permalink Reply
Steevb
4 me it was a server issue.

So it was easier and quicker just to move folders and re-install

Total time= 32mins