Fixed Search index batch processing potentially bugged

DragonByte Tech

Well-known member
Affected version
2.0.0 Beta 8
I fully admit this may be me reading the code wrong, but the code in XF\Job\SearchIndex looks wrong to me.

If I run this code:
PHP:
            \XF::app()->jobManager()->enqueue('XF:SearchIndex', [
                'content_type' => 'post',
                'content_ids' => $postIds
            ]);

Where $postIds is an array of, let's say 10,000 numbers, then won't this block of code...
PHP:
        do
        {
            $batch = array_slice($contentIds, 0, $batchSize);
            $contentIds = array_slice($contentIds, count($batch));

            $search->indexByIds($this->data['content_type'], $this->data['content_ids']);

            if (microtime(true) - $s >= $maxRunTime)
            {
                break;
            }
        }
        while ($contentIds);
...run all 10,000?

My reasoning for believing this is that if $this->data['content_ids'] originally contains 10,000 entries during the first run, by the time it hits the $search->indexByIds() line, $this->data['content_ids'] still contains 10,000 entries.

To me, it looks like the second argument to $search->indexByIds() should be $batch since that is the subset of IDs that should be processed at this point.

I have not tested this myself, so please forgive me if I am wrong. I only noticed this because I am building an application that uses a similar kind of process to the search indexer, so I am using the search indexer as a starting point, which includes this job.


Fillip
 
I agree. It's actually processing all in one go rather than doing it in the defined batch size and adhering to the job time limit. We've fixed this for the next release; thank you and good catch!
 
Back
Top Bottom