• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

Best practice for frequently-run, slow query?

Teapot

Well-known member
#1
I'm currently writing an add-on that needs to run a query on a separate database (that's potentially remote) fairly frequently - every two or three minutes or so. Naturally, this means that if I run the query whenever it's needed (ie, on every page load), the whole application slows down to a crawl.

Normally I would stick the query into a cron job, save the results into the simple cache, and call the job done, but I'm wary about setting a cron job to run every two minutes. Are there any better ways for me to grab this data and cache it across pageviews for a short time? The database may be relocated to the same server as the forum (or replicated), but we'd still have a problem with having to open a connection to a completely new database and do some JOIN magic, which is still slow.

As always, thanks for reading :)
 

MattW

Well-known member
#2
What sort of data are you querying?

Would you be able to create a summary dataset for the results in a separate database table that you can call from XenForo? We do similar things at work, where we have a perl script running the sql query via cron, that puts the results into their own table that the PHP page calls.
 

Teapot

Well-known member
#3
What sort of data are you querying?

Would you be able to create a summary dataset for the results in a separate database table that you can call from XenForo? We do similar things at work, where we have a perl script running the sql query via cron, that puts the results into their own table that the PHP page calls.
I'm compiling a list of users currently online in an IRC channel - this is the query in question:

PHP:
SELECT `user`.nick, `user`.away,
                `ison`.`mode_la`, `ison`.`mode_lg`, `ison`.`mode_lh`, `ison`.`mode_lo`, `ison`.`mode_lq`, `ison`.`mode_lv`
            FROM `ison`
            INNER JOIN `user`
                ON (`user`.nickid = `ison`.nickid)
            WHERE `ison`.chanid = 7 AND `user`.nick NOT IN ('Porygon-Z', 'Pokedex')
            ORDER BY nick ASC;
Essentially: Two tables: "ison" with a list of user IDs currently in a channel, and "user" with details of each user (like nick, which is what I'm after). Need to pull out an alphabetised list of those users. Obviously, because I'm compiling a list of online users, I want it to be fairly accurate. The database n question is updated in near-realtime, but obviously I'd need to sacrifice a little accuracy for performance.

Creating a summary dataset might be possible, but I'm not wholly sure how to best go about that.

Thanks for your help :)
 

MattW

Well-known member
#4
Ok, so based on that, depending on what you are pulling back, there will be an array of users currently online?

What I'd try doing is putting all the results into their own table with a foreach loop, and setting a timestamp for when they were inserted into the second table. You could then use that new information for your query used in XF.

Here is a really simple one I did from a few years back that was used to calculate the number of calls a certain 0800 number was taking on a per minute basis for display onto a monitoring page.

PHP:
$today = date("ymd");
$oneminute = date("Y-m-d H:i:s", time()-60);
 
#Calculate all calls in last 60 seconds
$query = "SELECT ddi FROM inc_$today WHERE time >\"$oneminute\"";
$result =  mysqli_query($dbc, $query)
        or die('Error querying the database');
$calls = mysqli_num_rows($result);
 
// Insert the total back into the database for seperate graphing
$insert = "INSERT INTO all_inc (total_calls) VALUES ('$calls')";
mysqli_query($dbc, $insert)
        or die('Error inserting data into database');
I then executed that PHP script from a cron job every minute, but could then get the updated information without needing to bog down the larger realtime database.