1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Programming around heavy operations?

Discussion in 'XenForo Development Discussions' started by Jaxel, Mar 5, 2015.

  1. Jaxel

    Jaxel Well-Known Member

    Okay, lets say I have a prompt in my addon... Lets say clicking the prompt starts a bunch of database operations that can take anywhere from 30-90 seconds to process... For a lot of people, an operation that takes longer than 30 seconds will time out.

    Does anyone have any ideas on the best way to handle this and present this to the user?
  2. EQnoble

    EQnoble Well-Known Member

    It really all depends on the nature of what you are doing exactly, I have done a few heavy operations and under different conditions, certain things that were feasible to do became unusable and so on.

    Two questions

    Is the data being returned private, publicly viewable?
    In what context is this information being returned, like what is the purpose of it?

    Without knowing what you are actually doing and the actual load and the frequency of the job the best I could say is do the back end stuff in parallel if it is appropriate to do so in your circumstance.

    In some instances rewriting something to support doing things in parallel brought me from 35+ second job completion time to 3 seconds max, but it really depends on what you are building and the flow of it I suppose.
  3. Jaxel

    Jaxel Well-Known Member

    In XenTorneo, I am trying to recalculate the values of every event in my database.

    It will fetch the full table of EVENTS (lets say 1000 items).

    Then for each item, it will query for the RESULTS of each event (varied number of results). It will loop through each of these results, calculating various data. Then it will update each RESULT. Then it will update the EVENT with various data as well.

    If we say for example, there are 20 results per event... what we have a is a large number of queries.

    1 fetch events
    1000 fetch results
    20000 update results
    1000 update events

    This could take an extremely long time.
  4. cclaerhout

    cclaerhout Well-Known Member

    I've solved this in the admin part by using a temporary table to queue data to be processed. See here. It's working.
    Xon likes this.
  5. Jake B.

    Jake B. Well-Known Member

    Any particular reason you can't use the Deferred system? I may have missed something as I've only skimmed through the post.
  6. Xon

    Xon Well-Known Member

    There is no guaranty the deferred system is running under different php timeouts, so it doesn't solve the 'long running single task' problem.

    The deferred system is just a method for automatically scheduling sliced up operations.
    batpool52! and cclaerhout like this.

Share This Page