Solving Duplicate job processing – ShouldBeUnique

You can use ShouldBeUnique on the job class to avoid adding duplicate jobs to the queue if there is another job of same id in the queue or job is still processing.

Different request will add the same job to queue soon after previous job of same id finishes processing. ShouldBeUnique helps to prevent duplicate jobs on the queue. Below shows how to implement this on a job class.

  1. First implement ShouldBeUnique interface on the job class.
  2. Use uniqueId() method to set unique id for the job.

ShouldBeUnique

  1. Duplicate jobs with same id will be discarded within the same request. Not with other requests. Second request will place another job to queue if first job is done processing.
  2. If application received a second request while processing the same job, job will be discarded. Because job with same id is in the queue.
  3. Uniquefor attribute only valid for the same request for identifying unique jobs. Second request will place another job regardless of the uniquefor timeout. It will not place an job only if the job in process.
  4. uniquefor timeout is useful if you want to run the job with same id more than once within the same request. Not with different requests.
  5. Unique lock added to the job id releases once the job completed.

ShouldBeUniqueUntilProcessing

  1. ShouldBeUnique does not allow another request to push same job to queue. ShouldBeUniqueUntilProcessing allow another job with same id during job processing. Not till it is send from queue to processing.

ShouldBeEncrypted

  1. Use this interface to ensure privacy and intergrity.
  2. Job will be encrypted before sending to the queue.