You can use ShouldBeUnique on the job class to avoid adding duplicate jobs to the queue if there is another job of same id in the queue or job is still processing.
Different request will add the same job to queue soon after previous job of same id finishes processing. ShouldBeUnique helps to prevent duplicate jobs on the queue. Below shows how to implement this on a job class.
- First implement ShouldBeUnique interface on the job class.
- Use uniqueId() method to set unique id for the job.
ShouldBeUnique
- Duplicate jobs with same id will be discarded within the same request. Not with other requests. Second request will place another job to queue if first job is done processing.
- If application received a second request while processing the same job, job will be discarded. Because job with same id is in the queue.
- Uniquefor attribute only valid for the same request for identifying unique jobs. Second request will place another job regardless of the uniquefor timeout. It will not place an job only if the job in process.
- uniquefor timeout is useful if you want to run the job with same id more than once within the same request. Not with different requests.
- Unique lock added to the job id releases once the job completed.
ShouldBeUniqueUntilProcessing
- ShouldBeUnique does not allow another request to push same job to queue. ShouldBeUniqueUntilProcessing allow another job with same id during job processing. Not till it is send from queue to processing.
ShouldBeEncrypted
- Use this interface to ensure privacy and intergrity.
- Job will be encrypted before sending to the queue.