execute only one of many duplicate jobs with sidekiq?

My initial suggestion would be a mutex for this specific job. But as there's a chance that you may have multiple application servers working the sidekiq jobs, I would suggest something at the redis level.

For instance, use redis-semaphore within your sidekiq worker definition. An untested example:

def perform
  s = Redis::Semaphore.new(:map_reduce_semaphore, connection: "localhost")

  # verify that this sidekiq worker is the first to reach this semaphore.
  unless s.locked?

    # auto-unlocks in 90 seconds. set to what is reasonable for your worker.
    s.lock(90)
    your_map_reduce()
    s.unlock
  end
end

def your_map_reduce
  # ...
end

https://github.com/krasnoukhov/sidekiq-middleware

UniqueJobs Provides uniqueness for jobs.

Usage

Example worker:

class UniqueWorker
  include Sidekiq::Worker

  sidekiq_options({
    # Should be set to true (enables uniqueness for async jobs)
    # or :all (enables uniqueness for both async and scheduled jobs)
    unique: :all,

    # Unique expiration (optional, default is 30 minutes)
    # For scheduled jobs calculates automatically based on schedule time and expiration period
    expiration: 24 * 60 * 60
  })

  def perform
    # Your code goes here
  end
end