`Oban.insert_all/1` doesn't enforce unique job constraints unless you're on the pro license

amos-kibet

amos-kibet

4 days ago

0 comments

Oban.insert_all/1 doesn’t enforce unique job constraints

Configured unique constraints on your Oban worker but still seeing duplicate jobs when using bulk inserts? The Basic engine only enforces uniqueness with Oban.insert/2, not insert_all/1.


💡 This drop builds on @almirsarajcic’s drop about preventing duplicate Oban jobs. If you haven’t read it yet, check it out first!


You’ve followed the advice and configured your worker with unique constraints:

defmodule MyApp.EmailWorker do
  use Oban.Worker,
    queue: :emails,
    unique: [
      period: {2, :minutes},
      keys: [:user_id],
      states: [:available, :scheduled, :executing]
    ]

  # ...
end

And you’re bulk inserting jobs for efficiency:

users
|> Enum.map(&EmailWorker.new(%{user_id: &1.id}))
|> Oban.insert_all()

But duplicates still get inserted! 🤯

Why?

From the Oban documentation:

🌟 Unique Jobs and Batching

Only the Smart Engine in Oban Pro supports bulk unique jobs. With the basic engine, you must use insert/3 to insert unique jobs one at a time.

The unique option on your worker is silently ignored when using Oban.insert_all/1 with the Basic (free) engine.

Solutions

Option 1: Use individual inserts

Trade efficiency for correctness—uniqueness works as expected:

Enum.each(users, fn user ->
  %{user_id: user.id}
  |> EmailWorker.new()
  |> Oban.insert()
end)

⚠️ Caveat: This creates N database round-trips, which can bottleneck your DB at scale.

Option 2: Custom ETS-based deduplication

Keep bulk insert efficiency with in-memory deduplication:

# Initialize ETS table at app startup
:ets.new(:email_dedup_cache, [:set, :public, :named_table])

# Check before inserting
defp should_enqueue?(user_id) do
  case :ets.lookup(:email_dedup_cache, user_id) do
    [{_, enqueued_at}] ->
      now = System.monotonic_time(:millisecond)
      now - enqueued_at >= :timer.minutes(2)
    [] ->
      true
  end
end

defp mark_enqueued(user_id) do
  :ets.insert(:email_dedup_cache, {user_id, System.monotonic_time(:millisecond)})
end

# Filter duplicates before bulk insert
users
|> Enum.filter(&should_enqueue?(&1.id))
|> Enum.map(fn user ->
  mark_enqueued(user.id)
  EmailWorker.new(%{user_id: user.id})
end)
|> Oban.insert_all()

Option 3: Upgrade to Oban Pro

Oban Pro’s Smart Engine supports bulk unique jobs natively—no custom logic needed.

Which should you choose?

Approach Best for
Individual inserts Small datasets, simplicity
ETS deduplication Large datasets, single-node deployments
Redis deduplication Distributed systems (multiple nodes)
Oban Pro Production systems with budget

Links

Comments (0)

Sign in with GitHub to join the discussion