Having some trouble with a bulk update
I have something like:
For some context, I'm generating 4 character codes in
This works, but only when I run it one by one. When trying it in a bulk update, it does the query for existing codes before it has been updated, giving a bit of a race condition and hitting that unique constraint error.
Was thinking maybe setting
Any ideas on how I can make this work? Or maybe my approach here is fundamentally flawed?
For some context, I'm generating 4 character codes in
generate_unique_code/2, and existing_codes is a list of all existing ones so the function can retry so it won't hit a unique constraint error on save.This works, but only when I run it one by one. When trying it in a bulk update, it does the query for existing codes before it has been updated, giving a bit of a race condition and hitting that unique constraint error.
Was thinking maybe setting
batch_size to 1, but that's getting me: * ** (Spark.Options.ValidationError) invalid value for :batch_size option: expected integer, got: nilAny ideas on how I can make this work? Or maybe my approach here is fundamentally flawed?
