nvm, I wasnt even looking for that, I was looking for self.which_days.all()
Haudegen has quit
tsukasa_ joined the channel
EyePulp joined the channel
solomonix joined the channel
tsukasa_ has quit
SpeakerToMeat joined the channel
backnforth has quit
csk joined the channel
Melamo joined the channel
Melamo
is there a way in Django 1.11 to take advantage of PostgreSQL's upsert functionality without dropping down to raw SQL?
was hoping to do something like create_or_update(), but atomic without a unique index
savid joined the channel
savid has quit
FunkyBob
isn't it already wrapped in a transaction?
and what do you mean "without an unique index"?
ezarowny has quit
jhfisc joined the channel
jhfisc has quit
Melamo
think of the case of "create or update". If done w/ multiple SQL statements, you might do a SELECT, followed by an INSERT or UPDATE depending on the SELECT result. Wrapping those queries in a transaction does nothing to prevent a concurrent thread from doing an INSERT between your own SELECT and INSERT, unless the PG concurrency level is configured to be very high such as "serializable" described in https://www.postgresql.org/docs/9
a unique index in the DB can prevent multiple rows w/ the same unique value set, thus prevent duplicate rows from being created. The first succeeds, the 2nd ones throws an exception because of a unique constrain violation
mcspud
I thought all write operations were FIFO
IE - no parallel writes
schinckel is the man to ask for this
Melamo
it depends on the DB and the configuration, but sequential writes across all connections would be rather slow. It would basically be a global write lock ala old MongoDB
mcspud
hrmm
milardovich joined the channel
schinckel
Melamo: I've got a nice little upsert-alike.
mcspud
The actual write may be deferred, but the transaction log should have the data in it regardless?
schinckel
(Actually, bulk_update, but you can manage to do it as bulk_update + bulk_create, in a single @transaction.atomic)