Are JSON Arrays not supported for paramterization in the postgres driver?

Hey! I've been loving kysely so far and doing a ton of awesome code cleanup and refactoring with it. But I just hit a big snag with a seeming inability to use my json array columns for updates. It seems that the parameter parsing doesn't work with JSON. The types seem fine: https://kyse.link/?p=s&i=pc0iBgUrhjXxfAlSEh9i (Note that my actual app has well-defined types instead of the object primitive types used in the example) But then when I run this, I get { severity: 'ERROR', code: '22P02', detail: 'Expected ":", but found "}".', where: 'JSON data, line 1: {"{\"bar\":\"baz\"}"}\n' + "unnamed portal parameter $2 = '...'", file: 'jsonfuncs.c', line: '621', routine: 'json_ereport_error' } The only other question I could find related to this in here had a suggestion to stringify the values. But then it breaks the column interface typing and I have to lie about the type every time I use it. That doesn't seem right to me. One other minor, interesting note. The update with [] works, it only begins to fail if you try to update [{}].
Solution:
Yeah, the pg driver doesn't serialize arrays. I think the reasoning is that it doesn't know, when serializing, if the target column will be a postgres array or json. If you only make the update and insert types string, everything works correctly. The row type is inferred correctly for output data. https://kyse.link/?p=s&i=NBfnbfzwNBlZ0YJpc7dj...
Jump to solution
6 Replies
Solution
koskimas
koskimas12mo ago
Yeah, the pg driver doesn't serialize arrays. I think the reasoning is that it doesn't know, when serializing, if the target column will be a postgres array or json. If you only make the update and insert types string, everything works correctly. The row type is inferred correctly for output data. https://kyse.link/?p=s&i=NBfnbfzwNBlZ0YJpc7dj
significantotter
significantotter12mo ago
I see! I'll update my kanel types and build a helper, I think. Maybe I can use a utility type like Kanel does for ids to make sure that it gets processed correctly before use Thanks for the answer!
significantotter
significantotter12mo ago
Hm, I just found this PR from a year ago about a similar topic. I totally get your reasoning for not putting it into the main code, but I was wondering if you ever made progress on the generic serializer plugin? That kind of thing would be very helpful for my use case, to consolidate the serialization work by type and by input. For anyone who references this in the future, this was the solution I put together:
// in a helper file:

/**
* The node-pg driver automatically treats all array value parameters as postgres arrays
* In order to serialize JSONB arrays, we have to prepare it into a string-like value first
*/
export function sqlJSON<T>(object: T): RawBuilder<T> {
return sql`(${JSON.stringify(object)}::jsonb)`;
}

SheetRecordModel
/** ... */
public async update(
{ id }: { id: SheetRecordId },
args: SheetRecordUpdate
): Promise<SheetRecord> {
return await this.db
.updateTable('sheetRecord')
.set({
...args,
modifiers: args.modifiers !== undefined ? sqlJSON(args.modifiers) : undefined,
actions: args.modifiers !== undefined ? sqlJSON(args.actions) : undefined,
rollMacros: args.modifiers !== undefined ? sqlJSON(args.rollMacros) : undefined,
})
.where('sheetRecord.id', '=', id)
.returningAll()
.executeTakeFirstOrThrow();
}
/** ... */
}
// in a helper file:

/**
* The node-pg driver automatically treats all array value parameters as postgres arrays
* In order to serialize JSONB arrays, we have to prepare it into a string-like value first
*/
export function sqlJSON<T>(object: T): RawBuilder<T> {
return sql`(${JSON.stringify(object)}::jsonb)`;
}

SheetRecordModel
/** ... */
public async update(
{ id }: { id: SheetRecordId },
args: SheetRecordUpdate
): Promise<SheetRecord> {
return await this.db
.updateTable('sheetRecord')
.set({
...args,
modifiers: args.modifiers !== undefined ? sqlJSON(args.modifiers) : undefined,
actions: args.modifiers !== undefined ? sqlJSON(args.actions) : undefined,
rollMacros: args.modifiers !== undefined ? sqlJSON(args.rollMacros) : undefined,
})
.where('sheetRecord.id', '=', id)
.returningAll()
.executeTakeFirstOrThrow();
}
/** ... */
}
bombillazo
bombillazo12mo ago
We've come up wiht this issue as well, is there any way to run ad-hoc transformations on inserts column? We'd like to keep our real typing for a column even when inserting/updating. Making it a string when changing the column value makes it possible to pass any string without type checks. perhaps the proper solution is to define the column as an array of JSON (jsonb[]) instead of single jsonb
Igal
Igal11mo ago
@significantotter have you tried the plugin in the last comment? you could design your system to only allow a single point of writing those inserts, and let typescript or some additional validation tool protect you. anything else would be overkill imho.
significantotter
significantotter11mo ago
I didn't, because it was built into a sqlite driver package and I'm using postgres I did, however, end up with a workable (if unappealing) solution. I just manually use a function to generate a sql json array insert statement from the array. https://github.com/significantotter/kobold/blob/compendium/src/services/kobold/models/sheet-record.model.ts and here's that helper function: https://github.com/significantotter/kobold/blob/compendium/src/services/kobold/lib/kysely-json.ts
Want results from more Discord servers?
Add your server