Swart
Swart
DTDrizzle Team
Created by Swart on 8/31/2024 in #help
Invalid Byte Encoding with Postgres JSONB columns
Minor update here - I've continued to be flummoxed by this. The errors have disappeared. The only change I have in place currently is sanitizing all input I get from API's with the following
const capturedParse = JSON.parse

...

const encoded = Buffer.from(responseText).toString('utf8')
return capturedParse(encoded)
const capturedParse = JSON.parse

...

const encoded = Buffer.from(responseText).toString('utf8')
return capturedParse(encoded)
I'll likely stop posting updates here as this isn't a drizzle issue (that I know of), but one thing I never figured out was how to reproduce this locally. Even defining strings with unicode strings like \u00f8 were handled correctly locally. Anyway, thank you for the help and if anyone finds this and the future and runs into the same problem please let me know as I'd love to understand what's going on.
10 replies
DTDrizzle Team
Created by Swart on 8/31/2024 in #help
Invalid Byte Encoding with Postgres JSONB columns
For completeness, I created the following type and used another serializer, which resolved the core issue.
import stringify from 'fast-json-stable-stringify'


export function customJsonb<TData>(name: string) {
return customType<{ data: TData, driverData: string }>({
dataType() {
return 'jsonb'
},
toDriver(value: TData): string {
return stringify(value)
},
})(name)
}
import stringify from 'fast-json-stable-stringify'


export function customJsonb<TData>(name: string) {
return customType<{ data: TData, driverData: string }>({
dataType() {
return 'jsonb'
},
toDriver(value: TData): string {
return stringify(value)
},
})(name)
}
Unfortunately though, I am now seeing the same error when updating / inserting normal varchar columns. I feel something is deeply wrong outside of drizzle and will continuing to investigate.
10 replies
DTDrizzle Team
Created by Swart on 8/31/2024 in #help
Invalid Byte Encoding with Postgres JSONB columns
Yea, I think it's something in one of the nuxt extensions I am using. Trying a different serializer is actually a great idea, I'll give that a shot tomorrow.
10 replies
DTDrizzle Team
Created by Swart on 8/31/2024 in #help
Invalid Byte Encoding with Postgres JSONB columns
I've investigated more and began dumping the hex output of the resultant string produced by JSON.stringify and the data going into my jsonb column. It turns out that when this error begins to occur, the that output has the Unicode code point encoded into it, NOT the hex value. I've done this by using the following
// eslint-disable-next-line node/prefer-global/buffer
const hex = Buffer.from(JSON.stringify(entries)).toString('hex')
// eslint-disable-next-line no-console
console.log('signups: detected failed update', { scheduleKey, error, hex, entries })
// eslint-disable-next-line node/prefer-global/buffer
const hex = Buffer.from(JSON.stringify(entries)).toString('hex')
// eslint-disable-next-line no-console
console.log('signups: detected failed update', { scheduleKey, error, hex, entries })
So for the error: invalid byte sequence for encoding "UTF8": 0xe9 0x73 0x20 the hex for the word that contains the incorrect symbol is 47 69 76 72 E9 73 20 instead of 47 69 76 72 C3 A9 73 20 Based on this, I think the error is actually in something hijacking the global JSON.stringify behavior.
10 replies
DTDrizzle Team
Created by Swart on 8/31/2024 in #help
Invalid Byte Encoding with Postgres JSONB columns
Oh, one additional thing to mention here - I cannot reproduce this at all on my machine. I can't even create dummy data that will cause this postgres error. If anything I'd like to understand how this is even possible with an update clause through the ORM.
10 replies