Gary, el Pingüino Artefacto
Gary, el Pingüino Artefacto
Explore posts from servers
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 11/28/2024 in #help
Recommended way for managing Postgres functions
Hi, I need to create multiple Postgres Functions, and they are very likely to change multiple times in the following days/weeks. I read that the recommended way is to create an empty migration file and add the sql there. That means for every change, I would need to create a new empty migration file. I was time about creating them on a directory with raw .sql files and creating them every time the project starts. With this way, a migration file isn't needed. Is this a good idea or may be go with the empty migration?
1 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 11/16/2024 in #help
Brand ids?
Hi, it's possible to mapped a primary key to a branded type from effect?
export type UserId = string & Brand.Brand<"UserId">
export const UserId = Brand.nominal<UserId>()

export const usersTable = pgTable("users", {
// Possible?
id: uuid().primaryKey().transform((value) => UserId(value)),
createdAt: timestamp().notNull().defaultNow(),
givenName: varchar({ length: 64 }).notNull(),
familyName: varchar({ length: 64 }).notNull(),
})
export type UserId = string & Brand.Brand<"UserId">
export const UserId = Brand.nominal<UserId>()

export const usersTable = pgTable("users", {
// Possible?
id: uuid().primaryKey().transform((value) => UserId(value)),
createdAt: timestamp().notNull().defaultNow(),
givenName: varchar({ length: 64 }).notNull(),
familyName: varchar({ length: 64 }).notNull(),
})
9 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 11/1/2024 in #help
Conditional batch on Neon Batch API?
Hi, it's possible to do something like:
const deleteImages = true // can be true or false

await db.batch([
db.update(users).set({ ... }).where(...),
deleteImages ? db.delete(userImages).where(...) : false
])
const deleteImages = true // can be true or false

await db.batch([
db.update(users).set({ ... }).where(...),
deleteImages ? db.delete(userImages).where(...) : false
])
3 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 10/14/2024 in #help
Applying drizzle migrations on Vercel with Hono
Hi, I'm trying to apply that migrations on the directory like this:
await migrate(drizzleDb, {
migrationsFolder: "./src/databases/tenants/migrations",
})
await migrate(drizzleDb, {
migrationsFolder: "./src/databases/tenants/migrations",
})
I'm using Nextjs, deployed on Vercel with a hono api. The path is /api/hono/tenants/create. But every time I get this error: Error: Can't find meta/_journal.json file at /var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:42887 at qp (/var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:43323) at /var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:44022 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async o (/var/task/.next/server/chunks/636.js:67:10412) at async Function.P [as begin] (/var/task/.next/server/chunks/636.js:67:9981) at async qf (/var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:43460) I have tried absolute urls, relative, using path.join, etc. Thanks for the help 😄
2 replies
TTCTheo's Typesafe Cult
Created by Gary, el Pingüino Artefacto on 10/14/2024 in #questions
Applying drizzle migrations with Hono
Hi, I'm trying to apply that migrations on the directory like this:
await migrate(drizzleDb, {
migrationsFolder: "./src/databases/tenants/migrations",
})
await migrate(drizzleDb, {
migrationsFolder: "./src/databases/tenants/migrations",
})
I'm using Nextjs, deployed on Vercel with a hono api. The path is /api/hono/tenants/create. But every time I get this error: Error: Can't find meta/_journal.json file at /var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:42887 at qp (/var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:43323) at /var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:44022 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async o (/var/task/.next/server/chunks/636.js:67:10412) at async Function.P [as begin] (/var/task/.next/server/chunks/636.js:67:9981) at async qf (/var/task/.next/server/app/api/hono/[[...route]]/route.js:2467:43460) I have tried absolute urls, relative, using path.join, etc. Thanks for the help 😄
3 replies
CCConvex Community
Created by Gary, el Pingüino Artefacto on 9/10/2024 in #support-community
How to aggregate documents (more than 16K)?
No description
21 replies
CCConvex Community
Created by Gary, el Pingüino Artefacto on 9/8/2024 in #support-community
The filter helper is limited by the 16384 documents scanned?
No description
2 replies
CCConvex Community
Created by Gary, el Pingüino Artefacto on 8/28/2024 in #support-community
Spamming function calls on stream OpenAI responses
Hi, I was looking at the convex-ai-chat repo and found this https://github.com/get-convex/convex-ai-chat/blob/main/convex/serve.ts#L70
const stream = await openai.chat.completions.create({
model: OPENAI_MODEL,
stream: true,
messages: [
{
role: "system",
content:
"Answer the user question based on the provided documents " +
"or report that the question cannot be answered based on " +
"these documents. Keep the answer informative but brief, " +
"do not enumerate all possibilities.",
},
...(relevantDocuments.map(({ text }) => ({
role: "system",
content: "Relevant document:\n\n" + text,
})) as ChatCompletionMessageParam[]),
...(messages.map(({ isViewer, text }) => ({
role: isViewer ? "user" : "assistant",
content: text,
})) as ChatCompletionMessageParam[]),
],
});
let text = "";
for await (const { choices } of stream) {
const replyDelta = choices[0].delta.content;
if (typeof replyDelta === "string" && replyDelta.length > 0) {
text += replyDelta;
await ctx.runMutation(internal.serve.updateBotMessage, {
messageId,
text,
});
}
}
const stream = await openai.chat.completions.create({
model: OPENAI_MODEL,
stream: true,
messages: [
{
role: "system",
content:
"Answer the user question based on the provided documents " +
"or report that the question cannot be answered based on " +
"these documents. Keep the answer informative but brief, " +
"do not enumerate all possibilities.",
},
...(relevantDocuments.map(({ text }) => ({
role: "system",
content: "Relevant document:\n\n" + text,
})) as ChatCompletionMessageParam[]),
...(messages.map(({ isViewer, text }) => ({
role: isViewer ? "user" : "assistant",
content: text,
})) as ChatCompletionMessageParam[]),
],
});
let text = "";
for await (const { choices } of stream) {
const replyDelta = choices[0].delta.content;
if (typeof replyDelta === "string" && replyDelta.length > 0) {
text += replyDelta;
await ctx.runMutation(internal.serve.updateBotMessage, {
messageId,
text,
});
}
}
Isn't a function trigger every time the a new token gets streamed?
2 replies
TTCTheo's Typesafe Cult
Created by Gary, el Pingüino Artefacto on 8/9/2024 in #questions
Extending React Markdown with any component
Hi, I'm trying to implement charts into the markdown langauge. Currently, I'm using Mermaid for some basic charts but I want to implement my own. For example:
pie title Pets adopted by volunteers
"Dogs" : 386
"Cats" : 85
"Rats" : 15
pie title Pets adopted by volunteers
"Dogs" : 386
"Cats" : 85
"Rats" : 15
This markdown code will render a pie chart using mermaid. How can I implement something to render my own react components? I know mermaid is open source and I could look into the code, but I want to first ask is someone knows an easier way. Also, I will be could to implement any react component because some cool functionally will be unlock.
2 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 8/7/2024 in #help
Get table definition?
How can I get a SQL to create a table based on a table. For example:
const files = pgTable(
"files",
{
id: uuid("id").primaryKey().defaultRandom(),
createdAt: timestamp("created_at", { mode: "date" }).notNull().defaultNow(),
url: text("url").notNull(),
size: decimal("size"),
contentType: text("content_type"),
originalName: text("original_name"),
path: text("path").notNull().default(""),
},
(table) => ({
pathIdx: index().on(table.path),
})
)


const definition = getTableDefinition(files);
const files = pgTable(
"files",
{
id: uuid("id").primaryKey().defaultRandom(),
createdAt: timestamp("created_at", { mode: "date" }).notNull().defaultNow(),
url: text("url").notNull(),
size: decimal("size"),
contentType: text("content_type"),
originalName: text("original_name"),
path: text("path").notNull().default(""),
},
(table) => ({
pathIdx: index().on(table.path),
})
)


const definition = getTableDefinition(files);
Definition should be something like this:
CREATE TABLE IF NOT EXISTS "files" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"created_at" timestamp DEFAULT now() NOT NULL,
"url" text NOT NULL,
"size" numeric,
"content_type" text,
"original_name" text,
"path" text DEFAULT '' NOT NULL
);
CREATE TABLE IF NOT EXISTS "files" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"created_at" timestamp DEFAULT now() NOT NULL,
"url" text NOT NULL,
"size" numeric,
"content_type" text,
"original_name" text,
"path" text DEFAULT '' NOT NULL
);
2 replies
TTCTheo's Typesafe Cult
Created by Gary, el Pingüino Artefacto on 7/31/2024 in #questions
Loading server components on the client
Have anyone tried loading server components in the client. Maybe something like this:
<ServerComponentLoader
Component={MyComponent}
someRandomProp="213"
anotherRandomProp={123}
/>
<ServerComponentLoader
Component={MyComponent}
someRandomProp="213"
anotherRandomProp={123}
/>
Or even with children:
<ServerComponentLoader
Component={CreatePostDialog}
suggestedTitle="Some title"
>
<Button>Create post</Button>
</ServerComponentLoader>
<ServerComponentLoader
Component={CreatePostDialog}
suggestedTitle="Some title"
>
<Button>Create post</Button>
</ServerComponentLoader>
2 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 6/3/2024 in #help
Helpers for querying in the new PostGIS Geometry type
Hi, I was playing around with the new types and I wondered if there are some utility helpers for querying the point. For example:
const stores = pgTable("stores", {
id: uuid("id").primaryKey().defaultRandom(),
name: text("name").notNull(),
location: geometry("location", { type: 'point', srid: 4326 }),
});

const nearStores = await db
.select({ id: stores.id, name: stores.name })
.from(stores)
.where(ST_DWithin(stores.location, [some_lon, some_lat], 1000))
const stores = pgTable("stores", {
id: uuid("id").primaryKey().defaultRandom(),
name: text("name").notNull(),
location: geometry("location", { type: 'point', srid: 4326 }),
});

const nearStores = await db
.select({ id: stores.id, name: stores.name })
.from(stores)
.where(ST_DWithin(stores.location, [some_lon, some_lat], 1000))
The SQL should be something like this:
SELECT id, name
FROM stores
WHERE ST_DWithin(location, ST_MakePoint(some_lon, some_lat)::geography, 1000);
SELECT id, name
FROM stores
WHERE ST_DWithin(location, ST_MakePoint(some_lon, some_lat)::geography, 1000);
Thanks 🙂
3 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 4/30/2024 in #help
Filtering a jsonb with the shape Array<{id:string,name:string}>
Hi. I have a query that returns all the posts with it's categories like this,
type Posts = {
id: string
title: string
categories: { id: string; name: string }[]
}[]
type Posts = {
id: string
title: string
categories: { id: string; name: string }[]
}[]
I'm trying to filter the posts based on it's categories. This is what I have Helpers:
function jsonBuildObject<T extends SelectedFields>(shape: T) {
const chunks: SQL[] = []

Object.entries(shape).forEach(([key, value]) => {
if (chunks.length > 0) {
chunks.push(sql.raw(`,`))
}

chunks.push(sql.raw(`'${key}',`))

// json_build_object formats to ISO 8601 ...
if (is(value, PgTimestampString)) {
chunks.push(sql`timezone('UTC', ${value})`)
} else {
chunks.push(sql`${value}`)
}
})

return sql<SelectResultFields<T>>`coalesce(json_build_object(${sql.join(chunks)}), '{}')`
}

function jsonAggBuildObject<T extends SelectedFields, Column extends AnyColumn>(
shape: T,
options?: { orderBy?: { colName: Column; direction: "ASC" | "DESC" } }
) {
return sql<SelectResultFields<T>[]>`coalesce(jsonb_agg(${jsonBuildObject(shape)}${
options?.orderBy
? sql`order by ${options.orderBy.colName} ${sql.raw(options.orderBy.direction)}`
: undefined
}), '${sql`[]`}')`
}

function coalesce<T>(value: SQL.Aliased<T> | SQL<T>, defaultValue: SQL) {
return sql<T>`coalesce(${value}, ${defaultValue})`
}
function jsonBuildObject<T extends SelectedFields>(shape: T) {
const chunks: SQL[] = []

Object.entries(shape).forEach(([key, value]) => {
if (chunks.length > 0) {
chunks.push(sql.raw(`,`))
}

chunks.push(sql.raw(`'${key}',`))

// json_build_object formats to ISO 8601 ...
if (is(value, PgTimestampString)) {
chunks.push(sql`timezone('UTC', ${value})`)
} else {
chunks.push(sql`${value}`)
}
})

return sql<SelectResultFields<T>>`coalesce(json_build_object(${sql.join(chunks)}), '{}')`
}

function jsonAggBuildObject<T extends SelectedFields, Column extends AnyColumn>(
shape: T,
options?: { orderBy?: { colName: Column; direction: "ASC" | "DESC" } }
) {
return sql<SelectResultFields<T>[]>`coalesce(jsonb_agg(${jsonBuildObject(shape)}${
options?.orderBy
? sql`order by ${options.orderBy.colName} ${sql.raw(options.orderBy.direction)}`
: undefined
}), '${sql`[]`}')`
}

function coalesce<T>(value: SQL.Aliased<T> | SQL<T>, defaultValue: SQL) {
return sql<T>`coalesce(${value}, ${defaultValue})`
}
Current query:
const sb = db.$with("sb").as(
db
.select({
...getTableColumns(posts),
categories: jsonAggBuildObject({
id: categories.id,
name: categories.name,
}).as("categories"),
})
.from(posts)
.leftJoin(postTags, eq(postTags.postId, posts.id))
.leftJoin(tags, eq(tags.id, postTags.tagId))
.groupBy(posts.id)
)

const postsWithCategories = await db
.with(postsQuery)
.select()
.from(postsQuery)
.where(
sql`${postsQuery.tags} @> '[{"id": "098b9acb-694d-4c00-b87f-64f9811f8810"},{"id":"46bba9b2-9b50-4e70-9284-6cddb2fe32d4"}]'`
)
const sb = db.$with("sb").as(
db
.select({
...getTableColumns(posts),
categories: jsonAggBuildObject({
id: categories.id,
name: categories.name,
}).as("categories"),
})
.from(posts)
.leftJoin(postTags, eq(postTags.postId, posts.id))
.leftJoin(tags, eq(tags.id, postTags.tagId))
.groupBy(posts.id)
)

const postsWithCategories = await db
.with(postsQuery)
.select()
.from(postsQuery)
.where(
sql`${postsQuery.tags} @> '[{"id": "098b9acb-694d-4c00-b87f-64f9811f8810"},{"id":"46bba9b2-9b50-4e70-9284-6cddb2fe32d4"}]'`
)
4 replies
CCConvex Community
Created by Gary, el Pingüino Artefacto on 3/28/2024 in #support-community
Syncing with Elasticsearch or any search engine
Hi. Is there any guide or blog of how to sync Convex data with Elasticsearch? I have being developing an Ecommerce App to learn Convex and I would like to add some analytics and facet filters. For example: Most sell items on a date range. Orders with more items. Most active users per country. I have being using raw JavaScript with some indexes for the queries, but some of them need "Join-Like". I know there is an Airbyte Connector but I'm not very familiar with Airbyte.
16 replies
DTDrizzle Team
Created by Gary, el Pingüino Artefacto on 3/3/2024 in #help
sql operator with array of strings
This works
const example = await db
.select({
col: sql`ARRAY['1', '2', '3']`,
})
.from(users)
.limit(1)
const example = await db
.select({
col: sql`ARRAY['1', '2', '3']`,
})
.from(users)
.limit(1)
This also works
const example = await db
.select({
col: sql`ARRAY['1', '2', '3'] @> ARRAY['1', '2']`,
})
.from(users)
.limit(1)
const example = await db
.select({
col: sql`ARRAY['1', '2', '3'] @> ARRAY['1', '2']`,
})
.from(users)
.limit(1)
This crashes
const example = await db
.select({
col: sql`ARRAY['1', '2', '3'] @> ${["1", "2"]}`,
})
.from(users)
.limit(1)
const example = await db
.select({
col: sql`ARRAY['1', '2', '3'] @> ${["1", "2"]}`,
})
.from(users)
.limit(1)
Is there any workaround to pass an array in the sql operator?
4 replies
TTCTheo's Typesafe Cult
Created by Gary, el Pingüino Artefacto on 2/20/2024 in #questions
React Grid Layout automatic detection
Hi. I'm working with React Grid Layout in a responsive way where I need to compute the correct layout based on some "type" of widget. Here is an example. Input:
[
{ id: "1", type: "simple-value" },
{ id: "2", type: "simple-value" },
{ id: "3", type: "simple-value" },
{ id: "4", type: "list" },
{ id: "5", type: "table" },
]
[
{ id: "1", type: "simple-value" },
{ id: "2", type: "simple-value" },
{ id: "3", type: "simple-value" },
{ id: "4", type: "list" },
{ id: "5", type: "table" },
]
Output:
{
xs: [
{ w: 1, h: 1, x: 0, y: 0, i: "1" },
{ w: 1, h: 1, x: 0, y: 1, i: "2" },
{ w: 1, h: 1, x: 0, y: 2, i: "3" },
{ w: 1, h: 2, x: 0, y: 3, i: "4" },
{ w: 1, h: 2, x: 0, y: 5, i: "5" },
],
sm: [
{ w: 1, h: 1, x: 0, y: 0, i: "1" },
{ w: 1, h: 1, x: 1, y: 0, i: "2" },
{ w: 1, h: 1, x: 0, y: 1, i: "3" },
{ w: 2, h: 1, x: 0, y: 2, i: "4" },
{ w: 1, h: 1, x: 1, y: 1, i: "5" },
],
md: [
{ w: 1, h: 1, x: 0, y: 0, i: "1" },
{ w: 1, h: 1, x: 1, y: 0, i: "2" },
{ w: 1, h: 1, x: 2, y: 0, i: "3" },
{ w: 2, h: 1, x: 0, y: 1, i: "4" },
{ w: 1, h: 2, x: 2, y: 1, i: "5" },
],
lg: [
{ w: 2, h: 1, x: 0, y: 0, i: "1" },
{ w: 2, h: 1, x: 2, y: 0, i: "2" },
{ w: 2, h: 1, x: 4, y: 0, i: "3" },
{ w: 4, h: 1, x: 0, y: 1, i: "4" },
{ w: 2, h: 2, x: 4, y: 1, i: "5" },
],
xl: [
{ w: 3, h: 1, x: 0, y: 0, i: "1" },
{ w: 3, h: 1, x: 3, y: 0, i: "2" },
{ w: 3, h: 1, x: 6, y: 0, i: "3" },
{ w: 6, h: 1, x: 0, y: 1, i: "4" },
{ w: 3, h: 2, x: 6, y: 1, i: "5" },
],
"2xl": [
{ w: 3, h: 1, x: 0, y: 0, i: "1" },
{ w: 3, h: 1, x: 3, y: 0, i: "2" },
{ w: 3, h: 2, x: 9, y: 0, i: "3" },
{ w: 9, h: 1, x: 0, y: 1, i: "4" },
{ w: 3, h: 1, x: 6, y: 0, i: "5" },
],
}
{
xs: [
{ w: 1, h: 1, x: 0, y: 0, i: "1" },
{ w: 1, h: 1, x: 0, y: 1, i: "2" },
{ w: 1, h: 1, x: 0, y: 2, i: "3" },
{ w: 1, h: 2, x: 0, y: 3, i: "4" },
{ w: 1, h: 2, x: 0, y: 5, i: "5" },
],
sm: [
{ w: 1, h: 1, x: 0, y: 0, i: "1" },
{ w: 1, h: 1, x: 1, y: 0, i: "2" },
{ w: 1, h: 1, x: 0, y: 1, i: "3" },
{ w: 2, h: 1, x: 0, y: 2, i: "4" },
{ w: 1, h: 1, x: 1, y: 1, i: "5" },
],
md: [
{ w: 1, h: 1, x: 0, y: 0, i: "1" },
{ w: 1, h: 1, x: 1, y: 0, i: "2" },
{ w: 1, h: 1, x: 2, y: 0, i: "3" },
{ w: 2, h: 1, x: 0, y: 1, i: "4" },
{ w: 1, h: 2, x: 2, y: 1, i: "5" },
],
lg: [
{ w: 2, h: 1, x: 0, y: 0, i: "1" },
{ w: 2, h: 1, x: 2, y: 0, i: "2" },
{ w: 2, h: 1, x: 4, y: 0, i: "3" },
{ w: 4, h: 1, x: 0, y: 1, i: "4" },
{ w: 2, h: 2, x: 4, y: 1, i: "5" },
],
xl: [
{ w: 3, h: 1, x: 0, y: 0, i: "1" },
{ w: 3, h: 1, x: 3, y: 0, i: "2" },
{ w: 3, h: 1, x: 6, y: 0, i: "3" },
{ w: 6, h: 1, x: 0, y: 1, i: "4" },
{ w: 3, h: 2, x: 6, y: 1, i: "5" },
],
"2xl": [
{ w: 3, h: 1, x: 0, y: 0, i: "1" },
{ w: 3, h: 1, x: 3, y: 0, i: "2" },
{ w: 3, h: 2, x: 9, y: 0, i: "3" },
{ w: 9, h: 1, x: 0, y: 1, i: "4" },
{ w: 3, h: 1, x: 6, y: 0, i: "5" },
],
}
5 replies
TTCTheo's Typesafe Cult
Created by Gary, el Pingüino Artefacto on 1/13/2024 in #questions
Vercel stream keep alive???
I just found this git repo for publishing messages with NextJS and Upstash and I got some questions. https://github.com/rishi-raj-jain/upstash-nextjs-publish-messages-with-sse-example/blob/master/app/api/stream/route.js
// Can be 'nodejs', but Vercel recommends using 'edge'
export const runtime = 'nodejs'

// Prevents this route's response from being cached
export const dynamic = 'force-dynamic'

// Use ioredis to subscribe
import Redis from 'ioredis'

// Define the key to listen and publish messages to
const setKey = 'posts'

// Create a redis subscriber
const redisSubscriber = new Redis(process.env.UPSTASH_REDIS_URL)

export async function GET() {
const encoder = new TextEncoder()
// Create a stream
const customReadable = new ReadableStream({
start(controller) {
// Subscribe to Redis updates for the key: "posts"
// In case of any error, just log it
redisSubscriber.subscribe(setKey, (err) => {
if (err) console.log(err)
})
// Listen for new posts from Redis
redisSubscriber.on('message', (channel, message) => {
// Send data with the response in the SSE format
// Only send data when the channel message is reeived is same as the message is published to
if (channel === setKey) controller.enqueue(encoder.encode(`data: ${message}\n\n`))
})
},
})
// Return the stream and try to keep the connection alive
return new Response(customReadable, {
// Set headers for Server-Sent Events (SSE) / stream from the server
headers: { 'Content-Type': 'text/event-stream; charset=utf-8', Connection: 'keep-alive', 'Cache-Control': 'no-cache, no-transform', 'Content-Encoding': 'none' },
})
}
// Can be 'nodejs', but Vercel recommends using 'edge'
export const runtime = 'nodejs'

// Prevents this route's response from being cached
export const dynamic = 'force-dynamic'

// Use ioredis to subscribe
import Redis from 'ioredis'

// Define the key to listen and publish messages to
const setKey = 'posts'

// Create a redis subscriber
const redisSubscriber = new Redis(process.env.UPSTASH_REDIS_URL)

export async function GET() {
const encoder = new TextEncoder()
// Create a stream
const customReadable = new ReadableStream({
start(controller) {
// Subscribe to Redis updates for the key: "posts"
// In case of any error, just log it
redisSubscriber.subscribe(setKey, (err) => {
if (err) console.log(err)
})
// Listen for new posts from Redis
redisSubscriber.on('message', (channel, message) => {
// Send data with the response in the SSE format
// Only send data when the channel message is reeived is same as the message is published to
if (channel === setKey) controller.enqueue(encoder.encode(`data: ${message}\n\n`))
})
},
})
// Return the stream and try to keep the connection alive
return new Response(customReadable, {
// Set headers for Server-Sent Events (SSE) / stream from the server
headers: { 'Content-Type': 'text/event-stream; charset=utf-8', Connection: 'keep-alive', 'Cache-Control': 'no-cache, no-transform', 'Content-Encoding': 'none' },
})
}
Isn't this going to timeout or use a shit amount of function execution? What happend when deploy to the edge? Does this even work without the bill going to the moon?
2 replies
TtRPC
Created by Gary, el Pingüino Artefacto on 2/4/2023 in #❓-help
Extending middlewares
3 replies