neoptex
Kafka Streams .NET – Is this the right approach for merging product data from Debezium?
Hi everyone 👋
I’m currently migrating a legacy system where product information is spread across 8 different tables in a SQL Server database.
📌 Goal:
Merge all the data from those tables into a single enriched product message, and store it in a NoSQL database.
🛠️ What I’ve done so far:
- I’m using Debezium to capture real-time changes from the SQL Server tables.
- Each table has its own Kafka topic.
- Then, I use Kafka Streams (via Streamiz.Kafka.Net) to perform joins between the KTables and produce the enriched product.
❓My questions:
- Is this a good fit for Kafka Streams?
Would you say this is the right approach for this type of use case?
- Normal behavior?
Each time I refresh Kafka UI or check the output topic, my app seems to process around 500 to 1000 messages. It feels a bit heavy or slow, so I’m wondering if I might be missing something in the setup.
To illustrate, I’m sharing below my example using only 2 tables.
And it’s this one that generates between 500 and 1000 messages in the output topic.
When I say “refresh,” I mean I’m spamming the F5 key on Kafka UI to see the number of messages appearing in the topic.
Thanks a lot in advance for any guidance! 🙏
11 replies