Use cases
Real patterns from real engineering teams. Each one replaced days of custom infrastructure with a 2-minute setup.
Keep services in sync without building event publishers in every service.
You have 5 microservices that need to react when an order status changes. Today, you either publish events from application code (duplicating logic across services) or poll a shared database (laggy and wasteful).
StreamHook watches your orders table and delivers change events to each service's webhook. When an order moves to "shipped," your shipping service, notification service, and analytics service all know within a second — without any code changes to your order service.
// shipping-service/webhook.js
app.post('/streamhook', (req, res) => {
const { operation, after } = req.body;
if (after.status === 'shipped') {
shipOrder(after.id, after.address);
}
res.status(200).send('ok');
}); Trigger emails, SMS, and push notifications the moment data changes.
Your users expect instant notifications — "Your order shipped!" "Payment received!" — but your notification system runs on a 5-minute cron job. Users complain about delays. Your team patches it with more frequent polling, burning compute.
StreamHook delivers a webhook event within 1 second of a database change. Wire it to your notification service and your users get instant alerts. No polling, no artificial delay.
// notification-service/webhook.js
app.post('/streamhook', (req, res) => {
const { operation, after, source } = req.body;
if (source.table === 'orders' && after.status === 'paid') {
sendEmail(after.customer_email, 'payment-received');
sendSMS(after.customer_phone, 'Payment received!');
}
res.status(200).send('ok');
}); Keep Elasticsearch, Algolia, or Typesense perfectly in sync with your database.
Your search index drifts out of sync with your database. Users search for products that have been updated or deleted. You run a full reindex every night, but that means up to 24 hours of stale results.
Every INSERT, UPDATE, and DELETE is streamed to your search service in real time. Your search index reflects your database within seconds, not hours.
// search-sync/webhook.js
app.post('/streamhook', async (req, res) => {
const { operation, after, before } = req.body;
if (operation === 'DELETE') {
await elasticsearch.delete({ id: before.id, index: 'products' });
} else {
await elasticsearch.index({ id: after.id, index: 'products', body: after });
}
res.status(200).send('ok');
}); Capture every change with before/after snapshots — without modifying application code.
Compliance requires a complete audit trail of data changes. Building audit logging into every service means scattered implementations, missed edge cases, and a maintenance nightmare.
StreamHook captures every change at the database level — including direct SQL updates, admin scripts, and migration changes that bypass your application layer. Every event includes full before/after snapshots.
// audit-service/webhook.js
app.post('/streamhook', async (req, res) => {
const { id, timestamp, source, operation, before, after } = req.body;
await auditStore.insert({
event_id: id,
table: source.table,
operation,
before_state: before,
after_state: after,
captured_at: timestamp,
});
res.status(200).send('ok');
}); Bust stale caches the instant underlying data changes.
"There are only two hard things in computer science: cache invalidation and naming things." Your Redis cache serves stale data because TTLs are a rough approximation. Setting them too short hurts performance; too long serves wrong data.
When a row changes, StreamHook tells your cache layer instantly. Invalidate the exact keys that need refreshing. No TTL guesswork. No stale data.
// cache-invalidator/webhook.js
app.post('/streamhook', async (req, res) => {
const { source, operation, after, before } = req.body;
const record = after || before;
await redis.del(`${source.table}:${record.id}`);
await redis.del(`${source.table}:list`);
res.status(200).send('ok');
}); Stream operational data to your analytics warehouse without ETL delays.
Your analytics team relies on nightly batch ETL jobs. They are always working with yesterday's data. Product decisions are based on stale metrics. A/B tests take an extra day to evaluate.
Stream every database change to your analytics pipeline in real time. Feed ClickHouse, BigQuery, or Snowflake with live data. Make decisions based on what is happening now.
// analytics-pipeline/webhook.js
app.post('/streamhook', async (req, res) => {
const { source, operation, after, timestamp } = req.body;
await bigquery.insert('raw_events', {
table: source.table,
operation,
payload: JSON.stringify(after),
event_time: timestamp,
});
res.status(200).send('ok');
}); Start free. Connect your database in 2 minutes. Build whatever your team needs.
No credit card required. Free forever on the Free plan.