Skip to content
Back to Insights
Notion API Integration Automation

The Notion API in Anger: Building Real Integrations That Last

Notion is wonderful as a workspace and frustrating as a database. The API works, but it has a personality. This is what we've learned shipping six Notion integrations into client workflows — the patterns that survive, and the ones we now refuse to use.

Codecanis Admin

9 min read
Workspace and notes
A Notion-backed knowledge base wired into Slack, Linear, and the engineering team's standup loop.

Notion is a wonderful product to live inside and a strange product to integrate against. The mental model is "rich text editor with embedded databases," and the API faithfully reflects that — which means every database row is also a page, every page is also a tree of blocks, and most operations require thinking about both layers simultaneously.

We've now shipped six Notion integrations into client workflows: a knowledge base for an engineering team that syncs with Linear, an editorial calendar that pushes to a CMS, a meeting-notes pipeline that posts summaries to Slack, an internal CRM, an OKR tracker, and a sales playbook. This is the playbook of what works, what doesn't, and what we now refuse to do.

The Block Model Is the API's Most Important Idea

Everything in Notion is a block. A page is a block that contains blocks. A database row is a page (which is a block) whose properties match the parent database's schema. A toggle is a block that contains other blocks. The API reflects this faithfully, which means almost no operation is a single API call.

Concrete example: "create a new task with a description, a checklist, and a code snippet." This is one logical operation, but it requires:

  1. POST /v1/pages with the database parent and the row's properties (title, status, due date).
  2. PATCH /v1/blocks/{page_id}/children to append the description paragraph, the checklist blocks, and the code block.

If you forget that the page-creation endpoint only sets properties and not content, you end up with an empty task and a confused user. We've seen this bug ship to production more than once.

Rate Limits and Pagination

Notion's official rate limit is 3 requests per second per integration, averaged over a short window. The API returns 429 with a Retry-After header, but in practice we see soft slowdowns before hard limits — the API takes a few hundred extra milliseconds per request as you approach the ceiling.

Pagination is cursor-based and asks you to actually follow the cursor:

import { Client } from "@notionhq/client";

const notion = new Client({ auth: process.env.NOTION_TOKEN });

async function queryAllPages(databaseId, filter) {
  const results = [];
  let cursor = undefined;

  do {
    const response = await notion.databases.query({
      database_id: databaseId,
      filter,
      start_cursor: cursor,
      page_size: 100,
    });

    results.push(...response.results);
    cursor = response.has_more ? response.next_cursor : undefined;

    // Soft self-throttle below the 3 req/sec ceiling
    if (cursor) await new Promise(r => setTimeout(r, 400));
  } while (cursor);

  return results;
}

The 400ms throttle is generous on purpose. Notion's rate limit is global across your integration's token, which means a sync job that hammers the API will starve any user-facing webhook handler running through the same integration. Reserve at least 30% of your rate budget for interactive operations.

Schema Drift Is the Real Enemy

Notion databases let users add, remove, and rename properties through the UI at any time. Your integration has been hardcoding properties.Status.select.name, and one morning the property is called "State" instead, and every sync fails silently because the field returns undefined and your code dutifully writes null everywhere.

Our defence is a two-part pattern:

First, retrieve the schema at startup and at a fixed interval, and verify that the properties your integration depends on still exist with the expected types:

const REQUIRED_PROPERTIES = {
  Name: "title",
  Status: "select",
  "Due Date": "date",
  Owner: "people",
  Priority: "select",
};

async function verifySchema(databaseId) {
  const db = await notion.databases.retrieve({ database_id: databaseId });
  const missing = [];

  for (const [name, expectedType] of Object.entries(REQUIRED_PROPERTIES)) {
    const prop = db.properties[name];
    if (!prop) {
      missing.push(`${name} (missing entirely)`);
      continue;
    }
    if (prop.type !== expectedType) {
      missing.push(`${name} (expected ${expectedType}, got ${prop.type})`);
    }
  }

  if (missing.length) {
    throw new SchemaDriftError(
      `Notion database schema drift detected: ${missing.join(", ")}`
    );
  }
}

Second, fail loudly on drift. Send a Slack alert. Pause the sync. Do not attempt to write partial data — silent partial writes are how you end up with thousands of corrupt rows that have to be reconciled by hand.

Rollups and Relations: Read-Only Land

Notion supports relations (foreign key-style links between databases) and rollups (aggregations across relations). The API can read both, but it cannot write rollups, and writing relations is awkward — you set the relation property to an array of page IDs, which means you need the linked record's UUID before you can create the link.

Practical implications:

  • If your integration computes derived values, store them in regular fields, not rollups. Update them yourself via the API.
  • When syncing data into Notion that has foreign keys, do a two-pass write: create all the entities first, then go back and set the relations once you have the page IDs.
  • Don't make rollups load-bearing in any external workflow — they're for human consumption only.

Webhooks vs Polling

Notion shipped a webhooks API in 2024 and it works, but with caveats. Webhooks fire on page property changes within databases the integration has access to, but they do not fire on every block-level edit — you can update a page's body content without triggering a webhook if no database property changed.

For our integrations that need to react to content edits (the knowledge base that syncs page bodies to a search index, the meeting notes pipeline), we still poll. The pattern:

// Poll for pages edited since last sync, using Notion's last_edited_time
async function pollEditedPages(databaseId, sinceIso) {
  return await queryAllPages(databaseId, {
    timestamp: "last_edited_time",
    last_edited_time: { on_or_after: sinceIso },
  });
}

// Persist the "watermark" — the last time we successfully synced.
// On every run, fetch pages edited at-or-after watermark, then advance it
// to (now - 60s) to handle clock skew and Notion's eventual consistency.
const watermark = await store.getWatermark("notion_sync");
const editedPages = await pollEditedPages(databaseId, watermark);

for (const page of editedPages) {
  await syncPage(page); // idempotent by page id
}

await store.setWatermark(
  "notion_sync",
  new Date(Date.now() - 60_000).toISOString(),
);

The 60-second backstep on the watermark is critical. Notion's last_edited_time is eventually consistent — you can edit a page and not have the new timestamp visible to the API for several seconds. Without the backstep you lose edits at the boundary.

Idempotency Through External Keys

Notion's internal page IDs (the UUIDs in page.id) are stable for the life of the page, but they're not useful as your integration's primary key — you don't know them until after you've created a page.

The pattern: store an external key as a Notion property (a "rich text" field works well; we call ours External ID). Your sync logic queries by external key first, updates if a page exists, creates if it doesn't:

async function upsertByExternalId(databaseId, externalId, properties, children) {
  // Look up by external id (idempotency key)
  const { results } = await notion.databases.query({
    database_id: databaseId,
    filter: {
      property: "External ID",
      rich_text: { equals: externalId },
    },
    page_size: 1,
  });

  if (results.length) {
    const pageId = results[0].id;
    await notion.pages.update({ page_id: pageId, properties });
    // For children, replace strategy: delete existing, append new
    return pageId;
  }

  const page = await notion.pages.create({
    parent: { database_id: databaseId },
    properties: {
      ...properties,
      "External ID": { rich_text: [{ text: { content: externalId } }] },
    },
    children,
  });
  return page.id;
}

This survives retries, recovers from partial failures, and gives you a clean way to reconcile state between Notion and your source system. Every Notion integration we ship now uses this pattern.

The Things We Refuse to Do

  • Two-way sync between Notion and another database. The conflict resolution problems are nightmarish given Notion's eventual consistency. Pick a direction. Make the other side read-only. Be at peace.
  • Use Notion as a customer-facing data store. Rate limits make any non-trivial read load infeasible. Use Notion for the internal team that authors content; use a real CDN or Postgres for the customer-facing surface.
  • Render arbitrary Notion blocks to HTML in production without an allow-list. Embeds, equations, file blocks, and the rest of the long tail will break your renderer in interesting ways. Limit yourself to the 8–10 block types you actually support and explicitly drop the rest.

Key Takeaways

  • Everything is a block. Page creation and content population are separate operations — never forget the second one.
  • The 3 req/sec rate limit is global per integration. Reserve 30% for interactive paths; throttle background jobs.
  • Verify the database schema at startup and fail loudly on drift. Silent partial writes are the worst outcome.
  • Rollups are read-only. Two-pass writes for relations. Compute derived values in regular fields.
  • Use webhooks for property changes; poll for body content edits with a watermark backstep for eventual consistency.
  • Idempotency through an External ID property. Upsert, never blind-create.
  • One-way sync only. Notion is a source or a sink — never both.
Let's build something

Want to work together?

If this article made you think about your architecture, your roadmap, or a problem you haven't solved yet — let's talk.