Skip to main content
Integrations · MCP · API · BYOK · Network

Wired into every tool you already use.

Five day-one connectors. An MCP server so any AI client can tap into your company’s pulse. A public REST API and outbound webhooks for programmatic integration. Bring-your-own-LLM-key for procurement-friendly deployments. Federated insights without federated data.

Read MCP docs
at a glance
Connectors at GA
0
More by Q4
0
MCP tools exposed
0
Pulse Network customers
0
noteAll connectors are OAuth read-only · no agents installed · ACLs mirrored, never expanded.
Day-one connectors · D1

Five tools cover 85% of where work lives.

We could ship 100 connectors. We chose 5: the ones modern teams actually live in. More land through Q3/Q4 based on demand, never to chase a feature-checklist.

    Slack

    Channels, threads, DMs · ACL-mirrored

    GA

    GitHub

    Issues, PRs, reviews, commit metadata

    GA

    Notion

    Pages + databases · workspace ACLs

    GA

    Linear

    Cycles, projects, issues, comments

    GA

    Calendar

    Events you organize / attend · OOO

    GA

    Otter / Granola / Fireflies

    Meeting transcripts → Decision Memory

    Beta

    Google Drive

    Docs, Sheets, Slides · queued by demand

    Q3 2026

    Confluence

    Spaces, pages, comments

    Q3 2026

    Jira

    Issues, sprints, custom fields

    Q4 2026
MCP server · A3

Pulse in every AI tool you already use.

A standardized Model Context Protocol server at mcp.pulse.app. Any MCP client (Claude, Cursor, Cowork, ChatGPT) can connect via OAuth and query your company’s pulse. Read-only in v1, always permission-checked as the requesting user.

OAuth 2.0Per-user permission scope100 calls/hr free · 1k/hr paid
tools exposed via MCP
  • search_company

    Hybrid search across all entities · returns ranked results with citations

  • get_decision

    Fetch a decision by ID with full evidence trail

  • list_recent_decisions

    Last N decisions, filtered by team/topic/date

  • find_expert

    Given a topic or problem description, return ranked humans + availability

  • get_feature

    Full feature record: stages, owners, ship history, linked decisions

  • list_similar_features

    Given a one-paragraph description, return historically similar past features for Planner-style research

  • get_person

    Person profile + inferred skill graph + current availability

  • get_pulse

    User's current Pulse: briefing, diff, signals

  • list_commitments

    Open commitments by or to a given person, filterable by status

read-only todayWrite actions (create commitments, mark decisions resolved) are on the roadmap with explicit per-tool grants.

Claude Desktop config
{
  "mcpServers": {
    "pulse": {
      "url": "https://mcp.pulse.app",
      "transport": "sse",
      "auth": {
        "type": "oauth",
        "clientId": "$YOUR_CLIENT_ID"
      }
    }
  }
}
Public REST API · D7

Pull, post, subscribe.

Six REST endpoints over decisions, features, commitments, people, topics, and Ask. Outbound webhooks for every entity event Pulse emits, signed with HMAC-SHA256 so your handler can trust the payload. Generate a workspace-scoped bearer token at /app/admin/api-keys; subscribe to events at /app/admin/webhooks.

Bearer token100 req/min · 10K/dayHMAC-SHA256
Endpoints
  • GET
    /v1/decisions

    Filter by since · topic · cursor · limit

  • GET
    /v1/features

    Filter by stage · status (shipped / in-flight)

  • GET
    /v1/commitments

    Filter by owner · status

  • GET
    /v1/people

    List workspace members

  • GET
    /v1/topics

    List recurring topics

  • POST
    /v1/ask

    Body: { query, mode: "quick | deep | counter" }

  • POST
    /v1/documents

    Push a custom doc into Pulse · idempotent on retry · /docs/push-api

  • POST
    /v1/graphql

    Queryable knowledge graph · /docs/graphql

Responses are wrapped in { data, meta }; errors return { error } with a matching HTTP status. Cursor-paginated.

curl example
curl https://api.pulse.app/v1/decisions \
  -H "Authorization: Bearer pk_live_xxxxxxxxxxxx"
Outbound webhooks

Every entity event Pulse emits: decision.*, commitment.*, feature.*, agent_action.*, access_request.*, customer.churn_risk.

Delivery is at-least-once with exponential backoff (1m / 5m / 15m / 1h / 6h). Verify the X-Pulse-Signature header before trusting the payload; the verification snippet lives at the API docs.

Bring-your-own-key · B5

Your model contracts. Your audit trail.

Pipe your own Anthropic, OpenAI, Azure, or Bedrock key. Pulse charges a flat per-seat fee with no LLM markup; your model bill goes directly to your provider. Procurement-friendly · cost-passthrough · audit-ready.

Anthropic

Claude Sonnet · Opus · Haiku

flat per-seat · no LLM markup

OpenAI

GPT-4o · o-series · embeddings

flat per-seat · no LLM markup

Azure OpenAI

Enterprise tenant · regional endpoints

flat per-seat · no LLM markup

AWS Bedrock

Claude · Llama · provisioned throughput

flat per-seat · no LLM markup
Pulse Network · on the roadmap

Federated insights, never federated data.

Pulse calibrates against an opt-in network of customers, 47at last count. Aggregate signals like “companies your size estimate features at 4w but ship at 7w” become first-class context. Your data never leaves your tenant.

Opt-in onlyDifferential privacyAggregate · k ≥ 5

    Companies your size estimate features at 4w but ship at 7w on average

    based on 1,240 features across 47 customers

    67% of decision-to-execution gaps close within 21 days of being surfaced

    based on 3,820 decisions across 47 customers

    Teams with permission-aware AI see 2.4× the Ask Bar engagement

    based on longitudinal · 12 quarters

    Companies onboarding 3+ engineers simultaneously lose 1.5 quarters of velocity

    based on 82 hiring waves
Operating principles

What stays true across every integration.

Permission-aware everywhere

MCP queries run as the user · same Cerbos filter as in-app · the AI client only sees what the user can see in source systems.

Read-only by default

MCP today is read-only. Write actions are coming with explicit per-tool grants. Pulse never moves data without authorization.

Bring your own keys

Anthropic, OpenAI, Azure, Bedrock: pipe your own LLM key. Cost passthrough · audit-ready · no vendor lock-in.

Federated insights, never federated data

Pulse Network shares calibrated benchmarks across customers ('companies your size estimate at 4w, ship at 7w') without data ever leaving your tenant.

Common questions

Procurement-grade answers.

Most of these come up in evaluations. If yours isn’t here, ask directly.

Does Pulse store my source data?

Pulse keeps an indexed mirror with embedding vectors and structured entity extracts. Raw content is fetched on demand from the source system using your OAuth token. ACLs are revalidated per query.

What happens if I revoke a connector?

Within 60 seconds, all Pulse retrievals stop hitting that source. Within 24 hours, the indexed mirror is purged. Selective amnesia (#F63) supports finer-grained per-thread / per-doc removal.

Can I run Pulse against a single tool?

Yes. Connect Slack alone if that's where work lives. Each connector adds value independently; Pulse degrades gracefully when sources are missing.

How does BYOK pricing work?

On BYOK plans, Pulse charges a flat per-seat fee with no LLM markup. Your model bill goes directly to Anthropic / OpenAI / Azure. Suits enterprises with existing model contracts.

What MCP clients work?

Anything that speaks Model Context Protocol: Claude Desktop, Cursor, Cowork, ChatGPT (when their MCP support ships), and any custom client. We test deeply against Claude Desktop today.

Wire it in. See the difference in days.

Two weeks. Five connectors. A calibrated proof of concept against your actual data.