Skip to content

Installation

Getting started

Install the CLI, verify the deterministic workflow first, then add an AI provider only when you want generation, healing, or crew workflows.

Requirements

What you need first

  • Node.js >= 20 with an LTS runtime
  • Git for diff-aware analysis
  • A repo with Playwright or Cypress tests already in place
First verification

The safest first run is deterministic

npx impact-gate impact --path . --since origin/main npx impact-gate plan --path . --since origin/main npx impact-gate gate --threshold 80 --path .

Install

Add the package to your project when you want it versioned with the test suite:

Terminal window
npm install -D @yasserkhanorg/impact-gate

Install globally only when you want ad hoc CLI access across many repositories:

Terminal window
npm install -g @yasserkhanorg/impact-gate

Verify

01

Confirm the CLI is available

Start by checking that the binary resolves and the command surface is visible from the current project.

Terminal window
npx impact-gate --help

You should see the core commands including impact, plan, gate, train, and the optional AI workflows. The best first run is still impact, then plan, then gate.

Optional: LLM Provider

Optional AI

Add a provider only when you want it

Crew workflows, test generation, and healing need an LLM provider, but the core CI commands do not.

Free path

Core commands work without any API key

impact, plan, gate, train --no-enrich, cost-report, and feedback all work on the deterministic path alone.

Set one of these environment variables:

Terminal window
# Anthropic
export ANTHROPIC_API_KEY=sk-ant-...
# OpenAI
export OPENAI_API_KEY=sk-...
# Ollama (free, runs locally)
export OLLAMA_BASE_URL=http://localhost:11434

Verify Provider Connectivity

Terminal window
npx impact-gate llm-health

This probes the configured provider, or the auto-detected provider if you rely on environment discovery, and reports whether it can accept requests and return responses.