Skip to content

tl;dr

Overview

fault comes with two main capabilities in one CLI.

  • Fault Injection: operation oriented features
  • AI Agent: LLM-based features
---
config:
  theme: 'forest'
---
mindmap
  root((fault CLI))
    Fault Injection
      Proxy
      Scenario
    AI Agent
      Review
      MCP

Getting started with fault injection

The core of fault is its fault injection engine. It allows you to:

  • Inject faults into your services

    Run fault run to start injecting network failures

  • Automate these failures into YAML files that can be run from your CI

    Run fault scenario generate and fault scenario run to create YAML-based scenarios that can be stored alongside your code and executed from your CI.

Getting started with the AI Agent

If you are keen to get started with the AI-agent, the general steps are as follows:

  • Pick up your favorite LLM

    fault supports OpenAI, Gemini, OpenRouter and ollama. If you use any of the cloud-based LLMs, you will need to generate an API key. If you want privacy, go with ollama.

  • Configure your AI-Code editor

    Setup the editor of your choice so it knows how to find fault as a MCP server. Most of the time it's by adding a mcpServers object somewhere in their settings file.

Next Steps

  • Start exploring our tutorials to gently get into using fault.
  • Explore our How-To guides to explore fault's features.