camel explain
Explain what a Camel route does using AI/LLM
Options
| Option | Description | Default | Type |
|---|---|---|---|
| API key for authentication. Also reads OPENAI_API_KEY or LLM_API_KEY env vars | String | |
| API type: 'ollama' or 'openai' (OpenAI-compatible) | ollama | ApiType |
| Include Camel Catalog descriptions in the prompt | boolean | |
| Output format: text, markdown | text | String |
| Model to use | DEFAULT_MODEL | String |
| Show the prompt sent to the LLM | boolean | |
| Stream the response as it’s generated (shows progress) | true | boolean |
| Custom system prompt | String | |
| Temperature for response generation (0.0-2.0) | 0.7 | double |
| Timeout in seconds for LLM response | 120 | int |
| LLM API endpoint URL. Auto-detected from 'camel infra' for Ollama if not specified. | String | |
| Include detailed technical information | boolean | |
| Display the help and sub-commands | boolean |
Examples
The camel explain command uses AI/LLM to explain Camel routes in plain English. It supports multiple LLM providers including Ollama (local), OpenAI, Azure OpenAI, vLLM, LM Studio, and LocalAI.
Basic Usage
Explain a YAML route:
camel explain my-route.yaml Explain a Java route:
camel explain OrderRoute.java Explain multiple route files:
camel explain route1.yaml route2.xml MyRoute.java Output Options
Use verbose mode for detailed technical information:
camel explain my-route.yaml --verbose Output as Markdown for documentation:
camel explain my-route.yaml --format=markdown Prompt Options
Include Camel Catalog descriptions for more accurate explanations:
camel explain my-route.yaml --catalog-context Show the prompt sent to the LLM (useful for debugging):
camel explain my-route.yaml --show-prompt Use a custom system prompt:
camel explain my-route.yaml --system-prompt="Focus on error handling and security aspects." LLM Configuration
Use OpenAI or compatible services:
camel explain my-route.yaml --url=https://api.openai.com --api-type=openai --api-key=sk-... Use environment variables for the API key:
export OPENAI_API_KEY=sk-...
camel explain my-route.yaml --url=https://api.openai.com --api-type=openai Use a specific model:
camel explain my-route.yaml --model=llama3.1:70b