🦞OpenClaw Guide
Sessions

Context Overflow — Prompt Too Large for Model

Long conversations cause 'context overflow' errors. How to fix and prevent it.

⚠️ The Problem

During a long conversation, the bot fails with: `` Context overflow: prompt too large for the model. Try again with less input or a larger-context model. `` This often triggers rate limit errors immediately after.

🔍 Why This Happens

Every AI model has a maximum context window (how much text it can process at once). When your conversation history plus the new message exceeds this limit, the request fails. Common triggers: - Very long conversations without starting fresh - Pasting large documents or code - Tools returning huge outputs (like browser page content)

The Fix

Start a fresh session to clear context:

bash
/new

Send this as a standalone message in the same chat.

To prevent future overflows, enable compaction:

bash
openclaw config set agents.defaults.compaction.mode safeguard

For regular cleanups, periodically send /new in long-running conversations.

If you need to work with large documents, consider:

- Breaking them into smaller chunks

- Using a model with larger context (like Claude with 200k tokens)

- Summarizing previous conversation before continuing

🔥 Your AI should run your business, not just answer questions.

We'll show you how.$97/mo (going to $197 soon)

Join Vibe Combinator →

📋 Quick Commands

CommandDescription
/newStart fresh session (in chat)
/resetAlternative to /new
openclaw config set agents.defaults.compaction.mode safeguardEnable auto-compaction

Related Issues

🐙 Your AI should run your business.

Weekly live builds + template vault. We'll show you how to make AI actually work.$97/mo (going to $197 soon)

Join Vibe Combinator →

Still stuck?

Join our Discord community for real-time help.

Join Discord