diff --git a/README.md b/README.md
index 5ca1d1ce..71f6d300 100644
--- a/README.md
+++ b/README.md
@@ -1080,7 +1080,7 @@ curl -v http://host.docker.internal:8000/api/version
Run CAI against any target
-
+
The starting user prompt in this case is: `Target IP: 192.168.3.10, perform a full network scan`.
@@ -1090,7 +1090,7 @@ The agent started performing a nmap scan. You could either interact with the age
How do I interact with the agent? Type twice CTRL + C
-
+
If you want to use the HITL mode, you can do it by presssing twice ```Ctrl + C```.
This will allow you to interact (prompt) with the agent whenever you want. The agent will not lose the previous context, as it is stored in the `history` variable, which is passed to it and any agent that is called. This enables any agent to use the previous information and be more accurate and efficient.
@@ -1101,7 +1101,7 @@ This will allow you to interact (prompt) with the agent whenever you want. The a
Use ```/model``` to change the model.
-
+
@@ -1111,7 +1111,7 @@ Use ```/model``` to change the model.
Use ```/agent``` to list all the agents available.
-
+
@@ -1120,7 +1120,7 @@ Use ```/agent``` to list all the agents available.
Where can I list all the environment variables? /config
-
+
@@ -1158,7 +1158,7 @@ This command displays:
How to know more about the CLI? /help
-
+
@@ -1167,7 +1167,7 @@ This command displays:
The environment variable `CAI_TRACING` allows the user to set it to `CAI_TRACING=true` to enable tracing, or `CAI_TRACING=false` to disable it.
When CAI is prompted by the first time, the user is provided with two paths, the execution log, and the tracing log.
-
+
@@ -1189,7 +1189,7 @@ CAI>/load parallel # Match to configured parallel
CAI prints the path to the current run’s JSONL log at startup (highlighted in orange), which you can pass to `/load`:
-
+
Legacy notes: earlier “memory extension” mechanisms (episodic/semantic stores and offline ingestion) are retained for reference only. See [src/cai/agents/memory.py](src/cai/agents/memory.py) for background and legacy details. Our current direction prioritizes ICL over persistent memory.