That's transport and message passing. The response isn't streamed. It's delivered as a single message when the task is complete. Don't be confused by the word "Streamable". That's just there because it's using SSE to stream a series of JSON-RPC messages from the Server to the Client. But the Response to any specific Request is a single monolithic message. In the this space, an LLM that supports streaming is sending the response to a request as partials as they are generated. This allows you to present the results faster and give a lower perceived latency. MCP *does not* support this by the current specs. As I said, you can extend MCP and provide these partials in ProgresNotification messages. Then you are using a non-standard spec extension.