The separation often makes sense with long lived data.
A lot of data is fleeting. For example when you construct a struct out of sub-pieces. Many of those probably just live on the stack or are otherwise temporary. This type of data is passed around, read, written to etc. until it has the right shape. It goes through a little pipeline and then often becomes part of a bigger, longer lived structure or at least has some impact on one.
Long lived data is different. It's often global or is at least seen by entire modules and so on. There it makes sense to think of commands and queries. A database is a typical example.
The core idea is to make mutations obvious.
This pattern emerges in different programming domains and has an influence on a lot of programming decisions: UIs, DBs, video games, paradigms, cloud architecture, managed caching, HTTP (safe vs unsafe methods) etc. under different names, so there's an universality to it.
But... I think if we learned anything from the paradigm craze(s) is that any pattern is useful until it isn't.
Long lived data tends to have additional value to it, and anything valuable gets loaded up with business rules.
Data that sticks around and hardly ever changes, we start layering more interpretations on top. Eventually for performance reasons, those functions turn into projections of the data that get pre-cooked instead of interpreted on every request. Materialized view. Data transformations into other tables, etc.
A lot of data is fleeting. For example when you construct a struct out of sub-pieces. Many of those probably just live on the stack or are otherwise temporary. This type of data is passed around, read, written to etc. until it has the right shape. It goes through a little pipeline and then often becomes part of a bigger, longer lived structure or at least has some impact on one.
Long lived data is different. It's often global or is at least seen by entire modules and so on. There it makes sense to think of commands and queries. A database is a typical example.
The core idea is to make mutations obvious.
This pattern emerges in different programming domains and has an influence on a lot of programming decisions: UIs, DBs, video games, paradigms, cloud architecture, managed caching, HTTP (safe vs unsafe methods) etc. under different names, so there's an universality to it.
But... I think if we learned anything from the paradigm craze(s) is that any pattern is useful until it isn't.