So, the viewpoint seems to be that high complexity is analogous to low entropy (and that complexity can be measured, and predictions about the system's behavior can be made, using methods similar to those used in statistical physics)?
I think (I would have to reread it more carefully) is Parisi's argument is that probabilistic models can sometimes be necessary even with deterministic systems that are well-characterized because the systems are extremely sensitive to information that might be difficult to obtained (i.e., measure).
I think compression/algorithmic complexity frameworks are relevant in that they imply in complex systems deterministic-like prediction with very narrow posteriors will require larger and larger computational resources. Ie the concentration of the predictive posterior depends on the computational resources available.