If you try hard enough, you can do anything computable with a single integer parameter - if your model is a Turing machine, and the parameter encodes a program.
It's perhaps less clever or cool than the hack above (by virtue of us being used to Turing machines), but serves as a friendly reminder that you can encode a lot of information in a long enough number.
The real lesson here is that "number of parameters" is not a useful measure of information content. The only useful measure of information content is information entropy, which is the logarithm of the number of distinguishable states. The base of the logarithm is arbitrary, but by modern convention is invariably taken to be 2. The resulting unit is the bit.
Number of parameters is fine when you're not trying to be sneaky with ridiculously precise constants. And it's more expressive in certain ways than a raw bit count. For general use you can impose reasonable limits, so that one parameter can't go overboard with bits. Something like a precision cap of one part per thousand and no going over a trillion.
Your example is vacuous. You cannot determine the information content of a sequence of numbers in isolation. You can only determine information content of a system with respect to a model that tells you how to use the state of the system to distinguish between a number of possible states. The information content of the system is the log of the number of possible states from which the state of the system allows you to select one.
Yes, algorithmic information theory is a thing. But neither it nor your example refute what I said (because what I said is in fact true).