Sigh; it seems it's only programmers who think CS has a monopoly on big-O notation, or keep calling it abuse of notation and trying to use ∈, when it's really = that's the standard notation in mathematics (and for good reason).
Before Knuth popularized Big O notation in CS and started the field of analysis of algorithms, already in 1958 N. G. de Bruijn wrote an entire book on Asymptotic Methods in Analysis (not CS): see a few of its leading pages here: https://shreevatsa.wordpress.com/2014/03/13/big-o-notation-a...
And the notation was already being used by Bachmann in 1894 and Landau by 1909 in analytic number theory, well before computers. It was perfectly commonplace to use big-O notation with the equals sign very quickly: see e.g. this paper by Hardy and Littlewood (https://projecteuclid.org/download/pdf_1/euclid.acta/1485887...) from 1914, well before even Turing machines or lambda calculus were formulated, let alone actual computers or analysis of algorithms.
Before Knuth popularized Big O notation in CS and started the field of analysis of algorithms, already in 1958 N. G. de Bruijn wrote an entire book on Asymptotic Methods in Analysis (not CS): see a few of its leading pages here: https://shreevatsa.wordpress.com/2014/03/13/big-o-notation-a...
And the notation was already being used by Bachmann in 1894 and Landau by 1909 in analytic number theory, well before computers. It was perfectly commonplace to use big-O notation with the equals sign very quickly: see e.g. this paper by Hardy and Littlewood (https://projecteuclid.org/download/pdf_1/euclid.acta/1485887...) from 1914, well before even Turing machines or lambda calculus were formulated, let alone actual computers or analysis of algorithms.