No, because sqrt(n)/log(n) is not bounded by a constant as n approaches infinity. Consider: in base 10, if log(n)=10, then sqrt(n) is 5 digits; if log(n)=100, then sqrt(n) is 50 digits; if log(n)=1000, then sqrt(n) is 500 digits; etc.
Just as (e^n) grows faster (in the limit) than any polynomial of n, such as n^2 or n^100; likewise, log(n) grows slower than any polynomial of n, such as sqrt(n) = n^(1/2), or even n^(1/100).
They are both practically pretty slow-growing, though. It may well be that for practical sizes, constant factors matter more than the asymptotic difference on present computers.
Well, you can get sqrt to nontrivial sizes pretty easily, where you can basically never grow log to nontrivial sizes. Looking at 100 million, the square root is ten thousand and the base-2 log is 26.5. The base-2 log of a googol is 332 (and, just for fun, the square root is 100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000).
But I actually wanted to respond to this:
> log(n) grows slower than any polynomial of n, such as sqrt(n) = n^(1/2)
It's correct to say that log(n) grows slower than any positive exponent of n, but what you've said is wrong in two ways:
- f(n) = n^(1/2) is not a polynomial, as a polynomial in n can only feature nonnegative integer exponents of n.
- f(n) = 300 is a polynomial in n, but it grows more slowly than log(n). (It's also larger, for any plausible value of n at all. So constant factors do matter, but the difference between log(n) and sqrt(n) is quite noticeable.)