I don't think the analogy holds for two reasons (which cut in opposite directions from the perspective of fourth amendment jurisprudence, fwiw).
First, the dragnet surveillance that Google performs is very different from the targeted surveillance that can be performed by a drug dog. Drug dogs are not used "everywhere and always"; rather, they are mostly used in situations where people have a less reasonable expectation of privacy than the expectation they have over their cloud storage accounts.
Second, the nature of the evidence is quite different. Drug-sniffing dogs are inscrutable and non-deterministic and transmit handler bias. Hashing algorithms can be interrogated and are deterministic and do not have such bias transferal issues; collisions do occur, but are rare, especially because the "search key" set is so minuscule relative to the space of possible hashes. The narrowness and precision of the hashing method preserves most of the privacy expectations that society is currently willing to recognize as objectively reasonable.
Here we get directly to the heart of the problem with the fictitious "reasonable person" used in tests like the Katz test, especially in cases where societal norms and technology co-evolve at a pace far more rapid than that of the courts.
First, the dragnet surveillance that Google performs is very different from the targeted surveillance that can be performed by a drug dog. Drug dogs are not used "everywhere and always"; rather, they are mostly used in situations where people have a less reasonable expectation of privacy than the expectation they have over their cloud storage accounts.
Second, the nature of the evidence is quite different. Drug-sniffing dogs are inscrutable and non-deterministic and transmit handler bias. Hashing algorithms can be interrogated and are deterministic and do not have such bias transferal issues; collisions do occur, but are rare, especially because the "search key" set is so minuscule relative to the space of possible hashes. The narrowness and precision of the hashing method preserves most of the privacy expectations that society is currently willing to recognize as objectively reasonable.
Here we get directly to the heart of the problem with the fictitious "reasonable person" used in tests like the Katz test, especially in cases where societal norms and technology co-evolve at a pace far more rapid than that of the courts.