I think it depends on use cases and advancements in our understanding of time's effect on the DNA captured.
I.e. I could imagine that in the not too distant future, we know that DNA in air (exposed to sunlight) degrades at a certain percentage over time, and therefore could be used to determine if a person was near a given location recently. Sort of like carbon dating.
You could imagine law enforcement using this as a tool to find suspects: Drive around with a device that constantly captures air and checks it for the DNA of a suspect (which could have been found at the scene via other, more traditional methods) and then allows them to narrow down a persons location.
I think the precise location where the DNA is found is more important to proving a case than merely finding it. Environmental DNA lacks this information. When collecting DNA from a scene, I believe there are guidelines and really just plain common sense where you'd want to swab.
That is, finding someone's DNA in a common area is less convincing than finding it on the handle of a weapon. If environmental DNA is abused by an overzealous court, it may call into question the general effectiveness of DNA testing.
I remember a case study where they found the accused with DNA, only to find out, while the match was good, the match had been dead for a couple years and couldn't have been the perpetrator.
These "false positives" would make confidence in the system less.