I think access to government services is an impoetant part of democracy. Because you have no alternative, they should be Held to higher Standards of privacy. Google doesn't need to know everything.
It would require a dev every time someone wants to track something new.
It would require rerunning ci/cd, testing, qa to bake it in, in case it fails and breaks something.
All of that is hours of resources which translates directly into money.
With GTM, planning still occurs, organization, but someone can try something, have a debugger to iterate on, once done, hit publish. No need for dev, testing, qa, ci/cd time, breaking, reverting, etc.
• Seeing device statistics to know which browsers/devices to support/optimize for.
• Reviewing page flows to understand how users navigate/understand the site. Is the navigation easy to understand? Are the right pages highly-visible?
• Seeing which pages have high drop off rates, indicating either a resolution or lost hope.
• Analyzing trends over time to better understand users and the topics they're focused on. Is there high traffic to covid-19 symptom pages? Or maybe student loan forgiveness resources?
I can see a lot of meaningful and actionable data being gleaned from such systems. It's much more difficult to make improvements without supporting data.
How can this not come from a self-hosted, secure and privacy-respecting, analytics tool?
Even their existing normal HTTP/HTTPS Web server access logs give them all of the things you listed. Even at almost the start of the Web, they did.
Google Tag Manager is a surveillance tool for the benefit of Google. And they pitch it to companies as:
> Google Analytics lets you measure your advertising ROI as well as track your Flash, video, and social networking sites and applications
That's the first blurb on the first Web search hit. It's targeted as a tool for marketing people who brag about the large size of their ad spend on Google AdWords, and then need to make Powerpoints to justify that.
And who are often mimicked by people who don't know any better, but think it's best practice, because they saw a grownup doing it. Or they copy&pasted it from somewhere without understanding.
And when it's on a gov't public information system, it's leaking data about citizens to a private company known to snoop on everything it can, even secretly and against reasonable expectation of privacy.
(For example, in this case, who would know that by using a prominent Web site of the federal government, their behavior on that site is leaked to Google, who, due to other snooping, can attribute it to them personally as an individual. Like, if they walked into a Federal building, to consult an official, and Google had placed hidden cameras and microphones, that it controlled, throughout the building, and even followed them to and from the federal building.)
And, technically, it introduces an additional security weakness, by loading and running code from some site not under gov't control. Which, as we just saw for the nth time yesterday, is almost never well-placed trust. And for no good reason; only mistaken-at-best reasons.
That's just an example. Most other techbro "best practice" third-party requests have similar problems, or even worse, and are similarly unnecessary.
All of these concerns are also well levied against private enterprise, which americans are loathe to actually regulate. If you want any hope of government services undercutting private enterprise (as you should) this attitude will just hamstring the effort.
Third party services that remain involved aren't "tooling". They're part of the final site, dragging in all of that terrible behavior of the surveillance industry. So yes it's reasonable to ask why one should have to suffer that to access a public service and/or by government requirement. If we had a US GDPR and some societal expectation of privacy letting us be reasonably sure those vendors were prohibited from creating surveillance dossiers on us it would be more reasonable, but US "governance" is actually skewed the exact opposite way.