Also, there are APIs for acquiring real-world mailing addresses inside of "subdivided" warehouses. The only way to verify that a person isn't a bot is to make babies with them.
I don't know - a guy on InfoWars was talking about how Hitler is still alive (sorry can't find the link). Getting rid of conspiracy type stuff like that would be a good start.
Who decides what's a conspiracy and what's an uncomfortable truth that powerful lobbies are effectively covering up? e.g. the broadest NSA SIG INT collection efforts, sub-concussive hits from football causing CTE, and sugar-rich "low fat" diets causing rampant diabetes were all at one point in time called baseless conspiracies by experts in these fields.
Actual evidence decides that. My general rule of thumb is, how easily supporters/detractors of an 'idea' let a third-party audit/research their findings and how they react to the results.
I'm sure Azure has great stats on paper, it's not the simple, easy-to-measure data that concerns us. I'm also sure that Azure is currently treating their clients really well, in the spirit of "the first hit is free". What is concerning, is not the effects of their current actions, but rather what their current actions say about their future plans, given MS' history of "embrace, extend, extinguish".
With MS, the EEE process should be your null hypothesis when you are trying to predict their behavior.
Having said that, I am not against the idea of taking advantage of an opponent who prostrates themselves, just make sure that you can safely extricate yourself from the trap. If you want to deploy on Azure, make sure it's a multi cloud deployment with some insulation above the infrastructure, like Mesos or Kube. Which precludes using the more specialized services, but seriously, who pipes unencrypted data through 3rd party message queues? That's just crazy.
In Qatar, which has a national firewall, I was able to use Tor to browse torrent sites, and then use a regular bittorrent client to fetch the contents of the magnet links (encrypted connections only). That worked great for the few years that I was there. The Qatari internet was a fair bit faster than what I am used to from Canada, about 30 - 50 MB/s at max torrent.
A CEO once told me, "If you owe someone $1 million, then they own you. If you owe someone $1 billion, then you own them." I think that the orgs who are owed by US are primarily concerned with maintaining the status quo, rather than gutting the golden goose.
Edit: Since this was down voted, I want to be clear: this is not snark. This was meant as a very real commentary on how important it is that we have the ability to inflate our way out of debt. It's critical to understanding the power dynamics at play here.
It reflects poorly on us that this is being downvoted. The parent shared a simple cautionary tale without baiting or flaming or anything like that. It is a good thing, to be aware of the worst case scenario; IME, projects that fail badly do so mainly for people-reasons / org-reasons, just like this.
It doesn't reflect poorly on anybody beyond said company that took six months to build a <form> with five <input>s which was clearly blighted by major issues regardless of which technology they chose for their <form>.
What's interesting is that, despite regurgitating the writing on the wall, OP still blames a JavaScript framework.
Yep, and personalization and A/B testing can be accommodated with query params. By whitelisting those AB params in the caching layers, you can mostly prevent a performance difference between A and B, something that may have messed up our results with a 3rd party AB testing service.
Yes, just like Intel. We engineered our way into this, we'll engineer our way out, and then smooth it all over with more engineering. Entropy wins again.
I've been ranting here occasionally about the infuriating unprofessionalism of a web app that can't operate without JS. There is very little that you can do with JS that can't also be done with the help of a server, unless you start making up overly-specific requirements about what technologies are used, or writing user stories for robots. As a diehard NoJS guy and a developer who uses React professionally, my community's willingness to accept of the problem, and obliviousness to the solutions established by the React devs themselves, are pretty embarrassing. And it's not like I hate the language, Node has been my go-to application server for years now (Clojure is displacing that for me, but I digress).
Last year, I was building a simple SVG based chart dashboard for internal usage. Being a NoJS guy, I would sometimes disable JS while developing, on purpose or otherwise, and aside from forcing me to manually hit refresh in the browser, things generally worked. I added a couple links (styled as buttons) for zooming, to supplement the JS based drag-window zoom, and let the browser scroll the potentially very wide SVG chart within a div. If necessary, I could have even embedded the whole thing in an iframe, to avoid triggering whole page reloads, but our caching story was tight enough to compensate. Also, the React-rendered SVG represented the bulk of the markup on the page anyway.
Interactive visualisation, no JS needed. It added maybe an extra 10% to my workload (we already had the SSR stack), and helped me to avoid a variety of little glitches that plague many client-only apps, glitches that users learn to tolerate with mild disgust. The satisfaction of seeing our in-house dashboard pop up "instantly" with data, while Parsely and New Relic were still churning spinners or stuttering while waiting on JS, or even waiting for initial data after waiting on JS, was very cathartic. TTI can equal TTR, we have the technology, we've had it for a decade or so.
The parameter you are looking for is delta-V, the total change in velocity that the craft can accelerate to. It relates to inverse specific thrust, which is the parameter discussed earlier, comparing slow heavy exhaust to fast light exhaust.