Hacker News new | past | comments | ask | show | jobs | submit login
Migrating away from Google Maps and cutting costs (eventsofa.de)
299 points by ashitlerferad on Nov 6, 2018 | hide | past | favorite | 111 comments



In case it helps anybody else - we were recently looking for alternatives to Google Maps for reverse geocoding. We took property data from a local county, ran the centroids of a bunch of randomly selected parcels through 4 different reverse geocoding services, and compared the addresses we got back to the actual address of the parcel, as listed by the county.

Google was 97% accurate, at $4 per 1000 requests. Mapbox was 95% accurate, at $.50 per 1000 requests. TomTom was 73% accurate, at $.42 per 1000 requests. Location IQ (one of many providers simply running the open-source Nominatim project) was 12% accurate, at $.03 per 1000 requests.

To be fair to Location IQ / Nominatim, they had the right street the vast majority of the time, but were usually wrong about house number due to interpolating between address boundaries at cross streets defined in Census Data. If you need exact addresses, Nominatim probably isn't for you, but if you need a general location, it might work fine.

Also - this was one county of test data, so take it with a grain of salt.

Nevertheless, it gave us confidence that we could move away from the Google APIs, save 90% of our costs, and still have a high level of accuracy.


Have you tried HERE APIs[1] by any chance? I am curious how it stacks up against the other services.

[1] Link to example -> https://developer.here.com/api-explorer/rest/geocoder/revers...


We sued HERE API at a previous gig I was. We used it for standard geo-coding (also after Google starting asking high prices to use it) and the quality was very good.


sued -> used hopefully.


In America, it is considered polite to exchange suits and countersuits prior to engaging in business, similarly to laying down a friendly barrage of suppressing fire when entering a room.


So "sue yourself" is american english while "suit yourself" is british? Sorry I'm not a native speaker...


It was a joke :)


Geez, do people sue for anything in America?


Always remember to send a Thank You subpoena.


No, just in Florida.


Not surprised - Nominatim is based on OpenStreetMap, which simply hasn't collected exact house-level address data for most places. It wasn't a particular priority; getting approximately the right location along the street is more or less good enough for navigation purposes.


Another one to try is the geocoder team that spun out of Mapzen when it shut down.

https://geocode.earth/


Thanks Andrew! (one of our first customers)

geocode.earth founder here, happy to answer questions.

We have great reverse geocoding with OSM and OpenAddresses data.

Feel free to shoot me an email for an invite: julian@geocode.earth


@dbatten, LocationIQ team member here. We have a new API backed by a new geocoding engine that uses additional datasets (OA, GNAF, etc) currently in final stages of BETA. Could you shoot us an email at hello@locationiq.com and you can try it out. This should give you rooftop accuracy in a number of countries (US for sure) and street level accuracy in most others.


Saw your email... thanks! I'll check it out when I get a chance.


would it be possible for you to generate an “accuracy map” and select a provider based on an cost/accuracy trade-off, using a cheap provider in the common case, but using the most accurate provider in regions where the best provider is “much better”.

probably not worth the effort


To make a nation-wide accuracy map, you'd need to compile property data from every county (or equivalent) in the nation. Assuming every county publishes it publicly (I'm sure some don't), it would still be a ton of work...

The real catch is this, though: if you actually pulled it off, you would have just built a nearly flawless reverse geocoder, and you wouldn't need to use an external API at all. You could just look locations up in your huge property data set.


couldn’t the OP use the same method to determine accuracy that they used to compare providers?

it doesn’t need to be an accurate decider, just “good enough” in a majority of cases.


The methodology used to determine accuracy involved loading GIS data on all property in a given county. Therefore building a nation-wide accuracy map using that methodology would, by necessity, involve loading all property data nation-wide. And if you had all property data nation-wide, you wouldn't need a reverse geocoder. You'd have one.


We've put together an (objective) comparison tool here: https://www.geocod.io/compare/


Why do we even use server side rendered tiles at all? Why isn't the map rendered on the client, ideally with SVG? Is the code necessary too big? Can't we build / doesn't it exist already some lighter version to render land, sea, roads and rivers? What is the most difficult part, placing names without overlapping others (and deciding which names to render)?


This is where maps are headed. A good example of this style of rendering is Apple Maps and Mapbox GL (with JS, C++, et.al. bindings).

The issue is the rendering is actually somewhat resource intensive and browser support is thus far incomplete. But we’re getting there!


There is "Vector Tiles" which do client side rendering, look at OpenLayers, Tangram or MapboxGL.

Server side rendering can be less intensive for the client, since it just has to show an image.


Do you have the random list of parcels available by any chance? I would love to compare against Geocod.io. I suspect that we might able to achieve high 90's as well.



Thanks for posting this! Here's the result of geocoding the above list with geocod.io for anybody interested:

Forward Geocoding (Converting the addresses to latitude/longitude): https://gist.github.com/MiniCodeMonkey/13c02d45089478182c3d1...

Reverse Geocoding (Converting the lat/lon to addresses): https://gist.github.com/MiniCodeMonkey/13c02d45089478182c3d1...

Based on the above definition, geocod.io's accuracy for reverse geocoding of this list is 95.2%

Putting this together with dbatten's original findings, the list would look like this:

Google: 97% ($4 per 1,000)

Geocod.io: 95.2% ($0.50 per 1,000)

Mapbox: 95% ($0.50 per 1,000)

TomTom: 73%


How many queries do you do to make the savings worth the switch?


Our experience with the Google cost increase was similar. We were paying around $550/mo for the GMaps Platform pre-price-change. Over 80% of this cost was for dynamic maps usage. The rest was for use of the Google Places API.

When I got the notification in May that the rates were increasing, I didn't take it that seriously. We have a profitable company and I would have been more than happy to pay Google twice their rate for the premium services they offer. Nobody ever expects a sudden rate increase to be more than 20%, 30% or 40%, right???!

It was in July that the gravity of the situation hit me. I was seeing tweets and articles lamenting the rate increase. I thought to myself "huh, I better check this out". I did some quick math using the new rate card and nearly had a heart attack! Our bill was estimated to be ~$14,500 and the CLOCK WAS TICKING. We were facing a 2600% rate increase and had only 2 months to figure out a game plan. Our business was on the line!

I immediately determined that we were eligible for bulk pricing. However, Google will not sell you the bulk rates directly. You have to contract with an authorized 3rd party (re-seller, basically) to get the rates. So, we found a reseller and locked in the bulk rates. That brought our estimate down to around $12,000/mo. Better, but still a huge shock.

The next step was optimization. There was no way for us to reduce dynamic maps usage because it is such a core part of our products. So, we cut off almost all of our Places API usage and started using other services. Our estimated bill was now down to $9,000.

That's where we are today. We just got our first full month bill. It was a huge hit to our business, wiping out a significant portion of our profitability.

To be frank, I am pissed about this. I was more than happy to pay Google far more than they were charging us. We were always under the impression that rates would increase one day. But, to force an increase of this magnitude with such a short amount of lead time is pretty f&^%ing sh*&$tty coming from a billion dollar giant.

We're starting to test other mapping services. I met with a team member from Mapbox last week and am planning on testing their platform in December. Their quote for our usage needs? $550/mo.


Going from $550 to $14,500 per month is totally unacceptable.

This is exactly why I'll always be wary to depend on Google's services. And it isn't the first time they do this either.

First they monopolize the market via cheap rates and branding, then they make it exorbitantly expensive.


Amazon isn’t too far behind with that model.


What are you referring to? Amazon's been pretty good about reducing the cost of AWS features over time.


GCP also reduces costs as far as I know. The difference is that the cloud market has more competition


I'm referring to Amazon retail.


That business plan sounds like any other low profit, or unprofitble, free IT product company's.


>Going from $550 to $14,500 per month is totally unacceptable.

Wrong. It's completely acceptable. It's their business, and this is the risk you take when you make your business dependent on another business.

>This is exactly why I'll always be wary to depend on Google's services. And it isn't the first time they do this either. >First they monopolize the market via cheap rates and branding, then they make it exorbitantly expensive.

Google isn't the only company that does stuff like this. Almost any company would do this if they had the opportunity.


This doesn't make it acceptable. It sounds like what you're proposing is that a company must build its own streets, electrical grid, telecom network, generation plants, etc.

Very few business would do well in such an environment.


Also, I'll add that your comparison is apples-to-oranges. For the electrical part in particular, electric utilities in almost every country are highly regulated. They can't raise prices on a whim, even though they're monopolies; the government won't allow it. This doesn't apply to Google Maps. You're comparing services that are provided by a government, or are highly regulated, to a service that has governmental constraints whatsoever. If you don't like the price, don't buy it.


You need to define "acceptable". I see this a lot: someone makes a bold statement saying "xyz isn't acceptable!!!" Why not? Who are they to make that pronouncement? And how exactly are they going to enforce this thing not being acceptable?

If Google raising prices obscenely "isn't acceptable", then what are YOU going to do about it? Nothing, right? Is Google going to change their ways because a few people proclaim "this isn't acceptable!"? No. Therefore, it IS acceptable.

I'm waiting to be proven wrong.... which will only happen if this price-raising is canceled, which I don't think it will be.


No, he's saying that the business must weigh the cost of backups with the risk of needing them and determine if it is necesarry to fund creation of said backups.


maybe intolerable is a better word.


We had a similar experience. The first notices from google said we would be below the free tier so no problems. Turns out we passed the free tier the first hours of the first day of the month. Needless to say we are moving away from all google services we have used, either setting up dedicated servers or paying some other company for the service. In the case of the maps it's much cheaper to pay for a dedicated server with 1 GBps bandwidth than to continue with google maps. And we can update the maps ourselves if something isn't up to date. So the openstreetmaps organisation might be happy at least :-)


Stadia Maps founder here.

We'd be happy to discuss how we can meet your needs, as well, if you're interested. Feel free to reach-out directly: luke@stadiamaps.com.


Another problem with Google that you cannot really experiment with it for free. You have to add your credit card and there is no concept of using only the free quota.

E.g. I'd put up an experimental site which uses google maps free qouta only and I don't want to pay anything, since it's only an experiment, I'd be happy with my access blocked automatically for the rest of the month after the free quota is exhausted.

But AFAIK it cannot be done. You have to track you spendings and shut down the site manually when you are near the end of the free quota.

Google in the past helped people experimenting with its platform. This is not the case anymore with google maps.


I don't think that's completely fair. They do provide the 300USD credit, valid for 12 months after sign-up. And in response to another commenter below, in many countries I believe they allow sign-up for this trial with a bank account (they deposit a small sum and you verify receipt with the amount deposited). I have not tried this.

I provided my credit card for the trial. I was notified of the impending end of the trial, and was billed no further.

I do agree that the inability to set hard limits on usage is frustrating and somewhat concerning, however. Especially as they're apparently able to do so through the aforementioned trial termination mechanism.

And I further agree that, outside of this trial period, this policy does discourage experimentation. And that it is somewhat un-Google-like; somewhat surprising as I believe they're considered an underdog relative to AWS; and a bit disappointing.


This is true for all google cloud services. You have to enter a credit card number. You may 'intend' to use just 20$ a month and I believe you can set quota alerts, but you can't set a hard budget. Google will just bill for whatever you consumed.

I think that's ridiculous. It should deal-breaker problem for anyone running on a side project.


Can't you use a throwaway/generated credit-card?


Privacy.com and other virtual cards register as prepaid cards and are blocked by google.


That sucks. If inflexibility already shows at the entrance, then I'm not going to enter the place.


Hey, just to chime in. I'm a MSFTie, so no relevence to G, but have worked on products with similar limitations, and was usually the one arguing _against_ the CC friction at login.

What has become abundantly clear however is that CC gating is a fast way to reduce your malicious traffic _exponentially_; and especially if you're enterprise focused, this doesn't even impact your core strata much.

As a non-corporate-techie in my off-work-time, I absolutely understand the frustration at this inflexibility. During my day job where I'd have to clean up the fires that result from a fully open door policy however? I'd have a somewhat different take.

It's not 100% clear cut "the CC is all just to add spurious friction", is all I mean to say :)


Also I don't know if I'm missing something, but I just set up a small test site with a small amount of user traffic (<1k requests in total).

I totally can't tell how much this could cost me; I'm pretty sure I'm still under the free quota limit, but I can't see how far I'm into it. I see that there are requests being logged in the API metrics, but the billings pages show 0 usage across all the same metrics.


Right, and to even get an API key for experimentation you have to provide a credit card number, which a lot of students wouldn't have.


At this point I don't think they're really interested in accommodating students.


Couldn't you provide an e-card with a 1€ limit?

You would not be charged more but then would maybe be in breach of the contract passed with Google (?)


Sounds like a good way to get your Google account banned.


Which will then proceed to ban all related accounts that they suspect are yours too (so they try to ban a person, not an account). And god help you if it was work account or a linked one.


> Another problem with Google that you cannot really experiment with it for free. You have to add your credit card and there is no concept of using only the free quota.

We're here to help on that one. :) We strongly believe development and educational use should be painless, so testing locally (just use the right link!) or signing up for the free tier is as simple as possible.


Author here ( @codemonkeyism ), ready to answer questions. Blog seems down, we're haven't been slashdotted for a long time, sorry!

[Added some more caching, site works again slowly]


> Thanks to Google for giving us free Maps for some years

This attitude is something I truly appreciate. So often posts like these are angry rants about 'how dare they do this to me'. You seem to presume good intentions, which should be the default reaction in life.


One could argue that Google used search revenue for predatory pricing, which breaks anti-trust laws, to subsidise Maps and other services in order to build a monopoly.


It seemed a bit passive aggressive to me, but maybe I'm reading into it too much.


Sorry English isn't my native tongue, didn't intend it to be passive aggressive but meant it.


It kinda reminds me of someone thanking a cocaine dealer for giving them some free hits.


https://web.archive.org/web/20181102141738/https://www.event...

> we're haven't been slashdotted for a long time, sorry!

MapExample.png file is 1.1MB, it shouldn't be higher than 100KB given how little detail that picture has. Also 4.1MB of css for a page that has barely any styling?

Whatever you are doing you might be losing potential customers because they are unable to access that page through HN, because of poor optimization and scaling.


We use a CDN which reduces image size for our main site, not the blog though[0]. CSS is usually what the theme brings with it, but you're right of course.

[0] something on my agenda


What I don't understand is that you basically use an embedded map with a marker. But this remains free under the new pricing (I am using it on some of my projects) both on mobile and desktop, see there: https://developers.google.com/maps/documentation/embed/usage... https://mapsplatformtransition.withgoogle.com/calculator

So what were you paying for?


My understanding this works only for addresses and places, many of our customers want to move the marker around because default mapping doesn't reflect the location of their venue (ship, museum entry, backyard,...).

Will add this clarification and your links to the article if site works again though, thanks.

[Also currently not online are maps with search results, we removed to cut costs but will add back again]


Hi, why not set up Cloudflare to handle the traffic? Sorry if this sounds off topic.


Oh this isn't that easy for GDPR reasons ;-) Also this is just the blog, I don't to play with DNS and risk site problems for our blog. But I've thought about it, thanks!


could you expand on that? What exactly is preventing you from proxying your blog through cloudflare?

You're not processing customer data there are you?


No Cloudflare specialist, isn't Cloudflare depending on DNS mappings sending all domain traffic through Cloudflare? How would we send only www.eventsofa.de/blog through Cloudflare?


You could switch to blog.eventsofa.de.

That wouldn't solve today's traffic spike, and there are probably some lower-hanging fruits that should at the very least keep the site online. Anything sold in the last five years with an ethernet port should easily be able to handle a static site even during rare spikes in traffic.

But the Cloudflare DNS configuration isn't something to be scared of: I used to feel similarly nervous when fiddling around with DNS, but haven't had a single problem configuring 20+ sites with Cloudflare. (I would, however, be slightly more nervous about google's reaction to the move to a subdomain this would require.)


Thanks for the long feedback, we have a long tail of content in the blog so we won't migrate to a subdomain. Google says no problem, but we had SEO problems in the past when migrating subdomains although everything done by the book with redirects.


Doesn't work, but the assumption was on that most blogs have a separate subdomain, so that the subdomain could be redirected to cloudflare. That said, if you can't handle a mild HN attack, you should be weary of your DDoS resistance, and consider ways to do more caching in front. It's a lot easier to just tank DDoS traffic than to reject connections based on some guessed measure of whether they are legitimate or evil, and in this case here they are legitimate, so these filters wouldn't work anyway.


Thanks for the feedback, but we do handle our blog different than the rest of our platform, it's low specced and hosted differently, we do not consider the blog essential to our service.


Nice post! Seeing that you integrating with 2 different services, I'm wondering if you looked at other, more integrated, options? Mapbox, HERE and LocationIQ (shameless plug) are options that offer geocoding and maps.


Any plans to expand to other countries? Would love to see something like this in Ireland.


Event venue market place? I already have stock images for Ireland and Irish cities, but it's not the top spot sadly for expansion (Though I would love to go there!)


Any way you could copy the text in here or are there more relevant graphics?


> In under 30 minutes we migrated our code to Stadiamaps without prior knowledge of their API, including deployment.

This surprised me. I'm completely ignorant with respects to geoinformation providers, but for some reason, I somewhat expected vendor lock-in to be much tighter.


Geo coding took longer as the meta information about city districts etc. which needs to be mapped is different, but changing maps was easy. That said Open Cage was nice as their data was cleaner and we didn't have to manually change data as we had to do with Google.


OpenCage (https://opencagedata.com/) founder here if anybody has questions. Putting country hierarchies in a set of keys is a challenge for all geocoders and the output differs slightly (though we have a google-compat mode). Sometimes it's just road vs street or town vs city. England has parishes, Paris has arrondissements (equivalent to a city within a city), Berlin is a city, county and state (same outer boundaries) for example.


FAQ has:

>Just pass the coordinates as a latitude and a longitude, separated by either a comma or a (URL encoded) space and the API will automagically work out that you want to reverse geocode.

Is the reverse geocoding based on 1. government-source boundary data, or 2. proximity to tagged points from OSM or other non-authoritative sources?


Both where government provides open boundary data, though in most cases that would've been imported into OpenStreetMap already https://opencagedata.com/credits


They probably have a nicely decoupled code and not too complex an integration. If you're just showing some icons on the map, all you need to know is where to place them on the overlay layer and then query and set the bounding box for the map. That's pretty much it. The rest can be in your own code.


Exactly.


The core principles of geoinformation are the same across providers. Here is a map which is functionally geographic locations in specific locations, you place your own special bits in "layers" on the top. Leaflet.js is a great example of a library that works pretty much the same regardless of which provider you choose.

The competitive set is differentiated in some of their niche work (like specific-device-embedded maps, such as the ones in Tesla), but the first principles for the common developer are all the same.


Has anyone figured out WHY Google maps did this double-digit multiplier price increase?

Were they vastly under pricing it before? Did they want to stop providing it so they priced it at a point that would make almost everyone go away?

I'm just curious about their motivation and surprised that this doesn't seem to have been figured out or even discussed.


My guess is that they realized the enterprise market was more profitable for them, so they decided to focus more exclusively there. They know their enterprise customers aren't about to jump ship, so they can do pretty much as they please due to their dominant market positioning.



Thanks.


In case it's not known by everybody, you can geocode(maybe also reverse geocode, I didn't check) US addresses in batch with TIGER census data and PostGIS.

It's not directly relate to topic (dynamic maps, online geocoding), but somewhat relevant.

With about $300 in Amazon EC2 I geocoded 18 million addresses in 3 months. You probably can do it with your own PC with similar time (the only hardware requirement is RAM and SSD).

http://dracodoc.github.io/tags/Geocoding/


Did you do any verification to show the effective accuracy/spatial resolution?


There is an accuracy score in postgis tiger geocoder plugin. Usually a low value mean a pretty good match. Depend on address format and quality, 70-80% of addresses got a good match.

For resolution you cannot expect too much because street numbers are usually interpolated, and TIGER database definitely will not be as good as best commercial database. However you don't have many other solution if you have millions address to geocode.


I've migrated much of our mapping capabilities to MapBox's API.. but we're still facing mapping bills in the hundreds of dollars.

My next step is going to be to setup a caching Open Street Map tile server and using https://leafletjs.com/ javascript library. The costs now for mapping solutions are going to drive us to run our own GIS infrastructure.


We also looked at MapBox but ended up going with local tile server from OpenStreetMap data. We are using the geofabrik.de data bundles, mod_tile and renderd to serve up the tiles, and we pre-generate the tiles so all our requests are served from disk rather than being rendered (which really slows down mapping). We are lucky that we are only dealing with one state, so our data is around 25GB for map tiles and another 10GB for "street overlay" tiles that we lay on top of aerial tiles that we generate from NAIP data.

We customized the CartoCSS a bit to get the maps closer to the way we wanted them. For example, we took out building outlines, and reduced the frequency of highway shields, which required removing exit numbers.

It's a good solution for us, but we already had some mapping experience and lots of system deployment experience, so YMMV.


Be sure you've considered OpenLayers if you're going to be doing GIS and not just mapping. It's my go-to for the front end of all my Geomatics-Robotics tools.


Stadia Maps founder here to answer any questions!


Any reason why you wouldn't self host your tiles using something like open map tiles?[0] If you're geocoding mostly US addresses it's actually not impossible to roll your own geocoder* . If you're insane enough, I'm sure you could find some German spatial data and follow along with how the TIGER geocoder loads and normalizes information if you wanted to run an entire GIS operation in-house.

[0]: https://openmaptiles.org

* Provided you're familiar with PostgreSQL, and PostGIS.


It makes sense if it's your core competency, but when it's not it's usually worth it to pay someone else to deal with it, like email.


Not VC funded therefor no money (sadly) for such engineering, we focus on core features for our customers and only write or setup stuff we could not get anywhere else. But would love to do some deeper GIS and mapping on top of PG.


Piggy backing off this thread for my own question.

I've a small webapp to show me my elevation based on my lat/lon. I built it with Google's API as I've used it before and it's what I know. It doesn't get much traffic so it's unlikely to ever cost much once my free credit expires but I'd like to explore other options.

What alternative (preferably free) services are there out there? I need to be able to pass lat/lon and get back the elevation above sea level of that point on earth.


That would depend on what your resolution and accuracy requirements are. Ground level or treetop? If you are willing to DIY there are freely available DEM (Digital Elevation Model) datasets that might work for you. For example ASTER[1] has global 90 meter resolution and 30 meter resolution in the USA. There are smaller, embeddable DEMs with ~1km resolution. Lots of options out there. Grab a coffee and start with an internet search for DEM.

1: https://en.m.wikipedia.org/wiki/Advanced_Spaceborne_Thermal_...


Cheers, I'll take a read. I'm looking for ground level (I use it to calibrate the barometer on my watch at the start of a hike/trail run/climb)


We (https://opencagedata.com/) have such an API in beta. We're not happy with the performance yet, but might be enough for your use-case.


I've been doing some interesting experiments with elevation compression recently that make it realistic to have large swathes of the earth memory-resident. If you're interested you know where to find me!


We did something similar and used Maptiler.com's vector tile server offering + MapboxGL.js on the frontend.

Using mapbox's own backend would've been too expensive for us since our app is able to do asset tracking, a use case almost all of the bigger mapping providers (we checked google, bing, here, mapbox) want to earn an extra buck on.


What most (all?) of those alternatives are missing is the satellite view and street view. I look at it almost every time I search for a place to get an idea of the surroundings for when I'll be there. However links to the same location on Google Maps can fix that.


Google invested 100s of millions driving streetview cars around the world. That's hard to match. "Street View cars have traveled almost 10 million miles, covering every single continent and 83 countries in total" (http://geoawesomeness.com/googles-street-view-turns-10/) Mapbox has satellite view


Yeah, but Stadia Maps aren't that good. They have really bizarre issues with labeling. For instance, they would display "Bridgeport", but not "New York" at some zoom levels. And Long Island looks completely disconnected, and Manhattan is missing altogether.

https://i.imgur.com/O6v1X03.png


I don't think that's their fault but issues with the map services they rely on (which is OSM from what I gather [0]). I remember this being discussed recently on HN but can't find an article/thread right now.

[0]: https://news.ycombinator.com/item?id=17010831


Yes, It's a mix of underlying data and Mapbox GL. This is an active area of development both on our end and in the Mapbox open source community.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: