Mapping in R seems a bit quaint at this time. If I need to do mixed visualizations including geo, I am reaching for Observable. If I only need the choropleth, Felt. If I need more spatial analysis, ArcGIS Online (it’s free!). If I need full-custom spatial or mixed statistics, geopandas.
That's a pretty good deal too but if you just roll into arcgis.com they have a free, zero-friction signup. You login with one of their identity providers (github, google, apple, etc) and get started. They give you one "Creator"-level user. I've been using it for years without a hint of being charged.
You can sign up for the personal use subscription and stick to Online. You can’t use Pro on Mac, but the web map viewer has grown into a highly featured editor and does a lot more than viewing (data editing, styling, analysis, charts, filtering, etc).
Bookmarked. Good post, Choropleth's in R are hit or miss and mergeing data to the mapping dataframe remains annoyingly complicated (alluded too in post).
Data being placed on a map often has a hierarchical element (country, state/province, county/district, town/city, etc) and many of the data standards allow nesting such as GeoJSON.
While R can handle hierarchical data, it does much better with rectangular data sets.
I make extensive use of geopandas and plotly for doing mapping and it’s very difficult for me to imagine it could get all that much easier. Dump my geo data frame in and specify which columns to use and boom you’ve got a fully interactive map with Open Street Maps data underneath. It even has a variety of themes to choose from!
I've had a lot of trouble with plotly maps and too many layers and moved to folium, which isn't easier, but results are more stable. (it is easier for simple plots, when using `gdf.explore()` and you can keep adding layers with `m = gdf.explore(); gdf2.explore(m=m)`
With that being said, tidyverse is much easier than pandas for EDA (though there is something to be said about the less stable API when it comes to production)
GGplot2 is good, but leaflet is better at mapping these days. If you want a dynamic map, Leaflet for R is where to look. https://rstudio.github.io/leaflet/
GGplot is significantly more flexible. Leaflet is good for quick results that do not need to strictly adhere to cartography standards. Furthermore, it handles custom projections pretty poorly.
I've been meaning to use leaflet more for iterative portions of data exploration and designing a map, the stuff that a GUI and fast refresh times can speed up by an order of magnitude
I scrolled through the article looking for examples/instructions how to plot maps for places other than the US, but sadly that doesn't seem to be covered.
Plenty of options, e.g. maps package makes it trivial to pull down data for another region and you have no shortage of projections that work well for other extents and latitudes, a number you can implement with ggplot2's coord_map(). My main qualm would be geom_raster(), the fast algorithm implemented for gridded data doesn't work with projections too far removed from Cartesian
If this is for an application you're building for browsers, use Leaflet (easy, but not as flexible) or OpenLayers (more flexible, more complicated). Tehre should be database libraries, though I've typically interacted with m database through an API I developed instead of direct from the browser. Cesium exists as well, but that is a resource hog.
IF you're playing around on a local machine, R and Python have sql interfaces that can let you load the data for whatever local processing you want.
There's a basic map display in ssms if I recall correctly. But otherwise you can generate a map with something like Tableau. Less of a learning curve than R.
10 thoughts on data visualization best practices and tools:
1) For interactive visualizations of data on 3D globes, I use a mix of C++, Python (for data cleaning), and Unreal Engine (with a plugin called Cesium). An example of this is at https://youtu.be/9i-tQ8Sr80o.
4) If you are trying to do 3D population density maps in R, there are a lot in the community that say you should use https://www.rayshader.com/ with R.
6) If you are doing data vis for urban planning, odds are they are already using ArcGIS, and odds are you will be using something like that.
7) If you are trying to do data vis that relates to architecture, I would actually suggest starting with Twinmotion (which is part of the Unreal Engine ecosystem).
9) If you are wanting to show some high end maps fast, use Geolayers 3. There is a YouTube channel called "Boone Loves Video" (https://www.youtube.com/channel/UCXyGw2OkrAzLhq1r7hyDZkA). Boone explains Geolayers often in his videos.
10) I personally believe that if you are trying to get to next-gen data visualization my best guess is that you would use a mix of Blender, Nuke, Houdini, or After Effects. I personally have only used Blender and After Effects so far.