Pretty darn critical! I built a service a few years ago to scrape natural gas volumes from different refineries and my client wanted that data plotted over time. Another project collected and displayed wellhead pressure data.
Oh yeah, maps are also super important, and being able to overlay different statistics and or graphics over them.
thank you for sharing! i think we'll get there eventually and add components beyond tables, such as charts and maps. current focus is to get the table+sidebar experience right first. the thing that differentiates us from dashboarding tools is that with us an end user can trigger a task that gets or acts on data (as opposed to rearrangement of existing data on ui).
could you elaborate on cron jobs being a pain point? would you use it as a scheduled task to fetch/prepare data before user logs in or let users schedule jobs on their own to run async? any use cases you could share?
> could you elaborate on cron jobs being a pain point? would you use it as a scheduled task to fetch/prepare data before user logs in or let users schedule jobs on their own to run async? any use cases you could share?
Sure, I've used cron jobs to collect data, send reports, and check if any users should receive push notifications or alert emails.
The pain point is that celery can be a little tricky to configure when you're working with docker-compose. It works, but if I had to re-figure everything out again, it would take me a while.
I've had the need for things like this very often. But we had multiple metrics to store and visualize, we went with InfluxDB and Grafana for that purpose. Maybe one could create an integration with at least InfluxDB, because it is great for time series.
Great, much appreciated - i'm trying to understand the screen scraping of gas volumes, to trade nat gas. My email is my username at the very popular google email service.