200mb of data is not a large file, and chromium tabs have a memory limit of something ridiculously low so actual large 20-100gb datasets render this useless.
This echoes my thoughts exactly. Right now, we're actually more limited by the JS UI so a couple 100 MBs is the most you can do in a browser otherwise the UI becomes really slow. There's a lot of room for improvement - we're using React and that's causing a bunch of un-needed re-renders right now that we don't need. We probably need to create our own DAG based task management system and use Canvas to render everything - with all that, workflows on much larger files will hopefully become usable.
This is certainly true - I'm not saying "large file" in the colloquial sense of the "big data" but rather as in - a file you might want to open in Excel/Google Sheets. I've worked actual large datasets before - upwards of 500GB - pretty often before an I really wouldn't think about using my laptop for a such a thing!
We are thinking of making data connectors to major DBs though so you should be able to do a similar style visual analysis while keeping the compute on your DB.
I looked up the limit and as of 2021, tabs seem to have been limited to 16GB which is moderate in size for an in-memory dataset. However, I know WASM has a hard limit of 4GB without Memory64. Data size is all relative.