Loading your rust code into your existing KDB data lake and periodically updating it will be a significantly smaller lift than rewriting your data lake.
It sounds like you are some sort of Quant Dev on a desk, and so it's really up to you what you want to do. If you push against the grain to do a data lake rewrite, you'll own the time/effort/outcome of a big Data Engineering project. So you better be very right and also very fast.
If you are looking for solutions within your existing data lake, I've posted up a few sources / thoughts for you to get on and do your Quant Dev work.
You sound very set on some sort of rewrite, so you should do what your heart desires. Just make sure you deliver value to your desk.
> Loading your rust code into your existing KDB data lake and periodically updating it will be a significantly smaller lift than rewriting your data lake.
This is probably going to be what I do until KDB creaks over.
> You sound very set on some sort of rewrite
I vacillate between the two things. I'm personally used to data engineering with parquet and spark, which are widely used outside of finance, and don't have expensive vendor lock in.
And then I realise that I'd have to own this stuff, and my job isn't a data engineer, and I'm a quant dev.
It sounds like you are some sort of Quant Dev on a desk, and so it's really up to you what you want to do. If you push against the grain to do a data lake rewrite, you'll own the time/effort/outcome of a big Data Engineering project. So you better be very right and also very fast.
If you are looking for solutions within your existing data lake, I've posted up a few sources / thoughts for you to get on and do your Quant Dev work.
You sound very set on some sort of rewrite, so you should do what your heart desires. Just make sure you deliver value to your desk.