It's an artifact of how Rough.js renders sketchy lines. It does that by creating some random offsets. For some random values, it may create this effect for long lines, which is not ideal. This is less pronounced in shorter lines. The latest release has tried to reduce this effect, but there is a small probability that it may still happen: https://roughjs.com/posts/release-4.0/
It doesn't redraw the whole canvas. The optimizations are really needed when 'filling' a shape. If you draw a large polygon, it may take a little longer to fill it with bricks.
Optimizations can be made by using WebGL instead.
That's the goal. Missing the 2d path api mainly.
There are some interesting technical limitation for fill vs stroke. But for a fun hobby project, only limited time. :)
Can't you just replicate the entire 2d context api and pass through to an appropriately scaled (1px per block) offscreen canvas? Then pull the image from that and legoify it up to the display scale. That seems like a lot of unfun boilerplate to write, but ultimately an easier way to implement this, which would give all the gnarly parts of canvas without having to pull in external libraries ¯\_(ツ)_/¯
True but having raw parts would give it more flexibility when animating or changing sections of the image. Especially when the canvas size is large, it won't be as optimal