Don't think it's the case - there will be more desired ML projects than people capable of implementing them. ML is like electricity 100 years ago or programming 40 years ago, we haven't applied it yet to most problems of society.
The problem is it's not as useful as many people seem to think. I often hear my colleagues suggest ML for anything remotely complicated, even something like "measuring body fat percentage using electricity" that in reality only needs a physical equation.
I've even heard people suggest it for web scraping which seems absolutely crazy to me.
It can make a lot of sense for web scraping, if you have lots of target sites you can either build strict rules for the extraction and update them constantly, hand build something generic (often very hard) or train some classifiers for the content you want.
I could actually see a use case for web scraping. If you're after particular pieces of content that aren't accessed in a structured way, on a site that rate limits you to the point of being restrictive, maybe using a bit of NLP could help you rank links to click.
I tried using OCR to scrap Facebook profiles by simulate web browsing behavior. It helps a lot in avoid account blocking but still too slow to be practical.
Really just curious about this approach and want to test it since most old scraping methods failed on Facebook data. My take is that it is possible with enough resources since it is actually pretty hard to separate this from real usages.