Anyone who thinks this will work has never tried to index a site. A huge amount of effort is spent trying to figure out if the site is serving different content to users vs crawlers, or if the site is coded to appear visually different to humans vs machines. If you ask sites to index themselves you will get lies only.
I index sites all the time and I think it could work. There will be other problems, of course, but we already are partly there with XML sitemaps. Relying on the large search engines to enforce “honesty” from websites puts them into a mediator role that has a number of negative effects both for search in general and, increasingly, society at large.
Relying on sites to be honest about themselves, is even less likely. There are monetary incentives for many of them not to do that. Many sites host dishonest and clickbait content with extreme levels of SEO already. The cost of dishonesty decreases if you can directly modify the index.
I think that is primarily a symptom of the fact that we have a bottleneck on search interface providers. If it were easier / cheaper for new search engines / rankers to exist in the market, they could fairly easily filter out unscrupulous domains.