This response has nothing to do with anything. The singularity as Kurzweil is describing it is marked by tangible events like the development of a true AI or atleast a machine learning algorithm capable of discovering and proving its own concepts. I have no idea how this relates to your very abstract singularity (a trick of perspective?).
Kurzeil attempts to privilege certain technological milestones as substantively "different" from other technological milestones (e.g. "true" AI)--and thus claim their consequences for human culture are uniquely unpredictable.
My point is that this is just an assertion, not a prediction. Every future development to some extent obscures our ability to predict the future of human culture. Look far enough into the future along any line of inquiry (legal, artistic, religious, energy, biology, etc.) and there is a singularity beyond which we cannot predict. It's just a function of trying to predict the future in general, not some special property of AI.
We may not be able to predict future developments in musical composition, but we can predict that song writers will probably not convert the entire mass of the solar system into musical instruments over a two week period.
The same cannot be said of self-improving AIs. Kurzweil's Singularity is not about the difficulty of long term predictions, it is that the progress function may become so steep that even short term predictions become impossible.
Why can't the same be said of self-improving AIs? This seems to me like the sort of awesome-sounding but unsupported assertion that leads people to roll their eyes at the Singularity crowd.