Historians seem to generally think that Hitler's regime never would have existed if the West had not bent Germany over a barrel in 1918 and immediately after.
I mean, if we're going to ruminate over alternate timelines why fast forward to the 1930s..?
Not exactly, it was rather that the Treaty of Versailles was painful enough to cause resentment but wasn't harsh enough to cripple Germany. Even so, Weimar Germany managed to stabilize the situation for a decade or so, it's only with the Great Depression that finally broke the Republic's back (and even then, there were all sorts of political shenanigans that could have been manuvered better).
Furthermore, the foreign policy of the Nazis was informed more by their ideological myths than external events. After all, the Nazis admired the Great Imperialist Powers like the British Empire as part of the "Aryan Race". Their enmity was directed at Eastern Europe and the Communists, which had little to do with the enactment of post-war reparations on Germany.
I mean, if we're going to ruminate over alternate timelines why fast forward to the 1930s..?