More than just tradition, it’s because (at least in C) if a is an array, it’s effectively just a pointer and so a[i] = *(a + i) (which means the i’th element of a is just the contents of memory address a + i). In particular, we have a[0] = *a.
The first element of the array lives at zero offset from the address pointed to by a, so is considered the ‘zeroth’ element.
Yeah. It has nothing with intuition or mathematical sense, quite the contrary. It’s just a quirk of a language that got transformed as “the natural way” after decades of reinforcement. There were other languages before C where array indices started at 1.
That's right. Before, and after too. R for example has indices starting in 1. But I was thinking about the specific 0-based convention. I guess I was wrong about it though.
More than just tradition, it’s because (at least in C) if a is an array, it’s effectively just a pointer and so a[i] = *(a + i) (which means the i’th element of a is just the contents of memory address a + i). In particular, we have a[0] = *a.
The first element of the array lives at zero offset from the address pointed to by a, so is considered the ‘zeroth’ element.