In Java/C++, for example, do you casually say that 'a'
is the first character of "abc"
, or the zeroth?
Do people say both and it's always going to be ambiguous, or is there an actual convention?
A quote from wikipedia on Zeroth article:
In computer science, array references also often start at
0
, so computer programmers might use zeroth in situations where others might use first, and so forth.
This would seem support the hypothesis that it's always going to be ambiguous.
Thanks to Alexandros Gezerlis (see his answer below) for finding this quote, from How to Think Like a Computer Scientist: Learning with Python by Allen B. Downey, Jeffrey Elkner and Chris Meyers, chapter 7:
The first letter of
"banana"
is nota
. Unless you are a computer scientist. For perverse reasons, computer scientists always start counting from zero. The 0th letter (zero-eth) of"banana"
isb
. The 1th letter (one-eth) isa
, and the 2th (two-eth) letter isn
.
This seems to suggest that we as computer scientists should reject the natural semantics of "first", "second", etc when dealing with 0-based indexing systems.
This quote suggests that perhaps there ARE official rulings for certain languages, so I've made this question [language-agnostic]
.