It might help you to think about this in terms of how tail-call optimisations are actually implemented. That's not part of the definition, of course, but it does motivate the definition.
Typically when a function is called, the calling code will store any register values that it will need later, on the stack. It will also store a return address, indicating the next instruction after the call. It will do whatever it needs to do to ensure that the stack pointer is set up correctly for the callee. Then it will jump to the target address[*] (in this case, the same function). On return, it knows the return value is in the place specified by the calling convention (register or stack slot).
For a tail call, the caller doesn't do this. It ignores any register values, because it knows it won't need them later. It sets up the stack pointer so that the callee will use the same stack the caller did, and it doesn't set itself up as the return address, it just jumps to the target address. Thus, the callee will overwrite the same stack region, it will put its return value in the same location that the caller would have put its return value, and when it returns, it will not return to its caller, but will return to its caller's caller.
Therefore, informally, a function is tail-recursive when it is possible for a tail call optimisation to occur, and when the target of the tail call is the function itself. The effect is more or less the same as if the function contained a loop, and instead of calling itself, the tail call jumps to the start of the loop. This means there must be no variables needed after the call (and indeed no "work to do", which in a language like C++ means nothing to be destructed), and the return value of the tail call must be returned by the caller.
This is all for simple/trivial tail-recursion. There are transformations that can be used to make something tail-recursive which isn't already, for example introducing extra parameters, that store some information used by the "bottom-most" level of recursion, to do work that would otherwise be done on the "way out". So for instance:
int triangle(int n) {
if (n == 0) return 0;
return n + triangle(n-1);
}
can be made tail-recursive, either by the programmer or automatically by a smart enough compiler, like this:
int triangle(int n, int accumulator = 0) {
if (n == 0) return accumulator;
return triangle(n-1, accumulator + n);
}
Therefore, the former function might be described as "tail recursive" by someone who's talking about a smart enough language/compiler. Be prepared for that variant usage.
[*] Storing a return address, moving the stack pointer, and jumping, may or may not be wrapped up in a single opcode by the architecture, but even if not that's typically what happens.