Since most CLIs are really only terminals (pretty dumb ones mostly but sometimes with color), the only cross-platform way I've ever done this is by allocating muliple physical lines per virtual line, such as:
2
f(x) = x + log x
2
It's not ideal but it's probably the best you're going to get without a GUI.
Following you extra information as to what platforms you're mainly interested in:
With Ubuntu at least, gnome-terminal runs in UTF-8 mode by default so the following code shows how to generate the superscripts and subscripts:
#include <stdio.h>
static char *super[] = {"\xe2\x81\xb0", "\xc2\xb9", "\xc2\xb2",
"\xc2\xb3", "\xe2\x81\xb4", "\xe2\x81\xb5", "\xe2\x81\xb6",
"\xe2\x81\xb7", "\xe2\x81\xb8", "\xe2\x81\xb9"};
static char *sub[] = {"\xe2\x82\x80", "\xe2\x82\x81", "\xe2\x82\x82",
"\xe2\x82\x83", "\xe2\x82\x84", "\xe2\x82\x85", "\xe2\x82\x86",
"\xe2\x82\x87", "\xe2\x82\x88", "\xe2\x82\x89"};
int main(void) {
int i;
printf ("f(x) = x%s + log%sx\n",super[2],sub[2]);
for (i = 0; i < 10; i++) {
printf ("x%s x%s ", super[i], sub[i]);
}
printf ("y%s%s%s z%s%s\n", super[9], super[9], super[9], sub[7], sub[5]);
return 0;
}
The super
and sub
char* arrays are the UTF-8 encodings for the Unicode code points for numeric superscripts and subscripts (see here). The given program will output my formula from above (on one line instead of three), then another test line for all the choices and a y-super-999 and z-sub-75 so you can see what they look like.
MacOS doesn't appear to use gnome-terminal as a terminal program but references here and here seem to indicate the standard terminal understands UTF-8 (or you could download and install gnome-terminal as a last resort).