A little while back I decided to learn C# which has been a fairly rewarding experience as the language seems very easy to pick up.
That is, with the exception of the terminology. It's not that there's loads of jargon (that comes with learning any new language/technology) - it's that the terms used seem to be either awkward, unclear or unnecessarily complex. An "array" seems like quite an easy thing to explain to someone because of it's name alone, compared to (say) IEnumerable collections.
But this is beyond a specific language/platform. Take the below:
- functions/subroutines vs. methods
- errors vs. exceptions
- temporary tables vs. common table expressions
- user owned tables vs. schemas (in SQL Server)
I appreciate these aren't exactly equivalent terms but the "original" terms just seem to fit better and are therefore arguably easier to explain or understand.
So ... Are all the "good" names for things already taken?
Or are the concepts simply becoming more complicated and therefore more distant from day-to-day language. Are there counter-examples to this trend? (Obviously there are lots of tasks which are much simpler, but this is generally the result of an external library/the OS/etc doing things which a programmer previously had to do, rather than the same work being better described).
Or am I utterly wrong and my mind is simply warped from learning BASIC as a kid (as per Dijkstra)?