Dijkstra once noted that a programmer can reasonably expect to have to work within a range of at least thirty orders of magnitude- from dealing with individual bits all the way up to gigabyte-sized units.
Let's test this. In your career, what was the smallest level of data manipulation you have worked on, and the largest? In which direction has your career moved: toward the bare metal, or toward inhumanly large constructs?
Extra kudos to those brave punch-card veterans of the days of Mel who have survived and even thrived in the transition from raw binary to massive software architecture. We salute you.