Dijkstra once noted that a programmer can reasonably expect to have to work within a range of at least thirty orders of magnitude- from dealing with individual bits all the way up to gigabyte-sized units.

Let's test this. In your career, what was the smallest level of data manipulation you have worked on, and the largest? In which direction has your career moved: toward the bare metal, or toward inhumanly large constructs?

Extra kudos to those brave punch-card veterans of the days of Mel who have survived and even thrived in the transition from raw binary to massive software architecture. We salute you.


Lowest - hand soldering chips. Up through VHDL, hand-assembled machine code, burning my own ROMS, Assemblers, Macro assemblers, C, C++ and the highest level would probably be Python.

But I'm still going to vote this question down because it is too subjective.

Adam Pierce

Lowest level

ARM Assembler and even below: some sort of HDL, hardware description language, which I can't remember the name

Highest level

business process orchestration with BPEL; domain specific languages

Nicolai Reuschling
+2  A: 

Lowest level: debugging my code with an oscilloscope (did that recently, that was fun)

Highest level: not sure how to measure this, but the wikipedia database dump is something like 2 terabytes uncompressed. Also built a compiler and runtime system for a proprietary scripting language in the early 1990s (and then wrote a whole pile of code on top of that).

Greg Hewgill
Now that is a huge range!
Had to that oscilloscope debugging a year ago. Was not really fun, but at least it proved that the bug was not within my code.
+13  A: 

I am and always have been a Software Engineer, I have no formal hardware training. During University I always expected to go into something really abstract like language design. In reality I stumbled into the embedded C industry and found to my great surprise that I enjoy it.

On the low end:

  • I've used a PCI sniffer to debug memory issues.
  • I once wrote one chunk of code that had to execute in 10us with a tolerance of less than 1us.
  • I had one issue where I ended up having to prove to the digital designers that the actual hardware behavior differed from the verilog code that was used to generate it.
  • I was party to a bug fix that involved using command line poke calls to rewrite the body of a function so it was a byte shorter so we could insert a NOP at a particular point where the CPU was prone to double increment the PC.

On the high end:

  • I used FUSE to write a filesystem to manage the 2TB we have floating around on our home network. (high for the 2TB, I guess since it's a file system it could also be on the low list)
  • I recently wrote a Python RPC system, this uses introspection to implement the RPC in a way that is almost totally transparent to the code on either end.
  • I've written a couple of machine learning systems in Lisp and Prolog.
  • And finally I wrote some code that was directly responsible for power cuts in a smallish south American country. For the record my implementation was perfect the design was flawed.
+1 for "my implementation was perfect the design was flawed."
Chris Lutz
+1  A: 

Lowest - Building a simple CPU and Memory with VHDL then writing an assembler for the thing.

Highest - WebPortal aggregating WS-* based services.

Torbjörn Gyllebring

The lower level is easy: after learning to program with my TI-57 calculator, I learned assembly language on one of the early microprocessor boards with only an hexa keyboard and LED numeric display. It was really low level because we had to write the opcodes on paper and use a table to convert them to hex digits and compute by hand the offsets. It was fun, except there was no save facility (later, it was saved on audio tape...).
You had to send bits to some hardware addresses to light the digit segments. Of course, I did a cyclic animation...

I suppose there was an even lower level, when you had to flip switches to set the bits and read them back on little lights...

I have also programmed radio-altimeters, analyzing analog signals (thus using an oscilloscope for "debugging" and using digital analyzer to check signals on processor's pins.

Highest level? Not sure. Professionally, I am working on a large Java system to analyze raw data from mobile phone networks. It has some levels of abstraction.
Using high level scripting languages like Lua or JavaScript (although not on complex systems) gives a taste of this level.


I transition from high levels of abstraction to the bit level for ASIC verification. Then even lower as I need to see the physical level of waveforms that the SW produces. The high level consists of managing automation systems that gather large amounts of data from unit tests. Managing different HW configurations etc. Making tools that help debug and manage this process is challenging. Since I've done memory characterization I guess I've worked in the area between the transition of bits. Where the clock meets the "eye". I find data mining and report generation to be easy compared to UI design (command line). Trained as a HW engineer I learned SW and transitioned from programming the metal up to systems and then architecture for SW. I can still enjoy days of debugging a bit on intermittent HW though.


The lowest level was actually writing machine code into a Commodore PET for a high school Computer Science class, though others close behind would include working with breadboards using AND,OR,NAND, NOR, NOT and XOR gates, Commodore PET assembly and LOGO.

On the highest level, there is the abstractly abstract abstract model I had at one place where there was ASP code on the page that led to classes in ASP that led to C/C++ Object model code that led to Oracle functions to update a database. The other was what was called, the triplet database, that consisted of a few tables with the big one being 4 IDs in a row where there was an identity, an item ID column and then a pair of string IDs that coded a name/value pair for this item. That was rather funky to see as well as the nightmare it became when there were a few hundred thousand rows in it.

JB King
+1  A: 
Michael Kohne
+2  A: 

Lowest: Worked on an experimental computer that had reprogrammable microcode. It was defined for high performance floating point (intended to rival Cray and CDC style vector performance for less cost). One could redefine some assembler opcodes to use the underlying stack based microcode on the FP functional units to have one-op AX+Y, for example.

Now try to write an optimizing Fortran compiler for that beastie...

Or, if you mean smallest data unit, that was some other floating point code where I emulated some specialized hardware that had some special codes in the last two bits of otherwise IEEE 754 floats. All FP operations were defined on how those last two bits would combine for their results, but those results were independent of the arithmetic involved.

Largest: Enterprise backup schemes.


Oscilloscope to verify serial connection was my lowest level.

Highest level code would be PHP interface into SQL.

Paul Nathan

Lowest: Using x86 assembly to simulate functions of an elevator. Highest: Some buzzy orchestrator using J2EE/JMX/MBeans etc etc .. didn't like it all that much.

Now: C/C++ based projects.. involves large data analysis.

Sridhar Iyer

Assembly on an old school single board computer 8-bit, assembly drivers, embedded C code, all the way up to working on Expert Systems on TI Lisp Machines.

It's all interesting.

Will Hartung
+3  A: 

Lowest: writing out the microcode on paper and converting it to binary then hand loading it through a 8 switch (1 byte word) panel on a Speer Micro-Linc 300 Computer.

Highest: SAS, JAva, C#, Cobol


coding .com files in editor ("copy con") using assembler op-code reference

+3  A: 

Lowest: I've made my own transistors in a physics lab.

Highest: Haskell -- patches to darcs, xmonad.


lowest level: wrote my own disk I/O library in 6502 Aseembly Lanauge for the Apple ][ to increase the amount of data we could store on a floppy disk

median level: wrote object-oriented languages and language extensions, developed framework and class libraries for new OOPL

highest level: wrote near-real-time monitor to watch all other systems and applications across all servers in 7 states (a poor man's event bus)

Steven A. Lowe

Lowest level:

  • MACRO32 code to simulate a "ramdisk" for an application otherwise written in VAX-COBOL (with DCL and Datatrieve thrown in for good measure)

  • bit-twiddling in COS-310's DIBOL (Digital's Business Oriented Language - think COBOL without the words) so that we could reduce the record length a byte or two so that we could fit another couple of records on a floppy disk.

  • Writing a code-generator in VAX-BASIC that would interpret output from FMS (a screen/forms package) and generate the COBOL code to be INCLUDEd for reports and maintenance programs.

Highest level:

  • Almost anything in .NET qualifies since it's built on such high levels. Was too wasy to write a program that scanned my MP3s and copied them to directories based on the ID3 tags when you have an ID3-tag-decoding library to build upon.

  • Anything in SQL

  • Any report writer. I've used Crystal a lot over the past 2 years.

+1  A: 

IBM 1620 Assembler - it wasn't a fixed word length computer.


Low level:

  • Z80 Assembly (very simple scrolling routines and the like)
  • x86 Assembly (college assignment - DOS mouse routine calls, screen writing)
  • Hex editing the startup message on a MS-DOS 6 box ;) (Hey, I was young..)

High level:

  • VBA Office 97 Macros
  • Adding an additional abstraction and API on top of a FUSE module for pluggable features
  • Mainframe and Minicomputer programming in various 4GLs (including the dread COBOL). This was government work that paid out thousands of cheques in a nightly run.

My career is moving in whatever direction allows me code in C, Perl & PHP on *nix (although I will, of course, use whatever the job requires).

John Barrett

the lowest level was machine language for a simple cpu i coded in vhdl , harvard architecture and pipeline. I could actually modify the architecture to fit my instruction set...thats low XD.

yan bellavance

lowest: machine language (binary).

highest (level of abstraction): writing builders in groovy

Ray Tayek

When I was in school, I was coding "Demos" with a friend of mine for the PC platform.

We had no 32 bit assembler at this time (software was prohibitively expensive and the integrated assemblers of Turbo Pascal only could do meager 80286) but needed a VERY fast implementation of Bresenhams line drawing algorithm for texture mapping (DOS - times, so we had neither DirectX nor real graphics cards, only the terrible memory controllers with attached DACs / CLUTs that were available at this time... they were called "VGA - cards", lol...).

So what we did was to write the whole thing in a history class (it was about old Rome and I didn't care about that) in pseudo code, then translated it BY HAND to 80386 assembler and HAND ASSEMBLED the whole thing using the 80386 - reference book on paper into hexadecimal code.

Took us about a week, including necessary optimizations, but when I finally emitted the code into the Turbo Pascal - wrapper, it worked like a charm.

Turing Complete

Lowest: Debugging on-the-wire communications for a consumer banking application back when we couldn't assume that folks were just "on the internet"... so I had to deal with dial-up, handshaking, etc. (I still remember reading the raw bytes from the COM port monitior in real-time and understanding what was going on.)

Highest: Doing analysis and research on a multi-billion web-document corpus (terabytes of data).