Have you ever programmed raw machine code (not for class)? Examined a hex dump with just a hex editor (or, heck, without)? Written your own software floating-point library? Division library? Written a non-school-assignment in Lisp or Forth?

What sort of "lost arts" have been forgotten? And what reason (if any) would there be to resurrect them?

+39  A: 

Reverse engineering through disassembly... would get back to me though. SoftIce anyone ;-)

Why would it be useful? It gets one to understand the inner-workings of a machine rather well... and allows one to understand Client languages (e.g. C,C++) more easily.

+1 softice was amazing.
I'm doing this nearly every day. Working on SharePoint, have to know what's inside microsoft.sharepoint.dll ;-) Of course, I'm talking about .net reflection which is a modern way of saying "disassembly".
+1 i loved the advanced full screen debuger
This is still very much alive and kicking. IDA-Pro with HexRays is tough to beat for application level reversing.
@naivists I agress, I often have to use WinDbg to debug SharePoint weirdness, so knowing assembler is very useful. Also knowing .NET IL (Intermediate Language) is crucial for tasks like this.
Jason Evans
+84  A: 
  • Free thinking
Noon Silk
Why, exactly? I'd agree with common sense, or even just freedom, but what exactly constrains our thinking?
pav: It is my experience that a lot of programmers rely on other peoples opinions too strongly, and are afraid of forming their own. The world of blogging as brought this on, there people are able to seem intelligent, and then end up with a bunch of followers, who all take their advice as gospel, without considering it. Often it can be referred to (by the same people, ironically) as 'Cargo Cult' programming. The best meaning I am trying to get across is that you can take in comments, but the best thing is to research and decide for yourself, in your situation, what is the most appropriate.
Noon Silk
"best practice is to..." says who? "..this web site that i take as gospel"... ;)
I see. Perhaps it tends to happen more in your neck of the woods. A pity -- I thought that the simple determinism of computers would make programming more, not less, like a science.
pav: You must be new to programming :) (And just to be clear, it's not at my work I notice this, it is on the forums and the internet, in general).
Noon Silk
pavpancheckha: Programming is about **people** as much as computers. Possibly more so. `</sententious>`
Personally, I consider programming to be more of an art than a science.
Anon: Some people would consider science an art :)
Noon Silk
Well, I tend to take the internet with a grain of salt. According to it, Ron Paul won the US presidency. I was thinking, with the whole "science" thing, about questions along the lines of "hash tables are always better than RB trees, some blog said so" or "that CodingHorror guy likes C#, so I refuse to use anything else."
pav: Clearly, you don't take the internet with that much disrespect or you wouldn't be asking your question on it. It's a sad fact of society, or human nature, that we need to get information from people, as it's too hard to review it all personally. But on some matters, important matters, your own personal research and thinking are critical. I think this has been lost, a bit, in our field, because we lean towards popularity, as opposed to best (this site itself is the classic example of that, it's a democracy).
Noon Silk
The irony is, if you decide my answer is correct, then did you really decide that, or did you research it and come to the same conclusion? :P It is my opinion (get ready for more irony) that too many people take opinions as research, and don't check the facts. This is what I have an issue with.
Noon Silk
Great answer. Many programmers today are in little boxes. Come out! Try stuff. Fail. Therein lies the path to enlightenment.
Richard Pennington
@silky - I was going to research the statement you made in the comments, but then I decided I didn't feel like it and I'm just going to assume you are correct. ;-)
@Richard Pennington: not when I'm paying your salary, it doesn't ;)
You could make this answer even shorter (changing a little bit the meaning): "Thinking".
Bruno Reis
+1 Yeah me too :P
+9  A: 

After five years of Eclipse, my vi-fu is seriously degrading. Combine that with a terminal without cursor keys, and I am in a bit of trouble.

that's not a lost art man, people are using it
yeah, other people ;-)
vi lives. (well in the form of gvim and vi-input mode in programs like QtCreator and KWrite)
Ali Lown
+5  A: 
  • Threading reel-reel tapes
  • Unjaming reel-reel tapes
  • Going through 1000s of reel-reel tapes rewinging them periodically.

Then later we progressed to:

  • Taking TK-50 tapes apart when they go wrong
  • Taking TK-50 drives apart when TK-50 tapes go wrong
  • Asking people if they have another copy of their work when TK-50 tapes go wrong.
Martin Beckett
I assume that you look nostalgically, but not necessarily fondly, at those days. I can't imagine anyone eagerly wanting to return to the era of punch cards.
Robert Harvey
funny answer :)
+34  A: 
  • Knowing how memory pointers work (no one cares any more thanks to Java, .net, etc)
  • Knowing how to debug real issues. Garbage collection, taking heap dumps, etc.
Yes, because Lord knows we don't have enough pain already.
Robert Harvey
I program .Net and I know what pointers are and how they work. You'd be hard pressed explaining the difference between an array and a collection if you didn't.
pointers are hardly arcane. Many schools still teach C++ and many projects (legacy and new) continue to use C++.
@spence and john, that's great, I'm glad you still know. I've been working primarily in Java and .NET for the last 6 years, and have worked with plenty of contractors that have never programmed in an un-managed language, and hardly understand how any of it works underneath. I've always been very happy that I learned C/C++ and ASM in school,a nd I think it'll be a terrible day when schools drop all that and just teach a managed language and that's it.
I don't think ASM/C will be dropped very soon at all. In ASM, you're completely free to do nearly anything you want. Though it may be difficult, it's an adventure to hobbyists like me. Even if they stop teaching it in school, the information will still be available on the web (that's where I learned it).
Sort of how we don't care how a spark plug works inside. I think these types of lost arts aren't really lost arts. They're evolution. Expressive languages are like frameworks. The remove the menial tasks.
Long live C++!!!
George Edison
@orokusaki, good point. Its like no one knows or cares how the engine in their car works, and suddenly you find yourself broken down on the side of the road with the hood open, staring blankly at all the wires and hoses. So, lets all step up and be the code tow-truck! :)
You know, there are many people that program that do not build boiler plate business CRUD applications and Web based social networking clones.
Your point being?
@orokusaki: When my engine fails me, I go to a specialist, because I have neither time nor desire to learn how modern engines work (they're a lot different from the slant six in my first car). When I was young, a lot of programmers knew assembler, and now it's a specialty.
David Thornley
@David Thornley Tell me about it. I don't even clean my house. I have a lady who comes each week 3 days a week and does my laundry, and everything (and I'm not rich). If everybody and everything did what it/he/she was best at, the world would spin faster. Note: I am a fan of a large breadth of knowledge and the micro tasks have to be done by somebody just like how somebody has to flip burgers until McDonalds figures out a way to do it cheaper by machine. I'm just not a fan of solving problems that have been solved for 15 years.
+4  A: 

For a physics class I once had to enter the machine op codes using a hex keyboard.

Eric J.
Damn. A hex keyboard. I've gotta get me one of those. My CompSci teacher had a large keyboard with just the buttons 0 and 1, and Shift/Ctrl/Alt/Meta. It'd buffer every eight presses and send that code to the computer. My teacher was rather adept at usin it.
You guys had it easy, all I had was toggle switches.
toggle switches? Luxury!
You had toggle switches? When I were a lad we had to individually solder each bit to the breadboard. And to set it back to zero again we had to scrub it off wi' a bit of pumice. (etc. etc. *Four Yorkshiremen* ad nauseam)
Pumice? We had to set fundamental concepts just to get *that*.
You had zeros AND ones - you were spoilt
Martin Beckett
You had zeros? all i had were thoughts i couldnt put into code
I once entered machine code with a text editor, back when text editors were text editors. (Only about 16 bytes, to reset the serial port after it had hanged.)
Tom Hawtin - tackline
About 2 weeks into the class, I realized the board we were given to work with had a serial port. And there was a cross compiler available for that CPU... but I did wear down my fingers a good deal in those first two weeks ;-)
Eric J.
I once met a guy who actually coded on the C64 in this way. He knew the hex value for every opcode by heart and said it was faster to code that way...
Jonas Elfström
Sure, as long as the program he's writing doesn't require any abstraction. And branching is minimal. And he doesn't need functions :-) I know people that don't want to learn new stuff so claim the old stuff is faster, too.
Eric J.
+32  A: 

The Story of Mel is a classic of the lost arts... in blank verse, yet.

Mel loved the RPC-4000
because he could optimize his code:
that is, locate instructions on the drum
so that just as one finished its job,
the next would be just arriving at the "read head"
and available for immediate execution.
There was a program to do that job,
an "optimizing assembler",
but Mel refused to use it.

Agreed, I love this article.
Noon Silk
Hrmm... was Mel a real programmer...
Technically, I believe that's free verse (not blank) because the lines don't have ten syllables.
Michael Myers
+21  A: 

My first assembler was pirated from an Intel 8080 development system. I printed the hex dump on an ASR33 teletype and entered it by hand into my SOL80. I hand edited the I/O functions to use the SOL's cassette drive.

The SOL80 was build from a bag of 74LSxx parts and an 8080.

I relocated the SOLUS monitor from $C800 to $F800 so I'd have more room to run a (modified) UCSD Pascal system.

I don't think that hex editors existed back then. At least I didn't have one.

I wrote my first C compiler to run in 48K of memory: 5 passes: pre-processor, parser, improver (not really), code generator, and assembler. For the 6809. Yes, floating point in software and divide as well.

I ran (almost) version 7 Unix on a PDP11/23 (Venix), with a compiler written by Ritchie.

I ran BSD Unix on a VAX. With a 9-track tape drive.

I had a SparcI, a Lisa, the original Mac.

That was in the first 5 years. A lot happened since then.

You guys (at least some of you) have missed a lot.

Richard Pennington
I forgot to mention Leor Zolman and BDS C.
Richard Pennington
I wish I could give you more than +1. I had one chance to program an EPROM for a self-created 6800 monitor which included an assembler and debugger. In less than 8192 bytes. No chance to debug it—it had to work the very first time. And it did.
Yes, we indeed have missed. I just saw X11R7 on a friend's NetBSD and said (without thinking) that "but it has **ALWAYS** been R6"
+1 Wow! Wish I had done half of that :(
SOL80? Are you sure you don't mean SOL-20? That was my first computer too.
I. J. Kennedy
@I. J.: You're right. It *was* a long time ago. ;-)
Richard Pennington
+20  A: 

I still have a warm and fuzzy for Peeks and Pokes.

Neil N
Not so fun when you mistyped an address and caused the machine to hang *hard*! ;-)
Mauricio Scheffer
POKE 53280, 0POKE 53281, 1
Jason S
+5  A: 

Paging though the disk device to recover your work after rm * .o

I still consider `strings` my go-to tool for disk recovery.
ext3undelete works, though I can never get it to recover just the files I want, so I have to wait while it 'recovers' the entire partition onto a smaller partition while I delete it all again!
James Morris
+26  A: 

Debugging a program via blinking lights.

(Or blinkenlights, if you prefer)

Note to self: Hook up system that connects christmas lights to debugger state.
Noon Silk
Vote up for the Jargon File reference. Had my first read 25-and-some years ago.
This and debugging via logging, understanding the bugs logging can hide (e.g. it can make race conditions less visible). I do embedded work that interacts with other real-time systems, so setting a breakpoint generally causes more problems than are solved (since a halted program looks like a hung program to the other end of, say, a socket).
Mike DeSimone
Yes, using a toggle switch to step through tape-loaded instructions and observe the state of lights grouped by threes to get the octal codes, and using that to find which card is bad.
I actually laughed out loud at *"Das rubbernecken sichtseeren keepen das cotten-pickenen hans in das pockets muss"*.
Michael Myers
+1: you should have added toggle-switches and alligator clips ;-)
Hah, if we had a light bulb for every register... :)
+85  A: 

Programming for space efficiency seems like a lost art. Everyone nowadays assumes memory is cheap. It is unless you're working with sufficiently large datasets. I'm currently working on a project involving about a 54,000 x 32,000 matrix. Needless to say, there was some creativity involved in making everything fit in memory.

+1. I still optimize mainly for the challenge and fun than actually needing to.
One of the big gotchas is knowing when to pack data (e.g. to save disk, communication bandwidth, or cache space) and when not to (because packing/unpacking can slow you down). Also, I'm amused how little code deals with out-of-memory conditions; it's like they expect the OS to kill the process cleanly, and don't realize that they could corrupt a bunch of stuff before the OS realizes they're playing with null pointers.
Mike DeSimone
I'm sure people who develop embedded applications have to worry about space (though not as much as before).
Noufal Ibrahim
@Noufal - as one of those "embedded applications" developers, we still have requirements about code space size that limits us to KB. It is not unusual to be limited to less than 100KB of code space. That is pretty roomy compared to 10 years ago but it still requires some interesting optimizations.
I always try to keep my code optimal the way I know how, while my co-workers say "come on who cares!"
Mobile development is also still pretty space constrained (though nothing like it used to be).
Kendall Helmstetter Gelner
We game developers optimise for space efficiency quite a lot, although it's for the performance gain rather than the reduced space as such. Cache access is paramount and we often use O(N) structures instead of O(logN) because the former is faster if it's kept small enough.
I just finished a program that is supposed to run as part of our auto-update process that uses upwards of 1 gig of ram. And (with the exception of some VM weirdness) it works in the wild! +1 for you in theory and ideals anyway.
Peter Turner
+70  A: 

Punching holes in floppy disks to make them double-sided.

No thanks. The day I got my first CD-ROM is the day I stopped using floppies (well, almost).
Robert Harvey
+1 good one I already forgot, as using cd-rom give usb-sticks a try
Lol, I already forgot that. +1 'cos im smiling right now :)
that's awesome! I remember having some purchased software that came on floppies that didn't have a cut in them at all and were "read-only", which was of course circumvented with a pair of scissors :)
really? that possible? I don't use floppy but that's interesting, any body give me a link please?
Basically, go look up a picture of an old 5.25" floppy. They would cut a square notch at the top of a side to let the drive know that the disk was writable (i'm not sure if something mechanically moved there, or if an optical beam passed through it) so to make a disk "read-only" you could cover it with tape. (the same thing was basically true on 3.5" floppies, they had that little plastic slider instead though). Both sides of the magnetic film inside the disk could be used, so you could cut a 2nd notch in the opposite side, and put it in a drive upside-down, and write data to the back!
why didn't anyone tell me this years ago? hehe
+1: Those sweet memories. I ended up just cutting the corner off as it was faster. But you had to be careful not to cut the disc inside. I once did but managed to use only the inner sectors. Nothing was going to waste back then... Ahhh...
-1 Belongs on Super User. :)
Andrew Grimm
@R6bert Harvey. Death to all optical media! CD,DVD, BLURAY....they are all awful.
At least when I did double-siding you had to not only cut the notch but punch the two holes to allow the light through the start-of-track marker hole. Since one end of the punch had to be inside the case to do that you had to be careful not to scratch it.
Loren Pechtel
rally25rs: the floppy drive of the c64 had a non-mechanical sensor. My - failed - invention was to misplace this sensor, so the two parts always saw each other. But the sensor's job was to check if I took out the disk.
I've yet to find the correct hole position for DVDs to make them double sided...
bit bleed was a big problem for me when using bulk fd. @vili my second real programming paycheck was spent on a c1541. the first was spent on a c64 and both came from games i wrote on a v20. ;-)
Sky Sanders
__what__??? I need to go back to 1990, __now__.
@Alex: Agreed. I'm still waiting for the perfect backup media. Lately I have been using WD Passport drives.
Robert Harvey
+43  A: 

Performance as a feature, not as an afterthought.

Robert Harvey
Security as a feature, not as an afterthought. Same thing really, it's all about good design, both security and performance should really be just sideeffects of it.
Depends on what sort of software. In most cases, performance isn't really an issue. In some cases, it's critical. Designing security in strikes me as more generally valuable.
David Thornley
I think a lot of people these days assume performance won't be important when they're initially writing and testing it because it's always "good enough", but then when the software is out in the real world and creaking under the strain of thousands of users and millions of lines of data, opinions may change!
+8  A: 

Writing code by wiring individual diodes because (a) you happen to have them laying around and (b) your parents don't actually give you enough allowance to let you buy an EPROM burner, and you're not quite yet skilled enough to build one yourself. Unfortunately the resulting programs are rather short. But the output can be read on the homemade logic probe!

Ah, misspent youth.

Sarah K
Is the output smellable?
James Morris
... So if you used LEDs, you could watch your code execute? Sound like a trippy way to analyze code coverage.
Mike DeSimone
Well, you could watch the address and data buses. And single step the CPU clock using individual pulses, provided you were playing with a CPU with static registers.Output was only smellable if something went wrong and the magic smoke got out...
Sarah K
+21  A: 
loool... +1 for ingenuity and constructive criticism.
+1 This is not a lost art for me :) I do lot's of this :)
Like recognizing XR a,a does the same as LR a,[0] but takes less space and does not reference memory.
+9  A: 

Classic viruses that embed themselves in other executables. I haven't heard of any modern malware that's not related to controlling botnets and sending spam, and certainly no malware comes close in complexity to ZMist.

P. S.

No, this should not really be resurrected. Only for academic purposes.

Alex B
+1 for metamorphic code
Executables? *Boot sector* viruses, man! I think I still have an LHA archive containing my collection of Amiga viruses somewhere...
Michael Borgwardt
+16  A: 

Using DOS extender (such as EMX/XMS) drivers to access memory beyond 640KB.

Not as dead as you might think. DOS still has its niches.
+4  A: 

Woz did some really genius things in his designs. Creative ways to share memory. Also, the way he found out how to have the apple 2 display in color using a quasi-phase shifting technique. Building new hardware and writing drivers for it (graphics display, keyboard/driver).

He did some really cool stuff. Very inspirational!

Also, cool when guys repurpose wii remotes and other hardware.

How is repurposing a wii-mote a lost art?
Pete Kirkham
Yeah, ur right, it's not a lost art but it shows people doing similar things like woz but not necessarily for business purposes. It displays free thinking (like the earlier response to this question)
+15  A: 

In the early days we did really weird things:

  • Listening to a speaker which was connected to a register (pc) to debug a mainframe (no joke).
  • Replacing paper tape reader by hard and software.
  • Incorporating the VIC-20 floppy drive for calculations, because it's processor was just as well.
  • Writing interrupt routines to switch between text and graphics mode in one screen.
  • Drawing and erasing lines by xor.
  • Using video ram for codes since 3,5 Kb were not enough.
  • Rewriting machine code lost by a chewed-up tape.
  • Writing code for self soldered acoustic coupler (predecessor of modem).
Using the video RAM was done not so long in time. I had some DOS software to use the 1Mb my VGA had on my 386 with 4MB of RAM.
The awesome things I saw people do with Commodore floppy drives... The one I remember was a Huffman compression algorithm for the serial port to double the 250 byte/sec bandwidth. (Yes, it was that slow; it was implemented with bit-banging.) But then that got trumped by Don Lancaster using his Postscript printer as a computer. He'd write programs in PS, send them to the printer, and the output (like a Bode plot of an active filter response) comes out on paper with more resolution than the screens could do then.
Mike DeSimone
+1 for Rewriting machine code lost by a chewed-up tape.
The video memory for code thing is coming back with CUDA and other stream processors.
The video RAM for code thing: Back in the C64/Vic20 days, those CPUs were synchronous: A memory access happened in one cycle, no support for wait states. So some designers made the video circuit access the RAM on the opposite clock from the CPU, since the CPU finished a read on the rising clock edge. Another video/code resude story I heard was that in some Atari 2600 games, the programmers would reuse sections of code for the sprite bitmaps for things like explosions and wreckage, to save ROM.
Mike DeSimone
Changing hard drives (looked like taking the agitator drum out of a top-loading washing machine). Loading up a nine-track tape.
David Thornley
+2  A: 

The ability to take a program that was approx. 40k assembly instructions and shrink it by half to fit on a smaller computer (I can't remember the name of the person that did this but it was someone who worked on the macintosh i think).

+15  A: 

"Think before act"

Not so long ago, processing time was expensive and you had to make sure that your program would run on the first try. You're asked to create basic paper documentation, get it reviewed, code your program and, finally, run it.

Now, everybody has a hammer.

Rubens Farias
and nowadays everything is multi-core so when your code goes haywire and busyloops on a server shared by other people you can actually kill the process yourself before waking up the scary rootperson.
+21  A: 

Parking hard disk's heads :)

Landing zones and load/unload technology

Do you remember DOS park utility?

Nick D
I once forgot to park the heads on a Ferranti 286 PC and then proceeded to take it across a bumpy road on a small trolley with solid wheels. You can imagine the result :(
Anything to keep your precious 20 MB data save ;-).
+1 for dos park
my first harddrive was a used seagate st220 (20mb) for only 500 bucks.
@stacker: HA! My HD was 30mb! :-) BTW, I didn't have a PC but a PS computer, It was (still have it) good computer but I could buy hardware/peripherals only from IBM :/
Nick D
+24  A: 

Writing self modifying code !

This is one of those things that should never come back.
Mike DeSimone
Sure it breaks all today's coding practices, but that was a lot of fun.
It never went away.
Pete Kirkham
A large company I work for now has some production code that is self-modifying. It's in mainframe assembler, and the program is written in a clever way - it switches branches off/on at another areas of the program (such a branch is called "gate"). Reading it is kinda like playing Chaos Strikes Back (a PC RPG game famous for the teleports and changes that affect other part of the game).
Was anyone else forced to write self modifying code to get a common `Interrupt(int x)` function? Intel was nice enough to *only* put in `int imm8` for the interrupt instruction, so if you needed a dynamic interrupt number, it was pretty much forced upon you.
Still used for JIT compilation, dynamic recompilation, and malware.
+8  A: 

Fixing the partition table and FAT file system by hand using a simple hex editor.

Writing a small linux application in C from scratch to restore a damaged RAID set (3 of 4 disks had hardware errors when the server crashed).

+1 for Threading reel-tapes. Been there, done that. (Tandberg tape drive for Norsk Data servers)

Christian Vik
I used to modify FAT12 on floppy disks and create *magic* folders :-)
Nick D
+1 I remember now!
eh, I recently had to fix a partition using a hex editor cause windows 7 messed it up :-/
  1. Algorithm
  2. Edlin
Square Rig Master
+9  A: 

Classic BASIC (before QuickBasic), with line numbers and GOTOs and GOSUBs, but without any loops other than FOR loops (because you used IF and GOTO to implement WHILE loops). Allowed for the best spagetti in town.

You give QuickBasic too much credit. GW-Basic had WHILE loops, and there were 8-bit BASICs at the same time that even had proper named subroutines.
I remember doing basic on my Apple IIe. Always code in multiples of 10 in case you make a mistake.
Scott Chamberlain
scott: some versions of basic had a RENUM command as a remedy
@ammoQ: Yes, and unless you were really careful, there went your table of contents (this routine starts at line 1000, that one at 1400, etc.).
David Thornley
+41  A: 

Tweaking your config.sys & autoexec.bat files to squeeze every free byte you possibly could from your 640Kb, so a game would run.

Very similar to that scene in Apollo 13 where they are trying to cut as much as possible out of the normal startup process due to low power!

Spending all night changing the order drivers were loaded in conig.sys to get enough low memory for Autocad to run
Martin Beckett
Falcon 3.0 had brutal demands...
I managed to run Wing Commander 3 with just 4MB of RAM!
Frerich Raabe
I remember finding a smaller mouse driver from somewhere to free up 5 or 6 precious kilobytes.
And I remember having a keyb.sys-replacement which needed under 1k!
I remember this too. I actually got so good at it that I was **PISSED** when Win95 came out and "everyone" could just do as they pleased without needing to know how to config their memory. Origin games were the worst for needing this back then, too.
+2  A: 

Word alignment and Endian. Two bugbears that I repeatedly watch people come a cropper over while starting out in embdded system programming (and then have to help out).

Jon Ward
Agreed! I have seen this come up a lot outside of embedded systems programming as well.
+7  A: 
  1. Programming to handle out-of-memory errors and allow the user to recover gracefully.

    When I was programming the mac, each call to malloc would be followed by a check to see if an out of memory error occurred, and if so, to handle the error in a way that would warn the user, allow them to save work, and quit. Basically we would keep around a block of memory for this purpose, and free it up so it would be available for saving.

    Of course, if the user ran out of memory a second time, he /she was SOL.

  2. Locking main memory to prevent concurrent access.

    The mac had these things called "Handles". Basically they added an extra level of indirection to all memory access, so that they could recompact memory - move it around. Since you only stored the "Handle" - a pointer to a pointer to memory - and not a pointer itself, this was possible.

    However, you would need to lock memory down whenever accessing it, to prevent it from moving around, then release the lock afterwards.

    It was a royal pain in the butt :)

  3. Squaring up punch cards to optimize the chance of it making through the feeder without flying all over the place.

  4. line-oriented editors

    At my undergraduate school, we had to write a compiler - it was basically a 2 foot high printout of code, written from scratch, by a team of 3-4. To make things worse, we didn't use yacc, but a tool written by the instructor, who also worked on compilers for IBM.

    We had to do this entirely with "ed" as "vi" was deemed too computationally intensive to be used by everyone.

    I did get really good at regular expressions though, as the fastest way around a file was by search, and it was even faster to do a text match, bind, and substitution. Of course I was doing this pretty much blind, as I didn't even bother to see what I was working on. It was really a lot faster than using a mouse, cut and paste, etc. and luckily vi has the same capabilities.

    It's a lot harder to do it, though, without being able to see the screen :) You COULD print out lines of code, but you weren't really "operating" on it.

Larry Watanabe
Isn't #2 basically what C#'s `fixed` blocks do?
Michael Myers
The idea is similar, however it is only required in c# within unmanaged code. Imagine programming in C# where ALL your code was unamanged and you had to do this for every memory access!
Larry Watanabe
+1 ed is the standard text editor!
+15  A: 
  1. TSRs!
  2. ModeX
  3. Staring at my old PC-XT wide mouthed when I first played a game that used RealSound - That was a neat hack. Digitised speech on the PC speaker!
Noufal Ibrahim
Do you not mean the "Terminate and Stay Resident" TSR's? ;)
Yes. That's the one!
Noufal Ibrahim
+1 for TSR's, they're fun!
+7  A: 

Palette-based graphic techniques like palette-cycling animation. I miss the 'old days' of graphics programming until I have to do bitmap rotations and alpha transparency at a remotely usable speed :)

With a full 256 palette! I made my first ball-and-paddle game with palette rotation. Wish I can find that code :)
+1  A: 

Programming in AMOS, that's what got me hooked on programming.

+3  A: 


alt text

David Johnstone
+28  A: 

The ability to work without constant access to online reference material (and Google and Stackoverflow).

+1 This might be a lost art, but in today's world, someone that knows everything about a specific language has to have some kind of super memory. Take php for example. It has loads of functions, loads!!! I can remember some of them but not most and the ones I do remember I have to look up the parameters :P
Yes but even for stuff which has full documentation, figuring out how do do something can take several days... with SO it can sometimes get an answer in 10min, or might take days for an answer but you can "spawn a parallel SO process while you work on other stuff yourself!"
+2  A: 

Typing over 50 pages of code from dot matrix paper into a Tandy 1000 to reprogram it completely after it was corrupted... Only to find that it didn't fix anything.

+3  A: 

We used to debug via different frequency Beeps from the PC speaker. So the windows service would play a variety of tunes depending upon what it was doing (seriously, and not my idea). It even made it into a rather large mining company.

Sidenote: The funny thing is, when we eventually took it out, the customer complained and we had to reinstate the functionality!!!

+12  A: 

Designing Ansi/Ascii screens for BBS's!!!

+4  A: 

Handling the ray reverse event from a EGA monitor to make UI animations smooth.

George Polevoy
Seems like most knowledge relating to EGA graphics is gone. :)
And yet EGA graphics are still the future of TI calculators... :)

Debugging DSP software with only a logic analyser to watch the program address & data bus, then manually disassembling the hex into TMS320 assembler to work out where we'd got to. You had to remember how deep the instruction queue was so that you knew how many instructions to ignore when you hit a branch.

Plus the beautiful old TMS320s had a pin called "XF" (External Flag) and you would use the SXF and RXF instructions to flip it to a 1 or a 0, and then stick a logic probe on the pin to see what was happening -- a 1-bit debugging interface!

Erasing and reprogramming UV EPROMS.

+1  A: 

Using a DSL framework to generate a generator-DSL, to code a generator which generates DSLs for the domain model.

George Polevoy
+1, Yai, did this!
+9  A: 

Charging money for your code!

Square Rig Master
+3  A: 

Whistling into the phone to check whether mainframe was "up" before connecting phone to acoustic coupler.

Buying a terminal emulator package only to read in manual that I had to write a Z80 device driver before the program would work (and then writing it).

The important difference between byte pointers and word pointers on Data General machines.

Debugging IBM Assembler code from an ABEND dump.

+6  A: 

Opening COMMAND.COM in an editor and changing the error messages.

'Bad Command or Filename' to 'Deleting C:*.*. Please Wait'.

A couple of friends went through and changed virtually every OS error once--this was back on the TRS-80.
Loren Pechtel

From what I've seen:

  • reading the manual
  • deep suspicion of IDEs, wizards, ...
  • integration testing
  • patience
  • treating design, coding, compiling and testing as separate steps
  • code reviews
  • thinking beyond the current version of a product
+4  A: 

Two things:

  • the ability to focus on a problem without giving up to distractions easily (or, as a corollary, the inability to willingly take yourself away from IMs, mail, news, phone, etc).
  • the ability/will to think about what can be wrong when debugging a program, and not just mindlessly fire up the debugger and start tracing threads around and changing variables on-the-fly.

To rephrase the last point: more often than not I find myself doing trial-and-error bugfixing rather than trying to understand why things are happening.

Just because computers are fast it does not always mean that it will be faster to try out all possible combinations. Even though I never lived them, I miss not having the discipline of "the old days" when you wanted to make sure your program was correct because you might have only one single chance to compile it and run it that day.

+2  A: 

There are none.

The golden age of computing, and hence the lost and arcane arts was the brief period between Babbage getting the Difference Engine on-line and Ada Lovelace saying to him, "Hey Charley we can probably use this win on the horses!".

Since then it has either been downhill, or uphill or pretty well flat depending on your point of view and how rose tinted your spectacles are..

+8  A: 

Javascript seems like a lost art these days. I couldn't count the number of questions on Stack Overflow which read something like this:

How do you add two numbers using jQuery?

You've got to be kidding me.
well, thankfully not literally, but in essence, some people are learning jQuery and not javascript and don't even seem to realise the two are related.
JavaScript a lost art? Man, you must be young; or I am significantly older than I'd like to think :)
Noon Silk
+1  A: 

Using TSRs and switching between Operating systems within a running program! I had to acquire data on an old CAMAC system running AMCA-80 on a Z-80. There have been no data storage devices for that system. The machine could run CP/M as well and for that we had a single floppy drive. We managed to switch from AMCA-80 to CP/M in order to store the acquired data on the floppy and than jump back for the next run. This was the time when I new every single byte by name ;-) and used tokens and short variable names to refer to string fragments in order to conserve RAM. Those where the times...

+1  A: 

Embedding assembly programs inside Basic programs on the TRS-80. It had to be done very carefully because you never knew where it would be executing--relative jumps only. Furthermore there were a few values that would break the structure and likely take out your whole program. Writing code without being allowed to use a zero was a pain.

Getting the teacher to leave alone any line with embedded stuff was even more of a pain. On the Mod Is they displayed gibberish and couldn't be edited by the normal editor. In class we only embedded graphics but she was determined to figure out how it worked and couldn't get it through her head that making no changes in the editor would still trash the line.

Loren Pechtel

How to program keypunch drums.

Getting Domino's to deliver to the keypunch line (the line of people waiting for their turn at a keypunch)

What to do with buckets of chafe (the holes punched out of cards).

David Zimmerman
+1  A: 

Graphics programming with BGI

int gdriver = DETECT, gmode, errorcode;
initgraph(&gdriver, &gmode, "");
errorcode = graphresult();
if (errorcode == grOk) {

Mode 0Dh EGA graphics programming for DOS. Among the plethora of examples out there covering Mode 13h VGA (which, I freely admit, kicks ass), good luck finding a working example(*) that shows how to plot pixels in EGA.

Now you might be thinking, "yeah, whatever, EGA deserves to die." Fair enough, but perhaps you haven't played Quest for Glory, one of the greatest DOS games ever made. Or what about King's Quest IV, released in 1988, a cutting edge game that pushed the limits of home user PC hardware at the time. This stuff is a huge part of my heritage as a programmer.

(*) P.S. If you know of such an example, please help. :)

I added some useful info for you there... ;)
That is fantastic! Thanks.

Mine would have to be writing a TSR to trap the Ctrl+Alt+Del key..I still have it lying around somewhere...

The other is writing a Screen Design Aid in Borland C, this was inspired from the computer diploma course, that had the AS/400 mainframe system, a program called SDA on the AS/400, in which you could draw a data screen for inputs, interacting with COBOL and RPG IV, in college days (1996)...

It was that inspired me to write a program to write directly to the Video memory (remember 0xb800:0000 for colour and 0xb000:0000 for monochrome vga?) for fast writing/reading of the memory and dumping it to disk, then wrote a small library to read the dumped data back out into video memory, this includes the ascii lines and corners..

Remember those flashy boxes generated by entering ALT+240-250 somewhere...still have it on my netbook as I type! :D And creating a file on the DOS command line that was impossible to delete because there was a hidden ALT+254 character which was an ASCII blank and deleting it would fail. :)

edlin FILE[ALT+254].TXT
Invalid command or filename.

Edlin was some editor, in fact I did grow to like it...

Happy days, Best regards, Tom.


In my experience, strong kung-fu at the command line is a lost art among many IDE-oriented developers. I often help people with problems that are simple if one knows (a) Bash (b) the Unix find command and especially (c) regular expressions.

Michael Easter
+2  A: 

Writing an Intro for the C64, where each cpu tick was aligned to 8 pixels drawn on the screen, and every instruction in the cpu tock 1-3 cpu ticks. You wrote an interrupt for the screen hitting a given horinzontal line, and the read the current horizontal position of the cathode ray in the monitor (which could be on one of three position, depending on the length of the currently executed intruction in the cpu while the interrupt fired). The you jumped into a list of NOPs (one instruction each), depending on the cathode ray location to pixel align the excution flow of your program to the redraw of the screen. Then you switched background colors on each screen line to implement bouncing bars for your scrolltext.

Great great times.

+5  A: 

Bootstrap loading a computer...

A lot of flight simulators still use the old Encore workstations in a Master/Slave configuration where you have to punch in a series of instructions in Hex on the keypad to load them.

Watching someone do this, or doing it myself (I'm only 25) is like stepping back in time into the computer room in Golden Eye where you're surrounded by blinking lights, tape backup drives, and Printronix printers.

Lets just say, when you do finally get it to load right (because it doesn't alway take on the first permutation) it's hard not to do an "I am invincible!".

Evan Plaice
+2  A: 

Edititng punch cards with sticky tape!

Impelmenting simple "delete" and "insert" edits on a card copy machine by stopping the appropriate card moving with your finger.

James Anderson

Compiling from unsaved code in the editor's buffer - Turbo Pascal had this feature. If the code ran, you saved the file. Otherwise your system crashed; you would then have to reboot and start over.

Paul Keister