Does anyone have any experience designing an organization-wide backup strategy?
I run a network for a school, with a mix of Linux and Windows machines. We have around 800 pupils and 100 staff to cater for, which is probably larger than your University department, but the same general solutions are probably relevant.
There are (at least) two types of backups: disaster recovery (the building burns down, you need to get everything back up) and revision control (someone accidentally deletes a Word document and needs it back). I handle disaster recovery by having all our servers run as virtual machines, capable of having "snapshots" of their disk images taken. Snapshots of each server are taken and written to a removable harddrive (not necessarily every night, as that would take a lot of bandwidth, but every time you make a configuration change to the server), then removed and placed in a fireproof safe or taken off site (another advantage of using virtual machines: you can also encrypt data being taken off site). Each server VM is also mirrored in real time to a separate physical machine - if one machine goes down the other takes over right away.
Revision control is handled by a Python script that scans a file system and uploads changed files to a central server. The file system on that server makes extensive use of Unix-style symbolic links - i.e. only one copy of a given file is ever stored, subsequent copies are simply symlinked to. This allows you to have a full file system created for each day's backup but to only use a fraction of the actual disk space it uses (you just need enough space for any files changed since the last backup and to store all those symlinks). This is the general principle that things like the Mac's Time Machine system uses. Users needing to restore an old file can simply browse that file system.
All your Windows machines should hopefully be joined to a Windows domain. This should give them access to a network file area. That network area can be backed up easily enough with the methods above. At the school we set up the domain to stop users being able to store files locally on machines and force them to only use the network areas, but that might not be practical in your situation (your users might need to store big files that wouldn't be usable over your network connection). In that case you could run something like that backup script on each workstation, just get people to remember to leave their PCs on at night (or set them up so they can be turned on via their Ethernet cards and have the script switch them off afterwards if you want to save power).