What things should a programmer implementing the technical details of a web site address before making the site public? If Jeff Atwood can forget about HttpOnly cookies, sitemaps, and cross-site request forgeries all in the same site, what important thing could I be forgetting as well?

I'm thinking about this from a web developer's perspective, such that someone else is creating the actual design and content for the site. So while usability and content may be more important than the platform, you the programmer have little say in that. What you do need to worry about is that your implementation of the platform is stable, performs well, is secure, and meets any other business goals (like not cost too much, take too long to build, and rank as well with Google as the content supports).

Think of this from the perspective of a developer who's done some work for intranet-type applications in a fairly trusted environment, and is about to have his first shot and putting out a potentially popular site for the entire big bad world wide web.

Also: I'm looking for something more specific than just a vague "web standards" response. I mean, HTML, JavaScript, and CSS over HTTP are pretty much a given, especially when I've already specified that you're a professional web developer. So going beyond that, Which standards? In what circumstances, and why? Provide a link to the standard's specification.

This question is community wiki, so please feel free to edit that answer to add links to good articles that will help explain or teach each particular point. To search in only the answers from this question, use the inquestion:this option.

+22  A: 

How to work with absolute and relative paths.

Andrew Taylor
+19  A: 
  • Valid (X)HTML - with the appropriate tags.
  • No broken links (See above about relative links)
+40  A: 

It might be a bit outside of the scope, but I'd say that knowing how robots.txt and search engine spiders work is a plus.

+17  A: 
  • Consider URLs, a URL design with REST in mind could make exposing APIs easier in the future. Definitely much easier to get your URLs right the first time then to change them in the future and deal with the SEO consequences.
  • Read Josh Porter's book Designing for the Social Web.
  • Have some way to accept criticism and suggestions.
  • Know what progressive enhancement an graceful degradation are, JavaScript is NOT a requirement to operate the web and should be treated as such.
Eric DeLabar
+47  A: 
  1. Web standards
  2. Awareness of browsers
  3. Awareness of accessibility
  4. Awareness of usability
These are all true, but not really in the spirit of the question. They are things a _designer_ should know. I'm looking more for things a _programmer_ who is merely realizing a pre-built design should know.
Joel Coehoorn
I take it back: the accessibility item earns a vote because it may be a _legal requirement_ for some developers.
Joel Coehoorn
+50  A: 

I'll add one:

  • how to do caching
Joel Coehoorn
+22  A: 

I would think that knowing all you can about your deployment environment would rank up there.

IIS, MSSQL or Apache, MySQL, etc? ASP.NET, PHP, etc.?

Perhaps this is a no-brainer, but surely someone out there has written code that relies on [insert dependency] only to find out their client's server was missing [aforementioned dependency].

+6  A: 

Cross-browser support, particularly with respect to CSS.

Graeme Perrow
+12  A: 

Ensure that whatever framework/server-side scripting/web server/other you're using doesn't expose error messages directly to the user.

Checking that whatever has been put in place to facilitate the above during development is switched off or reversed. Obviously the preference is to have this stuff properly configured in first place - but it will still occur time and time again.

That's mainly written from a security standpoint, but very much related is the usability issue of ensuring that should errors occur, the user get something that makes sense to them and tries as best possible to get them back to what they were doing.

Gareth Jenkins
+1  A: 
Mark T
Safari and Chrome use the same rendering engine, but when did a bit of redundancy kill anyone?
+138  A: 
  • Never put email addresses in plain text because they will get spammed to death
  • Be ultra paranoid about security.
  • Get it looking correct in Firefox first, then Internet Explorer.
  • Avoid links that say "click here". It ruins a perfectly good SEO opportunity.
  • Understand that you'll spend 20% of the time developing, 80% maintaining, so code it accordingly.
up-vote for the e-mail address item
Joel Coehoorn
‘Click here’ links are ugly regardless of SEO.
Roman Odaisky
"Click here to blah" may be good if you expect many inexperienced users who may not realize where exactly they can click.
I think it should be "Get it looking correct in a standard-compliant browser first, then IE".It wouldn't be fair for other developers who are fans of other browsers.
I agree with everything except Firefox before IE. I'm a realist.
Standards compliant browsers should always come first if we're to ever get IE to change its ways.
+1 for Fox -> IE : you don't make a car that works in the arctic and reverse engineer for city driving
I do IE before Firefox because I found easier to change stuff in Firefox with Firebug ;)
I have my email address on my web site, and my spam filters have been perfectly adequate for the task, except for the backscatter spam which was generated from domains I owned.
David Thornley
L0l for 'Firefox first'. Sorry, but you're wrong, I'm sure.
@David Thornley: Hey, it's great that you have efficient ways to block spam for *your* email address, but the point was not to expose the *users'* email addresses. Do that and you'll have an angry mob at your doorstep faster than you could say 'Spam"! BTW, this sounds to me like basic privacy courtesy.
Cristi Diaconescu
Get it working in Chrome first. It's web kit and a good chance you are doubling up on the Mac users of your site. Then FF then IE.
In Germany it's the law to put the e-mail address on the page. I wouldn't risk it by using special tricks to hide it.
The way a lot of web apps are written (constantly evolving), development and maintenance are pretty much the same thing.
Noel Walters
+44  A: 

In addition to caching

btw, the whole linked article is a must read.
+11  A: 

Good knowledge of HTTP, including caching and expiry headers

+10  A: 

How to avoid Cross site request forgeries (XSRF) (this is not cross site scripting (XSS))

Now I'll probably be modded down for overuse of parentheses.

Marc Gear
Nah, programmers are generally fine with lots of parentheses, :)
+17  A: 

How to build a scalable design in the off chance that the site becomes really popular.

*cough*twitter*cough* :-)
+1  A: 
  • One of the key things is to understand how you are going to debug your system. This means understanding the 'big picture'. So know your environment (O/S, database, framework, networking et al) and at least know where to 'look' if you have ten users each calling with their on issue even if you did not write all that server side code.

  • Often times, good user interface design (error logging with the right amount of detail, log levels, hooks to display some details on demand) will go a long way.

+5  A: 

You should consult the OWASP web site and understand the vulnerabilities listed there. Keep in mind OWASP does not talk about issues like scalability, session state management issues, and browser compatibility. Those areas will need to be understood as well. But I would argue that they certainly are less important than security.

Johnny Bravado
+3  A: 
  • Web standards
  • CSS
  • Interface Design

If it's unusable, you have no chance!

Mr. Matt
These are all true and important, but not really in the spirit of the question. They are things a _designer_ should know. I'm looking more for things a _programmer_ who is merely realizing a pre-built design should know.
Joel Coehoorn
+35  A: 

Well, everyone else has already mentioned most things I thought of - but one thing I always forget is a favicon. Sounds stupid, I know, but I think it's one of those little things that helps to emphasise your brand, and I never seem to remember it. Please check Scott Hanselman's post about how to use it carefully.

I agree with some of the rest too - I think it's important to know as much as possible about your chosen language, so that you can code it with best practices and maintainability in mind. I've come across functions and patterns that I wish I'd known about when I did my first few crappy, amateur projects, as it would have saved me writing some retarded WTF-ey workarounds!

+4  A: 
Peter GA.
+191  A: 

Rule number one of security:

Never trust user input.

Jonny Barnes
Added to which: Cookies count as user input.
Colonel Sponsz
+1 users are evil
Assume users are idiots. If the user gives you input as expected, treat that as the exception and not the rule. A corollary: don't expect users to read instructions either, no matter how short, simple, and obvious they are.
HTTP headers count as user input too (including the referer).
Also, some server values count as input (like `$_SERVER` in PHP)
@Yassir: Not all users are evil, but those who are evil are still users.
Bill Karwin
I thought the first rule of security was don't talk about security.
+47  A: 

Here are a couple of thoughts.

First, staging. For most simple sites developers overlook the idea of having one or more test or staging environments available to smoothly implement changes to architecture, code or sweeping content. Once the site is live, you must have a way to make changes in a controlled way so the production users aren't negatively affected. This is most effectively implemented in conjunction with the use of a version control system (CVS, Subversion, etc.) and an automated build mechansim (Ant, NAnt, etc.).

Second, backups! This is especially relevant if you have a database back-end serving content or transaction information. Never rely on the hosting provider's nightly tape backups to save you from catastrophe. Make triple-sure you have an appropriate backup and restore strategy mapped out just in case a critical production element gets destroyed (database table, configuration file, whatever).

Ed Lucas
+3  A: 

From a systems perspective, document how the application works and the subsystems involved and add instrumentation to the application for the systems in which it will run (e.g. event logs or performace monitor in Windows).

The application has to be run by some support personnel and they need tools to track possible problems that may appear.

Alejandro Mezcua
+5  A: 

Set aside all the technical aspects, skills and security, I would make sure that it would be easy to use and really does the thing the user expect. Human computer interaction is important. Layot and flow is important. Otherwise no one will use it, other that scammers, spammers and robots.



+3  A: 

Consider your design from your potential users perspectives. How will they use the site? What will benefit them most? What will annoy, frustrate, or keep them from using it? If you're trying to decide on a design element that will benefit you, but not the user, scrap it.

This is true and important, but not really in the spirit of the question. It's something a _designer_ should know. I'm looking more for things a _programmer_ who is merely realizing a pre-built design should know.
Joel Coehoorn
+1  A: 

Know how to resist session highjacking. Http_only is only one aspect of this, and not necessarily the most likely for some threat models (it applies when people can insert html onto your site).

There are session highjacking attacks which are regarded as remotely executable by NIST, and exploits are in the wild today. Here are some refs:


+2  A: 

I agree with "The Professor" there's no point in having a beautifully built site that validates correctly and is accessible to all if the content is rubbish. In addition to his comment though I'd add spell checking and proof reading. I find that the majority of tweaks that have to be made after the site has gone live is down to spelling/grammatical issues.

This is true and important, but not really in the spirit of the question. It's something a _designer_ should know. I'm looking more for things a _programmer_ who is merely realizing a pre-built design should know.
Joel Coehoorn
+1  A: 
  • how SSL works
  • how PKI works
  • how cookies are used to manage sessions
upvote for the SSL part.
Joel Coehoorn
+4  A: 

HTTP Compression is often overlooked and can drastically speed up a website.

+2  A: 

If you are going to accept user input, learn input validation. This is the biggest thing that programmers make mistakes on, they accept user input in random location and it allows script kiddies to come along and remote include a file that then gives them full control over your local machine.

"Be lenient in what you accept, but strict in what you output"

However, don't trust any user generated input in any way shape or form. Don't trust it!

+31  A: 

When to say "no" to the designer or client, and how to do so gracefully and diplomatically.

+2  A: 

Understand how to monitor a site for intrusion and make it easy for the person who manages the site to recover to a known-good state. Even if you aren't going to be managing the site you should educate the site-owner in this regard before handing it over.

Even if your code is bulletproof, the server that the site is hosted on can be compromised (especially in a shared-server environment), so it seems like it's not so much a question of whether your site will be hacked, as when it will happen and how much pain will be involved in cleaning it up.

So you'll want to design with this in mind; e.g., craft your URL scheme such that it is easy to spot malicious requests in the access logs; think carefully before storing page templates in a database; and so forth.

Ben Dunlap
upvote for the 'how to monitor' part
Joel Coehoorn
+2  A: 

Site design and development with thinking of localization feature for other languages.

+84  A: 
  • Web standards: it's cheaper to aim for standards than testing for every browser available (and in a public website you will see a lot of different browsers/version/OS combinations (30+))
  • SEO-friendly URLs: changing URLs later in the game is quite painful for the developers and the site will most probably take a PageRank hit.
  • Understand HTTP. If you have only worked with ASP.NET webforms, then you probably don't really understand HTTP. I know people that have worked with webforms for years and they don't know the difference between a GET and a POST, let alone cookies or how session works.
  • HTTP Caching: Understand what to cache and what NOT to cache.
  • Optimize image weights. It's not cool to have a 20 KB image for a repeating background...
  • Read and understand Yahoo's best practices ( Not every rule applies to every website.
  • Use YSlow for guidance, but understand its limitations.
  • Understand how JavaScript is processed on the browser. If you put tons of external scripts at the beginning of your page, it's going to take forever to load.
  • Consider cell phone usability: some users will access your site using their native cell phone browser (I'm not talking about iPhones or Opera Mini). If your site is pure Ajax, they will probably be out of luck.
  • Learn the difference between 301 and 302 redirects: it's not the same for search engines.
  • Set up Google Analytics (or any other analytics package) right from the start.

Not specific to public websites, but useful nevertheless:

  • Server caching: identify and exploit any caching opportunities, it makes a big performance difference. It's often overlooked on non-public websites.
  • Set up a good error reporting solution, with as many details as possible. You will get a lot of errors when you launch, no matter how much you tested, so you better get all the details you can.
  • Set up an Operation Database (see for example so you can quickly identify bottlenecks.
  • Set up a good deployment strategy. You will probably deploy more often than non-public sites (we deploy daily).
  • Realize that web applications are inherently multi-threaded, you will have lots of visitors (typically much more than in non-public websites), and threads are not unlimited.
Mauricio Scheffer
That link doesn't have much information on what an Operation Database actually *is*. Do you have more literature?
It's a concept from the book "Release It!" (
Mauricio Scheffer
+28  A: 

You also have to:

  • Keep your system up to date with the latest patches.
  • Keep yourself up to date with knowledge of new vulnerabilities affecting your platform, and attack techniques in general.

I follow several security related blogs and podcasts.

In addition, I get email alerts from SANS (you need to register, but it's a great source).

(I'm always interested in learing about other good sources, too).

+28  A: 

If you have any influence on design, please read, "Don't Make Me Think" by Steve Krug. It is an easy read, and will almost certainly make you think...

Yes, I've read this book through and it makes a lot of sense. It's to the point as well, which is what your website should be like.
+14  A: 

Found a new one today:

  • Reset Style sheets

They style sheets you included as a base-line when starting a project to give you more consistent behavior across different browsers. See this question:

Joel Coehoorn
+13  A: 

On a public site, make sure you are using an XML sitemap to help search engine crawlers crawl your content more intelligently.

If you have non-HTML content on your site, you should also look into Google's extensions of the sitemap protocol to make sure you are using whatever is appropriate. They have specific extensions for News, Video, Code, Mobile-specific content and Geospatial content.

One thing I learned that was not obvious in the Google help, is that each of these content-specific sitemaps should be a separate file and joined together at the root with a sitemap index file. For some reason Google doesn't like you to mix content in one sitemap. Also, when you use Google Webmaster tools to tell Google about your sitemaps, tell it about each of the special sitemaps you have separately and use the drop-down to specify the type. You would think the crawler could use the XML to auto-detect this stuff, but apparently not.

Tim Farley
+4  A: 

A web developer should know:

  • Jakob Nielsen's Alertbox
  • That less is more
  • How to decouple presentation (HTML, CSS) from business logic (JavaScript, backend)
Ates Goral
+73  A: 


  • Filter and validate incoming user input ('amount' does not need to accept alphabetical characters) and escape outgoing user input (a ' in user input, is NOT the same as an SQL ').
    Never trust any data given by the user.
  • And the above will help with protecting against SQL injection.
  • Understand SSL
  • Keep your systems up to date with the latest patches.
  • Protect yourself from cross site scripting
  • How to resist session hijacking
  • Find out about HTTPOnly cookies
  • How to handle authentication/permissions
  • Understand PKI (public keys)
  • Keep up to date! This is the most important thing, make sure to follow all the latest information about possible security issues and vulnerabilities that affect your platform.


  • Create SEO friendly URLs - NOT
  • Use an XML sitemap so that site engines can crawl your site more intelligently
  • Set up Google Analytics (or another analytics package) from the start
  • Learn the difference between 301 and 302 redirects: it's not the same for search engines.
  • Set up a robots.txt file



  • Documentation!
  • Code from the beginning with maintainability in mind
  • Have a good deployment strategy - don't save it to the very end to figure this out.
  • URLs designed with REST in mind could save you a headache in the future.
  • Use patterns like MVC to separate your application flow from your database logic.
  • Be aware of the many frameworks out there that will speed up your development
  • Use staging and a version control system to deploy updates so that your users won't be affected
  • Set up an error logging system. No matter how well coded your website will have errors when it is released. Don't wait for the user to let you know; be proactive in identifying errors and bugs
  • Have a bug tracker
  • Know your environment. Your OS, language, database. When you need to debug it will be important to understand how these things work at a basic level in the least.

User experience

  • Be aware of accessibility. This is a legal requirement for some programmers in some jurisdictions. Even if it's not, you should bear it in mind.
  • Never put email addresses in plain text, or they will be spammed to death.
  • Have some method for users to submit their comments and suggestions
  • Catch errors and don't display them to the user; display something they can understand instead
  • Remember that cell phones and other mobile devices with browsers are becoming more common. Sometimes they have very poor JavaScript support. Will your site look okay on one of these?

Core Web technologies

  • Understand HTTP, and things like GET, POST, cookies and sessions.
  • How to work with absolute and relative paths
  • Realize that web applications are inherently multi-threaded, you will have lots of visitors (typically much more than in non-public websites), and threads are not unlimited.
When did this come back? It was deleted and (I thought) lost forever.
Joel Coehoorn
I prefer this to the revised/accepted version
+3  A: 

I am new in Web Development and what I faced problem with are

  1. Detail knowledge about JavaScript and Ajax.
  2. Security. Specially XSS and CSRF etc.
  3. Some knowledge about CSS even if there are dedicated designers for it.
  4. Adherence of W3C standards or others.
  5. Deployment issues and how to solve them.
  6. Browsers and How they work. Same origin policy and why it is important.
+1 for same origin policy.
Sean McMillan

Need to know what is easy to use for the public, not for an IT professional or software developer.

+5  A: 

How to handle the Slashdot effect.

+915  A: 
It would be great if you can suggest some good books for same
If you can recommend good books, please feel free to edit the post with links for them.
Joel Coehoorn
With regard to security and login: Make sure your login page is https, not http, as Bruce Schneier says. It doesn't matter that you submit the information encrypted if the user can't be sure your site hasn't been hijacked. HTTPS certificates give them that assurance.
Woowww.. great answer
Andrei Rinea
Some of your SEO suggestions are bad. It doesn't matter if you use tables or divs (Google confirmed this themselves). That SEF URL thing... I hate those "fake URLs", where the ID is the only thing that actually determines the page. "45-blah" would be the same page. It's not user-friendly either.
Then edit it. I didn't write most of this: I'm only maintaining it -- a job which I've inherited because I asked the question, solicited this larger answer specifically, and I'm genuinely interested in seeing what we can come up with. The more contributions the better.
Joel Coehoorn
One more note: if you do come back and edit this, try to be respectful of what was written. Don't just remove the parts you disagree with: actually take the time to address the short-comings and provide something better.
Joel Coehoorn
This is awesome, thanks to everyone involved in editing it.
Fire Crow
This one is soo true, good job man. +
complete and explanatory reply...
Syed Tayyab Ali
To the SEO/"don't use 'click here' links" topic: use "'log in' to post comments" form instead.
Absolutely fantastic! Are there similar lists on SO for other application domains too?
Vadim Ferderer
Look for the 'best-practices' tag, found on excellent posts everywhere.
Craig Trader
This may be the best answer to a question I've seen so far.
@Disgruntled Goat - You're wrong about the Links. The text in the links gets highlighted, and is used in the link keyword weight. Also, this is not just SEO friendly, it is USER friendly for bookmarking. I can't say how much it drives me nuts to have a bookmark or pasted link in an email that just has the id, and no decription.
The OWASP link is broken :-(
Really? All this? Shouldn't all that knowledge be split in a team?
@Carlo: If you really want to be a web developer then yes you really should as an individual understand everything in this list.
Chris Lively
You're right, I got the question in a completely different way. It says "know" not "do" I thought this answer implied a web developer should "do" all of that; that's why I mentioned it's more of a team effort.
+1 Great details, nice to know some of the things i was missing. Thanks
Great answer, it looks like a reference answer. Thanks
+3  A: 

Regarding credit cards and debit cards, at least within the United States, be aware of PCI compliance and the various rules and responsibilities that it covers. Accepting credit cards for a small e-commerce application can open a very nasty can of worms if the proper security measures are not in place. It goes way beyond having SSL enabled on the web site. Search for PCI-DSS on your favorite search engine and make sure you, and your clients, understand the regulations that they will need to follow. Other locales have similar rules under different names, but all of the major payment card players are getting serious about securing cardholder data.

Justin Scott
+3  A: 

What should a developer know before building a public web site?

What about the data?

  1. Data normalization
  2. Design Query structure of the data carefully
  3. Optimize it and understand where to cache or not
+40  A: 
+4  A: 
  1. Cross Browser Compatibility

  2. SEO

  3. Horizontal /Vertical Scaling

  4. Advantages/ Disadvantages of Caching

K Man
+3  A: 

Duplicate slashes in a path are normally harmless, but <a href="//index.html"> does not mean what you think it means.

Ant P.
What does it mean, then? Can you post a complete explanation? A name for effect + a link that can be included in the main answer?
Joel Coehoorn
+2  A: 

Take a look at a good web usability book, e.g. Don't Make Me Think: A Common-Sense Approach to Web Usability, by Steve Krug.

+5  A: 

Most of the essentials have been covered by the top 10 answers, but here are a few of the ones I missed up there:

  • For browser compatibility testing, use (free)

  • For stress testing, use the command line tool ab - ApacheBench (on Linux/Mac OS X). It will let you find the 'heavier' pages, so you can do your performance tweaking where it will matter the most (that is, caching!). "A slow page is a DoS attack waiting to happen."

  • If you, like most, will be using a web host rather than hosting your own web server, spend a couple of weeks (yes, weeks!) on the forum to get a feel for which hosting providers are currently the best in the lands. That forum is THE one and only gathering place for serious web hosting nerds, and these cats have the dirt on everyone. If you are serious about your web sites, you need to background check your hosting providers on WebHostingTalk.

  • Use a remote distributed system for monitoring your uptime (e.g. to determine whether it's time to move to a different hosting provider) - comes to mind, but there are many others

  • Do not write your own CAPTCHA. I repeat: Do NOT write your own CAPTCHA!

Jens Roland
Some good notes here. Regarding CAPTCHA: you're right, don't write your own from scratch. However, don't rely on a stock implementation either. You want something that you can customize enough that if the stock implementation is attacked and cracked your customizations still hold up.
Joel Coehoorn
Of course, combining a well-tested stock implementation with a little homebrew security-through-obscurity to fend off generic (implementation-specific) bots is always a good idea. My main point is to avoid entering the CAPTCHA arms race unless all you want to do is work on you CAPTCHA all day.
Jens Roland
Thierry-Dimitri Roy
You're right Eldimo, thanks for the correction
Jens Roland
I would suggest to not use captchas at all and instead try to use hidden form elements and javascript cookie validations.
+5  A: 

Begin by designing your page as if HTML was your only tool and JavaScript and CSS didn't exist, and make sure it validates. (This is not an excuse to use <font> tags, I'm talking about making good semantic code here!)

Then, add CSS (from an external file), and gently style your work, adding as little extra HTML as possible.

Finally, make your JavaScript (I'd use jQuery) enhance the user experience - again adding as little extra markup as possible.

+3  A: 

Especially for SEO, but for some other reasons as well: remove session id's from (public) URLs, that might have been added by the web framework for cookie-less browsers, but may not be required for public browsing anyhow.

+2  A: 

This may have already been mentioned, but knowing how the client plans on updating the site. If the client has someone who "knows HTML", then prepare for problems. It's best to have a good CMS in place for updates if the client wishes to update the website themselves, NEVER let them have access to all of your code.

+1  A: 

Make sure someone in the organization already has the content maintenance, ongoing SEO and marketing plan worked out fully. Because if they haven't, they're going to default to you to provide all of those things (possibly with little compensation).

Alex Burr
+2  A: 

Read about the Principle of least astonishment in "Principle of least astonishment" and "User-Friendly Programming".

+3  A: 

One very important thing for UI Heavy Sites is taking care of screen resolution. It can totally make or break the UI experience of your site.

+3  A: 

Develop for Gecko and Webkit browsers first, then use conditional comments to address IE issues that cannot be fixed by tweaking CSS (e.g. for more specificity, rules that trigger IE's 'hasLayout', etc.)

Dave Everitt
+17  A: 

Know how to hinder Denial of Service (DoS) attacks on user login forms by keeping track of the number of failed logins over a given period of time. In the event you hit a certain threshold above the running average, increase the duration of all recurring login attempts by a particular amount (say 5 sec.).

Someone feel free to modify for clarity :)

+3  A: 

Nothing. If you're building your first site, just build it. Get dirty, make mistakes and learn. Because after you've built hundreds and lots of advanced tricks are second hand to you, after you've done it all, seen it all, the one thing you'll always need to remember is the one thing you know when you start: you don't know everything. Especially if you're worried about security. Even if you cover all the bases, someone will come up with something new. It's the downside to being one of the Good Guys.

I think this is true if that first site doesn't have much economic import (i.e., a hobby site vs. a commercial engagement). If you're being paid to build it, then you should assess the tradeoffs inherent in each of the issues and invest your time based on the ROI (example: if the cost of a day or two of downtime is minimal, you may not want to invest too much up-front time in dealing with BOT defense). Make sure you really understand the implications up front when doing this, otherwise the likely outcome will be "we didn't mean it could *actually* be down for two days"
+28  A: 

The cruel, hard facts:

Users spend as much time on your website as an interviewer does reading your resume when submitted in a pile of thousands of others

  • Users spend very little time on your website: Read, seconds.
  • Users are lazy and they would rather be somewhere else
  • If the user can't find what they are looking for within seconds, they leave
  • If the user cannot identify what the website is all about, they leave
  • If the website does not 'just work', they leave
  • If the website annoys the user or does not appeal aesthetically to him, they leave

Everything about websites and website design revolves around these facts.

  • Clear Navigation
  • Conciseness
  • Branding strategies
  • Colors, schemes, aesthetics, text placement, text formatting
  • Helpful, not hindering, Ajax/JavaScript
  • Not reinventing the wheel when it comes to website use, navigation, etc.

This is just an outline on why it is so important to adhere to standards and read those website design books.

Agree very much, but you missed the point of the question. For most web developers, the clear navigation, layout, and concise content are the responsibility of someone else. They're only implementing an artist's design. Given that, what then do you need to know to implement the technical details correctly?
Joel Coehoorn
+15  A: 


  • Consider using an Application Firewall such as UrlScan that works by blocking specific HTTP requests, UrlScan helps prevent potentially harmful requests from being processed by web applications on the server.
  • Disable Directory listing.
  • Consider using a lower privilege identity.
  • Don’t use Blacklists, but use Whitelists instead, teach your application what to accept, not what to avoid.
  • If using ASP.NET then encrypt your connection strings using aspnet_regiis. This tool is so easy to use and requires simple steps to both encrypt and decrypt connection’s strings.
  • Pages with sensitive data should not be cached: page content is easily accessed using the browser’s history.
  • Validate user inputs in the application, promote the use of Regular Expressions (and be assured that they work the way they are meant to be)
  • Avoid, at all costs, client side validation (e.g. using Ajax or all JavaScript related validation libraries). JavaScript can and will, be turned off and so your protections).
  • Do NOT use GET for anything that changes the server state or contains sensitive information. GET requests are logged in the web server access logs. They are also shown in the browser history.
  • DO use POST for every action that changes the server state and reject all non-POST methods. POST prevents unintentional actions, most search engines won’t crawl POST forms and it also helps prevent duplicate submissions.
  • If using Cookies, mark them as HTTPOnly using System.Net.Cookie. Set the httpOnlyCookies attribute on the authentication cookie. Internet Explorer Service Pack 1 supports this attribute, which prevents client-side script from accessing the cookie from the document.cookie property.
  • Robots.txt files are the first place hackers look at. Use access controls to protect them.
  • When constructing SQL queries, use type safe SQL parameters. AKA use stored procedures and stored procedures only. Using stored procedures is always the best approach, from both technical and security point of views.
  • If using SQL Server, use a least privileges user. Create a SQL Server login for the account. Map the login to a database user in the required database. Place the database user in a database role. Grant the database role limited permissions to only those stored procedures or table your application really needs. By using a database role, you avoid granting permissions directly to the database user. This isolate you from potential damage to the database.
  • Use SSL were possible, this will encrypt and protect your data while on the wire. Using SSL doesn’t necessary means you are secure. It simply means your data is encrypted while on the go. If using SSL, restrict authentication tickets to HTTPS connections only.


  • Minimize HTTP Requests
  • Add an Expires or a Cache-Control Header
  • Gzip Components
  • Put Stylesheets at the Top
  • Put Scripts at the Bottom
  • Minify JavaScript and CSS
  • Optimize Images
Some of this is dated. Parameterized queries now give the same benefits as stored procedures, you _want_ to do validation on the client side to help avoid unneeded http requests from benign users (you just have to duplicate that work server-side as well), and there are often better ways than regex for your validation logic.
Joel Coehoorn
Client side validation should not be "avoided at all costs". It is a handy method to make sure the client doesn't waste time submitting incorrect info to a server. HOWEVER, while client side validation is good to use, you MUST NOT trust the data sent because the CSV can and will fail. ALWAYS use the proper tools to validate user input on the server side.
+5  A: 

If you implement a "I forgot my password" feature, don't email their password back in plaintext. Instead, email them a time-expiring link which will take them to a page that allows them to select a new password.


Even though there are many things to consider here you should not neglect the performace of the website,especially the loading time. This can be achieved through caching,gzip.deflate compression etc. Check a list of things you must do for faster website at the below url 7 steps to Speed up website loading – Website Optimization Tips -Part 1


If for some reason you don't trust Google or you want to have more control over the collected data, try Piwik as analysis tool. It is open source and extensible via plugins.

Felix Kling
+4  A: 

Having a backup strategy is really important (as it's already been mentioned) but checking the backup is equaly as important. There is no point having 100s of backups if they are all corrupt. Your restoration strategy should be known and tested depending on the needs of the business.

+2  A: 

You need very little knowledge to put a site out to the public. Don't forget that there are billions of sites out there, and you don't want to spend months of your valuable time building something that nobody wants.

All the skills you need are basic HTML, CSS and JavaScript to quickly throw up a prototype and put it out in front of the big bad web. Think about it this way - if you build out something really awesome in several months, let's say, and you put it out on web, and nobody clicks on that link to Get Started, then something has gone terribly wrong.

Either you were working on the wrong problem, or a problem that nobody had, or they didn't know they had a problem. You could simply test your early hypothesis by putting up a nice fancy mockup landing page with a link saying "Get Started", and when users click that, you take them to a thank you page asking them for their email/contact information to inform them for when you do actually go live.

I have recently been introduced to this idea of a Minimum Viable Product (MVP) which is very radical in terms of what it is. It's not a minimum viable product in the sense that most developers would think of it as. Here's a nice interview with Eric Ries that talks about the idea in detail -

Kent Beck, the creator of Extreme Programming methodologies had an interesting story to share in the Startups Lessons Learnt conference today in San Francisco. He had an idea of introducing a payment gateway to charge users for unlocking higher levels of a game he was building. They estimated it was going to take a little while to implement the whole thing, so he decided to just put up a button saying "Buy the Next Level" on the game page. When users clicked that link, they just let them into the next level without charging or anything. But it didn't hurt them at all as they didn't have a million+ user base, and they collected valuable information about how many users were actually willing to buy the next level.

So I would recommend you don't wait until you build a nicely polished and finished product before reaching out to your users. And to get started with that, you don't need a whole lot of knowledge. Basic HTML/CSS/JavaScript skills are more than sufficient to get started.

-1 Until your basic HTML/CSS/JS scripts screw up, and hackers get the email and user password of your because (since you have only basic skills) you didn't hashed and saltd those passwords and now they are not only hacking your little customer base' bank accounts (because users does that stupid thing that is reusing passwords) but also spamming them to hell.
Fabricio Araujo
@Fabricio - please read the answer more carefully. It doesn't matter if the security of the site is sub-standard, and nobody is using it. I am still advocating that a better approach is to get out with a product (even if half finished), rather than build it over an year with awesome security and the best development practices, just to find out that nobody wants to use the damn thing.
If you are building a public site, put in your team someone that already is a experienced webdeveloper. People with only BASIC skill must work on intranet websites to get some experience - and bring a experienced webdeveloper when build that public site. I with I could downvote that irresponsible answer more!!!
Fabricio Araujo
Wait a minute. A public website does not mean you are launching a new company or a product. All that is needed is a public IP address, a web server, and you're golden. You want people to hold back from making their sites public before they become experts in the field of web dev? That's the exact opposite of the principles on which the Internet is built - on openness. And wow I'm so glad that an answer can be down voted only once.
Yes. If you are dealing with sensitive information (CC numbers, password and email - less than other two - are sensitive) of other people, then yes, contract an expert to review wtf you're doing. If you are at a stage you're don't dealing with such information, it's doesn't need it but it's advisable to do so.
Fabricio Araujo
And getting dealing with such information in a naive way can cost the developer getting engaged in legal trouble. See Justin Scott answer below.
Fabricio Araujo
+9  A: 

I don't have sufficient good karma to edit, my contribution: Looks like everyone here is from the US :)

i18n and l10n

  • Use the correct character encoding for your web page (charset encoding)
  • Read the Accept-Language to define the page rendering language. I have seen too many web sites that localise depending upon my IP address and ignore "Accept-Language" header information! Painful as I have no idea how to view the site in English anymore.
  • Localise for users timezone. It's difficult to get this right, but users will appreciate the fact.
  • Format numbers, currency, date, time, address and names as per users region. Default to IP address based localisation if you don't have users region information in profile.
+6  A: 

make sure (unlike me) you dont develop your site using FF3 and IE8 and then at the end, check IE7 and see that it looks a mess and need to spend days tweeking it.

always check the site renders ok in a number of different browsers during development, dont leave it till the end.

+17  A: 

Good thread. Here are some areas I think no one's mentioned:

Accessibility (A11y), WAI-ARIA tags and so forth and since it's 2010, why not start adding some HTML5 into the mix also.

checkout Selenium for jUnit-izing client-side testing.

And lastly, Content Distribution Networks, don't host your static files if you can avoid them. e.g. Akamai or Google's instance of JQuery.

Francis Shanahan
My employer uses Limelight for CDN. Their service seems pretty straightforward, but they're a big investment so you need to be pulling serious bandwidth before you need them.
+45  A: 

I, personally, avoid using extensions like .php in my URLs. For example:

Not only does the first URL look cleaner, but if I decided to switch languages, it would be less of an issue.

So how does one implement this? Here is the .htaccess code I found works best:

# If requested URL-path plus ".php" exists as a file
RewriteCond %{DOCUMENT_ROOT}/$1.php -f
# Rewrite to append ".php" to extensionless URL-path
RewriteRule ^(([^/]+/)*[^.]+)$ /$1.php [L]


+1: Conceptually dividing the interface (the URL) from the implementation (how you deliver the content) has been a theme of good system design for a long time. (If only all browsers were equally happy with the consequences, but at least it's not as big a problem as it used to be…)
Donal Fellows
I would personally not do the `.htaccess`-file like that, but it is a good start at least. On a few sites I direct it all through one file (I've chosen `r.php?r=$1`) from where I load up templates and if possible whole cached versions of the pages (like /about rarely changes, so I cut some milliseconds on the loadtime by not generating it every time).
It may not be a huge deal, but it's also one thing that can help secure your site just a little more. For example, a little while ago, a vulnerability was discovered in ASP.NET, so anyone who ran across a site with URLs ending in .aspx knew it was probably written in ASP.NET and might have tried the exploit against it. With no extensions in your URLs, besides looking cleaner, it can decrease the likelihood of people discovering what language(s) it was written in, which could be a (small) factor in situations like this.
@Frank: good idea - I do that as well, but that's more of a framework function than simply hiding the file extension.
@Andy: Security through obscurity.
@Chris: No substitute for real security, of course, but kind of a "nice to have" in addition.
@Andy, make sure to also turn off silly HTTP headers like "X-Powered-By: ASP.NET" :)
@Constantin: Chortle. Yes, that would also be a dead giveaway. :)
Apache's MultiViews function can also be used to obtain this result, since it by default tries to supply an extension, if the requested file doesn't have a literal match.
Søren Løvborg