views:

152

answers:

3

I am really close to finishing up on a project that I've been working on. I have done websites before, but never on my own and never a site that involved user generated data.

I have been reading up on things that should be considered before you go live and I have some questions.

1) Staging... (Deploying updates without affecting users). I'm not really sure what this would entail, since I'm sure that any type of update would affect users in some way. Does this mean some type of temporary downtime for every update? can somebody please explain this and a solution to this as well.

2) Limits... I'm using the Kohana framework and I'm using the Auth module for logging users in. I was wondering if this already has some type of limit (on login attempts) built in, and if not, what would be the best way to implement this. (save attempts in database, cookie, etc.). If this is not whats meant by limits, can somebody elaborate.

Edit: I think a good way to do this would be to freeze logging in for a period of time (say 15 minutes), or displaying a captcha after a handful (10 or so) of unseccesful login attempts

3) Caching... Like I said, this is my first site built around user content. Considering that, should I cache it?

4) Back Ups... How often should I backup my (MySQL) database, and how should I back it up (MySQL export?).

The site is currently up, yet not finished, if anybody wants to look at it and see if something pops out to you that should be looked at/fixed. Clashing Thoughts.

If there is anything else I overlooked, thats not already in the list linked to above, please let me know.

Edit: If anybody has any advice as to getting the word out (marketing), i'd appreciate that too.

Thanks.

EDIT: I've made the changes, and the site is now live.

+1  A: 

1) Most sites who incorporate frequent updates or when their is a massive update that will take some time use a beta domain such as beta.example.com that is restricted to staff until it is released to the main site for the public.

2) If you use cookies then they can just disable cookies and have infinite login attempts, so your efforts will go to waste. So yeah, use the database instead. How you want it to keep track is up to you.

3) Depends on what type of content it is and how much there is. If you have a lot of different variables, you should only keep the key variables that recognize the data in the database and keep all the additional data in a cache so that database queries will run faster. You will be able to quickly find the results you want and then just open the cache file associated with them.

4) It's up to you, it really depends on traffic. If you're only getting 2 or 3 new pieces of data per day, you probably don't want to waste the time and space backing it up every day. P.S. MySQL exports work just fine, I find them much easier to import and work with.

animuson
+1  A: 

1) You will want to keep taking your site down for updates to a minimum. I tend to let jobs build up, and then do a big update at the end of the month.

2) In terms of limiting login attempts; Cookies will be simple to implement but is not fool-proof, it will prevent the majority of your users but it can be easily circumvented so it would be best to choose another way. Using a database would be better but a bit more complicated to implement and could add more strain to a database.

3) Cacheing depends greatly on how often content is updated or changes. If content is changing a lot it may not be worth caching data but if a lot of more static then maybe using something like memcache or APC will be of use.

4) You should always make regular backups. I do one daily via a cron job to my home server although a weekly one would suffice.

Jamza
A: 

Side notes: YSlow indicates that:

  • you are not serving up expires headers on your CSS or images (causes pages to load slower, and costs you more bandwidth)
  • you have CSS files that are not served up with gzip compression (same issues)
  • also consider moving your static content (CSS,Images,etc.) to a separate domain (CDN) for faster load times
scunliffe