From Wikipedia:
To say that a company "eats its own dog food" means that it uses the products that it makes.
So the question is: do you use the software you are developing?
From Wikipedia:
To say that a company "eats its own dog food" means that it uses the products that it makes.
So the question is: do you use the software you are developing?
If you don't you should. There are so many things that you can only learn about your software by either using it yourself, or by having someone in your company be able to provide you critical feedback on from a real user perspective. You build better software when you really understand the pain your users feel.
We all use our products as much as possible. Users give great feedback, so: use!
Well, I once joined a game company just to work on the MMO I was currently playing. BTW, I don't recommend this, it kind of ruins the game.
Currently? Not so much. We make kind of specific custom web-apps for other divisions, so it's not really apropos. I'd like to say I try to code as if I'd be using it (feature-wise, etc.), but what I do isn't always what the customer would do, for some strange reason.
I do not dogfood the software that I write. I feel like it is because I write some special case software that I am not in the core user base for. But in reality it is because I/we are too lazy. I wish I did.
Now, I do use other software that the company I works for writes. I am involved in several alpha and beta programs for other software that the company writes.
I work in a very specialized domain. So personally replicating the workflows of our users is a big challenge especially for new developers on the team. In this case most of our workflow research is based on on site observations and relying on the knowledge of devs who have been on the project for a longer time.
It would be extremely helpful if i had the domain knowledge of our users but its not always practical.
ABSOLUTELY!
...I was once at Ingres, competing with Oracle, and we won a HUGE sale because the prospect learned that they weren't using their own database to run their financial system! (Yes, this was LONG ago, and yes, I'm sure it has long since been "corrected!")
I / we use BOTH the systems we make and -ahem- our system is written in our system! -smile- Neat trick, eh? (Elementary, really.)
No. I write software that people use to build ships. Eating my own dogfood would be hard to do.
We are required to use our programs. Our CEO also must use each program before our users can. This helps a lot.
To the extent that we can, we do. We work in a domain that has nothing to do with developing software, so there's not a lot of overlap. Still, we continue to use what used to be our flagship product -- an IMAPv4 server -- even though we've deprecated it and no longer develop it actively.
Yes, on my current project, we use the software that I am developing. Since it's a timekeeping and attendance application, our development team uses it to punch time so that we can get paid.
Unfortunately, our use case is super simple: just put in between 40-80 hours flex time. In the future, we hope to charge time to the features that we've been working on.
That said, eating your own dog food isn't a silver bullet. In our case, the other employee populations have such radically different requirements from us, that it's a lot of work to think about what they need. Another caution to beware of is that upper management often has a very different view of the system than the folks in the trenches, or even worse, the prospective user populations.
Not as such - but our team does ending coding a fair few dev tools that we obviously use - does that count?
yes and no -
yes - when I am writing the spec (usually in real time as I go). i make constant changes every step of the way depending upon usability feedback i get just from running the software myself.
not so much - when I am working to a very tight published spec because its usually frustrating how unusable the software is as designed!
I think that if you're working on something that you could potentially use, then eating your own dog food is crucial.
This is fairly easy for me since I'm developing IDE plugins for java developers; by using my tool all the time I notice many opportunities for improvements, and find rare corner-case bugs that are not apparent in explicit testing.
Of course, much software cannot be used by its own developers, in which case (especially if users are nontechnical) it's important for the developer to go and observe the users use the product for a while. Nontechnical users tend to blame themselves or the operating system rather than the program.
Sure, that's one of the nice things about writing development tools, is that we can integrate them into our own development process. :-)
Yes! Everyone should - though you should try to use it with an open mind. There's always going to be bias when using your own product. Since we're very passionate about what we build, we can get really defensive about the products' shortcomings.
Yes, though it's not all sweetness and light. We produce completely disparate systems, and working on one but using the others, my feedback doesn't always (read: never) reach the guys who are doing the developing of those other products. Sometimes, it can be really frustrating when a much less-over-featured open source product would really nicely replace the use I get out of our own product.
That said, any time I get feedback from anyone using our product, I try to incorporate it into the next version of our product. I figure it's free insight, so let's take advantage of it. I suspect that the organisation is shielding me from a lot of this useful feedback.
Not all the stuff we develop. As a web solutions house a lot of what we develop is customised for a particular clients business requirements and isn't something that we can either reuse or simply use internally.
Where we can we do, we're in the process of building common libraries of internally maintained software which will be used on most (if not all) client work.
Since I develop web application, I'm rather used to "eat my own dogfood". In this field, I found, it's the only true way of improve the software usability , since being sensible to the problem let you improve the features.
As it depend on the practical usage of the software for the developer , it also depends on the company culture. Places where people are just busy shipping deliverables , doing some testing just to have a process and all such things which shows that company is just trying shortcuts ,no respect for process, no dignity of labor , No IP(intellectual property) awareness. You would see people not even thinking about using their own products.
However a company which has strong positive work culture, has good policies for customer satisfaction ,value developers contribution and moreover correctly understand that Testing logic , would create ways for the developer using own products.
There are two things you must do to make sure your products are worth releasing. One if to use them yourself in as much as possible real world scenarios. For example, if you build a product that can help in making websites, then use that to build your own company's website. This is eating your own dogfood.
The other, but related, is that you should use real world datasets to test your products. There is no substitute for that either.
I do for most of my personal hobby projects. The main reason being that I usually create those projects because I need to solve a problem or make something easier, and I wouldn't be creating them in the first place if I wasn't going to use them. The same goes for internal tools and utilities I create while working on commercial projects.
However, when developing commercial software this is not always the case. As others have mentioned, there are many types of software for which the developer is not a target user. For example, if I was developing a medical software package for a hospital, why would I be using it if I didn't work at a hospital?
On a side note, as much as "eating your own dog food" is generally considered a good thing, it will not always help improve the quality of your software. In some cases it may even make the user experience worse.
For example, if I was developing a compiler and I often used inline assembly but never used generics, it is very likely that the compiler's support for assmbly would be better than its support for generics. Now, if I was the only one who actually used the inline assembly feature, while most other users were programming with generics, I don't think me using the compiler would benefit them much. In fact, if I was the only developer, my usage of the compiler might actually hurt them, as I may decide to prioritize the development of the assmbly feature instead of working on the buggy generics implementation.
Another example would be software wich is easy to use once installed, but very hard to install or configure for a non-technical person.
Don't get me wrong, I encourage everyone to use their own software, but I think it's naive to believe that using the software yourself is a "guarantee" of quality.
If I'm not using the software I'm writing, nobody else is going to.
Obviously, this proverb cannot apply to every domain of software development: you can design software to manage a nuclear plant, but you don't use it daily. Of course, the software is tested (in a simulator), but that's not the point.
I am not sure game programmers can play all day with the excuse they test the software... Etc.
Now, on my personal side, I helped a lot to improve Scintilla and SciTE, a source code editing component and the "test" (now fully functional) editor made around it. And of course, I use it extensively all day. That's the point: most of the improvements/bug fixes I did were done to scratch my own itches, of course.
At my last job we tried using our bespoke highly configureable database + user interface for all sorts of tasks, however it was overkill and or ill suited to what we would ever need to do.
Quite often because people were just used to solving problems with our in house built tool for customers they'd come up with really obtuse ways to use the systems we'd developed to "enhance" the way we worked. Most of the time it just got in the way and we could have just used an off the shelf product.
Now I work on software for call centres, so unless I plan on managing a call centre then It's not very likely...
Eating your own dogfood is hard... Lets go shopping...
I suggest reading "Dissecting a C# Application: Inside SharpDevelop" http://www.icsharpcode.net/OpenSource/SD/InsideSharpDevelop.aspx
It explains how the makers of the Open Source .NET IDE SharpDevelop, managed to build it from scratch, using notepad, to come up with their own IDE and start using their IDE to develop it.
The company that I work for is in the building automation controls (HVAC, lighting and access controls) industry and has its products installed in both of its buildings. These buildings are, in fact, always one of our beta sites.
As much of the user interface of the program that I'm working on was created using the drawing portion of that program, we have to use that whenever we need to create new user interface graphics or dialogues, so in that sense, the developers on that project "eat their own dog food". As for using the program for its main purpose (as a user interface to our controller devices), we have a network set up with several of those devices connected to it, so we can at least see how the program is working on that side of things, but it's not really the same as hooking into a real site. And since the program is a vertical market type of application, there's not really much use for it outside of our industry, so you won't be seeing anyone in accounting or shipping & receiving (to use a couple of examples) using it any time soon.
I develop plugins for Eclipse for my research, and yes, I use them though I do like what they do :)
No. We make financial trade system and I don't trade. Of course I use the system to test my code and get the experience, but that's not dogfooding.
As a freelancer, I work on two types of software:
I feel like an efficiency ninja! If I can't do something simply, I create a way and then use it until I can come with a better way. So yes, I do eat my own dog food. It's just very simple dog food with mega flavor!
If a software house writes. e.g., embedded software for controlling manufacturing machines for assembling electronics, should they purposely start an electronics manufacturing division to eat their own dog food?
Ridiculous.
By all means, if a software house makes generalized software that they can use in their business, they should. But I would argue those are rare. Of the companies in which I have worked in my career, only 20% or so could have eaten their own dog food.
More important is actively soliciting, and listening to, feedback from users, and rigorous testing.