Which is more important in Software Development?
Fixing bugs, or adding features?

A colleage recently broached this topic, and phrased it as "we should all focus our priorities on fixing bugs vs. adding features". This is wrong enough as a generalization that it entire tomes can (and probably have) been devoted to the subject.

The ONLY possible answer is that it depends, and that balance is required. There are good and valid reasons to focus on either, and these largely depend upon both the context and circumstances. Either extreme is just plain wrong.
So lets get to the caveats and disclaimers:
Lots of these views come from a family tradition of having my last name on the company. In that situation, it truly is my professional reputation on the line (which affect the bottom line) if I screw up badly. See the rest of the top level web site - that was the family company for many years.
Another significant contributor to these views is based upon the requirements of my software over the years. Compilers, binary translators, dynamic binary optimizers, in-kernel drivers, system-wide [libc] interception and interposition. Errors in the first couple are painful, as user-space applications may misbehave or crash in the hands of other developers. A user-grade DBO has even higher correctness requirements, in that novice users may be the consumers. For the last two, an error can easily lead to complete system failure, up to and including data corruption and requiring a complete OS reinstall with complete loss of data. On a client machine, that is annoying. It is completely unacceptable on a mission-critical server.
Some of the issues to consider, (in no specific order) when deciding whether or not to focus on fixing bugs, or adding features: