At my workplace we want to outsource some parts of our software development project to an external contractor (another company with multiple developers). Our greatest fear is that although all functional requirements will be met, the code may be very bad, as it is often the case with outsourced software projects.
We can't use any of the conventional methods to ensure code qualitity, like peer review or pair programming, so our first idea was: "Hey, make code quality part of the requirements."
But how can we actually do that?
Even if we review the code, we must be able to say: This is not what we agreed. So we need something which can be quantified, something that isn't open for debate.
On the other hand we don't want to scare off every contractor, because the definitions are impractically strict. We don't want perfect code, but at least a minimum of code qualitity should be ensured.
If you want a particular project quality, you will have to :
To help your contractor to comply with your objective, translate your exigencies into concrete technical requirements tailored to your issues.
(Indicative threshold, you should adapt the value based on the expected quality. Overquality isn't good either for your project $$$)
In your contract, ask for regular delivery. Evaluate the delivery from your contractors to ensure that they are following your requirements and that they will produce the level of quality necessary for the success of the project. If not pull them back on track.
You could use a bunch of tools (For Java : Findbugs, JavaNCSS, Simian...) and manually analyze the project with the right configuration (the specified one). Your contractor will do the same to ensure the quality of the project before delivery.
The easiest way to do all that, is to use an existing quality managing software. This software must be available to you and to your contractor. It'll be used as an independent system that acts as a trusted third party to validate the level of quality.
Examples :
Use a test balloon: Let them start with a small package first containing only a small feature set (equivalent to a very first milestone). Indicate clearly in the contract, that you expect a high quality of the code.
They might possibly get that a successful delivery will lead to a new contract and deliver high quality. If you are not satisfied you might decide to do the work yourself or find somebody else without having wasted too much money yet.
In addition, it might be useful if you have someone regularly reviewing the code. It's better to define smaller work packages that can easily be reviewed than having one big milestone where it would be too late to ask for any changes.
Have you checked out FxCop [1] or StyleCop [2]?
They are good at sniffing out 'bad smells' and can be integrated into your build.
[1] http://bit.ly/7XUOqqIf you want particular code quality then you'll have to provide your outsourcer with your coding guidelines and ensure your contract includes a clause stating that these must be followed or code may not be accepted.
This is not a technical issue. It is about trust.
I suggest concentrating your efforts on finding the right partner. Attempting to manage risks by focusing on effects instead of causes is not a very good approach.
Being open about your concerns and speaking frankly with your potential subcontractor helps also.
Take a couple a days to brainstorm with your team and come up with, say, 10 useful guidelines for code quality as you see it and present those to the other side. Take note of their reactions.
Find the right people. When you are not sure, accept the risks or walk away. Communicate. The is no silver bullet on the technical side for ensuring code quality.
EDIT : I realise that i havent directly answered the question at all but i think that if you are more interested in making your sub-contracting effort work and get your work done on time and well, the below points are probably more important than any contract you can sign.
As a person who has worked on both sides of the fence, i think that the most important thing is to make sure
1. Communicate the architectural decisions made and their impact
2. Ensure that you have people who have the same mindset as you when it comes to the code on the other end
The only way to do is to interview people. It is not at all uncommon to interview sub-contractors before they join the team. Interview them not just on technical questions(that is not enough) but by giving them practical real life examples involving code cleanliness, patterns, refactoring etc and gauging their response to it. This goes a long way towards getting people with a matching mindset on your team.
Most important is to get the right person leading the sub-contracting team. You dont necessarily have to get all developers who are mirror - images of you on the subcontracting team. If you can get the LEAD there though who is atleast as close as you would like to someone you actually work with / yourself and make sure he is accountable for the rest of the team, most times, he will automatically ensure that he gets the right people for his team. I would not ever EVER recommend being ok with a NON-TECHNICAL person as your outsourcing team lead. He will never be able to staff / control the team correctly especially in code quality and related issues.
You could use some form of metrics to measure cyclomatic complexity and coupling and give an overall target, say, 60/100 (in the VS default metrics tool). Its not foolproof however and the only way to really know the quality is to review it.
edit: Actually, on second thoughts you might be better off taking a different approach. Try to gauge the quality of other code that company has produced and estimate what your product will be like from that. I read something yesterday about outsourcing to India and how it generally produces poor code. Go for a company with a good reputation - youll get what you pay for. I think it's too difficult to measure otherwise.
Require a minimum of 100% unit test coverage (or as close as is reasonable for your situation) and establish a baseline cyclomatic complexity score. Class/Method/Variable naming standards should also be established and adhered to and all methods and classes should be documented with standard comments (not normally a fan but in this case it will help ensure they're at least reading what they're copying/pasting from expertsexchange and here)
All those things can be quickly checked and, while not perfect, can at least establish a baseline. I would also require real-time access to the work, daily check-ins, and do a weekly code review of one or two random files. That way you have some chance of catching stuff before it goes horribly wrong.
To solve the copy-paste-pasta problem you could ask for specific code-formatting standard, variable naming and such. At least they'll have to re-read what they pasted and - who knows? - maybe they'll think twice before doing this.
Code quality is so subjective, especially if all the functional requirements are actually met. The best you can do is request certain concrete things like using an MVC pattern.
Perhaps you can require checkpoints for code review.
But a large part depends on hiring the right people to do the job.
Edit: I tried to find quantifiable items/checklists for code quality, and the best I could do is Wikipedia's list on code quality factors [1]. If the developers still don't want to fix they're code, these factors might not hold up, but they're at least more specific than just saying "quality".
[1] http://en.wikipedia.org/wiki/Software_quality#Measurement_of_software_quality_factorsI just went through this [1], and it's a crapshoot. As the "example that worked" to explain this to management, management definitely requires a four year bachelors degree to write code here. They hire senior developers with 5-7 years experience as a minimum for a technical lead.
So, if you have your best developer, who has a minimum of nine years of experience doing just this, you're asking that developer to distill everything they've learned into a single document that can be easily used to verify other developer's code.
It's not impossible, but it's a really, really hard thing to do well.
FindBugs [2] and PMD [3] are code checkers that report back possible problems. Make absolutely sure, above all else, to run something like PMD-CPD, a copy and paste detector. Have a cyclomatic complexity [4] cap.
If it's an option, split requirements, coding, and testing to three separate contracts and companies; do not let the testers write the code or requirements, or have close contact with those that do.
Require enough comments in the code that someone can follow what's going on. Make incorrect comments - those copied from another place without updating them - be something that will slow down payment to the contractor.
In general, make quality a requirement of them getting paid, and make that very clear up front. Review potential contractor's sample code for quality ahead of hiring them, if that's an option. Largely, we had to teach our outsourced team quality from scratch, as they'd really, really, really never written any maintainable code before; each new version that they were paid for was previously largely a rewrite, so paying them to improve the software was billed at their cost for redoing everything.
That works fine... on simple applications; the problem is several versions in, when they didn't quite plan for any future of your project.
[1] http://stackoverflow.com/questions/1791075/what-to-avoid-when-a-project-has-been-partially-outsourced/1791175#1791175I'll suggest another carrot/stick approach.
First of all you have to define what is good code quality. If you can't do that then you can't measure it. (There have been some other replies that have mentioned various tools for measuring complexity etc.) This is probably the hardest thing to actually do.
Then you have to define your contract. Not a code contract, but a business-to-business-real-life-in-the-flesh contract that is physically signed by both parties. This contract needs to state that the outsourced code has to meet you minimum metrics for quality or you will either apply a payment penalty or withhold payments until they correct their work and bring it up to your standards.
The carrot part of this contract would have clauses like bonuses for meeting quality requirements in under the agreed time frame.
If you don't have a physical contract with penalties then there is no incentive for the outsourcing company to supply code to meet your standards.
+1 for Lazarus.
Provide coding & development standards and make them sign up to follow those. Allow them to review those standards in advance. What you are trying to achieve is cheaper coding to the same standard that you would deliver yourself. Make sure that those standards tie up with what you do and don't go easy on them or try and make life harder for them than it is for you. If you feel comfortable with it, show them evidence that you review compliance yourselves but you don't have to do that because you are the customer.
The standards can include test case coverage, review coverage, pass rates, or indeed anything that is right for you in your organisation.
Reserve the right to audit compliance with your standards where there are issues. If they deliver quality on time, leave them to it. Where they deliver defects, audit the code, the code review and their standards compliance. Reserve the right to penalties or termination once they fall below a compliance threshold. The same applies if they are late but take care not to put delivery ahead of quality.
If they wont sign up to that, walk away. Outsourcing companies are ten a penny.
I'd argue that if you put a quality metric in a contract, it's always possible to write low-quality code that formally fits these metrics. Simply because a quality metric in a contract has to be a formal criterion, and quality isn't a formal criterion.
My advice would be: Split the projects into small parts. Review each part when it is completed. If the part is up to your coding standards, fine, the contractor can do the next part too. If it's not, give the contractor two choices: Either (a) she improves the code or (b) she won't get a contract for the next part. That way, you can say "this isn't good enough, please improve it", even if there's no contract-ready quality criterion.
Whatever tools one uses, the following will always be the problem. He will never be able to distinguish the code which was designed well and the code which seem to be OK but which is wrong from architectural point of view without the proper analysis. Proper analysis means involving tech experts, and this in fact means more money and time spent. I can't see the other way if one wants flexible, reliable and maintainable code from outsourcing.
you could define a maximum error/warning rate for tools like checkstyle [1] or pmd [2]
[1] http://checkstyle.sourceforge.net/If it's not going to be reviewed, it's not possible. There's no automated program that can determine code quality.
Defining something like maximum errors in a static analysis tool doesn't buy you a thing to determine code quality.
The problem with quality metrics as a criterion for acceptance is that a low quality developer will simply write the code in the way that they know best to write it, then muck with it until it passes all of the metrics. Metrics also can't capture some of the most important aspects of code quality, such as whether the names of symbols are meaningful and lead to easy comprehension and maintenance of the code.
It's reasonable to expect a certain standard of quality and even to include it in the contract, but the consulting relationship requires a certain degree of trust. If you try to reduce risk by giving yourself 'escape hatches' to avoid low quality code, this will likely discourage higher-quality developers from bidding on your project, as they are also interested in managing risk.
NASA Documents Online [1] lists some of the industry/academic articles on software quality. Scroll down to the Software section.
[1] http://www.hq.nasa.gov/office/hqlibrary/find/nasadoc.htm#SUse NDepend and its Code Query language (CQL) [1]. CQL is dedicated to write code quality rules that can be verified live in Visual Studio, or that can be verified during build process and through a report. Here are a few samples of CQL rules:
I don’t want that my User Interface layer to depend directly on the DataBase layer:
WARN IF Count > 0 IN SELECT METHODS WHERE CyclomaticComplexity > 15 AND PercentageComment < 10
I don’t want that my User Interface layer to depend directly on the DataBase layer:
WARN IF Count > 0 IN SELECT NAMESPACES WHERE IsDirectlyUsing "DataLayer" AND NameIs "UILayer"
Static fields should not be named m_XXX (Custom naming conventions):
WARN IF Count > 0 IN SELECT FIELDS WHERE NameLike "^m_" AND IsStatic
Methods out of MyAssembly and MyAssembly2 should not have more than 30 lines of code:
WARN IF Count > 0 IN SELECT METHODS OUT OF ASSEMBLIES "MyAssembly1", "MyAssembly2" WHERE NbLinesOfCode > 30
Public methods should not be removed to avoid API breaking changes:
WARN IF Count > 0 IN SELECT METHODS WHERE IsPublic AND IsInOlderBuild AND WasRemoved
Types tagged with the attribute MyNamespace.FullCoveredAttribute must be thoroughly covered by tests:
WARN IF Count > 0 IN SELECT TYPES WHERE HasAttribute "MyNamespace.FullCoveredAttribute" AND PercentageCoverage < 100