Thursday, October 26, 2006

The Guerrilla Guide to Interviewing

Joel's latest article on interviewing is up. It makes some good points, although he continues with the down-to-the-metal idiosyncracy that I posted about last year. But here are the things that I thought were really good points:
  • Hire/No-Hire . Make a decision. If you don't know, the answer is No Hire. I've run into this before when interviewing an entry-level guy for a position that required more skills than that. We recommended he be hired for Support instead. I'm not sure that that wasn't the right decision, but as a principle I like this one.
  • You want people who are smart, and who get things done. Joel describes people who fail at one or the other, and I think I've worked with most of them before.
  • A programmer should understand pointers, and recursion. Joel comments that a lot of people are coming out of school without learning a language that requires pointers, which is a problem. Less so with recursion. He says that pointers are an aptitude rather than a skill.

At the end he says, confidently,

If your resume and phone-screening process is working, you’ll probably have about 20% hires in the live interview.

True at FogBugz, no doubt. I've not really seen it here in Indianapolis, where the local talent pool is so small. But you never know, we might get lucky!

Want a job at an up-and-coming medical imaging company? Drop me a line!

Wednesday, October 25, 2006

Bloggers are people too

Ordinarily for me, reading and writing blogs is an intellectual exercise. I'm more comfortable and interested in discussing software processes, languages, testing than I am with the emotional appeal of the typical network news sob story. So when worlds, and cars, collide, I find it affects me especially. I envision my own five-year old son coming home from his karate class and I feel like crying. Good luck to you, Nick and Josh.

Tuesday, October 17, 2006

IQAA: Regression Testing

Dr. Hanna's Practice #8: Perform regression testing that is based on impact analysis. The first practice in the list that you can't just nod your head at, since you have to have a good idea of what regression testing is (I do) and what impact analysis is (I didn't.) But I did like Practice #7: Testers should attend design and code reviews. It's not something I had heard before, but it's obviously a good idea if you are interested in faciliating communication within your company.

So what should a tester do at a code review? Primarily they will want to come up with test ideas; examine the code paths; ask how each one can be exercised. But also they can ask a very fundamental question: What other parts of the code is this project going to affect? This is an impact analysis. If I remember correctly, it was recommended that this analysis be done formally, as in developers have to write up a statement or report analyzing what other parts of the product will be affected. Not a bad idea, but probably not for smaller companies like Prosolv.

So based on the Impact Analysis, testers should be able to come up with a set of requirements that need to be retested, and there's your regression suite. Of course, every build that goes to testing should be tested on the critical path (or as I prefer, the "Happy Path"). Dr. Hanna suggested a 90% pass goal, but I'm not sure why that should be. Some tests will be showstoppers, others will be...well, whatever. I suppose if you have more than 10% "whatevers" failing, you've got an issue, though.

Just a couple of other notes:

- Regression testing doesn't do any good if you do it at the beginning of a project - it is certainly to be hoped that there will be few failures then!

- Impact analysis is also necessary when a requirement is changed. Go to a developer if necessary!

- Which led to the question, what if the developer doesn't know? Dr. Hanna's response: Find new developers ;)

Monday, October 16, 2006

IQAA: Integration Testing

Dr. Hanna's Practice #3: Test for both functional and quality requirements. I would have thought state charts and truth tables were familiar to everyone, but I think the typical Indianapolis tester has a lot less experience than, say, one in San Jose, so there were a lot of people who they weren't familiar to. But Dr. Hanna had some good advice on turning a requirement, written in English, into a model by splitting it up into actions and results. He took a typical requirement statement, teased four predicates and four consequences out of it, and showed the truth table that it resulted in, with the 16 possible states of the predicates and the expected consequences of each. I was a bit itchy here, because I always feel that you can go through and test your application like this, but then go through and test a separate bit of your application which contradicts it. To help my understanding, I went on the second day to a talk on integrations testing, which I thought would more or less cover my confusion. You've got one requirement, you've got another requirement, testing them both is integration testing, right?

Well, no. Integration testing is the actual bit where you take two components of the system and make sure they talk to each other properly. Testing the input/output of one component is mostly a unit test, since they usually are easily testable and verifiable based on the automated testing that should have been written by the developer. But you need integration testing to avoid the "operation was successful but the patient died" phenomenon, where the interface of component A is not clearly understood by the developer of component B, so he writes and tests a very nice component that doesn't do at all what component A expects.

But with that clarification, I guess I see the real issue: the thing that is worrying me are two contradictory requirements. Given a clear requirements document, it is no longer the tester's problem, and it is no longer the developer's problem. It's a business problem, and someone with knowledge of the problem domain is required to clarify the contradictory requirements, which allows us to update the requirements doc, and guess what - now the testers can redesign their test plan and the developers can redo their code.

Here's a list of books recommended at the conference.

Friday, October 13, 2006

IQAA: Changing Requirements

The whole conference I was at this week for me revolved around the requirements management process. Partly because many companies I've worked at had trouble with this part of the process; that is, they followed the process of creating a requirements document, then they shut it away in a drawer and never look at it again while developers and testers go along their merry way and code up a mishmash of the requirements, what they think the requirements might mean, and any customer requests that aren't too difficult and/or are from important customers.

But I think a main thrust of Dr. Hanna's talk was that the requirements document is very important. I'm used to this very static, dull requirements document, and so I kept wanting to raise my hand and say, "How can you do that when the requirements phase is already complete?" But I have to conclude that he doesn't think it is static at all, and that it has to be dynamic and updated continually. (It was interesting that he said several times that testing is a process, not a step in the process, but he never said requirements were too.)

The typical software company tends to communicate rather informally. Write up a vague requirements document, then have the developers implement it any ol' way that seems right. If they're good, or at least social, developers, they'll talk to customers or managers or somebody that can clarify the requirement. A lot of developers will just guess, though. (Combined with receiving fast feedback from a Customer, this is just fine, of course.) But this is why the developer/customer communications need to be with testers (in a typical software environment ) or part of the process (in a regulated environment or one with traceability requirements. When it is part of the process, the correct process, I think, is to modify the requirements doc based on the customer communication. This gives testing a chance to update their tests. Dr. Hanna came back many times to the diagram:

Requirement -> Test Scenario -> Test Case -> Script

So if the Requirements are up to date, the tests can be up to date as well.

I'm not sure that every attendee thought this was the emphasis, but I also went to a couple of talks on this topic.

Thursday, October 12, 2006

IQAA: Quality enrichment conference

I've posted about the IQAA before, and I regret that I haven't been able to make it to their talks regularly. There is a bit of a disconnect between the organizers and the members, I think, because the organizers are creating a fair amount of high-quality content, and I'm not sure that the Indy software testing scene really is vibrant enough to appreciate it. Today I'm attending a free seminar given by Dr. Magdy Hanna of the IIST on Software Testing Discipline and Software Testing Management, and it's very good.

The intent was for Dr. Hanna to give two seminars, one in the morning more or less aimed at testers, and one in the afternoon aimed at test managers, but in practice they all sort of collapsed together. The majority of attendees were there for both sessions; which was good, because they ran together pretty much. Dr. Hanna is a good, knowledgable, and confident speaker, and when you have one of those you're guaranteed to run over. We got to hear a little more than half of the practices before lunch, and a couple more afterwards, so what was billed as the "afternoon session" started around 2:00. But it covered basically the remaining practices anyway, and around 3:15 he looked up, said, "How much time do we have left?" and burned through the rest of his slides as if they were a kaleidoscope :) I'll put together a few posts over the next few days on my impressions of the conference and speakers. I'm not going to summarize all of the practices he named; just some of things that made me think. For example,

Practice 1: Requirements are crucial, with the couple of subheaders: You can't test what you don't know, and Users will always change their minds, and this was the point when he went all Steve Yegge on us, and explained how he was opposed to the agile movement. Of course, as is usual in such cases, we find out that he's not actually opposed to the practices of agile, or at least many of them, but only to calling it agile, or something. (I've never been quite clear on what exactly the opposition is to).

I mention this in passing because it seemed to me that those two headers absolutely contradict each other. How do you know what to test, when the users are calling the developers daily with new requirements? But his overall point, I concluded, was that (a) requirements documents should be kept accurate and up-to-date, and (b) they should be your main avenue of communication between developers and testers. I had assumed, when he said he didn't approve of agility, that he wanted nice static requirements docs before testing ever started. This, of course, never happens in the real world. More later.

Thursday, October 05, 2006

WIX, IIS, and CPPUnit Nano

Shoutouts to a couple of pages that have made my life easier in the last few days. We use CPPUnit to run unit tests on some of our VC6 applications, but now it's time to start compiling those applications, and their tests, in Visual Studio 2005 . I messed around with trying to get CPPUnit to compile and link in in VS2005 for a while, but was unsuccessful; and in any case CPPUnit isn't getting any love from anywhere any more. So what's a unit tester to do? Enter Nano CPP Unit, a little unit testing page with all of the source right there on the page. Copy it in to the correct files, change a few other lines, and a bunch of tests were running right off. Very handy.

Second, trying to configure IIS through an installer built with WIX. The docs explain more or less clearly how to set up the custom actions, so I did that using the codes below, ran the installer, and...nothing.

<WebSite Id="MyWebServer"
Description="My Web Server"
Directory="MyLicenseServer">
<WebAddress Id="LicenseManagerWebAddress"
Port="80"/>
<WebVirtualDir Id="LicenseManagerVirtualDirectory"
Directory="MyLicenseServer"
Alias="LicenseServer">
<WebApplication Id="MyLicenseServer"
Name="MyLicenseServer" />
</WebVirtualDir>
</WebSite>


I ran across this Strange Blog entry detailing more or less how to do the same thing, but a comment in the post also mentioned the bit I hadn't seen before: Link in the provided object file sca.wixlib to set up all the custom action scheduling the way you need it. Thanks to that commenter, the Strange Blog author, and the author of Nano CPP Unit for their help!