Tuesday, March 06, 2007

Quality of Local Political Blogs: Compare and Contrast

I've been writing various articles for Bloomingpedia in the last month or so, in an effort not so much to improve that site as to understand better the town in which I live. One of the keys to understanding a city, I think, is to gather a lot of different perspectives from a lot of different individuals with a stake in the matter. Take Indianapolis, for example: there are a lot of places you can go to get an impression of how the city is doing. Ruth Holladay; Matt Tully. Taking Down Words; Indy Undercover. You have to gather them all together before you can make a critical analysis of what's really going on; but they're there, that's the important thing.

Or you could read the Indianapolis Star. But I don't have much trust in the Main Stream Media. Their goal never seems to be so much the truth as it is finding someone who disagrees, no matter how foolish or inane that person may be, and unless you already know the subject matter pretty well, you can't tell from the way the article is written which is the inane perspective and which is sensible. So that leads you back to blogs.

Here are four local politicians who have been on my mind lately: Marty Hawk, Dave Rollo, Scott Tibbs, Sophia Travis. How easy is it to get their perspectives on local issues?

Far and away the best online writer in this group is Sophia Travis. If you just looked at the MSM, you wouldn't think much of her except that she's a little flaky (an accordion player with political aspirations? Weird!) But when you read her blog, not only is she talking about the tough political issues, but she's following up on comments people leave; leaving comments on other local blogs; sending in questions to local online chats; really being a part of the conversation about what Monroe County is, and what it should be. It would be great if every politician had an online presence like Sophia's.

Second best is Scott Tibbs. I actually started this post thinking about what I don't like about Scott's blog: there's no real comment area on it, just a link to a bulletin board, which I assume is also run by him, and which you have to register on before you can comment. He says that's to avoid spammers, but obviously a lot of bloggers manage to allow real comments without going to that extreme. But the point is, he writes, and discusses, and allows discussion of his views in some form. So I can't take too much umbrage, especially compared to:

Dave Rollo. He's got a web page; it's a start. The page is very static; the main page has a "last updated" date on it, but there's nothing to find what was there before. There's only a few paragraphs discussing his views, and there's no way to leave public comments, and if he's ever left a comment online I haven't seen it. Start a blog, Dave. He did participate in an online chat recently, and having a web page puts him ahead of:

Marty Hawk. Not much to say here, because I really couldn't find out anything. She gets quoted in the local paper from time to time, and you can go read the minutes of the Monroe Council meetings and find some things she said. But right now, the number 2 hit on Google when you search for her name is the article I wrote on her last week. So we really don't know too much about her at all. It leaves me defining her, rather than having her defining herself. If that's what she wants, then that's fine.

So that's where we are in online local politics in Bloomington. It's a start. But I wish there were a lot more politicians in the conversation.

Thursday, February 22, 2007

Bookplates

(Hey, this is my 200th blog post! And it only took me three years!)



One principle of agile development that doesn't get a lot of attention is Sitting Together. The point of the principle is simple: agility requires communication, and there's no faster communication than shouting over your shoulder to the guy behind you! I think it's a bit overblown; communication is hugely important, but with the advent of instant messaging, not only do you know that Dan down the hall is sitting at his desk, but you even know that Mike down in Dallas is, and they're just as likely to respond to your ten-second query as Jennifer two desks away is. The participants have to be in pretty close time zones, though; Suresh in India just isn't gonna respond to your IM no matter how many times you check his status during the working day!



In my new company we sit together, which is something I've never done anywhere else. I've found that one disadvantage is that my desk doesn't have space by half for my programming library, which I like to keep at the office for easier reference. (Okay, so I haven't referred to the Differential Equations textbook since I left the videogame industry. Nevertheless.) So I'm taking over a couple of shelves nearby, but instead of just writing my name in all my books, I thought it would be more fun to make bookplates for them. Here's the design I made:

I'm no graphic designer, but I thought it was OK. If you want to modify it for your own use, feel free; I've made a Word template available for use with the Avery labels that come six to a page; you can get it here, or download the Avery bookplate for the four-a-page labels. Hey, my favorite book site LibraryThing, why don't you provide some of these? I'm sure there are dozens of people who can do better!

Friday, February 16, 2007

Government RSS Feeds

Sophia Travis is a Monroe County Council member - and, remarkably, one who is sophisticated enough to have her own blog. On one post, she asks what information we'd like to see on the Monroe County web pages. I would like to see an RSS feed. Here's why: Government business does not lend itself well to the regular web page format. The people's business does; the most important thing for the web site will always be ways to contact government officials; how to apply for permits; pay parking tickets; vote, etc. But government business consists mostly of a neverending string of public meetings, each one with an agenda beforehand and minutes afterwards. The best way to present a stream of information like that is with a feed. For example, I've already created a feed for the Council meetings using the very nice, if complex, Feed43 service. My feed will do the job, giving me an update through my feed reader whenever a new meeting agenda or minutes are posted, but it's pretty content-free, as the feed can't do much except monitor each row of the table of meetings on the page. But suppose the county tech services people set up an easy way to post updates using TypePad or Blogger - suddenly it's easy for them to update the site and there's a good description of the update in the feed. Then, perhaps, it could be expanded using the same feed, to give information on other public meetings, notice of events the council members are participating in, and any other kind of information that has a time element. I'm thinking this is actually a time saver, at least for that one poor soul whose responsibility it is to go in and edit the HTML table on the page whenever new meeting minutes are available!

What would you like to see on your local government web site?

Thursday, February 08, 2007

Change is good

I've accepted a new job with Envisage Technologies, a small software company in Bloomington. I'm excited about it as it's a company with a firm interest in agile principles:

Do ideas by the Gang of Four, Steve McConnell, Martin Fowler, Tom DeMarco and Kent Beck resonate with you? Join an experienced team of developers in an Agile environment...

So I'm no longer working in Indianapolis for the first time in more than ten years - I'm not sure what I'm going to do with all the extra time!

(I've also set up a LinkedIn account as per Guy Kawasaki's suggestion. Drop me a line if you want to connect to me.)

Tuesday, February 06, 2007

Prius anti-skid props

The problem with having six inches of snow dumped on us is that the hill that leads to our house has a slope that isn't quite a vertical wall, but that's pretty close. So as I was heading home my wife assured me she'd seen the plow go by, and I decided to take the hill, which in good weather would be five minutes, as opposed to going the long way round and taking half an hour.

Up I started, accelerating to about 25 MPH and getting at least 30 or 40 yards before realizing that the plow hadn't been by recently enough to make a difference. It was easily the worst snow I'd ever tackled on the hill before, and it's not fun having to back down that slope, let me tell you. Especially with the literal vertical drop on the side that sends you ten feet straight down before the drop is conveniently stopped by a tree.

But here's what the Prius does, straight from the brochure:

Motor Traction Control (TRC) – TRC uses sensors which automatically apply the brake to any slipping wheel while delivering more power to the wheels with greater traction.
Vehicle Stability Control (VSC)* – VSC senses oversteer (tail slide) and understeer (nose pushing forward), and managing the power delivered to each wheel.


It was a beautiful thing. I kept the accelerator right around 25 and the car took over from there. It never slipped sideways, never fishtailed, and actually applied acceleration to the wheels in bursts of a couple of hundred milliseconds at a time, followed by coasting to grab what little traction it could, and then accelerating again, and I was at the top of the hill as nice as pie. I only felt guilty for not stopping the cars I passed and telling them, "Your car got TRC? Got VSC? Then DON'T try the hill tonight! Just because my car can do it doesn't mean yours can!" What a beautifully engineered vehicle.

Thursday, February 01, 2007

WiX installer and Error 2708 (No entries found in the file table)

I had to add different versions of a file to my installer today. Seemed easy enough to do - the files were available in the right places and everything, so I added a version number on to the old file ID, added a new file with the new version number, and called it a day, right?

Not so fast. Compile up the install and run it:

Error 2708: No entries found in the file table.

Say what? Must have been a file system glitch. Open up the MSI with Orca and check the file table; well, yes, it has lots of entries, no trouble there. What's going on here?

Buried deep in the search results for the error code I found this page. The comment from Jane D pointed out that she'd seen this error while having problems with the Duplicate File table rather than the File table - and that jogged my memory. In a separate component I had a CopyFile element that was pointing to my file, and it still had the old file ID reference, now orphaned. Update the reference, recompile, and bingo. Working install.

I see this as a bug in the WiX linker: why did it build the MSI with this unresolved reference? I'll have to post something to the mailing list at some point.

Thursday, January 25, 2007

Credibility redux

Y'know, I'm not affiliated with Microsoft at all, although I own some stock and use their products happily. So I'd like to think that when I say something like, this Microsoft employee has credibility, it's because the employee actually has credibility and not because I'm biased or brainwashed towards the Borg.

I've been a loyal subscriber of Dare Obasanjo for at least a couple of years now, and a happy user of RSS Bandit, although I'm now evolving a bit into Google Reader for its mobile capabilities. So when I read his article about changing Wikipedia I didn't think much about it; mildly interesting but not a big deal, and his changes in the TechCrunch entry certainly deserved reverting under the Wikipedia "No experimenting" clause. But Michael Arrington's reaction was out of line:

A Microsoft employee, who took issue with this blog post, vandalized the TechCrunch Wikipedia entry and wrote about it on his blog.

That is a misuse of the word vandalized by any stretch of the imagination. Dare added maybe a couple of sentences with a dry, unemotional tone. He put up an apology in the comments, too, but in two or three comments (which have now disappeared) Arrington repeated the vandalism charge, and he's showing no signs of backing down. IMO, there is a serious credibility gap in repeating an emotionally charged word like that in response to some rather minor issues. I'd never heard of Arrington before, or read TechCrunch. This little flap doesn't make me want to, either. Michael Arrington joins Andrew Orlowski in my credibility book.

Wednesday, January 24, 2007

B2B 2.0

Chris has left LibraryThing. I don't know him, but apparently he's done some really good work for one of my favorite Web 2.0 plays; a social site based around book collecting.

There are lots of definitions of Web 2.0, but at least one of the principles that seems to define it is "Online Community". Flickr, YouTube, Yahoo! Answers. Online communities have been around since the beginning, of course, at first through mailing lists and NNTP servers, later through applications and, eventually, web sites. When we at Sunstorm were working on a version of Deer Hunter that was going to have a multiplayer mode - we had only the vaguest idea how that might work - I went to a seminar at the Game Developer's Conference on the topic of building online communities. We did a little work towards it; our web site ran some decent forum software, but in 1998 the Deer Hunter target market did not actually overlap with people who spent a lot of time online, which was hampering.

At least we had a good size target market. Combine the lush outdoor scenery of Deer Hunter 3 with a visionary concept of online communication, and we might have had our own version of Second Life on our hands, five years before anyone else. But of course, we didn't have the vision thing. It's still the hardest part of launching a consumer oriented web site. Wal-Mart tried it. Xanga was hip for a while. Not much there there, now.

But what about an online community as part of a B2B play? Not a corporate MySpace, but a self-selecting group made up of users of your product. If your target is geeks you might have a leg up here; Kinook has a nice online forum. Axosoft has forums and bloggers. The forum we put up at Interactive Intelligence seems to be buzzing along nicely. When I was there the customer base was very technical; that may be less so as their customer base has grown. But I think an actual, product-based online community is very workable for a business-to-business company. More later.

Thursday, January 18, 2007

The blogging split

I got into a discussion with someone concerning the blogosphere and its effect on corporations today. Here's something I hadn't really thought about in a couple of years: when was the last time someone was dooced? It's been a while, I think; or at least when it does happen it's not much of a story and the mainstream media doesn't pick it up.

So what happened? Did everyone just sort of "get it" ? I would say more that the world is sort of partitioning itself off now. On the corporate side, corporations are splitting into sort of "New Media" companies, Microsoft and Sun, where bloggers are allowed almost free rein, and "Old Media" companies, Wal-Mart say, or GM, where they feel it's very important that the company try to keep absolute control of the image of the company and don't allow their employees much say. That's not to say there isn't crossover; I understand one Microsoft division wanted Robert Scoble fired after he said something critical about the company, while GM actually has a blog...a rather corporate-oriented one, to be sure, but it does allow comments and they don't appear to censor them for criticizing the company.

On the other hand bloggers, or better I should say people, are splitting off as well. You see a lot of blogs around where someone started the blog, posted a few things, then apparently dropped off the face of the earth. Or possibly they write an article once a month or so apologizing for not blogging more and promising to do better from now on. Hey, blogging is hard, and most of us aren't getting paid for it. I've been known to go a month or two without posting. So there's more of a split between people who blog and people who read.

So I suspect what's happening is that people who blog, are moving over to work for companies who support blogging! Maybe not a momentous insight, but I can't think of anyone else who's come out and said it. People who don't blog, can stick around with the companies that are trying so hard to control their messages. That's why, I suspect, that you haven't heard much noise about doocing recently. People have sorted out where they belong; companies have clearer policies about what they expect and employees have a clearer understanding of what they're looking for.

(If they don't, I guess they'll have to buy Shel Israel's new book to clarify things.)

Wednesday, January 10, 2007

Building a cathedral

A traveler visited a city where many stone cutters were working. Approaching several, he asked the same question: "What are you doing?" The first stonecutter he met replied, "I'm cutting stone. It's dull work, but it pays the bills." A second stonecutter responded, "I'm the best stone cutter in the land. Look at the smoothness of this stone, how perfect the edges are." A third pointed to a foundation several yards away, and said, "I'm building a cathedral."

Thanks to Grady Booch and Joe Marasco for the story!

Wednesday, December 13, 2006

New Technology High School

This High School is a model for high tech high schools that's being pushed by the Gates Foundation and probably some other places, and they're thinking about trying one out in Bloomington, or at least they've gotten 50 G's in order to study the possibility. Some of the money went to bringing in a speaker to discuss the possibility, so along with a couple of hundred other people, I went to check it out. My initial reaction is, Groovy. I hated high school, and it's only ten years from now that my son will start it. I'd like him to have some choices about where to go, and this model seems pretty nice. They want to keep it to 400 students, as opposed to the thousand or so at each Bloomington high school now. I've commented elsewhere about not really understanding the Indiana charter school system; I suppose this would be one of those.

I'm hearing some contradictory things about the school, though. For example, a questioner asked last night about the per-student cost of the school. The response was that the school doesn't get any more from the state than any other school would get, and that technology was the biggest expense. But the little handout we got actually says, small school and class size allows students to take responsibility for their own learning...So I wonder which it is. I'd guess that any school would find that graduation rates would inversely correlate to class size. Also, two separate articles in the paper (subscription required) tell us that the school (a) caters to students in the job market, and (b) most graduates go on to higher education. What the heck does that mean?

The speaker explained a little bit of what the school was about; all very nice; focus on communication skills and working as a team, computers for everyone, community internships. I think you can have two kinds of high schools: the kind where kids are motivated and enthusiastic about doing stuff, and the kind where the kids are biding their time until they can get out and go do something else. When you have the first kind, the students are going to be self-selecting - they have to want to go to the school. This is why I think charter schools and school choice are good ideas. So for that reason alone I think this school would be a good idea.

But the audience had a lot of good questions; some sublime; some ridiculous; all very practical. The inevitable "What about sports?" question was asked, which of course really means, "What if my kid wants to go there but he's also a basketball star?" The responder didn't really pick up on that dynamic, mentioning that the schools in California play Ultimate Frisbee against each other. Yeah, great. But the local guy did mention that allowing the students to play on the big school teams was a possibility.

A lot of the questions made me think, though, that either by state law or by educator attitudes, the school system isn't really ready to shift paradigms. I don't necessarily blame them; it's not an easy thing to do. But there were questions about honors degrees and demographics. The California panelist pointed out that an honors degree is a pretty divisive thing, and how can you teach teamwork in that sort of environment? The local panelist said that he thought the demographics would have to mirror those for the local high schools, so this school would have the same proportion of special needs students, minorities, gifteds, etc. I don't see how they can do that and still have the students be self-selecting, not to mention I find it extremely irritating when people are classified into "black", "poor", "special ed" or groups, even when the goal is to create balance.

So there's plenty to think about still. But I hope they do it. And if I'm still around town in ten years, I'll probably be pushing my kid to go there. If you see a chance, take it.

Thursday, October 26, 2006

The Guerrilla Guide to Interviewing

Joel's latest article on interviewing is up. It makes some good points, although he continues with the down-to-the-metal idiosyncracy that I posted about last year. But here are the things that I thought were really good points:
  • Hire/No-Hire . Make a decision. If you don't know, the answer is No Hire. I've run into this before when interviewing an entry-level guy for a position that required more skills than that. We recommended he be hired for Support instead. I'm not sure that that wasn't the right decision, but as a principle I like this one.
  • You want people who are smart, and who get things done. Joel describes people who fail at one or the other, and I think I've worked with most of them before.
  • A programmer should understand pointers, and recursion. Joel comments that a lot of people are coming out of school without learning a language that requires pointers, which is a problem. Less so with recursion. He says that pointers are an aptitude rather than a skill.

At the end he says, confidently,

If your resume and phone-screening process is working, you’ll probably have about 20% hires in the live interview.

True at FogBugz, no doubt. I've not really seen it here in Indianapolis, where the local talent pool is so small. But you never know, we might get lucky!

Want a job at an up-and-coming medical imaging company? Drop me a line!

Wednesday, October 25, 2006

Bloggers are people too

Ordinarily for me, reading and writing blogs is an intellectual exercise. I'm more comfortable and interested in discussing software processes, languages, testing than I am with the emotional appeal of the typical network news sob story. So when worlds, and cars, collide, I find it affects me especially. I envision my own five-year old son coming home from his karate class and I feel like crying. Good luck to you, Nick and Josh.

Tuesday, October 17, 2006

IQAA: Regression Testing

Dr. Hanna's Practice #8: Perform regression testing that is based on impact analysis. The first practice in the list that you can't just nod your head at, since you have to have a good idea of what regression testing is (I do) and what impact analysis is (I didn't.) But I did like Practice #7: Testers should attend design and code reviews. It's not something I had heard before, but it's obviously a good idea if you are interested in faciliating communication within your company.

So what should a tester do at a code review? Primarily they will want to come up with test ideas; examine the code paths; ask how each one can be exercised. But also they can ask a very fundamental question: What other parts of the code is this project going to affect? This is an impact analysis. If I remember correctly, it was recommended that this analysis be done formally, as in developers have to write up a statement or report analyzing what other parts of the product will be affected. Not a bad idea, but probably not for smaller companies like Prosolv.

So based on the Impact Analysis, testers should be able to come up with a set of requirements that need to be retested, and there's your regression suite. Of course, every build that goes to testing should be tested on the critical path (or as I prefer, the "Happy Path"). Dr. Hanna suggested a 90% pass goal, but I'm not sure why that should be. Some tests will be showstoppers, others will be...well, whatever. I suppose if you have more than 10% "whatevers" failing, you've got an issue, though.

Just a couple of other notes:

- Regression testing doesn't do any good if you do it at the beginning of a project - it is certainly to be hoped that there will be few failures then!

- Impact analysis is also necessary when a requirement is changed. Go to a developer if necessary!

- Which led to the question, what if the developer doesn't know? Dr. Hanna's response: Find new developers ;)

Monday, October 16, 2006

IQAA: Integration Testing

Dr. Hanna's Practice #3: Test for both functional and quality requirements. I would have thought state charts and truth tables were familiar to everyone, but I think the typical Indianapolis tester has a lot less experience than, say, one in San Jose, so there were a lot of people who they weren't familiar to. But Dr. Hanna had some good advice on turning a requirement, written in English, into a model by splitting it up into actions and results. He took a typical requirement statement, teased four predicates and four consequences out of it, and showed the truth table that it resulted in, with the 16 possible states of the predicates and the expected consequences of each. I was a bit itchy here, because I always feel that you can go through and test your application like this, but then go through and test a separate bit of your application which contradicts it. To help my understanding, I went on the second day to a talk on integrations testing, which I thought would more or less cover my confusion. You've got one requirement, you've got another requirement, testing them both is integration testing, right?

Well, no. Integration testing is the actual bit where you take two components of the system and make sure they talk to each other properly. Testing the input/output of one component is mostly a unit test, since they usually are easily testable and verifiable based on the automated testing that should have been written by the developer. But you need integration testing to avoid the "operation was successful but the patient died" phenomenon, where the interface of component A is not clearly understood by the developer of component B, so he writes and tests a very nice component that doesn't do at all what component A expects.

But with that clarification, I guess I see the real issue: the thing that is worrying me are two contradictory requirements. Given a clear requirements document, it is no longer the tester's problem, and it is no longer the developer's problem. It's a business problem, and someone with knowledge of the problem domain is required to clarify the contradictory requirements, which allows us to update the requirements doc, and guess what - now the testers can redesign their test plan and the developers can redo their code.

Here's a list of books recommended at the conference.

Friday, October 13, 2006

IQAA: Changing Requirements

The whole conference I was at this week for me revolved around the requirements management process. Partly because many companies I've worked at had trouble with this part of the process; that is, they followed the process of creating a requirements document, then they shut it away in a drawer and never look at it again while developers and testers go along their merry way and code up a mishmash of the requirements, what they think the requirements might mean, and any customer requests that aren't too difficult and/or are from important customers.

But I think a main thrust of Dr. Hanna's talk was that the requirements document is very important. I'm used to this very static, dull requirements document, and so I kept wanting to raise my hand and say, "How can you do that when the requirements phase is already complete?" But I have to conclude that he doesn't think it is static at all, and that it has to be dynamic and updated continually. (It was interesting that he said several times that testing is a process, not a step in the process, but he never said requirements were too.)

The typical software company tends to communicate rather informally. Write up a vague requirements document, then have the developers implement it any ol' way that seems right. If they're good, or at least social, developers, they'll talk to customers or managers or somebody that can clarify the requirement. A lot of developers will just guess, though. (Combined with receiving fast feedback from a Customer, this is just fine, of course.) But this is why the developer/customer communications need to be with testers (in a typical software environment ) or part of the process (in a regulated environment or one with traceability requirements. When it is part of the process, the correct process, I think, is to modify the requirements doc based on the customer communication. This gives testing a chance to update their tests. Dr. Hanna came back many times to the diagram:

Requirement -> Test Scenario -> Test Case -> Script

So if the Requirements are up to date, the tests can be up to date as well.

I'm not sure that every attendee thought this was the emphasis, but I also went to a couple of talks on this topic.

Thursday, October 12, 2006

IQAA: Quality enrichment conference

I've posted about the IQAA before, and I regret that I haven't been able to make it to their talks regularly. There is a bit of a disconnect between the organizers and the members, I think, because the organizers are creating a fair amount of high-quality content, and I'm not sure that the Indy software testing scene really is vibrant enough to appreciate it. Today I'm attending a free seminar given by Dr. Magdy Hanna of the IIST on Software Testing Discipline and Software Testing Management, and it's very good.

The intent was for Dr. Hanna to give two seminars, one in the morning more or less aimed at testers, and one in the afternoon aimed at test managers, but in practice they all sort of collapsed together. The majority of attendees were there for both sessions; which was good, because they ran together pretty much. Dr. Hanna is a good, knowledgable, and confident speaker, and when you have one of those you're guaranteed to run over. We got to hear a little more than half of the practices before lunch, and a couple more afterwards, so what was billed as the "afternoon session" started around 2:00. But it covered basically the remaining practices anyway, and around 3:15 he looked up, said, "How much time do we have left?" and burned through the rest of his slides as if they were a kaleidoscope :) I'll put together a few posts over the next few days on my impressions of the conference and speakers. I'm not going to summarize all of the practices he named; just some of things that made me think. For example,

Practice 1: Requirements are crucial, with the couple of subheaders: You can't test what you don't know, and Users will always change their minds, and this was the point when he went all Steve Yegge on us, and explained how he was opposed to the agile movement. Of course, as is usual in such cases, we find out that he's not actually opposed to the practices of agile, or at least many of them, but only to calling it agile, or something. (I've never been quite clear on what exactly the opposition is to).

I mention this in passing because it seemed to me that those two headers absolutely contradict each other. How do you know what to test, when the users are calling the developers daily with new requirements? But his overall point, I concluded, was that (a) requirements documents should be kept accurate and up-to-date, and (b) they should be your main avenue of communication between developers and testers. I had assumed, when he said he didn't approve of agility, that he wanted nice static requirements docs before testing ever started. This, of course, never happens in the real world. More later.

Thursday, October 05, 2006

WIX, IIS, and CPPUnit Nano

Shoutouts to a couple of pages that have made my life easier in the last few days. We use CPPUnit to run unit tests on some of our VC6 applications, but now it's time to start compiling those applications, and their tests, in Visual Studio 2005 . I messed around with trying to get CPPUnit to compile and link in in VS2005 for a while, but was unsuccessful; and in any case CPPUnit isn't getting any love from anywhere any more. So what's a unit tester to do? Enter Nano CPP Unit, a little unit testing page with all of the source right there on the page. Copy it in to the correct files, change a few other lines, and a bunch of tests were running right off. Very handy.

Second, trying to configure IIS through an installer built with WIX. The docs explain more or less clearly how to set up the custom actions, so I did that using the codes below, ran the installer, and...nothing.

<WebSite Id="MyWebServer"
Description="My Web Server"
Directory="MyLicenseServer">
<WebAddress Id="LicenseManagerWebAddress"
Port="80"/>
<WebVirtualDir Id="LicenseManagerVirtualDirectory"
Directory="MyLicenseServer"
Alias="LicenseServer">
<WebApplication Id="MyLicenseServer"
Name="MyLicenseServer" />
</WebVirtualDir>
</WebSite>


I ran across this Strange Blog entry detailing more or less how to do the same thing, but a comment in the post also mentioned the bit I hadn't seen before: Link in the provided object file sca.wixlib to set up all the custom action scheduling the way you need it. Thanks to that commenter, the Strange Blog author, and the author of Nano CPP Unit for their help!

Friday, August 04, 2006

Evidence open source project

I got a CodePlex project approved, whee. I'm pretty sure I must have grandfathered in since I submitted my request a couple of weeks ago - it's just an idea in my mind, not an actual project, and therefore it probably doesn't come under the latest requirements:

Most project requests that we approve have two or more project admins, two or more committed developers, and a recent history of active check ins, opened and closed work items, and at least one release. We sometimes make exceptions for individual project applicants (with or without code) who have a proven history of success in creating successful online development projects, or startups.

Uh, yeah, that's me all right. But, my project idea has to do with genealogy and collaboration; it's an attempt at raising the standards of online genealogical research. As I wrote on the project wiki:

The typical Internet genealogical researcher today works as follows:
  1. Search on genealogy sites for published databases that have matches for someone already in their database
  2. Copy the information to their database
  3. Publish the database

So there's a lot of room for improvement. I'll write more about my ideas soon - right now I have to go figure out how one checks in code using Team System...

Drop me a line if you're interested in the project!


Wednesday, August 02, 2006

Credit card frauded

Crud! I got a call from MBNA to verify some "suspicious activity" on my credit card, and sure enough someone managed to get to their online banking site, use my account number to log in, change my address to somewhere in Maryland, and buy an $11,000 dollar computer from Dell.

Luckily Dell was on the ball and asked MBNA to verify the purchase, so the account will be closed immediately. Still, I wonder how they came up with the number? A little scary; but changing that account number is something I should have done years ago, as I use it for way too much stuff. (Which is exactly why I haven't changed it.)

When I called MBNA they asked for my mother's maiden name as verification. I bet that's easy to find - wonder if the online banking site used that same security?

Hopefully changing the account number is the end of it. Then I need to start splitting up my accounts: one credit card for online purchases, one for monthly charges, one for gas, etc. Be careful out there!

Monday, July 31, 2006

Death of NDOC

I caught the news first from Bill Wagner's blog that NDoc 2.0, the documentation tool for .Net developers, was losing its main developer and motivating force, Kevin Downs. (And incidentally, I never saw anything useful on either Digg or Technorati about it. Simple Google is still the best place to look.) Here's what Kevin had to say in an email that was quoted in dozens of blogs:

As some of you are aware, there are some in the community who believe that a .Net 2.0 compatible release was theirs by-right and that I should be moving faster – despite the fact that I am but one man working in his spare time...

This came to head in the last week; I have been subjected to an automated mail-bomb attack on both my public mail addresses and the ndoc2 mailing list address. These mails have been extremely offensive and resulted in my ISP temporarily suspending my account because of the traffic volume.


The standard line of bloggers has been, more or less: What a shame, what a loss to the community, why aren't these mailbombers contributing, that's what happens to open source projects.

It certainly is a big loss. But to be honest, I don't see it as a huge deal. Bill sees it as a problem with the whole open source software model, which I disagree with - I think the Asterisk project is one counterexample. The email, to me, has a bit of a defensive tone, like the writer's lost all his enthusiasm for the project and is looking for an excuse to get out of it. (I've sure been in that position, and it's got nothing to do with open source!) Is NDoc really that heavily used? Doxygen has the advantage of working with more languages, so it's my preferred tool, but I would think if there are that many people interested in using it, surely someone can step up as a new administrator, even if the project languishes for a while. And a mailbomb attack? Do those really still work? I would have thought any administrator would have been able to block some IP's and stop it. I guess it was the product of someone's bot army; but that brings up another point: anyone can launch a mailbomb or DOS attack. You can make one person mad online, even for a perceived rather than an actual insult, and the attack can come. If you're a small organization, you just have to weather the storm and move on.

I'm not saying Mr. Downs made the wrong decision; far from it. It's his life and his work and we should be grateful for whatever he is willing to donate to the community. But let's accept it and move on without getting huffy about it.

Oh, and maybe I better see if Doxygen could use any extra coders...




Customer Affinity and UI design

Martin Fowler discusses the importance of being attuned to the business side of software development. I especially liked this quote:

I've often heard it said that enterprise software is boring, just shuffling data around, that people of talent will do "real" software that requires fancy algorithms, hardware hacks, or plenty of math. I feel that this usually happens due to a lack of customer affinity.

I've heard this too, in spirit at least. and one of the reasons is that those people of talent don't believe that UI design is "real" software. Of course, the place you have the most opportunity to affect how the customers work and whether they enjoy your software is in the user interface. In the last few years, UI design has started to gain a little more respect in the community, but the fact remains that it is one of the areas of software design that really remains an art, rather than a science. What are your favorite sites for discussing UI design?

Tuesday, July 18, 2006

Finding holes in the process

Ever done a process review? It's one of those things that gets done, formally or informally, when a software company is trying to grow from small to large. In my experience, the most likely way it happens is, a manager or two or three get together and decide on some tool that they like, or have used before, and that they think would be useful for source control, or bug tracking, or building, and then they pass the edict down to the programmers: "Okay guys, from now on we use OnTime for all bug reports." The programmers nod politely and get on with the business at hand, and may even enter a few things into OnTime if they remember.

Ina few months, the managers realize that nobody's paying much attention to OnTime, and they go and bug the programmers. "Hey guys, let's use this bug tracker, ok? We paid a lot of money for it." The programmers start entering a few more things into OnTime, if they remember, but they grumble about it. Why waste time on this busywork, they think? The programmers aren't happy, the managers aren't happy, and communication is breaking down badly.

How do you avoid this? Don't just nod politely when the tool is introduced; attack it. Of course, if it's a tool you've not used before, you won't be able to see what any weaknesses are. But try to understand the workflow. Bug the manager until he makes it clear for you. He'll probably end up saying something like "Each bug goes from Entered to Accepted to Fixed to Tested to Released".

That's a pretty standard workflow. But now you can start to poke holes in it. Has anyone thought through the failure steps?

"Okay, so what if it's a bogus bug? I'm not going to accept it then."

"Hmm, that's true. Maybe we should add a Rejected state."

"Sounds good. What if Testing fails?"

"Umm, the test group should just set it back to Entered, and it can cycle through again."

"Okay, but what if that happens the week before the release? Do we need to put off the release until the bug gets fixed? Or can we hold off on it until the next release?"

"Ummm..."

Processes tend to break down around the failure points. If every bug took the path Find/Fix/Test/Release, software development would be very simple, and the workflow would be completely linear. But on every step of the line, it needs to be clear what will happen on a failure. Does it go back to previous step? Farther? Can we ignore it? A clear workflow with known failure paths will go a long way towards making any software project smoother.

Thursday, July 13, 2006

Build part 3

Looks like the build is finally up and running, and we've completed a few builds that testing seems to approve of. I finally moved the virtual machine over to a machine with a decent amount of power behind it and that got things to pick up a little; but certainly we were far from the James Shore ideal of being able to download and build immediately...at least not after I implemented my idea of moving source code to a different drive so the C drive could be more easily restored if needed. It took quite a bit of time to finally dig out all the references to c:\Prosolv\build and replace them with environment variables!

But our mishmash of Ruby scripts is going again. We have our summer intern working on a new build process: he's evaluated various tools and chosen one called Visual Build, which we'll move to at some point, when he's declared it ready.

My friends Andy and Sushil have both just had babies. Congratulations, you guys!

Thursday, June 29, 2006

Build machine still dragging

(Backstory here.) Came in the next morning; VS2005 is still installing. Curse. The bright side was, as I watched it, it switched from actually installing VS2005 to installing one of the CE framework packages that I really didn't care about. So, I spent some time unsuccessfully trying to get it to cancel out at that point, and eventually was able to get the machine to start shutdown, which allowed me to kill the VS install. How will that affect the machine, I'm not sure. So, around 9AM the machine started shutdown - then, the automatic updates kicked in. Thirty of them. Curse. So here it is, about three hours later, and just about half the updates are complete.

This is not a good circumstance when trying to get a release out.

Wednesday, June 28, 2006

The perils of slow build machines

We had a hard drive die on our build machine. Not to worry; as we learned from the rubber chicken source code should be buildable and shippable anywhere, anytime. But then, I don't have a great deal of trust in that ideal concept, so we decided to take advantage of the situation to create a virtual build machine instead of a real one. Here are roughly the steps I followed:

Install Virtual PC
Grab an existing hard drive image with Windows XP SP2 and copy it.
Install a couple of things on it; then attempt to install Visual Studio 2005.
It blows up with an error. Huh?
Try it again, same error.
Realize that the image is limited to 4 gigs, and VPC doesn't allow modifying the existing size of a virtual disk, as far as I can tell.
Curse.
Create a brand new image, and install XP SP2 on it. This process takes plenty of rebooting, and "Press Enter to continue" style dialogs; not to mention several hours just to copy all the files.
Install Visual Studio 6; hopefully it will be quicker and we'll need it for some legacy stuff anyway. Many more reboots, but eventually it's installed.
My boss comes by and asks how much longer it will be until the next build.
Curse.
Attempt to install VS2005 again. Many more reboots.
My boss comes by again and tells me he's arranged for a much faster machine with more memory. Cheer.
It's around 5:00 that day, so I decide to leave the VS2005 install running overnight, then I can transfer the virtual machine to the new machine in the morning.
Come back the next morning. VS2005 is still installing.
Curse.
It looks like it's nearly done, though. Hopefully it's within an hour or two of finishing and I can move it over to the faster machine.
Wait eight hours. VS2005 is still installing.
Curse.

I'll leave soon. Hopefully VS2005 won't take 48 hours to install, and I'll be able to get back to it in the morning. We're now at eight days without a new build. Curse.

Monday, June 26, 2006

Death of Agile

Jonathan Kohl writes on the value of pragmatism, as opposed to process zealotry, and asks what we think. Jonathan, I think you should enable comments on your blog :) But I'll do a quick post here instead. I'm not sure whether I agree or disagree. Absolutely you should use whatever works for your project; I have no issue with that. But I have a lot of trouble imagining a project where I would say, "In this situation, writing unit tests would be a very bad idea" or "It's clear that we should not have a daily build for this project. One a month, absolute max."

In other words, the point of agile processes are that they are good processes. You use them because they are unquestionably an advantage to your project. Maybe I'm a zealot. Is there an argument to be made against unit tests? To me, the whole zealotry issue comes across like saying, "Sure, I really like transistors, but hey, if vacuum tubes are what your stereo requires, you go right ahead and use them!"

Friday, May 12, 2006

Internship available

If you are a student in a computer-related field at an Indiana college and looking for a summer internship, drop me a line with a resume and I'll see that it gets to the right place. Prosolv is a medical software company in Indianapolis.

Wednesday, April 05, 2006

Podcast list

With basketball season finally over, I plan on updating this blog more often. I have a couple of series ideas in mind: first, I'm looking into presentation systems for 3D medical graphics; ultrasounds, for example, and I'm very interested in the Visual Toolkit (VTK). I've not been successful in importing any of our own sample DICOM sets yet, though, so I need to poke around and try to find some online that will work.

Second, I want to do a podcast review of the nine or ten podcasts I listen to regularly. I have a long commute, and I typically go to the gym over lunch, so I have a good three hours to listen to podcasts per day if I want.

I use Juice as my podcatcher, and an iRiver attached to the auxiliary input to the sound system in my car to listen to the podcasts. Here's an OPML of my subscriptions. In no particular order, they include:

Academic.net
HanselMinutes Mp3 Direct
QA Podcast
Polymorphic Podcast
Perlcast
Chris Pirillo Show
Software As She's Developed
Security Now!
Major Nelson
Channel 9
this WEEK in TECH
Congressman John Hostettler -- Capitol Update

Thursday, March 16, 2006

Friday, March 10, 2006

Team Foundation Server

So Dave Bost gave a presentation on Team Server last night. It was pretty interesting; it was the first time I had really seen a Team System presentation that didn't focus on the different clients. He made it clear that he didn't want to discuss licensing, so I didn't ask the question that really was blazing through my head: why the heck do I and the thirty people in my company give a rip? Team System is for big people.

Anyway, Dave is the new developer evangelist for Indiana. It used to be Chris Mayo, but I guess he's moved on the other things. Dave, are you going to the Continuous Integration conference? Maybe I'll see you there!

There was also a semi-organizational meeting for a C# special interest group. I think that might be interesting, and we'll probably pull out the computers for the next one. A guy from the Advanced Visualization Lab was there too; their work might be relevant to some of the new 3D ultrasound machines that we're examining at Prosolv. Should be very interesting!

Wednesday, March 08, 2006

Volume texture DDS files

I had reason to try to create a Direct3D volume texture this week. This article gave instructions on how to do it, but I think it must have been out of date, because running the code they gave did not result in a DDS file that was loadable by the texture viewer. (I messed around with modifying the article, but then I had to register on the site, and blah blah blah) So I studied the header that was generated by the texture viewer, and eventually wrote this code:


#include <ddraw.h>
#include <fstream>
#include <D3d8types.h>

int main( int argc, char * argv[] ) {
if( !argv[1] || !strstr( argv[1], ".dds" ) ) {
fprintf( stderr, "Usage: noise output.dds\n" );
return 1;
}

DDSURFACEDESC2 desc;
memset( &desc, 0, sizeof(desc) );
desc.dwFlags = 0x00801007;
desc.dwSize = 124;

desc.dwDepth = 64;
desc.dwWidth = 128;
desc.dwHeight = 128;
desc.dwBackBufferCount = 64;
desc.ddsCaps.dwCaps = 0x00001002;
desc.ddsCaps.dwCaps2 = 0x00200000;
desc.dwFVF = 32;
desc.ddpfPixelFormat.dwSize = 0x20;
desc.ddpfPixelFormat.dwFlags = 0x41;
desc.ddpfPixelFormat.dwRGBBitCount = 0x20;

desc.ddpfPixelFormat.dwLuminanceBitCount = 0x20;
desc.ddpfPixelFormat.dwBumpBitCount = 0x20;
desc.ddpfPixelFormat.
dwPrivateFormatBitCount = 0x20;

desc.ddpfPixelFormat.dwRBitMask = 0x00ff0000;
desc.ddpfPixelFormat.dwGBitMask = 0x0000ff00;
desc.ddpfPixelFormat.dwBBitMask = 0x000000ff;
desc.ddpfPixelFormat.
dwRGBAlphaBitMask = 0xff000000;

unsigned int cnt =
desc.dwWidth*desc.dwHeight*desc.dwDepth*4;
unsigned char * buf = new unsigned char[ cnt ];
while( cnt-- ) {
buf[cnt] = rand()>>7;
}

std::ofstream ofst( argv[1] );
ofst << "DDS ";
ofst.write( (const char *)&desc, 124 );
ofst.write( (const char *)buf,
desc.dwWidth*desc.dwHeight*desc.dwDepth*4 );

return 0;
}


To be a really useful sample, I need to replace the flag values with constants...I need to figure out what they are, first, though!

Tuesday, February 28, 2006

Honestly Subjective Performance Reviews

(Thanks Lasse.) Corporate performance reviews are for the most part a waste of time. At my last job, I worked with the same set of peers for around four years, and we did peer reviews on their anniversaries. The first year, I tried to provide constructive feedback on how I thought people were doing, what they could do better, etc. The second, third, and fourth years, I had no idea what to write. Reiterate what I wrote the year before? Try to comment on what they were doing better than they did last year? I didn't have a clue. Once (with a really cool boss) I wrote my evaluation as a limerick.

This article discusses what's wrong with reviews, and how they can be better. First of all, just bail on the idea that reviews can ever be objective, any more than journalists can. Then focus on the future, not the past. It's a very agile idea. I'm excited about the future of work. I think corporations of the 21st century can work so much better. But how many companies are willing to give it a shot?

Saturday, February 18, 2006

Coding Standards and Reviews

At the IQAA on Thursday, there was a good presentation on coding standards and reviews. I've always had a strong sense that code reviews were important, but I've never actually been to a code review that was worth the time it took. My current job has a standard requirement that all code should be reviewed, though, so I've sort of been casting about for a good style of review. Ed Gibbs has thoughts on the subject; so does Macadamian. But I definitely thought that as far as my company was concerned, Robert Bogue's talk got to the heart of what a code review should be. Not that we'll actually change our process, of course, but at least I'll have some talking points when the subject comes up :) (I'm a touch underutilized in my company, I feel. I have to persuade rather than insist. Maybe after another year or two.)

So here is what I took away as the most salient points:

  1. Code reviews should not be painful. Bring cookies; have balloons.
  2. Code reviews should have a point. Don't just bring everyone in and show them the code. Then they all say, "um, sure, looks good." Have points of emphasis; exception handling, say, or readability.
  3. It's OK for junior developers to comment on senior developer's code. I'm still groping on this one; not that I ever thought that they shouldn't, but the question is more, how do you get them to do it? I've known developers that come to code reviews, sign all the forms, but then don't ever say a word about the code. I brought the question up at the time but didn't state the issue as clearly as I would have liked.
  4. Code reviews and code standards are related. This one had never occurred to me before, even after, at my last job, I wrote a short article called, "How to get your code past a review". Now I realize that that document was actually a coding standard. I think we've got a coding standard around somewhere at this company, but I'm not sure where it is. I'll probably resurrect it at some point.

So it was definitely a learning experience for me, and hopefully a springboard to learn more about the subject. Mr. Bogue has a blog as well, subscribed!

Tuesday, February 14, 2006

Yahoo! Patterns

Wow, this is a handy little pattern library. (Thanks to Grady Booch.) I'm spending more and more time with Yahoo these days, for stock updates, Yahoo! Answers, Flickr, and other things. A good sign for them, I guess.

Monday, February 13, 2006

The irrelevant Joel Spolsky

For a guy who has written so much good stuff on software development, I think Joel is falling behind the times. His latest post talks about not being able to find an online calendar that he likes, which is fine - I don't think anyone has, yet - but then he uses that as a springboard to decry the new software technique that he refers to as, "Ship early and often".

I did a web search for that phrase to find an alternate viewpoint, and Joel is already in the top three sites for it; he's quite an influencer. But he seems to have a lot of disagreements with the techniques of agile programming, which includes this technique, there referred to as "Frequent Releases". Joel - and the article I linked - claim that releasing half-baked software isn't a good idea; true enough in itself, but I'm guessing that the calendars he checked out weren't buggy, bugs are bad things and no one wants to use buggy software, but that they simply didn't have all the features he was looking for. Releasing a calendar that has actual business value isn't releasing half-baked software; it's getting something out there that people can use, evaluate, give feedback on. It's a starting point for a conversation with the users. Look at Flickr; the most popular photo-sharing site on the planet started its life as a game tool, and evolved into its current incarnation by listening to the users and giving them what they want. That's how you create software.

"But", Joel says, "I'm not going to look at 30 Boxes again -- I've spent enough time evaluating it." He won't be back to see next week's version, or even next year's. (I wonder what calendar program he'll be using in the meantime?) I suspect he says this as a recognized authority on good software, in the belief that if he doesn't like it, it's probably not much good. That's probably true, too, but, there are one heck of a lot of other folks out there. They have blogs. They write about stuff they don't like too, and they also write about stuff they do like, if not nearly as often. I didn't look at any of these Ajax calendars at all, myself. But eventually, I suspect, one will turn out to rock the world, and at that point it will be all over Technorati, Icerocket, Memeorandum, Tailrank. At that point I won't care about Joel's opinion of them today. Joel probably won't either. When one of them wins out, he'll know by word of mouth, as we all will. Two or three of the others will have fallen apart by then, spending too much time writing features that no one wants, not getting anything released out on their website, not getting any buzz. And that is why, if you're writing software today, you should release early, and you should release often.

Friday, February 10, 2006

Generics at user meeting

I went to the Indianapolis .Net User Group meeting last night. They advertised Generics as the topic, and since I really didn't know anything about them, I was looking forward to it.

It turned out to be more interesting as a group dynamic than as a presentation. As a presentation, what I gleaned was that, from a user's perspective, generics are precisely identical to C++ templates. You declare a variable of a type that takes generics, and drop the specific type after the type name, in brackets: ArrayList foo = new ArrayList. Or you declare your own class and declare a after it to create your own generic.

From an implementer's standpoint, they're pretty darn different from C++ templates, as you might expect. And working out exactly what those differences might be engendered more discussion from the group than any topic I've yet seen. People were interested in how they were implemented, whether they would really avoid boxing, whether it was done at run time or compile time. There appeared to be two or three people who really knew their stuff, too - they were discussing what the IL that was generated looked like and that sort of thing. By the 45 minute mark, I was pretty sure that I was in for a two-hour or more night.

But amazingly, the entire presentation couldn't have been more than 35 minutes. Add in 25 minutes of discussion and the whole thing would have been over with before seven with plenty of time to draw door prizes and be out of there by 7:30.

But I don't know if that's what happened or not. The Q&A period was still going strong at 7:05 and I decided to bail. I hope my ticket didn't win a new car or something :)

Wednesday, February 08, 2006

ISO

I wrote a while ago about how ISO can actually be used as a positive thing for a company, which I suppose most developers at the grunt level would disagree with. It's true though: you just have to use it to describe your processes, rather than prescribe them.

There is a basic dichotomy, however: The company management may not have the least interest in improving the processes. They just want the pretty sticker for the front door that says, "Yes indeed! We're ISO approved! You can do business with us!" After that, they may not give a fig whether or not the processes are actually being followed, except to the extent that they won't get into legal trouble. This is why so many developers hate ISO. For ten months out of the year, they're told to bypass, sneak around, don't bother with the process, we have to get those customers happy. Or if they follow a process, they may get penalized for it. "What do you mean it'll take you two months to do that? We can't put that on the form! Put down three weeks!" Then, of course, when it does take two months, everyone has to work overtime since the project is so far behind schedule.

For the other two months of the year, they're told, "OK, here's the process. You have to have it memorized. If an auditor comes by, make sure you have the document in front of you. Just read it to the auditor. Don't make trouble. Don't volunteer anything. We just want our little sticker; we don't care about the process."

It's a shame. There's real value in ISO. I wonder if there are any companies that can find it?

Tuesday, January 31, 2006

A good Informatics web site

When I complain about something, I do like to follow up if the problem is addressed, or corrected, or even vaguely not quite as bad as I thought. I wrote here about the IU School of Informatics web site and how I was pretty unimpressed with it, for what should really be a school on the cutting edge. Well, even if you can't find it from the main page, bubbling up from the bottom are some good sites. This one on a talk series about complex systems is very nice - it has the expected abstracts, speakers, times & dates, and things; but also includes links to the slide decks and podcasts of the existing talks! Now that's what I'm talking about! It's not perfect, of course - I'd like to have forums or comment sections for each individual talk, as well as an RSS feed so I can grab the talks with a podcatcher - but it's one heck of a lot more interesting than the Informatics main page. Well done, Katy Börner, and thanks to Justin Donaldson for the link.

Friday, December 23, 2005

Software testing job opportunities

Are you a software tester? Come to Indianapolis and join ProSolv, which will be adding 50 new jobs next year, and immediately adding a quality manager and a software tester. Visit the job descriptions on Monster, or just send me a note and I'll see your resume gets to the right place!

Icerocket tags



Wednesday, December 21, 2005

Responses from the Senators

I posted here about writing a letter to the Indiana senators, Richard Lugar and Evan Bayh, about the Coburn anti-pork amendments. Finally this week I heard back from both of them. Senator Lugar sent a form letter, although it was right on topic, and asserted that the "Bridge to Nowhere" was not getting any money, although I haven't verified that yet. A staffer of Senator Bayh's wrote me, though, just saying he received my inquiry and wanted me to call him. Huh! I'll have to do that and see what he has to say.

Tuesday, December 20, 2005

Major victory for science

Judge John Jones determined in the Dover, PA court case that Intelligent Design should not be taught as a reasonable scientific alternative to evolution. Well done! But the judge went beyond that, finding that ID is not good science, that the yahoos who brought the case were wasting everyone's time, and all in all wrote a decision that I completely agree with in every way. Can we just project him straight to the Supreme Court? Timothy Sandefur posts a succinct summary of the decision over at the Panda's Thumb. Congratulations to all involved!

Tuesday, December 13, 2005

Miscellania

Things I have been doing while not blogging:
  • Sudoku. The local paper finally started putting one in, starting with an easy one on Mondays and getting harder each day until Saturday. I got about half of them the first week. I need to go over this paper in more detail though. I figure another week or two until I get bored :)
  • Jigsaw. Got a new jigsaw puzzle (1000 pieces ) and my four-year-old and I started to work on it. Haven't looked at it since the Sudoku though!
  • LibraryThing. A fun site that lets you catalog and tag books. You're only allowed 100 200 free entries, but the $25 lifetime fee is pretty enticing. I can just see it becoming so popular that even the lifers have to pay extra for new features, though.
  • Yahoo Answers. An evil combination of points, social software, trivia and opinion that I can't resist coming back to several times a day to check on the new questions. Haven't gotten any hugely new useful information out of it, but I bet I get to level 2 soon.
  • Ars Indiana. Don't know if this will go anywhere, but it's my new blog where I intend to put all my cultural-type posts. Put one up on the B.B. King concert last week.

Update: It's 200 books, not 100. Sorry, Tim!

Thursday, December 08, 2005

Way To Go Indiana!

Thanks to the Panda's Thumb for referencing the Fordham State of Science report. In it, Indiana receives an A and an overall rating of 91%, fifth in the nation. That is an amazing achievement, especially considering how close the forces of darkness are. Nice job, Hoosiers!

Bloomington as energy pill

Wocka wocka wocka. Looks like Bloomington is about to be eaten by Pac-Man.

Tuesday, November 22, 2005

No images allowed in the app_data folder

I had my first opportunity today to try to create a web site with Visual Studio 2005. I made it about two steps before running into a brick wall. The concept of Master Pages seems like a really cool one, but I'm not sure how to make it work together with CSS and maybe two or three user controls. So I tried to set up a master page; no problem; tried to associate default.aspx with it; problem. The property is grayed out. A little research told me that the field could be filled in through code in the @page declaration, so I tried adding that with trepidation. I was pretty sure that the field was grayed out for a reason, and sure enough, that didn't get me what I was looking for.

I quickly found out that "Content" pages are the only kind of page that can be associated with master pages, but all the intuitions I've built up about finding things don't apply to 2005. For example, I tried to "Add New Item" to the solution, expecting to find "Content Page" as an option. No dice. I tried looking through the toolbox for something I could drag onto the page to make it a content page; nothing. So I did quite a bit of additional research and poking around on the web, and in Dev Studio, and in the Dev Studio help. But it took me quite a long time to actually find the solution, which I did by trial-and-error: right-click in the content placeholder in the master page and choose "Add Content Page". I'm sure I could have found help on this...somewhere.

This was directly followed by problem #2; I simply tried to add an image to my page. Copy-and-pasting from VS 2003 on my system failed, to my mild surprise, but then I noticed this nice "App_Data" folder in the project, and it was clear that any images or sounds needed to go here. Right-click on it, add new item, and drag the picture to my page, no problem!

Except that when I hit F5, no image is to be seen. Back to the web. This time, there's a fundamental problem: I can't figure out any terms to search for that describe my problem with any hope of clustering to the right solution. I tried "visual studio app_data image doesn't show up", "asp.net image failing", "Visual Studio 2005 images", "visual studio 2005 add jpg to web page", but all these terms give me much too generic results back. If the solution to my problem was in one of those result sets, it must have been on page 37 at least.

Finally, I went to the Microsoft forums, and specifically to forums.asp.net . Here, a search for app_data turned up dozens of results, and I tried to narrow it down by searching for "app_data debugging", which was what I was trying to do. Bingo! By sheer luck it turned out that the problem involves permissions and running inside the debugger, but the fact of the matter is that the app_data folder is not supposed to hold images at all, only databases.

I added a separate folder for the images and everything worked fine. Whew. What adventures await me in Visual Studio tomorrow?

Icerocket tags



Thursday, November 17, 2005

Indianapolis Workshop on Software Testing

This looks like an interesting, if tiny, local software testing group. I found it through this post, from Mike Kelly, who appears to be the ringleader, with an impressive list of publications on his site. Mike, how about blogging more than once a month though, huh?

They say the best way to get an invite to their meeting is to submit a paper. I think I'd like to sit in on just one or two first though; maybe I'll try to finagle an invite from somebody. I joined this group anyway, which was free. Don't suppose I'll ever hear from them though.

Edit: fixed link

Monday, November 14, 2005

Customer Service: Compare and Contrast

Way back when I posted about sending a note to the Bloomington newspaper about adding an RSS feed to their website. I did that again this week, at the site Macadamian.com, which has a regular column called Critical Path, tips for software managers. Got almost an identical response, actually, with a quick response from an editor saying they were considering it, followed a few hours later with a link, and now I'm subscribed to it in my feedreader. Very nice; great customer service.

Now a month or so ago, I sent not the exact same question, but a similar question to the Indiana University School of Informatics, where I think I might be an alumni. (I was for a while, and then I wasn't again, but now I hear that the IU Computer Science department has been assimilated, and so I must be again. Unless I'm not. Anyway.) The school has an RSS feed, which is good, but the unfortunate bit is that the feed is just old-school marketing, PR stuff. IU Research in Spotlight at Seattle Supercomputing Conference. Now, by no means do I object to reading that stuff; a lot of it is important and interesting. But if this is a cutting-edge school, I want a cutting-edge web page. I want to read student and professor blogs, with comments, utilizing new technology to block spam. I want to see wikis, and web pages with Ajax components. I want podcasts of lectures and symposiums. I don't want a bunch of static web pages that no one is ever going to look at, except for the one time a month they need to look up an email address.

So, on their comment page, I wrote up my request.

No response. At all. My comment was ignored completely.

So what's the deal here? Does the industry just change too fast for universities to keep up with? Is it a problem specific to Indiana University? Or is it just that they're not a business and therefore have no interest in responding to customer requests?

I don't know. I'd like to know.

Icerocket tags





Saturday, November 12, 2005

Midsummer Night's Dream at IU Opera Theater

I went and read the Peter Jacobi article in the Herald-Times before writing this. He's got something of a reputation for sticking to positive items in his reviews, and if that's true, this opera must have been deeply troubled, since he presents a series of negative comments from the director, even if you have to read between the lines a little to get them. "Lack of stagecraft", "Not enough rehearsal time", "Children may need to be miked", were some of his comments.

All that said, I thought the opera was stunning. Now, when I write reviews, I write them not just of the production, but of the opera. I can't compare this production to the debut at the Aldeburgh Festival in 1960, or even the one at the London Coliseum in 1994. So I wasn't sure if the role of Oberon was always played by a countertenor or if that was just someone's cool idea (it always is) or if Puck is always a ballet dancer who shouts his lines (he shouts, but isn't necessarily a dancer).

And I wasn't blown away by Oberon at first. It took me a while to get used to the rhythm of his lines, but once I did, I thought it fit in perfectly with the beautiful mystic green in the abstract sets and lighting. The fairy costumes were done in modern punk, which was bright and colorful enough to work perfectly with the sets, and the four lovers were dressed in street clothes. Lysander came out in a T-shirt with the name of a fraternity on it, which got a big laugh when Oberon instructed Puck to "Look for a mortal in Athenian dress." (Athenian, fraternities, Greeks, get it?).

Some of the children did have to be miked; of the four majors, I think it was two and two, but that didn't matter. As far as stagecraft, it's certainly been a long-held belief of mine that singers can't act, which has been true here at IU at least. So the fight scene was drab at best. The rude mechanicals weren't bad - I suppose you can tell a good actor by how convincingly he can badly act - and Bottom was pretty good, although my "feel" for the character has always been a bit more boorish.

Of course, that's a judgment on the play. And while I'm at it, I could have happily left after Act 2, because I always feel badly for the mechanicals when everyone makes fun of them. But the music, the countertenor, the costumes, the dancer, in the first two acts, all combined together to make this one of the three or four best operas I've seen at IU. Dreamy.

Icerocket tags




Friday, November 11, 2005

Automated testing using Ruby

So here’s the problem statement: Write a Ruby script that will open a database, check it for accuracy, and if it is NOT accurate, send an email describing the issues.

So this will require (a) opening a database in Ruby, (b) running a test in Ruby, and (c) sending an email in Ruby. None of these is probably very difficult, but not being a Ruby expert I went searching for examples on the web. I wasn’t thrilled by the examples I found for these tasks, so I thought I’d write up what I did.

Databases: This is code that will open an Access database and grab all of the rows in the Exam table:


require 'dbi'

DBI.connect("DBI:ODBC:driver=Microsoft Access Driver
(*.mdb);dbq=" +
ENV['TESTINSTALLDIR'] + "db1.mdb ") do dbh

rows =
dbh.select_all('select * from Exam')
end

Tests: I started by writing my own little test procedures, until I stepped back and looked at what I’d done – I’d developed a rudimentary RUnit, along the lines of NUnit or CPPUnit. At that point I was sure that it had been done before, and it had – and not only that, but it turned out to be part of the Ruby standard library. Although what I’m doing here isn’t really what I would call unit testing, it’s close enough that I decided to use that instead.


require 'test/unit'
require 'test/unit/ui/console/testrunner'

class
DatabaseTest < Test::Unit::TestCase
def test_dbContents
assert(rows[1]["Media Type"] == "Image Server")
end
end

Test::Unit::UI::Console::TestRunner.run(DatabaseTest)

Email: There are some good email sending examples around. I started with this one and ended here:


require 'net/smtp'

class FailCounter

def TextBody()
email_text = <<END_EMAIL
To: "Ben Fulton"
<#{@to_addr}>
From: #{@from_addr}
Subject: #{@project}
automated test failure

An automated assertion failed for the project
#{@project}

#{@errors}

END_EMAIL

return email_text
end

def Finalize
if (@counter > 0)

Net::SMTP.start("myprovider.net") do smtp
smtp.sendmail( TextBody(),
@from_addr, @to_addr )
end
else
puts "No failures!"
end
end
end

Now, my goal was for the results of the test to be put into the email. That took a long time to figure out. Step 1 of the solution was to realize what the automated test runner was doing under the covers, and take advantage of it. So I replaced the run(DatabaseTest) line with this:


tr = Test::Unit::UI::Console::TestRunner.new( DownloaderTest)
passed = tr.start()
Now I have the results back in a TestResult, which I can examine for failures, so emails only go out if some tests actually failed:


if (passed.failure_count() > 0 passed.error_count() > 0)
fc =
FailCounter.new
fc.Add( “Failures found” )
fc.Finalize()
end

Step 2 of the solution is to get the information from the test in a format that I can put in an email. It turns out that TestRunner.new can take a parameter defining where output should go, which defaults to STDOUT. I could have redirected it to a file, but that seemed like unnecessary work, so after a lot of searching I came up with what I was looking for, StringIO, which takes output and writes it to a string:

sio = StringIO.new
tr = Test::Unit::UI::Console::TestRunner.new(
DownloaderTest, Test::Unit::UI::VERBOSE, sio )

I also changed the default NORMAL verbosity parameter to VERBOSE. Then I replace the FailCounter “Failures found” line like this:

fc.Add( “Failures found: “ + sio.string )

And that was it. I’m not going to glue all this code together here, since this post is already too long, but hopefully if you’re interested it should be straightforward. Good luck!


Icerocket tags




Wednesday, November 09, 2005

Kansas school board redefines science

Kansas, for shame!

I hereby pledge that I will never move to Kansas, nor allow any member of my family to attend any school in that state.

Icerocket tags



Build/Test machine

I posted here about our plans for updating the ProSolv build process. It's been going pretty well; the hallway machine is up and running, although I had to bring a table from home to set it up on, and now someone wants to buy it from me :)

Builds are scheduled for 6 PM each night, and an automated test script runs all day. Right now we just have a single script that takes about 15 minutes to run. It's powered by Ruby and by AutoHotKey, which works nicely as an automator. I especially like that the scripts are simple text files.

A lot of people don't quite understand what I'm trying to do. They look at the machine and say, "What's the point of running a test that doesn't log any results?" The answer is, that there is a lot of importance to just exercising the UI. If we have a build one day where you click on a study image and the application crashes, this test process will find that.

Nevertheless, as long as this machine is running scripts, there's no reason for it not to log results. I thought for a while that I would have to add code to the application to write out sensible log results, which is not a process to undertake lightly, but it occurred to me recently that the GUI manipulations that the script is doing mostly result in predictable changes to the file system and database. So I spent a little quality time with Ruby's DBI and Test/Unit modules, and wrote up some assertions that will send an email to me at the end of the script if the database isn't in the state I expect. It's only a start, but now I can add more assertions in the middle of the process, or add new assertions as I extend the test scripts. It's coming together very nicely!

I'm thinking also about modifying the machine to alternate test runs with kiosk-style data updates, such as how many files were compiled last night, or how many support calls were handled yesterday. It'll be interesting to see how people respond to that :)

Icerocket tags




Thursday, November 03, 2005

Gadgets and Office alive alive-o

Microsoft previewed its new Windows Live strategy yesterday. My reaction, along with a lot of other people's: It's a portal, and we don't want or need another portal, no matter how skillfully it's put together and how many neat gadgets are available on the site.

That said, here's what really got my attention in the announcement:

Windows Live™ is a set of personal Internet services and software...

So what exactly are we talking about, Internet services? Are we talking web services here? That would be cool. Here's what I want: The ability to add, to my site and not to Microsoft's, a Word document that can be edited by approved people. The document would ideally be stored on my site, but could then be bounced to a Microsoft service for some Ajax magic and editing. Is this the sort of thing that Office Live is going to make available. That would be awesome!

But I've gone searching around the web looking for any evidence that anything on Live is going to be addable to other web sites. Scoble said something - when does Scoble not say something? - but he didn't go into any details other than, "I’m still struggling to understand what I’ll get by putting a new Windows Live service on my blog or business site".

Robert, it depends on which direction it goes. I'd be thrilled to call out to a Windows Live web service as part of a mashup for my site - maybe a Click-To-Talk button using Messenger to dial my phone directly? - but if you're expecting me to make something available that users can only reach through the Live site, forget it.

So for me, the jury is still out until we get more details for developers.

Disclaimer: I own stock in Microsoft.

Icerocket tags


Wednesday, November 02, 2005

Code Reviews

Ed Gibbs says his team is about to institute code reviews. Of course, if you do pair programming regularly code reviews are pointless, since - turning all the knobs up to 11 - all of the code is reviewed all the time. But I've never worked in a shop where pair programming really took off. I'd be curious to hear how prevalent it is.

As I understand it, we at ProSolv are required by FDA regulations - perhaps here? - to do design and code reviews, although, especially for small projects, we often combine them into a single review. Currently I'm not convinced that they add anything to the quality of our software, although, as I've stated before, I think ISO can potentially be a big gain for a company and not just overhead. All the usual difficulties of code reviews apply - what sorts of things are worth bringing up? Is coder A receptive to constructive criticism? Is coder B tearing things down for the sake of doing it? Is coder C reluctant to make a great suggestion for fear of hurting feelings? Should the code be perfect, or just good enough? - and in the final analysis the review is either marked passed or failed.

I'm sure this process can be improved, but I'm not sure how. Maybe design reviews could be accompanied by UML diagrams. Maybe we just need a big slab of coding standards that have to be applied. For example, a review I'm looking at now introduces two new global variables to a C++ application. I think the industry consensus is that global variables are bad, but certainly the code works. Do we need a coding standard that says to avoid global variables? If we did that, how much extra overhead is added to the process?

I'm seriously considering offering a bounty of ten cents a line for any project that can remove lines of code from an application rather than adding them. I bet that would be more effective than fifty code reviews!

Icerocket tags




Tuesday, November 01, 2005

RootkitRevealer

After my post yesterday on SysInternals and listening to the RootKit episode of Security Now, I decided to give RootkitRevealer a whirl on my system. It turned up a slab of hidden registry class ID keys underneath HKLM\SOFTWARE\Classes\CLSID:

{47629D4B-2AD3-4e50-B716-A66C15C63153}
{604BB98A-A94F-4a5c-A67C-D8D3582C741C}
{684373FB-9CD8-4e47-B990-5A4466C16034}
{74554CCD-F60F-4708-AD98-D0152D08C8B9}
{7EB537F9-A916-4339-B91B-DED8E83632C0}
{948395E8-7A56-4fb1-843B-3E52D94DB145}
{AC3ED30B-6F1A-4bfc-A4F6-2EBDCCD34C19}
{DE5654CA-EB84-4df9-915B-37E957082D6D}
{E39C35E8-7488-4926-92B2-2F94619AC1A5}
{EACAFCE5-B0E2-4288-8073-C02FF9619B6F}
{F8F02ADD-7366-4186-9488-C21CB8B3DCEC}
{FEE45DE2-A467-4bf9-BF2D-1411304BCD84}


I was mildly worried and spent a bit of time tracking down these keys. I think I can say pretty definitely what they're for now; it's Pinnacle Studio 9 hiding their registration keys. Irritatingly, Studio doesn't handle logging in as a non-admin properly, either - every time I start it I have to click the little message that says "Don't show this screen again".

Icerocket tags