This was related to me by J.
J. headed along to the coffee break area, a couple of couches and a coffee machine round a corner in the office. There he found B., working with a pencil and paper. "What are you doing B.?" he asked. B. explained that he had a large number of files which were wrong, but uniformly wrong in the same way. B. described the changes and said he was working out an edit with a Powerful Editor to make this same change to all the files.
This was at a time when there were many more types of systems than the Unix-like ones of today. However, most of them had their own Powerful Editor, of utility similar to Emacs and vi, of which it was claimed you could do anything with it, the only problem being working out how.
J. asked B. how many files are involved and the number turns out to be something like 80 or 100 or so. J. remarks that the changes could all be done in 20 or 30 minutes and heads off to make the changes to demonstrate this. J. was a much more dynamic, action-oriented character than B., who was more contemplative about handling problems. B. carries on working out the Powerful Editor commands to apply the changes.
Half an hour or so later, B. has finished preparing the edit and heads off to see how J. is getting on. J. had pretty much finished applying the changes by then. Of course the automatic edit by B. only takes a few seconds to apply.
Now both J. and B. well understood the importance of only making changes like this on a copy of the data so that everything can be thrown away and another attempt made on a fresh copy of the data if necessary. This meant the results of both attempts could be compared with the original and with each other. Both sets of changes were pretty much what was wanted but they weren't quite the same. This was because J.'s manual changes turned out to have a few typos, highlighted by comparison with B.'s automatic changes. This is the real problem, occasional non-obvious typos in manual changes.
With B.'s automatic edit errors in the edit will be egregiously wrong because all the files will be wrong. Thus the error is easy to find and to fix. On the other hand, manual changes will all need to be checked individually, and even then, there's no guarantee that a mistake has not been missed.
This was the lesson J. learnt from this. It's worthwhile expending additional time and effort to devise an automatic process to make changes rather than making manual changes, which all have to be checked and may still be wrong.
"Bumper sticker computer science" already has this covered where it says
If you have too many special cases you're doing it wrong.This is an example of making too many special cases.
Before there were PC's there were terminals, which nowadays are usually just terminal emulators running on PC's. This particular type of terminal was programmed as a terminal by a firmware team, but could also run software downloaded into it. The software team produced word processing software to download into it. The firmware and software teams were based on different continents, so mostly communicated by e-mail.
Keystrokes on the terminal went through 3-stage lookup translation tables to convert keystrokes into characters and according to localised language requirement. For various reasons two of the tables were shared between the firmware and the software (the firmware provided pointers) but one table with localisation entries was duplicated between the firmware and the software.
At some point, to be able to localise for a new market, some empty entries in the keystroke translation tables were to be filled in with newly defined characters such as Ÿ (Y-umlaut - HTML entity Ÿ - Unicode character Latin Capital letter Y with diaeresis), although this was before either HTML or Unicode. Also at this time the original firmware team had moved on to newer, more interesting projects and the firmware was in the charge of a specialist maintenance team whose aim would be to minimise support effort.
Because the translation table were well understood by the software team and the specification of keyboard key to character was known, the software team just proceeded with the changes to match what they knew the firmware team would make to the firmware copies of the table. However, when the time came to test the software with an updated terminal there was a problem. The characters were wrong.
The software team immediately worked out what the problem was. The firmware maintenance team were wary of making changes to the big translation table and so had decided to test for these characters as special cases in the code, because that seems to be a smaller change. It really isn't a smaller change, and as soon as cases that have be handled differently turn up, the change turns into a mess of spaghetti code that may be impossible to get right.
However, it wouldn't be a good idea for the software team to just tell the firmware team that they had done it wrong. That would only get their backs up and generate resistance to changing something that seemed to be working for them.
So cue an e-mail which read something like :
We thought you would be changing the translation tables like this [ insert detailed description of how the translation tables should be changed ] but apparently you've done something else. Please can you tell us how you changed it so we can adjust our change to match?This had the desired effect and the firmware team replied :
Oh, your way is better. We'll redo the firmware change to match.Perhaps knowing that the software team was prepared to change the table entries gave the firmware team confidence that change the tables was the correct way to do it. Anyway, this had the desired result and the updated tables supporting the additional characters worked fine.
A couple of months later the software team heard of a another consequence of these changes. The terminals, just running as terminals without the downloaded software, had been displaying some of these additional characters as blobs on the screen. But as the terminals as terminals were probably mostly used for data entry and report printing and as long as the data in the database and the printed characters were correct, having a character display as a funny blob on the screen could be ignored as a quirk with no adverse effect.
Once the table changes devised by the software team were applied by the firmware team, this problem just went away on its own. The software team had fixed a bug they didn't even know existed.