Pages

Thursday, December 04, 2008

Modularity and refactoring is a law?

Modularity and refactoring relate with modular programming, which break our program into modules. Refactoring (break into small factors/parts) refers to the technique, and modularity refers to the property (IMHO, high modularity means low dependency).

Normally this is advised when we do structured or OO programming. Kind of rule-of-thumb in software development. The benefits:
1) reduce complexity
2) simplified program structure
3) etc.

In my case, this paradigm was kind of golden egg!

Last night while i was writing my C program, i found that somehow this is not an option. In my program, I want to represents my data using 2 bytes of char, which contains a row of bits, like follows:

[ 7 6 5 4 3 2 1 0 ][ 7 6 5 4 3 2 1 0 ]
--------1----------------2--------

Each bit represent a flag (1-16). To set a flag, i assign a number e.g. 4 for 4th flag in (3rd in 1), and 15 for 15th flag (6th in 2).

Luckily, i use modules. Module A to allocate appropriate flag (bit), and module B to split value into bytes. However, to calculate the correct bit index, i have to delegate the job to module A, since i can't put the calculation to B. Although it might be possible, it may becomes too complex.

Why bother to have A or B? Why not just one? (someone might ask)
Function A (calculate bit index) repeated multiple times. And called in B. My point is, the calculation needs to be put in appropriate place (in A, and not B). More important, the formulae is easily revised and traced ('cause i remember it in A). So modularity saves the day (or night)!

ps: btw (if u might ask), i'm developing a transition table for transducer.

-EOF

No comments:

Post a Comment