[Prog] Trade offs

* Space vs. Speed
– Most famous one. (Memory space vs. execution speed) – details are omitted.

* Modularity vs. (Performance & Space)
– Well-designed-module should be stable. To be stable, there may be lots of ‘parameter check’, ‘condition check’, ‘sanity check’ and so on. Besides, for the safety reason, it may not trust passed memory space. So, it may allocates it’s own memory space and try to copy data from passed memory to use it. Those make this trade-off. Well-designed-module hides it’s information and stable as it is. But to be like this, it consumes speed and space.

* Abstraction vs. Debugging
– When software is stable, abstraction reduces maintenance costs very much. But, if not, debugging is more difficult than straightforward designed software, especially, to the engineer who are not accustomed to the software.

* low possibility to misuse vs. various and rich functionality
– It is clear…

* Backward-Compatibility vs. Innovation
– …

[Prog] Software design fundamental (my opinion)

I want to say only one sentence about software design.

“Software design starts from and ends with [ separating things that will be changed with high possibility, from others that will not. ]”.

In macro point of view, software requirement is main subject to classify. In micro point of view, function parameter, algorithm, data structure and so one can be targets.

[Prog][Quotation] Extensive computer use is correlated with low productivity.

A study at the Software Engineering Laboratory – a cooperative project of NASA, Computer Sciences Corporation, and the University of Maryland – found that extensive computer use (edit, compile, link, test) is correlated with low productivity. Programmers who spent less time at their computers actually finished their projects faster. The implication is that heavy computer users didn’t spend enough time on planning and design before coding and testing (Card, McGarry, and Page 1987; Card 1987).

[Prog] Do not underestimate costs for refactoring.

Refactoring itself requires lots of costs. Especially, verifying result – refactored software – is extremely expensive. So, to minimize this costs, auto-test-system is essential. Don’t underestimate the costs. Usually, making auto-test-system can save costs more than expecting.

[Prog][Quotation] Ratio of High-Level-Language Statements to Equivalent Assembly Code

Language                     Ratio
-----------------------------------------------------------
Assembler                    1 to 1
Ada                          1 to 4.5
Quick/Turbo Basic            1 to 5
C                            1 to 2.5
Fortran                      1 to 3
Pascal                       1 to 3.5

< Source : Applied Software Measurement (Jones 1991) >

[Prog] Step of progress (programming skill).

This id 100% personal opinion…
Step of progress in terms of programming skill(excluding co-working skill).

1. Coding with giving attention to the language syntax.
2. Coding his/her own thought quickly. (number of code line per hour is a lot. But, very buggy & poor design)
3. Realizing difficulty of debugging. So, coding considering debugging. initial speed becomes slower than (2)
4. Realizing software design. Coding considering design too. (slower than (3))
5. Knowing design techniques and importance of interface design, but immature. So, software is tended to be over-engineered. Lots of time is spent on designing over-engineered-software, interface and so on. In this stage, programmer knows about “What should be considered for good software”. But he/she doesn’t have any idea/experience about solutions. That’s why initial speed is very slow.
6. Being mature. Becoming faster step by step.

[Prog] Basic knowledge about HW is required even to the application engineer.

((To avoid misunderstanding) I am also application engineer.)
Application engineer tends to ignore HW characteristics. But, without HW, SW is useless. Even application engineer in embedded software, would better to know basic stuffs about low-layer. Here are some examples about considering these low-layer characteristics. These examples are to appeal to application engineer.

* Each assembly code line may spend different number of CPU clocks. So, counting assembly code line is useless to check CPU performance.
* Optimized code for “2 CPU + 1RAM” is totally different from the one for “2 CPU + 2 RAM for each”. Besides, DMA, Bus arbiter etc may also affect to code.
* In some cases, using compressed data is faster than uncompressed one. For example, using highly-compressed-RLE data on the platform which has very-fast-CPU but slow NVRAM. In this case, dominate factor of performance is “performance of accessing NVRAM”. So, compressed data has advantage on this. Besides, overhead for uncompress is very low in case of RLE.

Point in here is, “In embedded environment, even application engineer needs to know about it’s HW platform where software runs on.”