Optimization
This article is about improving programs. For the class of problems, see optimization problem.
To optimize a correct program is to engineer its design or implementation to improve its performance. This generally means decreasing its time complexity, space complexity, CPU time usage, or memory usage. Sometimes it will be possible to decrease all of these, whereas in other cases one might be sacrificed for another (especially in the so-called time/memory tradeoff). Less commonly in algorithmic programming, optimization can also mean reducing disk seeks, cache faults, or response time, or increasing throughput. A specific concept that optimizes a program, such as I/O buffering, is also referred to as an optimization. Generally, optimization improves the targeted performance statistic for all or almost all conceivable inputs, or the improvements it offers vastly outweigh the regressions (for example, a program may run slightly more slowly on very small inputs but significantly more quickly on medium to large inputs after optimizations have been made). It thus improves performance in both the average case and the worst case. When the average case performance is already excellent but the worst case performance needs improvement, randomization may be easier than optimization.
For optimizations that do not affect asymptotic (scaling) behaviour, but improve time or memory usage by a constant factor only, see invisible constant factor.
Pruning
In recursive backtracking algorithms, the central tenet of optimization is the following:[1]
- Prune early, prune often.
- Don't do anything stupid. Don't do anything twice.
References
- ↑ Optimization Techniques. USACO Training Pages, Section 4.1