[go: up one dir, main page]

Jump to content

Overhead (computing)

From Wikipedia, the free encyclopedia
(Redirected from Computational overhead)

Overhead in computer systems consists of shared functions that benefit all users or processes but are not directly attributable to any specific task. It is thus similar to overhead in organizations. Computer system overhead shows up as slower processing, less memory, less network bandwidth, or bigger latency than would be expected from reading the system specifications.[1] It is a special case of engineering overhead. Overhead can be a deciding factor in software design, with regard to structure, error correction, and feature inclusion. Examples of computing overhead may be found in Object Oriented Programming (OOP), functional programming,[citation needed] data transfer, and data structures.

Software design

[edit]

Choice of implementation

[edit]

A programmer/software engineer may have a choice of several algorithms, encodings, data types or data structures, each of which have known characteristics. When choosing among them, their respective overhead should also be considered.

Tradeoffs

[edit]

In software engineering, overhead can influence the decision whether or not to include features in new products, or indeed whether to fix bugs. A feature that has a high overhead may not be included – or needs a big financial incentive to do so. Often, even though software providers are well aware of bugs in their products, the payoff of fixing them is not worth the reward, because of the overhead.

For example, an implicit data structure or succinct data structure may provide low space overhead, but at the cost of slow performance (space/time tradeoff).

Run-time complexity of software

[edit]

Algorithmic complexity is generally specified using Big O notation. This makes no comment on how long something takes to run or how much memory it uses, but how its increase depends on the size of the input. Overhead is deliberately not part of this calculation, since it varies from one machine to another, whereas the fundamental running time of an algorithm does not.

This should be contrasted with algorithmic efficiency, which takes into account all kinds of resources – a combination (though not a trivial one) of complexity and overhead.

Examples

[edit]

Computer programming (run-time and computational overhead)

[edit]

Invoking a function introduces a small run-time overhead.[2] Sometimes the compiler can minimize this overhead by inlining some of these function calls.[3]

CPU caches

[edit]

In a CPU cache, the "cache size" (or capacity) refers to how much data a cache stores. For instance, a "4 KB cache" is a cache that holds 4 KB of data. The "4 KB" in this example excludes overhead bits such as frame, address, and tag information.[4]

Communications (data transfer overhead)

[edit]

Reliably sending a payload of data over a communications network requires sending more than just the payload itself. It also involves sending various control and signalling data (TCP) required to reach the destination. This creates a so-called protocol overhead as the additional data does not contribute to the intrinsic meaning of the message.[5][6]

In telephony, number dialing and call set-up time are overheads. In two-way (but half-duplex) radios, the use of "over" and other signaling needed to avoid collisions is an overhead.

Protocol overhead can be expressed as a percentage of non-application bytes (protocol and frame synchronization) divided by the total number of bytes in the message.

Encodings and data structures (size overhead)

[edit]

The encoding of information and data introduces overhead too. The date and time "2011-07-12 07:18:47" can be expressed as Unix time with the 32-bit signed integer 1310447927, consuming only 4 bytes. Represented as ISO 8601 formatted UTF-8 encoded string 2011-07-12 07:18:47 the date would consume 19 bytes, a size overhead of 375% over the binary integer representation. As XML this date can be written as follows with an overhead of 218 characters, while adding the semantic context that it is a CHANGEDATE with index 1.

<?xml version="1.0" encoding="UTF-8"?>
<datetime qualifier="changedate" index="1">
  <year>2011</year>
  <month>07</month>
  <day>12</day>
  <hour>07</hour>
  <minute>18</minute>
  <second>47</second>
</datetime>

The 349 bytes, resulting from the UTF-8 encoded XML, correlates to a size overhead of 8625% over the original integer representation.

File systems

[edit]

Besides the files themselves, computer file systems take a portion of the space to store directory names and listings, file names, files' sector locations, attributes such as the date and time of the last modification and creation, how the files are fragmented, written and free parts of the space, and a journal on some file systems.

Many small files create more overhead than a low number of large files.

See also

[edit]

References

[edit]
  1. ^ Denning, Peter (January 2003). "Overhead". Encyclopedia of Computer Science. John Wiley and Sons. pp. 1341–1343. ISBN 978-0-470-86412-8.
  2. ^ "Inline functions (C++)". Microsoft Learn. Microsoft. 22 January 2024. Retrieved 22 March 2024.
  3. ^ Mahaffey, Terry (24 July 2019). "Inlining Decisions in Visual Studio". C++ Team Blog. Microsoft.
  4. ^ Sorin, Daniel J. (2009). "Caches and Memory Hierarchies" (PDF). Retrieved March 13, 2019. Presentation for course in Computer Architecture.
  5. ^ Common Performance Issues in Network Applications Part 1: Interactive Applications, Windows XP Technical Articles, Microsoft
  6. ^ Protocol Overhead in IP/ATM Networks, Minnesota Supercomputer Center