In order to avoid those problems, some data structures allow for the inefficient operation to be postponed, this is called ''scheduling''. The only requirement is that the computation of the inefficient operation ends before the result of the operation is actually needed. A constant part of the inefficient operation is performed simultaneously with the following call to an efficient operation, so that, the inefficient operation is already totally done when it is needed, and each individual operations remains efficient.
==
====Example: queue====
For example, [[amortized queue]]s are composed of two [[singly linked list]]s: the front and the reversed rear. Elements are added to the rear list and are removed from the front list. Furthermore, each time that the front queue is empty, the rear queue is reversed and becomes the front queue, while the rear queue becomes empty. The amortized time complexity of each operation is constant. Each cell of the list is added, reversed and removed at most once. In order to avoid the inefficient operation where the rear list is reversed, [[real-time queue]]s, adds the restriction that the rear list is only as long as the front list. To ensure that the rear list becomes longer than the front list, the front list is appended and reversed to the rear list. Since this operation is inefficient, it is not performed immediately. Instead, it is performed for each of the operations. Therefore, each cell is computed before it is needed, and the new front list is totally computed before the moment when a new inefficient operation needs to be called.