Skip to content

Module 11: External Memory

Motivation

Different levels of memory

A typical current computer architecture includes

  • registers (very fast, very small)
  • cache L1, L2 (still fast, less small)
  • main memory
  • disk or cloud (slow, very large)

The External-Memory Model (EMM)

image.png

Stream-based algorithms

Stream-based algorithms (with \(O(1)\) resets) use \(\Theta\left(\frac{n}{B}\right)\) block transfers.

image.png

Tasks that can be performed:

  1. Text Compression:
    • Algorithms: Huffman, Run-Length Encoding, Lempel-Ziv-Welch.
  2. Pattern Matching:
    • Algorithms: Karp-Rabin, Knuth-Morris-Pratt, Boyer-Moore.
    • Assumes internal memory has \(O(|P|)\) space, where \(|P|\) is the pattern size.
  3. Sorting:
    • Merge:
      • Can be implemented with streams.
    • Merge-Sort:
      • Requires \(O\left(\frac{n}{B} \log n\right)\) block transfers (can be optimized further).

External Dictionaries

Dictionaries in External Memory

  • Dictionaries: Store \(n\) key-value pairs, support search, insert, delete.
  • AVL-Trees:
    • Optimal in RAM: \(\Theta(\log n)\) runtime.
    • External memory: \(O(\log n)\) block transfers per operation.
  • Challenges:
    • Inserts happen at varying locations; nearby nodes not in same block.
    • Typically \(\Theta(\log n)\) block transfers.
  • Goal:
    • Reduce to \(O(\log_B n)\) block transfers.
    • Example: For \(n \approx 2^{50}, B \approx 2^{15}\), difference between \(\log n\) and \(\log_B n\) is significant.
  • Better Solution:
    • Design tree structure ensuring most search-path nodes fit in one block.

Towards a-b-trees

  • Multiway-tree:
    • A node can store multiple keys.
  • Definition:

    • A d-node:
      • Stores d keys.
      • Has d+1 subtrees.
      • Keys are arranged such that:

        image.png

  • Properties:

    • Always one more subtree than keys (subtrees may be empty).
  • Advantages:
    • Allows flexible insert/delete by permitting varying numbers of keys in nodes (within limits).
    • Restricts where empty subtrees may exist.
    • Results in a smaller tree height compared to AVL-trees.
      • Benefit: Fewer block transfers.

Definition of a-b-trees

An a-b-tree (for some \(b \geq 3\) and \(2 \leq a \leq \lceil \frac{b}{2} \rceil\)) satisfies:

  1. Non-root Nodes:
    • Each is a \(d\)-node where \(a-1 \leq d \leq b-1\).
    • Has \(a\) to \(b\) subtrees and \(a-1\) to \(b-1\) keys.
  2. Root Node:
    • Is a \(d\)-node where \(1 \leq d \leq b-1\).
    • Has \(2\) to \(b\) subtrees and \(1\) to \(b-1\) keys.
  3. Empty Subtrees:
    • All empty subtrees are at the same level.

Typically, the order \(b\) is specified, and \(a\) is set as \(a = \lceil \frac{b}{2} \rceil\).

  • Note: Small height allows storing many keys.
  • Example: A 3-6-tree of height 2 can store up to: \((1 + 6 + 36) \cdot 5 = 215 \text{ keys.}\)

Note

Explanation of Calculation

  1. Structure of the Tree:
    • A 3-6-tree means:
      • Each node can have 3 to 6 subtrees.
      • Each node stores 2 to 5 keys (always one less than the number of subtrees).
  2. Number of Nodes at Each Level:
    • Level 0 (root): 1 node.
    • Level 1: Each root has up to 6 children, so there are 6 nodes.
    • Level 2: Each level 1 node has up to 6 children, so there are 36 nodes.
  3. Total Nodes:
    • Sum up the nodes at each level: \(1 + 6 + 36 = 43 \text{ nodes.}\)
  4. Total Keys:
    • Each node can have up to 5 keys: \(43 \cdot 5 = 215 \text{ keys.}\)

a-b-tree Height

Theorem:

An a-b-tree with \(n\) keys has height \(O(\log_a(n))\).

Proof:

To determine the minimum number of keys an a-b-tree of height \(h\) must have:

  1. Nodes per Level:
    • Level 1: At least 2 nodes.
    • Level 2: At least \(2a\) nodes.
    • Level 3: At least \(2a^2\) nodes.
    • Level h: At least \(2a^{h-1}\) nodes.
  2. Total Non-Root Nodes:

    • Sum of all non-root nodes:

      \[ \# \text{ non-root nodes} \geq \sum_{i=1}^h 2a^{i-1} = 2 \sum_{j=0}^{h-1} a^j = 2 \cdot \frac{a^h - 1}{a - 1} \]
  3. Total Keys:

    • Root contributes 1 key.
    • Each non-root node has at least \(a-1\) keys:

      \[ n = \#\text{ KVPs} \geq 1 + (a-1) \cdot \frac{2(a^h - 1)}{a - 1} = 2a^h - 1 \]

where \(1\) is the root, and \(a-1\) is \(\geq a-1\) KVPs at non-root.

Conclusion

The height of the tree grows logarithmically with the number of keys:

\[ h \leq \log_a\left(\frac{n+1}{2}\right) \]

a-b-tree Operations

  • Similar to Binary Search Tree (BST):
    1. Compare the search key with keys at the current node.
    2. If not found, move to the appropriate subtree.
    3. Repeat until the key is found or an empty subtree is reached.
**abTree::search(k)**

z ← root, p ← NULL     // p: parent of z
while z is not NULL
    let ⟨T₀, k₁, ..., k_d, T_d⟩ be key-subtree list at z
    if k ≥ k₁
        i ← maximal index such that k_i ≤ k
        if k_i = k then return KVP at k_i
        else p ← z, z ← root of T_i
    else p ← z, z ← root of T₀
return "not found, would be in p"

Notes:

  • Visited Nodes: \(O(\log_a(n))\) (one per level of the tree).
  • Finding \(i\): Not constant time, depends on \(b\) (number of keys in a node).

Insert

Steps:

  1. Perform abTree::search to locate the correct leaf.
  2. Add the key and an empty subtree at the leaf.
  3. Check for space:
    • If the leaf has room, the operation is complete.
    • If the leaf overflows (too many keys/subtrees), resolve it by node splitting.
**abTree::insert(k)**

z ← abTree::search(k)       // z: leaf where k should be
Add k and an empty subtree in key-subtree list of z
while z has b keys (overflow → node split)
    Let ⟨T₀, k₁, ..., k_b, T_b⟩ be key-subtree list at z
    if (z has no parent) create a parent of z without KVPs
    move upper median k_m of keys to parent p of z
    z' ← new node with ⟨T₀, k₁, ..., k_{m-1}, T_m⟩
    z'' ← new node with ⟨T_m, k_{m+1}, ..., k_b, T_b⟩
    Replace ⟨z⟩ by ⟨z', k_m, z''⟩ in key-subtree list of p
    z ← p

image.png

a-b-tree Summary

  • Height:
    • An a-b tree has height \(O(\log_a(n))\).
    • For \(a \approx b/2\), this height is optimal:
      • Level \(i\) has at most \(b^i\) nodes.
      • Each node holds at most \(b-1\) key-value pairs (KVPs).
      • Total keys: \(n \leq b^{h+1} - 1\), with \(h \in \Omega(\log_b(n))\).
  • Operations:
    • Search and Insert: \(O(\log_a(n))\) node visits.
    • Delete:
      • Implemented with \(O(\log_a(n))\) node visits.
      • Lazy deletion often used as space is cheap in external memory.
  • Choosing Order \(b\):
    • Option 1: Small \(b\) (e.g., \(b = 4\)):
      • Similar to a balanced BST, competitive with AVL-trees.
    • Option 2: Large \(b\) (fits into one memory block):
      • Optimized for external memory as an ADT Dictionary realization.

2-4-tree to Red-Black-tree Conversion

Conversion Process:

  • A d-node in a 2-4-tree:
    • Becomes a black node with \(d-1\) red children.
    • These red children are arranged to form a BST of height at most 1.

Resulting Properties:

  1. Red Node Property:
    • Every red node must have a black parent.
  2. Black-Depth Property:
    • Any empty subtree \(T\) has the same black-depth (number of black nodes on the path from root to \(T\)).

This ensures the resulting red-black tree maintains balanced height and structure.

Red-Black Trees

Definition

A red-black tree is a binary search tree (BST) with the following properties:

  1. Node Colors:
    • Every node is either red or black.
  2. Red Node Property:
    • Every red node has a black parent.
    • The root is always black.
  3. Black-Depth Property:
    • Every empty subtree \(T\) has the same black-depth:
      • The number of black nodes from the root to \(T\) is the same across all paths.

Notes:

  • Efficient Storage:
    • The color of each node can be stored using only one bit of overhead.

Red-Black Tree to 2-4 Tree Conversion

Lemma:

Any red-black tree \(T\) can be converted into a 2-4 tree \(T'\).

Conversion Details:

  • A black node with \(0 \leq d \leq 2\) red children becomes a \((d+1)\)-node in the 2-4 tree.
  • This ensures:
    • No red node in the red-black tree has a red child.
    • Empty subtrees remain at the same level due to the consistent black-depth property.

Proof:

  1. Node Conversion:
    • Black nodes and their red children combine to form single nodes in the 2-4 tree.
    • Ensures adherence to 2-4 tree properties.
  2. Black-Depth Preservation:
    • Empty subtrees remain aligned across levels in the resulting 2-4 tree, as black-depth remains consistent.

Red-Black Tree Summary

  • Red-black trees have height \(O(\log n)\). Each level of a 2-4-tree creates at most 2 levels in the red-black tree.
  • Insertions can be done in \(O(\log n)\) worst-case time:
    • Convert relevant parts to 2-4-tree.
    • Perform the insertion in the 2-4-tree.
    • Convert back to red-black tree.
  • Insertions can also be performed directly in the red-black tree using rotations and recoloring.

Deletions can also be performed in \(O(\log n)\) worst-case time.

Experiments show red-black trees use fewer rotations compared to AVL trees.

Red-black trees are widely used as balanced binary search trees (e.g., std::map).

B-trees

  • B-tree adapts the a-b-tree for the external memory model:
    • Nodes align with memory blocks (size \(B\)).
    • Order \(b\) maximizes nodes fitting in a block, typically \(b \in \Theta(B)\).
    • Minimum keys per node \(a\) is \(\lfloor b/2 \rfloor\).

B-tree analysis

  • Operations (search, insert, delete):
    • Require visiting \(\Theta(\text{height})\) nodes.
    • Work within a node is done in internal memory (no block transfer).
  • Height:
    • \(\Theta(\log_a n) = \Theta(\log_B n)\) (since \(a = \lceil b/2 \rceil \in \Theta(B)\)).
  • Conclusion:
    • All operations require \(\Theta(\log_B n)\) block transfers.

B-tree summary

  • Block transfers:
    • All operations require \(\Theta(\log_B n)\) block transfers, which is asymptotically optimal.
    • Searching among \(n\) items requires \(\Omega(\log_B n)\) block transfers.
  • Practical height of B-tree:
    • Example: \(n = 2^{50}\), \(B = 2^{15}\).
    • \(b \approx \frac{1}{3} \cdot 2^{15}\), \(a \approx \frac{1}{3} \cdot 2^{14}\).
    • A B-tree of height 4 would hold \(\geq 2a^4 - 1 > 2^{50}\) key-value pairs.
    • Thus, the height is typically \(3\).

External Hashing

Trie of blocks – Overview

  • Assumptions:
    • Store fixed-length bitstrings (from hash values, not necessarily distinct).
  • Construction:
    • Build a trie \(D\) (directory) of bitstrings in internal memory.
    • Stop splitting in \(D\) when remaining items fit in one block (similar to a pruned trie, but stop earlier).
  • Leaves and Blocks:
    • Each leaf of \(D\) refers to a block of external memory.
    • Blocks store key-value pairs (KVPs) in no particular order.

Trie of blocks – operations

  • search(\(k\)):
    • Traverse trie \(D\) to find leaf \(\ell\).
    • Load block at \(\ell\).
    • Search for \(k\) within the block.
    • Block Transfers: 1.
  • delete(\(k\)):
    • Perform search(\(k\)) to load the block.
    • Remove \(k\) from the block.
    • Transfer the updated block back to external memory.
    • Block Transfers: 2.
    • (Optional: Combine underfilled blocks, but this usually isn't worth the additional block transfers.)
  • insert(\(k\)):
    • Traverse trie \(D\) to locate leaf \(\ell\).
    • Load block \(P\) at \(\ell\).
    • If \(P\) is full:
      • Split the block and assign new children to \(\ell\).
      • Create two new blocks and redistribute items by their next bit.
    • Insert \(k\) into the appropriate block.
    • Transfer updated block back to memory.
    • Block Transfers: Typically 2–3.
    • (Note: Repeated splits are unlikely for large \(B\).)

External Hashing Collisions

  • Definition:
    • Collisions occur when duplicate bitstrings hash to the same value, placing them in the same block.
    • Internal resolution of collisions within a block is not a concern.
  • Problem:
    • If more than \(B\) items share the same hash value:
      • All bitstrings in the block are identical, preventing splits.
      • This indicates either:
        • High load factor.
        • Poor hash function.
      • Typically, rehashing would be necessary.
  • Solution:
    • Instead of rehashing, extend the hash function:
      • Replace \(h(k)\) with \(h(k) + h'(k)\) using a new hash function \(h'(k)\).
      • Initial bits remain unchanged, so other blocks are unaffected.

External Hashing Summary

  • Efficiency:
    • \(O(1)\) block transfers are expected for any operation.
  • Space Management:
    • To create more space, typically only one block is added.
    • The size of the directory is rarely changed.
    • No need to move all items (unlike rehashing).
  • Directory \(D\):
    • Usually fits into internal memory.
    • If \(D\) does not fit:
      • Similar strategies to B-trees can be applied.
      • \(D\) can also be stored as an array, reducing its size.
  • Space Usage:
    • Many blocks will not be completely full, but space usage remains efficient.
    • For randomly chosen bitstrings, each block is expected to be 69% full.