Non-Lexical Lifetimes (NLL) utilize a control-flow graph (CFG) based dataflow analysis that computes the liveness of borrowed data at the MIR level. Instead of anchoring borrow lifetimes to lexical scopes, the compiler constructs a CFG where nodes represent program points. A borrow is active only along paths from its creation to its last usage, determined by backward dataflow analysis. This allows the compiler to accept programs where a mutable borrow begins after the last use of an immutable borrow, even within the same block. The analysis rejects programs where any path could lead to use-after-free, ensuring safety while permitting previously rejected valid programs.
Problem: In a high-throughput telemetry system, a function scanned a packet buffer to validate checksums (immutable borrow), then immediately patched corrupted packets (mutable borrow). Pre-2018 Rust enforced lexical lifetimes, causing the immutable borrow to persist until the function's end, blocking the mutable patch.
Solution 1: Explicit cloning. Clone the entire buffer before validation to release the original borrow, then mutate the clone. This approach is straightforward and compatible with ancient Rust versions. However, it incurs double memory consumption and allocation latency, which is unacceptable for a system processing gigabit traffic where latency budgets are measured in microseconds.
Solution 2: Lexical restructuring. Enclose the validation loop inside a nested block { ... } to force the immutable borrow to end before the mutable patch section. This avoids runtime overhead and works without language upgrades. However, it leads to code obfuscation, fragmenting the logical "validate then patch" flow across nested scopes and complicating error handling that spans both phases.
Solution 3: Adopt NLL. Migrate to Rust 2018 to leverage dataflow analysis, allowing borrows to end at their last point of use rather than the enclosing brace. This provides a zero-cost abstraction where code reads as a linear sequence without nesting or cloning. The compiler accepts the program because the analysis proves the immutable borrow is dead before the mutable borrow begins, though it requires a compiler upgrade and team training.
Chosen solution and result: Solution 3 was selected after confirming the production environment supported Rust 1.31+. The code was refactored to remove artificial nesting, allowing the immutable borrow to end immediately after validation and enabling the mutable patch on the next line. This reduced cyclomatic complexity from 12 to 4 and eliminated a 2MB heap allocation per batch, satisfying the strict latency requirements.
How does NLL interact with the drop order of temporary values in complex expressions, and why did this require changes to temporary lifetime rules?
Many candidates assume NLL only affects named let bindings. However, NLL introduced precise drop elaboration for temporaries at the MIR level. In expressions like if let Some(x) = &mutex.lock().unwrap().data { ... }, the temporary MutexGuard must remain alive until after x is used, but no longer. Pre-NLL, it lived until the end of the statement, potentially causing deadlocks. NLL uses dataflow analysis to insert drop flags that destroy temporaries immediately after their last use, even across complex control flow, ensuring locks are released promptly.
Why does NLL still reject programs where a mutable borrow is created after an immutable borrow, even if the immutable borrow is never used again, when the immutable borrow is part of a loop-carried dependency?
NLL performs a may-use analysis on the control-flow graph that is flow-sensitive but not path-sensitive. If an immutable borrow is created inside a loop and used in one iteration, a subsequent iteration cannot create a mutable borrow because the CFG back-edge conservatively assumes the old borrow might be accessed. Candidates often expect NLL to evaluate specific branch conditions (path-sensitivity). However, NLL guarantees safety for all possible execution paths, requiring a borrow to be definitely dead across every path before allowing a conflicting borrow. This prevents subtle use-after-free bugs in loop-carried dependencies that would be invisible in a simple lexical analysis.
What is the specific role of two-phase borrows within the NLL framework, and how do they resolve the "method receiver vs. arguments" conflict?
NLL introduced two-phase borrows specifically to handle method call autoref patterns like vec.push(vec.len()). During evaluation, the compiler reserves a mutable borrow for the receiver (vec) in a "reserved" state compatible with immutable borrows while evaluating arguments (vec.len()). After argument evaluation, the borrow "activates" to full mutability. Candidates often conflate this with general NLL lifetime shortening or reborrowing. The distinction is critical: two-phase borrows temporarily suspend exclusivity during argument evaluation, enabled by CFG analysis tracking reservation and activation points separately, which preserves method chaining ergonomics without breaking aliasing rules.