Inventory accounting requires precise tracking of costs as goods flow through a warehouse. FIFO (First-In-First-Out) assumes the oldest purchased items are sold first, crucial for perishables or inflationary environments. Unlike average costing, FIFO demands matching each sale to specific historical purchase lots, creating a many-to-many relationship challenge that predates modern SQL standards.
Given two tables—purchases (lot_id, quantity, unit_cost, received_at) and sales (sale_id, quantity, sold_at)—we must allocate each sale quantity to the oldest available unsold inventory. This creates three complexities: a single sale may consume multiple partial lots, a single lot may span several sales, and the allocation must respect chronological order without procedural loops. Traditional JOIN approaches fail because they cannot track the depletion state of individual lots across rows.
Use window functions to calculate cumulative sums, transforming discrete quantities into contiguous intervals. Convert purchases into cumulative ranges [prior_cumulative+1, current_cumulative] and sales into similar ranges. A JOIN on overlapping intervals identifies which lots feed which sales. The intersection length multiplied by the lot's unit_cost yields the cost basis. This set-theoretic approach avoids recursion and operates entirely within ANSI SQL.
WITH purchase_cumulative AS ( SELECT lot_id, unit_cost, received_at, SUM(quantity) OVER (ORDER BY received_at, lot_id ROWS UNBOUNDED PRECEDING) - quantity AS cum_start, SUM(quantity) OVER (ORDER BY received_at, lot_id ROWS UNBOUNDED PRECEDING) AS cum_end FROM purchases ), sales_cumulative AS ( SELECT sale_id, sold_at, SUM(quantity) OVER (ORDER BY sold_at, sale_id ROWS UNBOUNDED PRECEDING) - quantity AS cum_start, SUM(quantity) OVER (ORDER BY sold_at, sale_id ROWS UNBOUNDED PRECEDING) AS cum_end FROM sales ) SELECT s.sale_id, p.lot_id, p.unit_cost, LEAST(s.cum_end, p.cum_end) - GREATEST(s.cum_start, p.cum_start) AS allocated_quantity, (LEAST(s.cum_end, p.cum_end) - GREATEST(s.cum_start, p.cum_start)) * p.unit_cost AS allocated_cost FROM sales_cumulative s JOIN purchase_cumulative p ON s.cum_start < p.cum_end AND s.cum_end > p.cum_start ORDER BY s.sale_id, p.received_at;
A pharmaceutical distributor tracks batches of medications with varying wholesale prices due to supplier fluctuations. FDA regulations require exact cost traceability for each pill sold, necessitating per-unit cost attribution rather than average costing. The warehouse processes thousands of daily transactions across hundreds of SKUs, with purchase lots arriving at unpredictable intervals and prices.
The initial approach used a CURSOR in a stored procedure, iterating sales sequentially and decrementing lot balances row-by-row. While functionally correct, this method caused severe locking contention during peak hours as it held locks on inventory tables for extended durations. Additionally, the procedural logic failed ACID compliance tests under concurrent INSERT operations, resulting in phantom reads and double-spending of inventory lots.
The team briefly considered using triggers to maintain a running balance table that updated automatically upon each sale. However, this introduced mutating table errors in Oracle and complex deferrable constraint management in PostgreSQL, cascading latency into the OLTP system. The trigger approach also complicated audit trails by obscuring the exact allocation logic within database metadata rather than explicit query code.
The chosen solution implemented the interval overlap method using window functions to pre-calculate cumulative boundaries. This enabled the database optimizer to utilize sort-merge joins rather than nested loop joins, reducing the cost calculation for a 10,000 unit sale report from 45 seconds to 200 milliseconds. The result enabled real-time cost-of-goods-sold reporting during month-end financial close without blocking inventory transactions, achieving full SERIALIZABLE isolation compliance.
How do you handle the edge case where purchase and sale events share the exact same timestamp, ensuring deterministic FIFO ordering?
Candidates often assume ORDER BY sold_at is sufficient. However, when timestamps collide, the allocation order becomes non-deterministic and may vary between query executions. The solution requires a tie-breaker column—typically the primary key or an auto-incrementing sequence—within the window function's ORDER BY clause. Without this strict ordering, two sales occurring simultaneously might incorrectly consume the same lot quantity twice due to race conditions in the query optimizer's execution plan, violating inventory integrity.
Why does using FLOAT or DOUBLE PRECISION for quantity columns corrupt the FIFO allocation results?
Many candidates use floating-point types for monetary or quantity calculations, unaware that IEEE 754 floating-point cannot precisely represent decimal fractions like 0.1. This imprecision causes cumulative sum errors that compound across thousands of rows, resulting in a lot expected to contain exactly 100 units registering as 99.999999 or 100.000001. Consequently, the interval overlap mathematics either misses valid overlaps or creates phantom negative allocations. The solution mandates DECIMAL or NUMERIC types with explicit precision for all quantity and cost columns to ensure integer-exact arithmetic and prevent financial discrepancies.
How do you correct accumulated rounding errors when a sale consumes fractional cents across multiple lots with different unit costs?
When a sale splits across three lots priced at $0.33, $0.33, and $0.34, naive rounding of each line item may cause the sum of allocated costs to diverge from the sale's total expected value by a penny. Candidates often calculate allocated_quantity * unit_cost directly without considering the rounding context or residual remainders. The robust solution applies banker's rounding (round half to even) or preserves unrounded values in a subquery, then applies a correction algorithm in the outer query. This adjustment adds the residual difference to the largest allocation line, forcing the sum to match the total sale value exactly while maintaining audit trail accuracy.