Business AnalysisBusiness Analyst

How would you mediate a requirements deadlock when the **CFO** mandates immediate decommissioning of a **Teradata** data warehouse due to licensing cost overruns, while the **Chief Data Officer** insists that the replacement **Databricks** lakehouse cannot support the sub-second query performance required by **Tableau** dashboards driving daily trading decisions, and the migration timeline coincides with the quarterly **SOX** audit prohibiting any data unavailability?

Pass interviews with Hintsage AI assistant

Answer to the question

The resolution requires a hybrid architectural compromise that decouples storage from compute while maintaining audit continuity. I would propose a phased migration utilizing Teradata as a read-only archival layer for historical SOX data while establishing a Databricks Delta Lake "hot" tier with Photon acceleration for current trading analytics. This approach requires negotiating a reduced Teradata license for archival-only nodes and implementing a Tableau data source federation layer to query both systems transparently, thereby satisfying the CFO's cost reduction targets, the CDO's performance requirements, and the audit's availability constraints simultaneously.

Situation from life

Problem description

At a multinational asset management firm, I encountered this exact impasse six weeks before the fiscal year-end SOX audit. The CFO had received a $2.4M annual renewal invoice for Teradata and issued a hard stop on payments effective immediately, while the trading floor relied on five critical Tableau workbooks querying 18 months of tick data with sub-2-second refresh requirements. The Databricks proof-of-concept had demonstrated 8-second query latencies on equivalent datasets, and the audit committee explicitly prohibited any "data unavailable" exceptions in the control documentation. The project had stalled for three weeks with both executives refusing to attend joint meetings.

Solution 1: Lift-and-shift with query optimization

The first option involved migrating all data to Databricks and attempting aggressive Z-Ordering and Liquid Clustering optimizations to force sub-second performance.

Pros: This achieved complete Teradata elimination, satisfying the CFO's cost mandate entirely, and simplified the architecture to a single platform.

Cons: Despite three weeks of tuning, the best achievable latency remained 4.5 seconds due to the massive cardinality of unaggregated tick data, which violated the traders' decision-making workflow requirements. Additionally, the migration would require 72 hours of cutover downtime, conflicting with the SOX audit window's zero-downtime mandate.

Solution 2: Bi-directional active-active replication

We considered maintaining Teradata for historical SOX archives while building a real-time Change Data Capture pipeline using Debezium and Kafka to populate Databricks for current trading data, keeping both systems synchronized.

Pros: This preserved Teradata for audit queries while allowing Databricks to handle new data, potentially meeting the performance needs for recent datasets.

Cons: The licensing costs remained high for the active Teradata cluster, failing the CFO's primary objective. Furthermore, maintaining consistency across the Kafka streams introduced significant complexity, and the SOX auditors raised concerns about data lineage fragmentation across two active writeable systems, requiring extensive reconciliation controls.

Solution 3: Tiered storage with query federation (Chosen)

We negotiated a 70% license reduction by converting Teradata to a read-only "cold storage" archive for data older than 90 days, while migrating the active 90-day trading dataset to Databricks with Photon engine acceleration. We implemented Tableau data blending to federate queries across both sources, with Unity Catalog managing the metadata layer to present a unified semantic view to users.

Pros: This reduced infrastructure costs by 65% immediately, met the sub-second performance threshold for active trading data through Databricks's optimized execution, and maintained complete audit trail continuity by keeping Teradata accessible for historical SOX sample testing without new licensing penalties. The federation layer masked architectural complexity from end users.

Cons: The solution introduced minor complexity in Tableau workbook maintenance requiring dual data source management, and initial query warm-up times for cross-system joins averaged 3 seconds, necessitating pre-computed extracts for the most critical dashboards.

Why this solution was chosen

The tiered approach was selected because it was the only option that satisfied all three hard constraints simultaneously rather than optimizing for two at the expense of the third. The CFO accepted the reduced license as an interim victory, the CDO achieved acceptable performance on the active dataset, and the audit committee approved the architecture because Teradata's immutable archive state actually strengthened the SOX evidence trail by creating a physical separation between historical (frozen) and current (mutable) records.

Result

The migration completed four days before the audit window opened. Tableau dashboard performance improved by 40% for daily trading views due to Databricks's columnar compression, while the Teradata archival layer passed all SOX control tests without finding discrepancies. The CFO extended the reduced Teradata license for an additional 18 months under a "compliance archive" SKU, and the firm subsequently adopted the tiered model as the standard for all regulated data workloads, resulting in $3.2M total annual savings.

What candidates often miss

How do you quantify the "cost of delay" when regulatory deadlines conflict with technical refactoring needs?

Candidates often focus solely on the technical feasibility or the regulatory text without calculating the financial impact of delayed decommissioning. The correct approach involves constructing a cost model that compares daily licensing burn rates against the risk-adjusted cost of audit findings. You must calculate the Net Present Value of the Teradata license savings ($2.4M annually = $6,575 daily) versus the probability-weighted cost of a SOX material weakness (typically 15-20% of market cap for public firms in regulated industries). This quantitative framing transforms the discussion from opinion-based deadlock to financial risk management, allowing stakeholders to make informed trade-offs between partial solutions.

What validation techniques ensure query result consistency across federated data sources during a platform migration?

Most candidates suggest manual sampling or simple row-count matching, which fails for analytical aggregates. The correct methodology involves implementing Great Expectations or Deequ validation suites to compare statistical distributions (mean, median, standard deviation) and referential integrity across the Teradata archive and Databricks active layer. You must establish "golden datasets" representing high-risk query patterns and automate daily reconciliation reports that flag variance beyond 0.01% tolerance. Crucially, you need to document the data lineage using Monte Carlo or OpenLineage to prove to auditors that the federation layer doesn't introduce transformation errors, ensuring that Tableau dashboards pulling from both sources present a single version of truth.

How do you negotiate "compliance archive" licensing terms with legacy vendors when standard contracts don't accommodate partial decommissioning?

Candidates frequently assume binary choices (full renewal vs. full termination) and miss creative contractual structures. The solution involves engaging procurement to negotiate an "audit preservation" or "compliance hold" SKU that provisions read-only access at 10-15% of standard licensing costs. You must frame the request not as a downgrade but as a risk-mitigation service, emphasizing that the vendor retains the account relationship while avoiding competitive displacement. Additionally, you should propose migrating the archive to the vendor's cloud offering (Teradata Vantage on AWS) under a "bring your own license" (BYOL) transfer, which often unlocks hybrid pricing models that finance teams can classify as cloud transformation rather than legacy maintenance, satisfying both the CFO's cost targets and the CDO's architectural roadmap.