Business AnalysisBusiness Analyst

Advise on the methodology for validating and prioritizing requirements when redesigning a **KYC** (Know Your Customer) onboarding workflow, given that user behavior analytics from **Adobe Analytics** demonstrate a 60% abandonment rate at the identity verification stage, the risk management team mandates retaining all five verification checkpoints to satisfy **PCI DSS** Level 1 and internal **AML** (Anti-Money Laundering) policies, and the existing customer screening database resides on a legacy **IBM z/OS** mainframe accessible only through **SFTP** batch transfers with a 4-hour latency, prohibiting real-time integration with the proposed **React Native** mobile application?

Pass interviews with Hintsage AI assistant

Answer to the question.

Establish a requirements traceability matrix mapping each PCI DSS control and AML policy requirement to specific user drop-off points identified in Adobe Analytics funnel visualization. Facilitate Kano model workshops to classify mandatory compliance features as "basic needs" versus performance features, creating stakeholder alignment that excessive friction creates regulatory risk under Consumer Duty principles. Architect a facade pattern where a Node.js middleware service manages provisional approvals using a Redis cache for low-risk profiles, while Apache Kafka handles asynchronous synchronization with the IBM z/OS mainframe via scheduled SFTP batches.

This approach satisfies risk management through tiered verification while meeting user expectations for immediate account activation, effectively decoupling the frontend React Native experience from legacy backend constraints.

Situation from life

A mid-sized fintech launching a React Native digital wallet discovered through Adobe Analytics that 60% of Gen Z users abandoned onboarding during the fifth verification checkpoint. The risk team refused to reduce steps, citing PCI DSS Level 1 certification requirements for payment instrument storage and internal AML sanctions screening protocols. The screening database resided on a legacy IBM z/OS mainframe that only accepted SFTP flat files every four hours, making real-time verification architecturally impossible without a multi-million dollar mainframe modernization.

Solution A: Synchronous API Emulation via IBM z/OS Connect

The team evaluated building a REST API facade over the mainframe using IBM z/OS Connect to enable real-time responses. Pros included ideal user experience with instant approval and simplified React Native frontend logic requiring no state management for pending statuses. Cons included prohibitive licensing costs, a six-month development timeline that would miss the competitive market window, and severe performance risks as CICS regions historically crashed under web-scale concurrent loads, threatening system stability.

Solution B: Pure Asynchronous Batch Processing

This approach involved collecting all documentation upfront, transmitting via SFTP, and notifying users via email after the four-hour processing window. Pros included zero modification to the stable COBOL codebase and guaranteed compliance with AML screening requirements. Cons included projected abandonment rates spiking to 85% due to Gen Z's expectation of instant gratification, plus significant customer service burden from support tickets inquiring about application status, eliminating projected operational savings.

Solution C: Risk-Based Hybrid with Eventual Consistency

We implemented a tiered system using Apache Kafka event streaming and Redis caching. Low-risk customers verified through Experian digital identity APIs received provisional account access tokens valid for four hours, allowing immediate card usage with conservative transaction limits. High-risk profiles queued for the SFTP batch without provisional access. Pros included reducing perceived wait time for 80% of the user base while maintaining strict screening for edge cases. Cons included architectural complexity requiring Saga pattern implementation for compensating transactions if the batch rejected a provisionally approved user, necessitating account freezing and fund recovery workflows.

We selected Solution C because it balanced regulatory imperatives with market demands. The result was a reduction in abandonment to 15%, $12M incremental revenue in Q1, and successful passage of the annual PCI DSS audit without findings. The IBM z/OS system experienced zero performance degradation as SFTP loads remained within existing batch windows.

What candidates often miss

How do you negotiate "immutable" regulatory requirements when they conflict with user experience?

Many candidates treat PCI DSS or AML requirements as absolute binary constraints without examining interpretive flexibility. In practice, these standards often permit risk-based approaches regarding implementation timing, such as distinguishing between "verification before first transaction" versus "verification before high-value settlement." The Business Analyst must create a compliance risk matrix that quantifies residual risk of provisional access against business risk of customer abandonment, citing specific clause interpretations (e.g., PCI DSS v4.0 Requirement 8.2.3) to demonstrate defensible compliance. Candidates miss that regulatory guidance frequently allows for "soft declines" and tiered verification if supported by documented risk assessments and audit trails.

What is the specific technical constraint of "eventual consistency" in financial systems and how do you communicate it to business stakeholders?

Junior analysts often fail to explain that distributed systems using Apache Kafka or Redis caches operate on eventual consistency models, whereas legacy IBM z/OS mainframes assume immediate atomicity. When provisional approvals rely on cached data, a window exists where the SFTP batch might later reject the user, creating a "false positive" scenario. The correct approach involves translating the CAP theorem trade-off into business terms through a service level objective (SLO) document, showing that a 0.01% provisional reversal rate matches existing fraud tolerance for check deposits. Visualizing compensating transaction workflows using BPMN diagrams helps stakeholders understand that Saga pattern orchestration provides safety mechanisms without requiring technical expertise.

How do you calculate the true cost of technical debt when integrating legacy systems via SFTP versus modernizing?

Candidates frequently present SFTP integration as the "cheap" option without accounting for operational burden in their Total Cost of Ownership (TCO) analysis. The missed calculations include manual PGP key rotation workflows, exception handling labor when flat files corrupt, and the opportunity cost of data trapped in batch cycles preventing real-time analytics. A proper analysis compares Capex of IBM z/OS modernization against Opex of maintaining SFTP bridges, including overnight shift staff to monitor batch windows and the 2-3 week release cycle delays inherent in SFTP dependencies. This holistic view often reveals that middleware modernization delivers positive ROI within 18 months despite higher upfront investment.