This requires a rapid knowledge capture methodology that balances structured architecture discipline with accelerated ethnographic research techniques. The approach centers on intensive collaborative workshops using capability mapping frameworks to externalize implicit knowledge. Analysts must simultaneously conduct reverse-engineering of system touchpoints to validate hypothesized value streams against actual transactional data. This dual-track method ensures that documented capabilities reflect both expert testimony and objective system reality.
I was engaged to analyze a mid-sized logistics firm being acquired by a global 3PL provider. The target had operated for 20 years with oral tradition-based process definitions. Their entire customer onboarding logic existed only in the heads of two operations managers retiring in 10 days. The acquirer's ArchiMate enterprise architecture required standardized capability decomposition down to level 3 granularity. However, the virtual data room shut down in 72 hours per NDA terms.
We considered conducting sequential one-on-one interviews with the experts, recording sessions for later transcription. This would yield deep contextual understanding and allow for detailed questioning. However, this approach would require 5-7 days minimum to cover all 40 critical capabilities. This left no buffer for validation or cross-referencing against the SAP ERP transaction logs. The risk of conflicting interpretations between the two experts would remain high without real-time reconciliation.
We chose to run parallel 12-hour intensive workshops using Miro boards with pre-populated TOGAF capability templates. This forced real-time consensus between the experts while simultaneously cross-referencing their statements against SQL query results from the legacy AS/400 database. This created immediate validation of claimed process steps against actual data flows. The method was exhausting for participants but ensured that tacit knowledge was externalized and verified within 48 hours.
We successfully documented 38 of 40 required capabilities with full ArchiMate relationships. The remaining two capabilities were flagged as high-risk knowledge gaps. This allowed the acquirer to negotiate a 15% reduction in purchase price to account for future process redesign costs. The architecture team had sufficient detail to begin integration planning within the ServiceNow repository before the data room closed.
Question 1: How do you validate business capabilities when subject matter experts deliberately obfuscate processes to protect job security?
This requires triangulation techniques comparing expert testimony against system logs, physical documentation artifacts, and output deliverables. When experts resist documentation, implement "process rehearsal" sessions where they must demonstrate the workflow while narrating decisions. This effectively bypasses their ability to abstract or omit steps. Cross-reference timestamps in Salesforce case histories or Oracle workflow engines to verify claimed cycle times and decision branches. This creates an audit trail that exposes gaps in their narrative without direct confrontation.
Question 2: What is the critical difference between business capabilities and business processes in enterprise architecture, and why does confusing them cause integration failures?
A business capability represents the organization's capacity to achieve a specific outcome, remaining stable regardless of process changes or technology shifts. For example, "Customer Credit Assessment" persists whether executed via manual Excel review or automated AI risk scoring. A business process describes the specific flow of activities realizing that capability. Confusing them leads to rigid integrations that break when the target company modifies their workflow. Capability-based planning allows the acquirer to substitute processes while maintaining strategic function.
Question 3: How do you handle implicit business rules embedded in legacy code when no documentation exists and the original developers are unavailable?
Employ code archaeology paired with output analysis. Extract the executable logic from repositories such as COBOL copybooks, PL/SQL packages, or Java classes. Feed sample inputs through the system to observe outputs and use decision table reconstruction to reverse-engineer conditional logic. Correlate these findings with current-state process observations. When code behavior contradicts stated business rules, treat the code as ground truth for compliance purposes and document these as "discovered constraints" rather than intentional requirements.