Manual Testing (IT)Manual QA Engineer

Develop an exhaustive manual testing procedure to verify accurate spatial anchoring, environmental occlusion, and thermal performance stability in an **ARCore**/**ARKit**-powered furniture placement feature within an **Android**/**iOS** e-commerce application that employs **SLAM** (Simultaneous Localization and Mapping) for surface detection and **Physically Based Rendering** (**PBR**) for material visualization.

Pass interviews with Hintsage AI assistant

Answer to the question

A systematic methodology begins with Environmental Baseline Establishment, where you document controlled lighting conditions (lux levels, color temperatures) and surface textures (feature-rich vs. uniform) to create reproducible test matrices.

Next, execute Session-Based Drift Detection by placing anchor points on detected planes and maintaining the camera feed for 10-15 minute intervals while periodically logging the virtual object's world-space coordinates against physical reference markers.

For Occlusion Validation, introduce real-world physical occluders (chairs, tables) at varying distances and angles, verifying that virtual objects correctly render both in front of and behind these obstacles based on depth-map accuracy from LiDAR or stereo cameras.

Implement Thermal State Monitoring by running GPU-intensive background applications before testing to simulate device heat, then measuring frame rates and tracking stability using Android GPU Profiler or Xcode Metal System Trace.

Finally, conduct Cross-Platform Parity Testing to ensure that ARCore's coordinate system drift tolerance matches ARKit's behavior under identical environmental conditions, documenting discrepancies in plane detection speed and anchor retention.

Situation from life

During validation of a furniture retail app, we discovered that virtual sofas consistently drifted 8-12 centimeters from their initial placement after seven minutes of user interaction on Samsung Galaxy A52 devices, while iPhone 12 units maintained sub-centimeter accuracy in the same environment.

The problem manifested specifically on low-texture beige carpets under warm LED lighting, combined with thermal throttling that reduced the Snapdragon 720G SoC performance by 40% after sustained AR rendering.

Solution A: Controlled Lab Testing Only

We initially considered restricting tests to ideal conditions with high-contrast checkerboard patterns and continuous air-conditioned cooling.

Pros: Highly reproducible pass/fail criteria and minimal environmental variables.

Cons: Failed to catch the drift issue that 70% of users reported in homes with neutral-toned carpeting, rendering the test suite ineffective for production sign-off.

Solution B: Flagship Device Exclusivity

Another approach involved testing solely on iPhone 15 Pro and Samsung S24 Ultra with dedicated cooling rigs.

Pros: Eliminated thermal variables and showcased optimal PBR rendering quality.

Cons: Represented only the top 15% of the user base, masking critical performance issues affecting mid-tier devices where the app actually exhibited thermal throttling and SLAM tracking loss.

Solution C: Environmental Stress Matrix with Thermal Profiling

We chose to implement a comprehensive matrix combining five distinct surface textures (marble, shag carpet, wood grain, plain drywall, glass), three lighting scenarios (natural daylight, fluorescent office, warm incandescent), and two thermal states (cold boot versus post-gaming 45°C device temperature).

Pros: Accurately reproduced the drift and occlusion failures reported by users, while providing quantifiable data on thermal degradation points.

Cons: Required 3x more test execution time and necessitated purchasing various floor samples and lighting equipment.

Chosen Solution and Result:

We adopted Solution C because it directly correlated with field failure reports. By testing on thermally-stressed Galaxy A52 units on beige carpet, we confirmed that ARCore's point cloud confidence dropped below the 0.6 threshold required for stable tracking, triggering the drift. The development team implemented dynamic quality scaling that reduced shadow raycasting when device temperature exceeded 42°C, which stabilized the SLAM tracking and maintained consistent frame rates above 30fps.

What candidates often miss

How do you differentiate between SLAM tracking loss caused by insufficient visual features versus motion blur during manual testing?

Many candidates attribute all tracking instabilities to software bugs without considering environmental physics. Insufficient visual features (like white walls or glossy surfaces) cause ARCore/ARKit to report low trackingState confidence consistently in static conditions, visible in Logcat or Xcode console logs as "InsufficientFeatures" errors. Motion blur, conversely, correlates with high accelerometer readings from the IMU (Inertial Measurement Unit) showing rapid movement spikes while the camera feed exhibits smearing. To distinguish manually, hold the device completely still; if tracking remains unstable, visual features are the culprit. If stability returns when movement stops, motion blur is the cause.

Why is PBR material validation necessary under multiple color temperatures, and how do you verify lighting estimation accuracy without a spectrometer?

Candidates often test PBR materials under single artificial lighting and declare success, missing that ARKit's light estimation or ARCore's Environmental HDR mode might misinterpret incandescent 2700K light as daylight 6500K, causing gold metals to render as silver or matte plastics to appear metallic. To test manually without specialized hardware, place a physical X-Rite ColorChecker chart or standard white A4 paper next to the virtual object. Compare the virtual object's specular highlights and diffuse reflections against how the physical paper appears; if the virtual object looks unnaturally cool or warm compared to the paper under the same light, the lighting estimation algorithm requires calibration.

What impact do protective phone cases have on SLAM performance, and why do testers often overlook this variable?

QA engineers frequently test on bare development devices, missing that 85% of users employ protective cases that can obstruct downward-facing time-of-flight sensors or LiDAR scanners. When these depth sensors are blocked, the system falls back to RGB camera-based tracking, which has significantly lower accuracy for occlusion detection and plane detection speed. Testers should validate with cases installed, particularly thick rugged cases or those with metallic rings, and verify if the app gracefully degrades by showing "Surface detection may be impaired" warnings when depth sensor obstruction is detected via Camera2 API diagnostics on Android or AVFoundation metadata on iOS.