Sajjad Abedi
To cut costs I tackled high resubmission rates and fraud using data insights, ML, and simplifying the user experience. This reduced resubmissions by 23%, cut fraud by 25%, and improved session success—laying the groundwork for broader automation.

Veriff Auto Capture

Tl;dr

To cut costs I tackled high resubmission rates and fraud using data insights, ML, and simplifying the user experience. This reduced resubmissions by 23%, cut fraud by 25%, and improved session success—laying the groundwork for broader automation.

Setting the stage

Veriff was going through turbulent times. Leadership changes, a company-wide layoff, and shifting priorities created an atmosphere of uncertainty. Our team’s challenge was clear yet daunting—our resubmission rates were driving up costs, frustrating users, and tarnishing our reputation.

The problem

High resubmission rates weren’t just expensive; they also damaged trust. But what was causing users to fail their verification attempts?

Discovery

To find answers, I started with a deep dive into our data. I analyzed session breakdowns by country, resubmission rates, and the specific customers most affected. This gave me an initial glimpse into the scale of the problem. But numbers only told part of the story.

Deep dive into the data.
Deep dive into the data.

Next, I explored qualitative sources. I pored over customer tickets, listened to support calls, and even shadowed our manual verification team. Patterns began to emerge—poor image quality and process misunderstandings were leading contributors.

Shadowing the manual verification team.
Shadowing the manual verification team.

A quick win

We began by improving the manual verification process with ML. This helped our team catch errors faster and more consistently, reducing mistakes that led to resubmissions. It also proved that automation could empower humans rather than replace them.

Verification tools
We brough the ML outcome to the manual verification
We brough the ML outcome to the manual verification

Image capturing

Next, we tackled end-user image capture—a major contributor to verification failures. We designed prototypes that provided real-time feedback, helping users adjust lighting, framing, and glare before submission.

Starting with welcome screen, and making sure the user is ready to capture the image.
Starting with welcome screen, and making sure the user is ready to capture the image.
Automatically deteching the document type and capturing the image.
Automatically deteching the document type and capturing the image.
Automatically capturing the selfie image.
Automatically capturing the selfie image.
Giving feedback to user to make sure the image is good enough.
Giving feedback to user to make sure the image is good enough.

Through iterative testing, we fine-tuned these solutions, balancing usability with technical constraints. Resubmission rates began to drop as users succeeded on their first try.

Delivering results

We cut resubmissions by 23% and reduced fraud by 25% without any extra effort from users. There was also a 15% drop in session drop-offs, and the higher-quality images make manual verification much easier. Overall, it’s been a great success!