The rise of peer-to-peer (P2P) payment apps has revolutionized the ease and speed of personal payment transactions, giving individuals powerful financial tools that didn’t exist before. Despite safeguards in place to protect their platforms, there are still attempts across the financial industry to exploit the system — especially when people use them to pay individuals they don’t know personally. In 2023, consumers lost a record $10 billion to scams. Preventing fraud and scams requires enhanced security measures alongside user vigilance.
At Cash App, we’ve built a number of safeguards to protect our customers from scams, from auto-blocking high-risk payments to providing a range of educational resources. We've also widely rolled out a tool called Payment Warnings, which leverages our machine-learning (ML) systems to warn customers that their P2P transaction may be part of a scam. When customers receive a scam warning, they abandon or cancel a payment nearly 50% of the time -- demonstrating the profound value of our safeguards.
A pop-up warning is triggered whenever our models detect a high likelihood of a scam. When customers encounter the warning in Cash App, they have the option to cancel the payment or proceed. That information is then used to improve our ML systems.
The ML models that trigger this warning are trained on multiple data sources. One of our most important sources is historical scam reports submitted by our customers. We’ve made it simple and easy for customers to report potential scams. Upon receiving these reports, we evaluate and investigate the reports as appropriate to address scam related activity, and that data is used to improve the performance of our models. Cash App customers play an important role in keeping our platform safe.
Other data sources that feed these ML models include previous customer support cases, data associated with our scam warnings, and a host of additional customer and transaction information. We monitor metrics surrounding how our customers interact with our scam payment warnings and the kinds of payments that trigger those warnings to continue optimizing our deployment of the tool.
Cash App works tirelessly to keep customers informed and make sure the platform is as safe as possible. This tool is one example of how we’re constantly looking for ways to stay a step ahead of sophisticated fraudsters and evolving fraud patterns – and we’ll continue introducing new tools to achieve this goal.
The rise of peer-to-peer (P2P) payment apps has revolutionized the ease and speed of personal payment transactions, giving individuals powerful financial tools that didn’t exist before. Despite safeguards in place to protect their platforms, there are still attempts across the financial industry to exploit the system — especially when people use them to pay individuals they don’t know personally. In 2023, consumers lost a record $10 billion to scams. Preventing fraud and scams requires enhanced security measures alongside user vigilance.
At Cash App, we’ve built a number of safeguards to protect our customers from scams, from auto-blocking high-risk payments to providing a range of educational resources. We've also widely rolled out a tool called Payment Warnings, which leverages our machine-learning (ML) systems to warn customers that their P2P transaction may be part of a scam. When customers receive a scam warning, they abandon or cancel a payment nearly 50% of the time -- demonstrating the profound value of our safeguards.
A pop-up warning is triggered whenever our models detect a high likelihood of a scam. When customers encounter the warning in Cash App, they have the option to cancel the payment or proceed. That information is then used to improve our ML systems.
The ML models that trigger this warning are trained on multiple data sources. One of our most important sources is historical scam reports submitted by our customers. We’ve made it simple and easy for customers to report potential scams. Upon receiving these reports, we evaluate and investigate the reports as appropriate to address scam related activity, and that data is used to improve the performance of our models. Cash App customers play an important role in keeping our platform safe.
Other data sources that feed these ML models include previous customer support cases, data associated with our scam warnings, and a host of additional customer and transaction information. We monitor metrics surrounding how our customers interact with our scam payment warnings and the kinds of payments that trigger those warnings to continue optimizing our deployment of the tool.
Cash App works tirelessly to keep customers informed and make sure the platform is as safe as possible. This tool is one example of how we’re constantly looking for ways to stay a step ahead of sophisticated fraudsters and evolving fraud patterns – and we’ll continue introducing new tools to achieve this goal.